upload nodejs application to AWS EC2 - node.js

I want to deploy my app (Nodejs + Mongodb) in EC2. After installed Nodejs and MongoDb in EC2 instances, I do not know how to upload my source code to instances and where to upload it?
Thanks!

When you created your instances you should have also created an ssh key that you use to access the EC2 instance via SSH which you clearly have if you were able to install additional packages.
You can use this same key and method to connect to the instance with SFTP and upload the files you want like that.
You could also use scp to copy the files directly from the commandline or from a custom script.
With regard to where to upload the files - that all depends on you. What web servers are you using (if at all). Depending on your application you need to select a location that is accessible via your web server. The default location for an apache server would be /var/www

You can use WinScp (https://winscp.net/eng/download.php) to have an "explorer like" access to your EC2 linux:
Note: The ppk private key was created using the "node.pem" key from AWS. But you must make it accessible by running:
chown :Users node.pem
chmod 400 node.pem

Related

How to setup file upload from NodeJS/VueJS to another server?

I've created a VueJS App for file uploading in which it will be my admin panel (For CMS) with NodeJS as back-end. Now I want to upload the files that was passed to the NodeJS and move it to another VueJS App which is going to be my primary website in order for me to access the files locally. How can I do it? Any suggestions or different approach will do.
You have many options on how to move the files to another site.
You could store them in a shared bucket or shared directory between your 2 backends or you could add another route to download that file.
You could configure a cronjob to scp or rsync those files to your target machine.
This is really more of a question of how to sync a directory to some place else.

Setting environment variables when deploying MEAN Stack app to AWS EC2

I'm deploying a project that I've been working on to AWS for the first time and everything that I've read, regarding deploying a MEAN stack app to ec2, states that you install the project via git repo. However I have environment variables for different API keys and my database string that I placed in my .gitignore file so I'm facing the issue of trying to set those environment variables so that my web application runs correctly. Does anyone have any idea how to go about this?
An EC2 instance is an entire virtual system. When you create a new EC2 instance, you will need to connect to it, git clone your project, install any necessary dependencies (NodeJS, NPM, etc.) and then start your application with any environment variables that you like.
You will want to use a tool like scp to upload any non version controlled files, like your database string, or create them on the instance with a text editor (Vim, nano, etc.)
You can create a startup script that does this when the instance is created, however if you are deploying the project for the first time on a new machine, I don't think this is the way you will want to go.

Web based method to get public key from .pem file

I have created an EC2 instance and I now want to connect to it from a Chromebook. For the time being, I only have access to this Chromebook and I am after a way of generating my public key from the .pem file that Amazon issues.
I am familiar with how to do this via the Linux command line, but I need a web based solution for this.
You need an SSH client for Chrome, try this:
https://chrome.google.com/webstore/detail/secure-shell/pnhechapfaindjhompbnflcldabbghjo

AWS to host assets and client side static code at central repository which should be accessible from node to upload files to central location

I am using AWS to run my application based on MEAN stack. I am using load balancer with three instances of Node application servers and three instances of mongo database server on cluster. My application has feature to upload file contents to server, mainly images, audios, vidoes etc. I want following
I want to create one central content repository which should be accessible from all of my three node application servers so that my node code should be able to upload files to central content repository.
I want to one URL to access this central content repository which can be used on user interface to load assets and display it
I also want to re-purpose this same central repository to host all of my client side javascript, css, images and would like my index.html to refer client side assets from central repository URL
I was looking into options in AWS however go confused. Not able to understand what is best and easy approach. Following is server architecture for reference.
Please advice
I couldn't figure out to use EC2 to solve this problem, so I changed my implementation approach, rather than hosting files on file system, I am using the MongoDB GridFS to store files. It helped to make it available for each nodejs hosts.

how to deploy a heavy size files using chef

I have a file which is of 900 MB approx.
I want to deploy the file to the target machine. I used the files resource in chef (cookbook_file). But i am unable to upload the cookbook in to the server because of the size.
Is there any way to deploy the file on the target machine rather than downloading it from internet (using remote_file)?
I'm sorry, Chef server is no artifact server. It is a bad idea to upload large binaries to the chef server. Place them on some server in the intranet or maybe add an Apache HTTP server to the node running the chef server and upload the files there. And yes, then use the remote_file resource.
We had similar issue. Since our requirement is that the huge file should be on chef server before deployment to client.
We had just used split command and used cat command to join on target node. It worked well ..

Resources