Setting environment variables when deploying MEAN Stack app to AWS EC2 - node.js

I'm deploying a project that I've been working on to AWS for the first time and everything that I've read, regarding deploying a MEAN stack app to ec2, states that you install the project via git repo. However I have environment variables for different API keys and my database string that I placed in my .gitignore file so I'm facing the issue of trying to set those environment variables so that my web application runs correctly. Does anyone have any idea how to go about this?

An EC2 instance is an entire virtual system. When you create a new EC2 instance, you will need to connect to it, git clone your project, install any necessary dependencies (NodeJS, NPM, etc.) and then start your application with any environment variables that you like.
You will want to use a tool like scp to upload any non version controlled files, like your database string, or create them on the instance with a text editor (Vim, nano, etc.)
You can create a startup script that does this when the instance is created, however if you are deploying the project for the first time on a new machine, I don't think this is the way you will want to go.

Related

How to dynamically change API URL's in react app which ran in a Docker container without rebuilding?

What is the best way to manage API URL's in an application (created with create-react-app) and ran in a Docker container?
Actually, I want to build a docker image and be able to run it on different environments (production and staging for example) without building a new one.
My current solution is to start a container with some environment variable like "docker run -e ENV=dev".
Add a logic to read env from query params. If query params are not passed use the default. That way, you can easily switch between envs on the fly. If you want to remember your users choice, then store it in the storage and you can read from storage when query param is not passed.
I don't consider myself a React dev, but I have come across and solved the same problem with Vue and docker.
After a little research specifically for React, it looks like you can use the public folder to share a mounted file/directory with the running container. Vue has a similar folder. The files in that directory are accessible from the app root URL (/some-file.blah).
Your app's directory structure might look like:
./app
./app/src
./app/src/public/
./app/src/public/config.json
./app/src/... (the rest of your app)
I assume that config.json would then be available at /config.json after a build. You could either then include that file in your HTML template's script tags or load it on demand using AJAX depending on what stage of the page lifecycle it's needed.
Having very little experience with React myself, I assume someone more familiar can provide clarification (or better edits) to help out.

Continous development with Docker containers

I'm using docker for a new project for learning purposes. I thought that it would make things a lot easier since I have to setup the environment only once.
I've created a setup like this; I created a base image that installs Ubuntu and NodeJS. I've also created a Development image that copies a src (web application) folder into the container, then does an npm install and runs the server.js. That's basically it in a nutshell.
But now, whenever I make a change to my source code I have to stop the running container, and build the image and run it again. It doesn't take long to rebuild and run de Development image, but it gets a bit annoying to do that eveythime I make a change to my code during development.
What I normally had was a Gulp task or Browsersync watching my local files. Everytime I made a change, those changes were automatically visible in the browser. That really speeds up the development process.
I could still work like this during development by installing everything locally. But that kinda defeats the purpose of having a "development image". That means that I still have to configure all the systems that want to work on this web application with the appropiate Node version, Database schemes, port mappings, SSL settings, Certificats etc.
So my questions is, is there a way to run a container, and whenever I change the source code (locally), that it is then automatically pushed to the running container? So I have "continuous development"?
If I understood you correctly, you don't wish to build the Development image each time you update the src. If that's the situation, what you can do is:
For the Development Phase (When the Source code is update frequently):
Create a Dockerfile in such a way that make use of a shared volume (where the source code will reside).
In the shared volume, you can update the source code using the src container.
Also, if you need to do some additional task, you can write it in a script and call that script each time you update the source.
Later on, when the development phase ends, you may use your current Dockerfile to build the development image.

Deploy NWJS with docker

I am developing an app with NWJS, now I am thinking in the deploy process, what I need is install the app into different machines that will use that app, the problem that I see is if I change some file I will need install again into each machine, I was reading about docker and if I understood fine, I can make an Image and download the last version of the app into each machine that use the app.
The Question is if can I upload the app into a container and download that into each machine?, and How can I search the documentation for do that?.
Thanks for any help
I think I've cheated my way into a solution, this could work for you, depending on what your exact requirements are.
In one scenario, I have a shared network folder that allows machines to launch the NWJS app via the network share, so every time I update the file and someone relaunches their short-cut, they have a fresh copy.
The remote users, who are not directly on our same network, has their copy in a DropBox folder - which - of course - automatically update as I drop the new copy into that folder.
None of these solutions are as "clean" as an installer, but, for our use case, works rather well. It's a bonus that DropBox handles the downloading of the new copy of the file automatically.

Where are source files stored on Google Cloud Platform when deployed from local machine

I have just deployed the basic NodeJS express app on Google Cloud Platform from IntelliJ IDEA. However I cannot find and browse the source files. I have searched in the Development tab, App Engine tab and it shows the project but not the actual files. I can access the application from my browser and it is running fine. I can see the activity and requests everything coming into the application but I cannot see the source files. I tried searching for them in the terminal Google Cloud Console and I cannot locate the files in there either. It's puzzling because I don't know where the files are being served from.
AFAIK seeing the live app code/static content directly in the developer console is not possible (at least not yet), not even for the standard environment apps.
For apps using the flexible environment (that includes node.js apps) accessing the live app source code may be even more complex as what's actually executed on GAE is a container image/docker file (as opposed to plain app code source file from a standard environment app). From Deploying your program:
Deploy your app using the gcloud app deploy command. This command
automatically builds a container image for you by using the Container
Builder service (Beta) before deploying the image to the App
Engine flexible environment control plane. The container will include
any local modifications you've made to the runtime image.
Since the container images are fundamentally dockerfiles it might be possible to extract their content using the docker export command:
Usage: docker export [OPTIONS] CONTAINER
Export a container's filesystem as a tar archive
Options:
--help Print usage
-o, --output string Write to a file, instead of STDOUT
The docker export command does not export the contents of volumes
associated with the container. If a volume is mounted on top of an
existing directory in the container, docker export will export the
contents of the underlying directory, not the contents of the
volume.
One way of checking the exact structure of the deployed app (at least in the standard environment) is to download the app code and check it locally - may be useful if suspected incorrect app deployment puts a question mark on the local app development repository from where deployment originated. Not sure if this is possible with the flexible environment, tho.
The accepted answer to the recent Deploy webapp on GAE then do changes online from GAE console post appears to indicate that reading and maybe even modifying live app code might be possible (but I didn't try it myself and it's not clear if it would also work for the flexible environment).

Deployed version of NodeJS site not loading on AWS

I am doing my first deployment on AWS (using Elastic Beanstalk), and I am completely new to this.
I built a personal website using NodeJS / Express, and on my local machine it loads just fine. Once I was ready to deploy a v1, I created an AWS account and set up a new EBS application environment for Node. I set up the static files to load from /public, set my node version, and set the launch command as node app.js, but those were the only options I changed.
I zipped up my site (using CNTL + Click -> Compress on a selection of all site files) and uploaded that zip, and after some time, it came up all green. Clicking the link to load my site though, I get a half finished version. Looking at my console, I see that I am getting 4 files as 404, and because of that, 4 failures from RequireJS.
These 4 files are backbone views, and are contained in a folder with 4 other JS files that are all loading just fine (I can open them in the chrome dev tools source tab from the deployed version). I am confused how just these 4 files would go missing.
Is there some way to FTP into where ever my files are contained, to confirm the files are in fact not present? And barring that, what steps are available to figure out what is occurring here? Like I said, it looks and loads just fine locally, and I am at a loss as to where to even start debugging something like this. The AWS docs I have read so far only tell me to do exactly what I have been doing.
Repo for the project is here: https://github.com/RyanMG/trustycode
And the deployment is here: http://trustycode.elasticbeanstalk.com/
The files it is having trouble with are under public/javascript/views/ (CodeView, AboutView, PhotoView, DesignView)
Any ideas / advice?
Is there some way to FTP into where ever my files are contained, to confirm the files are in fact not present?
You can ssh into the EC2 instance of the Elastic Beanstalk app using your pem file.
Check files in /var/app/current
I don't have the reputation to comment, but that is one of those common gotchas I found myself switching to OSX from GNU/LINUX at work. OSX is case insensitive; linux world is case sensitive.

Resources