For Node servers I am able to figure out whether they are running in a Docker container using the is-docker package. This obviously doesn't work for React applications which run in the browser rather than the command line which makes fs unavailable. Is there any other way to let a web application figure out whether it is running in a Docker container?
I noticed that create-react-app supports injecting environment variables into process.env during build time using .env files. Thus I created a .env.development file which contains REACT_APP_DOCKERENV=false and a .env.production file which contains REACT_APP_DOCKERENV=true. Those variables are then injected when running yarn start and yarn build respectively.
Of course this assumes my development builds are always executed locally while the production builds are always executed in Docker which works in my particular scenario.
I'm trying to deploy a Next.js project using a Docker image and I was wondering if it's possible to simply use an already generated dist folder (.next) and start the next.js server (npm run start) without having to trigger the build step again in the container.
The container will be hosted in AWS Elastic Beanstalk and I also want to avoid uploading the source code and installing the npm packages there, as I already have a CI pipeline that is generating the production artifacts.
Answering because I was going through issues with this myself and found this question. My research shows the only way to accomplish this is to run a docker environment on beanstalk and not node.js. The primary reason is that there are absolute paths in the .next build artifacts so you have to build on each instance and you have to make sure that BUILD_ID is synced across those instances.
If you CI pipeline can handle creating and pushing the Docker image, then its pretty easy to deploy on Beanstalk without any rebuilding etc. Hope that helps!
I am building a NodeJS application using MongoDB as the database. I am thinking that it will make more sense in terms of portability across different platforms and also versioning and comparison to have the application deployed in Docker. Going through various recommendations on internet, here are my specific questions :
(a) Do I copy my application code (nodejs) within Docker? Or do I keep Source code on the host machine and have the code base available to Docker using Volumes? (Just for experimenting, I had docker file instruction pulling the code from repository within the image directly. It works, but is it a good practice, or should I pull the code outside the docker container and make it available to docker container using Volumes / copy the code)?
(b) When I install all my application dependencies, my node_module size explodes to almost 250 MB. So would you recommend that run npm install (for dependencies) as Docker step, which will increase the size of my image ? Or is there any other alternative that you can recommend?
(c) For connecting to the database, what will be the recommendation? Would you recommend, using another docker container with MongoDB image and define the dependency between the web and the db using docker? Along with that have configurable runtime property such that app in different environments (PROD, STAGE, DEV) can have the ability to connect to different database (mongodb).
Thoughts / suggestions greatly appreciated. I am sure, I may be asking questions which all of you may have run into at some point in time and have adopted different approaches, with pros and cons.
Do I copy my application code (nodejs) within Docker? Or do I keep
Source code on the host machine and have the code base available to
Docker using Volumes?
You should have the nodejs code inside the container. Keeping the source code on your machine will make your image not portable since if you switch to another machine, you need to copy the code there.
You can also pull the code directly into the container if you have git installed inside the container. But remember to remove the .git folder to have a smaller image.
When I install all my application dependencies, my node_module size
explodes to almost 250 MB. So would you recommend that run npm install
(for dependencies) as Docker step, which will increase the size of my
image ? Or is there any other alternative that you can recommend?
This is node pulling over all the internet. You have to install you dependencies. However, you should run npm cache clean --force after the install to do some clean up to have a smaller image
For connecting to the database, what will be the recommendation? Would
you recommend, using another docker container with MongoDB image and
define the dependency between the web and the db using docker? Along
with that have configurable runtime property such that app in
different environments (PROD, STAGE, DEV) can have the ability to
connect to different database (mongodb)
It is a good idea to create a new container for the database and connect your app to the database using docker networks. You can have multiple DB at the same time, but preferably keep one db container inside the network, and if you want to use another one, just remove the old one and add the new one to the network.
A
During development
Using a directory in the host is fast. You modify your code, relaunch the docker image and it will start your app quickly.
Docker image for production/deployement
It is good to pull the code from git. it's heavier to run, but easier to deploy.
B
During development
Don't run npm install inside docker, you can handle the dependencies manually.
Docker image for production/deployement
Make a single npm i in image building, because it's supposed to be static anyway.
More explanation
When you are developing, you change your code, use a new package, adapt your package.json, update packages ...
You basically need to control what happen with npm. It is easier to interact with it if you can directly execute commands lines and access the files (outside docker in local directory). You make your change, you relaunch your docker and it get started!
When you are deploying your app, you don't have the need to interact with npm modules. You want a packaged application with an appropriate version number and release date that do not move and that you can rely on.
Because npm is not 100% trustworthy, it happen that with exact same package.json some stuff you get as you npm i makes the application to crash. So I would not recommend to use npm i at every application relaunch or deployement, because imagine some package get fucked up, you gotta rush to find out a soluce. Moreover there is no need at all to reload packages that should be the exact same (they should!). It's not in deployement that you want to update the package! But in your developement environment where you can npm update safely and test everything up.
(Sorry about english!)
C
Use two docker image and connect them using a docker network. So you can deploy easily your app anywhere.
Some commands to help maybe about Docker networking! (i'm actually using it in my company)
// To create your own network with docker
sudo docker network create --subnet=172.42.0.0/24 docker-network
// Run the mondogb docker
sudo docker run -i -t --net docker-network --ip 172.42.0.2 -v ~/DIRECTORY:/database mongodb-docker
// Run the app docker
sudo docker run -i -t --net docker-network --ip 172.42.0.3 -v ~/DIRECTORY:/local-git backend-docker
Today we've created a Node.js/ReactJS app. We are using Bitbucket (git repo), Docker container along with Jenkins + AWS ECS (Elastic Container Service).
The process we use today is when we are ready to deploy and have a new version ready, we go into the /assets directory and run command gulp build. This handles the whole build/minification process and in the end gives us the version number. From here we check this into the git repo and since it has the version this becomes the tag in the repo. All good, right? :)
From here, in Jenkins we can simply run the build choosing Prod/Master for example and it takes care of grabbing all the npm packages, push docker image to ECR, updating revision within ECS. And the the service is up and running.
It seems to me that we should not be running this gulp build command locally and have to check into git repo. Not to mention this leaves the git repo a bit messy and with other developers it's not a great solution to have the 'compiled' minified files there.
Wouldn't the better practice be to have this gulp build run over on Jenkins?
However, I believe we would still like to retain the tagging within the git repo? Is there another way to achieve this?
Has anyone dealt with a similar issue or has a best practice for something like this?
Really curious to hear what you think.
Thanks in advance.
There's no "best practice" but if you want it to be less messy you can look at using a Jenkinsfile: https://jenkins.io/doc/book/pipeline/jenkinsfile/
It doesnt matter what commands you have or anything, it's just best practice for Jenkins to do it. so gulp build should run on Jenkins. The only process should be commit and push. Jenkins should handle the rest.
I'm trying to use docker with a node.js web app i'm working on.
I have familiarized myself with the docker concepts and gotten up and running with the example here: https://docs.docker.com/examples/nodejs_web_app/
I get the general process...write a Dockerfile -> Build a docker image -> run it in a VM.
However, it seems impractical to rebuild the image and restart the container every time I change a file.
I currently have a gulp / live-reload setup that works great for development so I was wondering if there was any recommended way of accomplishing something like this with docker.
Thanks!
You can mount the source directory in the container as a volume and use the same gulp/livereload setup that you currently use now. Here's an example project with this setup. If you run into port issues with livereload see here.