For Node servers I am able to figure out whether they are running in a Docker container using the is-docker package. This obviously doesn't work for React applications which run in the browser rather than the command line which makes fs unavailable. Is there any other way to let a web application figure out whether it is running in a Docker container?
I noticed that create-react-app supports injecting environment variables into process.env during build time using .env files. Thus I created a .env.development file which contains REACT_APP_DOCKERENV=false and a .env.production file which contains REACT_APP_DOCKERENV=true. Those variables are then injected when running yarn start and yarn build respectively.
Of course this assumes my development builds are always executed locally while the production builds are always executed in Docker which works in my particular scenario.
Related
I was looking to deploy my Nodejs/TypeScript web application to my cpanel shared hosting. But, I am getting an error:
*
Cloudlinux NodeJS Selector demands to store node modules for
application in a separate folder (virtual environment) pointed by
symlink called "node_modules". That's why the application should not
contain folder/file with such name in application root*
I have created a nodejs application. The final nodejs/typescript folders-files were moved to the nodejs application directory. I also was able to install TypeScript as well as run yarn install after copying my virtual environment and running it from my local terminal by running ssh.
The issues is, from my nodejs application end, I cant 'run any script' or 'npm install' neither can I from my virtual terminal.
But yarn run real fine.
I'm also aware that another folder was created in my root hosting path called /nodevenv/ where another instance of my domain/sub-domain name lives.
Node executables folder/files.
What I dont know is, does it have anything to do with running npm script? Again, does it have anything to do with my application path?
Another weird thing I couldn't figure out is how to run the frontend and backend together. They both run fine on my local machine.
Do I create a subdomain for the server-side and the frontend from the main domain?
I want them to run together as they were running on the same port from my localhost: 8080.
How do I set them to run on the same port from my sharedhosting
What I did is, creating a proxy from the frontend package.json file like so:
"proxy": "example.com" and it was running fine from local machine.
Remove node_modules folder. and try to run your commands again.
Try Moving node_modules to public folder
It's occurring because the node_modules folder already exists, if you remove the folder, the command sent by "► Run NPM Install" button will execute successfully.
I'm experimenting with docker micro-services and I'm refactoring an existing NodeJS app to run as several separate entities. What I'm used to doing is running it locally (using npm start) and whenever I save my changes, it redeploys everything locally instantly.
How do I set up VSCode with it's docker extension to work in this way? I've let VSCode create the Dockerfile \ docker-compose.yml files for each micro-service.
The guides I'm finding around the place imply pushing each change to an image repo, which when you're making small changes, seems long winded
EDIT - Okay. I got it building and deploying manually. I've had to run the build command in CMD instead of GIT Bash (minor frustration... but nothing I can't live with). Is there a way to automate this instead? So whenever a change happens inside of the directory, it just starts the build process again.
I am building a NodeJS application using MongoDB as the database. I am thinking that it will make more sense in terms of portability across different platforms and also versioning and comparison to have the application deployed in Docker. Going through various recommendations on internet, here are my specific questions :
(a) Do I copy my application code (nodejs) within Docker? Or do I keep Source code on the host machine and have the code base available to Docker using Volumes? (Just for experimenting, I had docker file instruction pulling the code from repository within the image directly. It works, but is it a good practice, or should I pull the code outside the docker container and make it available to docker container using Volumes / copy the code)?
(b) When I install all my application dependencies, my node_module size explodes to almost 250 MB. So would you recommend that run npm install (for dependencies) as Docker step, which will increase the size of my image ? Or is there any other alternative that you can recommend?
(c) For connecting to the database, what will be the recommendation? Would you recommend, using another docker container with MongoDB image and define the dependency between the web and the db using docker? Along with that have configurable runtime property such that app in different environments (PROD, STAGE, DEV) can have the ability to connect to different database (mongodb).
Thoughts / suggestions greatly appreciated. I am sure, I may be asking questions which all of you may have run into at some point in time and have adopted different approaches, with pros and cons.
Do I copy my application code (nodejs) within Docker? Or do I keep
Source code on the host machine and have the code base available to
Docker using Volumes?
You should have the nodejs code inside the container. Keeping the source code on your machine will make your image not portable since if you switch to another machine, you need to copy the code there.
You can also pull the code directly into the container if you have git installed inside the container. But remember to remove the .git folder to have a smaller image.
When I install all my application dependencies, my node_module size
explodes to almost 250 MB. So would you recommend that run npm install
(for dependencies) as Docker step, which will increase the size of my
image ? Or is there any other alternative that you can recommend?
This is node pulling over all the internet. You have to install you dependencies. However, you should run npm cache clean --force after the install to do some clean up to have a smaller image
For connecting to the database, what will be the recommendation? Would
you recommend, using another docker container with MongoDB image and
define the dependency between the web and the db using docker? Along
with that have configurable runtime property such that app in
different environments (PROD, STAGE, DEV) can have the ability to
connect to different database (mongodb)
It is a good idea to create a new container for the database and connect your app to the database using docker networks. You can have multiple DB at the same time, but preferably keep one db container inside the network, and if you want to use another one, just remove the old one and add the new one to the network.
A
During development
Using a directory in the host is fast. You modify your code, relaunch the docker image and it will start your app quickly.
Docker image for production/deployement
It is good to pull the code from git. it's heavier to run, but easier to deploy.
B
During development
Don't run npm install inside docker, you can handle the dependencies manually.
Docker image for production/deployement
Make a single npm i in image building, because it's supposed to be static anyway.
More explanation
When you are developing, you change your code, use a new package, adapt your package.json, update packages ...
You basically need to control what happen with npm. It is easier to interact with it if you can directly execute commands lines and access the files (outside docker in local directory). You make your change, you relaunch your docker and it get started!
When you are deploying your app, you don't have the need to interact with npm modules. You want a packaged application with an appropriate version number and release date that do not move and that you can rely on.
Because npm is not 100% trustworthy, it happen that with exact same package.json some stuff you get as you npm i makes the application to crash. So I would not recommend to use npm i at every application relaunch or deployement, because imagine some package get fucked up, you gotta rush to find out a soluce. Moreover there is no need at all to reload packages that should be the exact same (they should!). It's not in deployement that you want to update the package! But in your developement environment where you can npm update safely and test everything up.
(Sorry about english!)
C
Use two docker image and connect them using a docker network. So you can deploy easily your app anywhere.
Some commands to help maybe about Docker networking! (i'm actually using it in my company)
// To create your own network with docker
sudo docker network create --subnet=172.42.0.0/24 docker-network
// Run the mondogb docker
sudo docker run -i -t --net docker-network --ip 172.42.0.2 -v ~/DIRECTORY:/database mongodb-docker
// Run the app docker
sudo docker run -i -t --net docker-network --ip 172.42.0.3 -v ~/DIRECTORY:/local-git backend-docker
Bit of a n00b question:
I created project with vue-cli using webpack.
On my windows machine I run "npm un dev" and I get a frontend server with HMR and so on.
Now I want to deploy my app to a production machine - ubuntu on DigitalOcean.
What are the steps I must take? I'm not very familiar with the logic of how it's supposed to work. If my ubuntu machine has NODE_ENV set to production, it won't install any of the devDependancies and i'm not able to build anything. So I guess I'll have to change that? If yes then it doesn't make any sense since it's a production machine.
And do I have to create another node/express server to serve index.html? Won't it supposed to work out-of-the-box somehow?
Thanks :)
TL;DR Build on your local machine and everything you need will be outputted in the ./dist/ directory, just copy the contents over to the webroot on your production server and you're good to go.
The webpack template handles most of the stuff for you.
Step you need to take to release:
Run npm run build on your local machine
Copy the contents of the generated ./dist/ directory to your server webroot
That's it!
When you run npm run build, the pre-configured build script sets the node environment to production, and builds with only the stuff that should be in production, it also optimizes the code and removes debug capabilities. When it comes to dependencies webpack takes care of that and includes them in the generated javascript files located in the ./dist/js/, so you need not concern yourself with copying over the node_modules/ directory either.
It also copies over everything in your static directory and src/assets directory to the ./dist/ directory to be prepare for a release. And resolves all the references to the new path generated by webpack.
The production server should not be concerned with building the vue app, run the build command on your local machine to keep dev dependencies away from your production server. I recommend against installing webpack and other dev tools on your production server. It just pollutes the server with things not needed there.
Some development tools could potentially produce alot of issues on production servers. So best practice is to never install them.
You could optionally create your own release script that uses ftp or rsync, whatver you prefer to copy everything in the ./dist/ directory to the production server webroot. This could be a script in bash, if on windows, run it in git bash or something similar for example.
Hope that cleared things up, congrats on your first vue release!
How do I test a heroku node task on my local machine using the heroku runtime environment (foreman?)
I can successfully run my task like so: node my_task.js. Now I need to (a) make it an executable in the bin directory, which will make it dependent on the location of the node binary on the system, and (b) I need this specific task to use environment variables defined in my .env file, which I can't mimic by just running it with node unless I hard code them, but that would defeat the purpose.
Is there a way to use foreman or heroku cli to run a task as it would be run in the heroku environment?
I spoke with Heroku support and figured this out. I didn't realize you can use foreman to run other processes:
foreman run node bin/my_task works for me. That way I can also keep the shebang node path as app/bin/node instead of having to switch it to usr/bin/node for my Ubuntu box.