Writing files in a Docker Nodejs Image - node.js

I'm building a NodeJS (Express) application off the Node Docker Image and need to generate files through the app. Whenever I run the application outside of Docker my code works and the file shows up as expected, but when docker-compose up-ing it the rest of the app works fine but no files are written. The scripts don't throw any errors either so my suspicion is that this is more to do with how Docker manages files within containers.
I've tried two different implementations of my solution too, one using the fs library and another generating files using ffmpeg, both with the same results.
I've tried console.log(__dirname) to confirm I'm looking in the right directory too but, alas, no luck there.
If it's of any help, I'm working of the Node image and the MySQL image, using Docker-Compose to link the two. I'm also using the PM2 process manager (recommended by a tutorial I was following) in case this can give any other useful info.
Should I be creating some kind of volume for Docker to write files to, or is it as simple as adding a specific library to my application?

Related

Docker vs node.js

I'm fairly new to React-native. Im sorry about quite a convoluted question but I have dilemma. I am building an API that communicates with a server app that Im working on, I have been using Docker successfully to run containers BUT I'm constantly being told that I don't need to run Docker at all. I understand the principles of Docker and Node.JS but in all honesty I cant imagine how I would run server side without Docker. Ive tried Node.js and seemed to require a PHP server, which I was also told I did not need. Is this true? which is better Docker or Node.JS? and If Node JS is better how to run it without a php server as it is my understanding that php serves the pages and React consumes the pages.
'You can just install Node, frequently through your OS's package manager. It doesn't require PHP or other language interpreters. I find working directly with Node much easier than using Node in Docker: it is actually a local development environment that my IDE is comfortable with, and not a path to run a Node interpreter "somewhere else" that's isolated from my desktop tooling. '
1)After a few weeks of research I found that I didn't need docker at all. Within Node is the ability to run a server using either fastify or express. I just needed to check on the relevant documentation for usage
2) I linked fastify to ngrok and exposed my local IP address to a public facing direction
3) I linked the ngrock url to my freedns and voila! it worked!
4) I had a small problem with the port which was resolved by using
the command ngrok http 127.0.0.1:5000

How to Integrate angular cli project with nodejs/expressjs project

I have 2 projects, as shown in the following images.
The Node, and Express project is in the root folder, and the Angular CLI project inside of angular-src folder.
I can run each project individually, back-end on port 3000 with node app.js and front-end on port 4200 with ng serve
What is the best way to integrate both projects, and the easiest in deployment?
Full Project Folder
Full Project Folder with angular src files shown
Most people keep their frontend (fe) and backend (be) in separate repos. This makes sense when you think that you can have multiple fe's for a single be.
However, its often easier just to keep them in the same repo. Then when you deploy you only need to clone one repo.
Also, it depends what route you do down. Will you installed Node.js etc on your server or will you go down the docker route?
People often like to use docker to containerise their fe and be. The benefit of this approach is that you only need one dependency installed on each server you deploy too, that is docker itself. Everything else exists in the docker world.
Also, docker makes it easy to docker pull from any server. If things go wrong, just kill your docker container and spin up a new one (assuming your data would not be lost in doing so, not a worry for the fe but a be concern).
Obviously, there is the docker learning curve but it's not too painful.

Push code to remote Docker

I have a GitLab repository in which I have a node.js app with express, I want "deploy" this code to my Ubuntu Server to use the express server remotely and not only local, but I don't want install node.js instead I want try use Docker.
I have read a lot about Docker, and I had understood the fundamental thing. My question is this, if I install Docker on my Ubuntu Server, how can I "deploy" my code on Docker when I push in my repository?
Basically, you have to divide the process in two steps. One is dockerizing your app, which means creating a Docker image for your repository. The second step is having your server use this image, possibly automating the process on push. So I would do something like this:
Dockerize your app. This means having a Dockerfile where you create an image that contains your app, runs it and possibly exports a port to use it externally.
Run the image in your server. Your server will need to have docker installed, and be able to get the right image (more on this later). If only one image is being used, you can just use a simple docker run command. If there are more parts involved, such as a database or a webserver, I would recommend using docker-compose.
Make the image available on your server. You have more than one option here. You can publish your image to a docker repository (private or public), or you can just download the repository in your server, and build the image there.
Lastly, you need to bind these steps. For that you need a hook that reacts on commits to the server, where you send a command to the server to fetch/build the image, and run the newer version.
You have a lot of flexibility on how to do this, actually. I would start with a simpler process, where you build the image on your server, and build on top of that according to your needs.
Dokku is a Docker based PaaS platform that provides git push deployments. It supports Heroku buildpacks to build an run your application or custom Dockerfile deployments.

Running multiple node applications using same configuration file that is outside of each project

I am using pm2 to run several node applications. Problem is that I am using config files for every node application and so what I want to do is easily have say a json file outside of all the node application folders in which they can ALL point all for common database connections etc...
Prefer to not use linux environment variables unless there is an easy and great way of setting it up
pm2 does have the ecosystem, but it doesn't seem to be very well documented to me
what other solutions?
pm2 ecosystem // this generates .config.js not a .json
Create json or yml file. Put it in your root projects folder. And write "configProvider" which will read the file and populate configuration. It works really well for us. Especially this file can also been shared between different languages, not only javascript.

docker compose django and node

I am trying to make an application in django via docker and I want separate the backend (django) container from frontend (node, react) container using only one repository.
I want to run node commands from django container (for example: npm init and creating the package.json at main folder).
Is it a good pratice?
If yes, how can I do this?
Thanks in advance.
If you only need Nodejs for building, you should have one docker image just for building (and if you want, deploying) the static files, and then use a whole different docker setup for the actual production environment.
You can look at https://github.com/dkarchmer/django-aws-template (disclaimer, I am the developer) to see an example. Unfortunately, the project is not yet fully tested and documented, but shows how I propose to handle static files outside Django (it does emulate what I do for real in production - just not fully tested).
You will see a top level docker image I use only for building the webpack type project (using gulp), and actually releasing that directly to S3. The top level index.html file gets copied to the django templates directory, to be used as the base template by other django templates (you may not need this if your front-end will be 100% independent of Django). But IMO, I find it useful to mix. For example, I do all the authentication portion using regular django (django-allauth).
Your question is fairly open ended (not exactly a good way to ask in SO), but I hope the link above gives you some ideas on how to implemented what you need.

Resources