I have a GitLab repository in which I have a node.js app with express, I want "deploy" this code to my Ubuntu Server to use the express server remotely and not only local, but I don't want install node.js instead I want try use Docker.
I have read a lot about Docker, and I had understood the fundamental thing. My question is this, if I install Docker on my Ubuntu Server, how can I "deploy" my code on Docker when I push in my repository?
Basically, you have to divide the process in two steps. One is dockerizing your app, which means creating a Docker image for your repository. The second step is having your server use this image, possibly automating the process on push. So I would do something like this:
Dockerize your app. This means having a Dockerfile where you create an image that contains your app, runs it and possibly exports a port to use it externally.
Run the image in your server. Your server will need to have docker installed, and be able to get the right image (more on this later). If only one image is being used, you can just use a simple docker run command. If there are more parts involved, such as a database or a webserver, I would recommend using docker-compose.
Make the image available on your server. You have more than one option here. You can publish your image to a docker repository (private or public), or you can just download the repository in your server, and build the image there.
Lastly, you need to bind these steps. For that you need a hook that reacts on commits to the server, where you send a command to the server to fetch/build the image, and run the newer version.
You have a lot of flexibility on how to do this, actually. I would start with a simpler process, where you build the image on your server, and build on top of that according to your needs.
Dokku is a Docker based PaaS platform that provides git push deployments. It supports Heroku buildpacks to build an run your application or custom Dockerfile deployments.
Related
I am learning about docker recently so not clear on this part , suppose I am making a node js app where a frontend is been connected to mongo db , but I use a docker-compose file to spin up the mongo container every time I do development , now I have completely build my application and want to make a final docker file for it so that I can publish my image . What is the best way to do this ?
Idea -1
Should I wrap up my docker-compose file also in the final image so that from dockerfile before triggering the app using npm start I should use trigger this docker-compose file
Or is there any other way in which i can directly spin up the container
I'm not a Docker/Compose expert either, but you usually reference your Dockerfile from docker-compose.yml, and not the other way around.
It'd be very helpful if you could share an example of your docker-compose.yml and expand on your deployment plan (local shell script? CI/CD on cloud? single host for all your services?).
Also, I encourage you to checkout the official Use Compose in production.
I have 2 projects, as shown in the following images.
The Node, and Express project is in the root folder, and the Angular CLI project inside of angular-src folder.
I can run each project individually, back-end on port 3000 with node app.js and front-end on port 4200 with ng serve
What is the best way to integrate both projects, and the easiest in deployment?
Full Project Folder
Full Project Folder with angular src files shown
Most people keep their frontend (fe) and backend (be) in separate repos. This makes sense when you think that you can have multiple fe's for a single be.
However, its often easier just to keep them in the same repo. Then when you deploy you only need to clone one repo.
Also, it depends what route you do down. Will you installed Node.js etc on your server or will you go down the docker route?
People often like to use docker to containerise their fe and be. The benefit of this approach is that you only need one dependency installed on each server you deploy too, that is docker itself. Everything else exists in the docker world.
Also, docker makes it easy to docker pull from any server. If things go wrong, just kill your docker container and spin up a new one (assuming your data would not be lost in doing so, not a worry for the fe but a be concern).
Obviously, there is the docker learning curve but it's not too painful.
I've been searching around the web to see what's the best/simplest way to deploy a meteor app, and have found that Meteor Up has been the easiest way to do this.
However, I've been noticing that this works pretty awesome on small apps, now that one of our apps has grown larger than 250mb, Meteor Up has to build and deploy the whole 250mb app again and again for even the smallest change.
With other node applications we have on digital ocean, a simple git pull does the trick without having to re-upload the entire application.
Is there a way to maintain a meteor application with a github/bitbucket repository?
Thanks!
Well, I have found a solution for this.
Reference: PM2 + Meteor Environment Setup
Using meteor build and following the README that it generates, I was able to run the bundle without using meteor up.
This helps at deploying since it skips the process of uploading the entire bundle to the server, instead, just use git pull in the server to pull your code changes and use meteor build to create the build and run it with pm2.
Can someone with actual experience explain how these layers interact with each other and how a working setup (dev to production) should actually be, well, set up?
I understand there are buildpacks that serve to install Strongloop on Heroku. And that deploying the actual app is done with git push.
Some specific points that you could address...
How can I have (more or less) the same environment locally and at Heroku.
After setting up Strongloop Node, does the server environment stay in place? Or is it recreated every time I deploy an update? (if yes, how so?)
How does slnode fit into the picture
Can I connect to a db hosted at Heroku from a dev machine?
I hope answers to this question can serve as a guide for people like me who are struggling to understand how all the pieces go together.
I understand there are buildpacks that serve to install Strongloop on Heroku. And that deploying the actual app is done with git push.
You don't need our buildpack, and yes, you deploy with git push.
How can I have (more or less) the same environment locally and at Heroku.
How much more or less? You can develop on your mac laptop, and push to Heroku, using same version of node, or you can be more like Heroku, and use Linux, or ... what exactly about the Heroku env do you want to reproduce?
After setting up Strongloop Node, does the server environment stay in place? Or is it recreated every time I deploy an update? (if yes, how so?)
Not sure what setup you are referring to.
How does slnode fit into the picture
It doesn't.
Can I connect to a db hosted at Heroku from a dev machine?
Don't know, sorry, try Heroku support pages for this kind of heroku-specific tech question, perhaps?
I followed docs on bottom of page at http://docs.strongloop.com/display/DOC/Heroku, look for "Create Procfile and deploy", and it worked OK for me.
Fast answers
Use Vagrant, download a Debian 6 box, and install whatever you need in it, remember to check what version of Node.js Heroku uses.
It will be partially rebuilt, your npm dependencies will be redownloaded, your application gets rebuilt and so on.
You can use slnode on your dev machine. However if it is necessary to use slnode on a dyno, fork a build pack and install slnode as part of it.
You can connect to a database hosted on Heroku from any network connected server or pc, you will be provided with an ip and credentials.
Build packs
I'm not sure about actual constraints of a build pack, but it can do almost anything that you can do in an Debian 6 virtualized environment with shared kernel. ( An Debian 6 instance in a OpenVZ VPS )
Think of build packs as low level dependency managers which solve dependencies like node.js, Redis, Apache2 and so on.
They also build an environment like file system structure, ENV variables and so on.
Heroku infrastructure
Heroku is using AWS as it's raw hardware provider, AWS provides a fresh installation of an OS in a virtualized hardware ( VPS )
Heroku builds dynos on top of raw OS, by guess, it shares one OS between at least 128 dynos.
Each dyno is isolated from others, it has common software built in like ls, but it's build pack's duty to install any other software like Node.js for your application.
Heroku's node build pack, installs Node.js, and runs npm.
I develop locally and push to GitHub and my staging environment is on an EC2 instance. I currently have to ssh in to my remote instance, git pull, restart Node server. Is there a way this process can be automated in any way?
Yes, I wrote a blog post just last week about this - although my server-side part uses Ruby/Sinatra rather than Node, but re-writing it in Node would be very simple.
http://baudehlo.wordpress.com/2012/02/09/github-continuous-deployment-to-ec2/