Deploy on digitalocean using docker-compose and docker machine - node.js

I am trying to publish a Nodejs application with docker-compose and docker-machine on digitaocean. When I do docker-compose up -d, the application works well on the server but when I shut down my computer or stop docker everything also stops on the server.
Could someone please help me move forward. Thanks in advance

Related

Docker is not pointing to existing containers

I'm on Ubuntu and I am using docker for my Laravel application but after the restart, I can't execute commands like docker-compose up -d, docker-compose restart, docker-compose exec backend PHP artisan migrate anymore. I checked other solutions like killing containers running on that specific port but the problem keeps coming back. What seems to be the problem?
whenever I try docker-compose up -d, this is the result:
Error Logs
My application is completely running, but somehow I can't execute Laravel artisan commands so I tried restarting it, but it's pointing to these containers instead of the ones that are currently running:
Containers

one linux connect to another ip,How to use docker make them into images and deploy to somewhere else

Now I have two Linux PC,mongodb is in the first PC which IP is 192.168.1.33,and a
java application on another Linux connect to the mongodb on 192.168.1.33
What I want to do is,prepare everything and make both Linux systems into docker images,and when I am in productive environment,I can simply restore the images that I prepared,and everything is OK,so I do not need complex deployment steps.
but the problem is,the IP of mongodb will change,and the IP 192.168.1.33 is written in my configuration file of my java application,it will not change automatically,is there a automated way?
Basics
We create Docker-file with minimal installation steps.
We create docker-Image from that Docker-file in step-1.
We create container from the step-2 image and expose the important port as required.
For your problem.
creating-a-docker-image-with-mongodb This article will help to dockerize the mongodb.
but the problem is,the IP of mongodb will change,and the IP
192.168.1.33 is written in my configuration file of my java application,it
will not change automatically,is there a automated way?
If you expose the mongo-db port to docker host you can use same
docker-host-IP:<exposed-port>
Ref from the article sudo docker run -p 27017:27017 -i -t my_new_mongodb
Example: 192.168.1.33 is your docker-host where mongodb container is running with exposed port 27017. You can add 192.168.1.33:27017 to your JAVA app.
What I want to do is,prepare everything and make both Linux systems
into docker images
You can not convert your VM to direct docker images. Instead you can follow the steps written in Basics and dockerize the both DB and application layer.
2.dockerize-your-java-application refer this link and dockerize you application based on requirements.
Step 1 & 2 will help you to build docker images which you can deploy to multiple servers.

Docker Cloud and JHipster console

Is it possible to set up JHipster console on Docker Cloud? My application is deployed on Heroku.
If is there no option, please advise where can I set up docker in cloud.
Regards!
Yes docker cloud is an option although I've never tried it. If you have simple needs and don't need container orchestration on multiple hosts I would recommend creating a simple VM with docker on your favorite cloud provider (using docker-machine for example) and then deploy the console there using docker-compose. It's really easy to do.
1) SSH on your server
2) Install docker and docker-compose
3) Get the docker-compose file from https://github.com/jhipster/jhipster-console/blob/master/bootstrap/docker-compose.yml
4) Run docker-compose up -d
The console will be available on port 5601.
Refer to the docs at : https://jhipster.github.io/monitoring/
More advanced setup are possible but this is the easiest way to go. Also note that it is perfectly possible to run the JHipster-Console without Docker but it requires some work. To do this, setup an ELK stack yourself usinh on simple logstash configuration and scripts to preload the dashboards.
Ok, locally everything is fine. So how (step by step) push JHipster Console to docker-cloud and connect it with my application on heroku?

What's the best way to debug a NodeJs app running on a docker container in a remote host?

I have a NodeJs app running on a docker container on a remote server. I can access the app on the browser. I'm also able to deploy to my app using PhpStorm and its remote server connection.
However, I tried to use the remote NodeJs debug tool of PhpStorm and it doesn't work. I always get connection refused.
I know the debug port is open because I check the docker containers and the 5858 is open. This port is also oppened on the host. And this is also the port I set for the debug.
package.json:
"scripts": {
"start": "nodemon --debug=5858 index.js myApp"
}
I don't know if PhpStorm is the best solution to debug this kind of app. So if someone has a better idea please let me know.
Thanks!
After further searching I found this great repository:
https://github.com/seelio/node-inspector-docker
It seems to me the easier way to make the app running and debug it.
Definitely node-inspector,
I had to do the same for an app in microservices and clusters/workers
just in case you need it: clustered apps with node-inspector
You can use intelij IDEA as IDE
It support running app directly from docker and allows you to debug apps easily.
once configured with your docker image its done.
next time just click run and it will start quickly nodejs inside your docker and show logs etc all just like we do with local node instance
https://www.jetbrains.com/help/idea/2016.3/running-and-debugging-node-js.html#node_docker_run_debug
Its EAP and communitiy editions are always is free

deploy wso2esb in docker container with kubernetes

can someone help with how to deploy wso2esb in docker container with kubernetes?
currently im running only one node/master at local machine with ubuntu server 14.04 LTS
if im running with this
sudo docker run --name esb isim/wso2esb
it instantly trigger the service inside the container
but if im running with this
kubectl run esb1 --image=isim/wso2esb
the container just run, without trigger the service inside the container
btw im using isim/wso2esb from docker hub
hope someone can help me..
From the comments above, it looks like you were connecting to the wrong IP address, which you discovered by running kubectl logs esb1.
In general, you can follow the Kubernetes Debugging FAQ when you see an issue like this to see if it is a common problem that has already been documented.

Resources