ISSUE: I have a docker image running for neo4j and one for express.js. I cant get the docker images to communicate between eachother.
I can run neo4j desktop, start a nodemon server and they will communicate.
SETUP:
NEO4J official docker image
NEO4J_AUTH none
PORTS localhost:7474 localhost:7687
Version neo4j-community-4.3.3-unix.tar.gz
NODEJS Image
PORTS 0.0.0.0:3000 :::3000
Version 14.17.5
Express conf
DEV_DB_USER_NAME="neo4j"
DEV_DB_PASSWORD="test"
DEV_DB_URI="neo4j://localhost" //for image purpose for local its bolt://localhost:7687
DEV_DB_SECRET_KEY=""
let driver = neo4j.driver(
envConf.dbUri,
neo4j.auth.basic(envConf.dbUserName, envConf.dbUserName)
);
package.json
"#babel/node": "^7.13.10",
"neo4j-driver": "^4.2.3",
I can remote into the neo4j image through http://localhost:7474/browser/ so its running.
I cannot use the server image to call a local neo4j instance.
when i call the apis in the server image i get these errors
If i use neo4j protocal:
Neo4jError: Could not perform discovery. No routing servers available. Known routing table: RoutingTable[database=default database, expirationTime=0, currentTime=1629484043610, routers=[], readers=[], writers=[]]
If i use bolt protocal:
Neo4jError: Failed to connect to server. Please ensure that your database is listening on the correct host and port and that you have compatible encryption settings both on Neo4j server and driver. Note that the default encryption setting has changed in Neo4j 4.0. Caused by: connect ECONNREFUSED 127.0.0.1:7687
Ive been scouring the documentation for a while any ideas would be most welcome!
I was able to achieve the communication by using docker-compose. The problem is that both containers acted as separate networks and i could not find a way to allow the server to communicate with the database. Running docker-compose and building both containers within a single compose network allows communication using the service names.
take note this is tab sensitive!!
docker-compose.yml
version: '3.7'
networks:
lan:
# The different services that make up our "network" of containers
services:
# Express is our first service
express:
container_name: exp_server
networks:
- lan
# The location of dockerfile to build this service
build: <location of dockerfile>
# Command to run once the Dockerfile completes building
command: npm run startdev
# Volumes, mounting our files to parts of the container
volumes:
- .:/src
# Ports to map, mapping our port 3000, to the port 3000 on the container
ports:
- 3000:3000
# designating a file with environment variables
env_file:
- ./.env.express
## Defining the Neo4j Database Service
neo:
container_name: neo4j_server
networks:
- lan
# The image to use
image: neo4j:latest
# map the ports so we can check the db server is up
ports:
- "7687:7687"
- "7474:7474"
# mounting a named volume to the container to track db data
volumes:
- $HOME/neo4j/conf:/conf
- $HOME/neo4j/data:/data
- $HOME/neo4j/logs:/logs
- $HOME/neo4j/plugins:/plugins
env_file:
- .env.neo4j
with this you can use docker to run both the server and database and anything else while still using change detection rebuilding to develop and even build multiple environment images at the same time. NEAT
Related
I have successfully set up Docker-Desktop for Windows and installed my first linux containers from dockerhub. Network-wise containers can communicate with each other on the docker internal network. I am even able to communication with the host network via host.docker.internal.
Now i am coming to the point where i want to access the outside network (Just some other server on the network of the docker host) from within a docker-container.
I have read on multiple websites that network_mode: host does not seem to work with docker desktop for windows.
I have not configured any switches within Hyper-V Manager and have not added any routes in docker, as i am confused with the overall networking concept of docker-desktop for windows in combination with Hyper-V and Linux Containers.
Below you can see my current docker-compose.yaml with NiFi and Zookeeper installed. NiFi is able to see Zookeeper and NiFi is able to query data from a database installed on the docker host. However i need to query data from a different server other than the host.
version: "3.4"
services:
zookeeper:
restart: always
container_name: zookeeper
ports:
- 2181:2181
hostname: zookeeper
image: 'bitnami/zookeeper:latest'
environment:
- ALLOW_ANONYMOUS_LOGIN=yes
nifi:
restart: always
container_name: nifi
image: 'apache/nifi:latest'
volumes:
- D:\Docker\nifi:/data # Data directory
ports:
- 8080:8080 # Unsecured HTTP Web Port
environment:
- NIFI_WEB_HTTP_PORT=8080
- NIFI_CLUSTER_IS_NODE=false
- NIFI_CLUSTER_NODE_PROTOCOL_PORT=8082
- NIFI_ZK_CONNECT_STRING=zookeeper:2181
- NIFI_ELECTION_MAX_WAIT=1 min
depends_on:
- zookeeper
Check if you have connection type in dockerNAT set to appropriate external network and set IPV4 config to auto.
I am under a weird dilemma. I have created a node application and this application needs to connect to MongoDB (through docker container) I created a docker-compose file as follows:
version: "3"
services:
mongo:
image: mongo
expose:
- 27017
volumes:
- ./data/db:/data/db
my-node:
image: "<MY_IMAGE_PATH>:latest"
deploy:
replicas: 1
restart_policy:
condition: on-failure
working_dir: /opt/app
ports:
- "2000:2000"
volumes:
- ./mod:/usr/app
networks:
- webnet
command: "node app.js"
networks:
webnet:
I am using mongo official image. I have ommited my docker image from the above configuration .I have set up with many configuration but i am unable to connect to mongoDB (yes i have changed the MongoDB uri inside the node.js application too). but whenever i am deploying my docker-compose my application on start up gives me always MongoNetworkError of TransientTransactionError. I am unable to understand where is the problem since many hours.
One more weird thing is when i running my docker-compose file i receive following logs:
Creating network server_default
Creating network server_webnet
Creating service server_mongo
Creating service server_feed-grabber
Could it be that both services are in a different network? If yes then how to fix that?
Other Info:
In node.js application MongoDB uri that i tried is
mongodb://mongo:27017/MyDB
I am running my docker-compose with the command: docker stack deploy -c docker-compose.yml server
My node.js image is Ubuntu 18
Anyone can help me with this?
Ok So i have tried few things and i figured out at last after spending many many hours. There were two things i was doing wrong and they were hitting me on last point:
First thing logging of the startup of docker showed me that it was creating two networks server_default and server_webnet this is the first mistake. Both containers should be in the same network while working.
The second thing I needed to run the Mongo container first as my Node.js application depend_on the Mongo container to be run first. This is exactly what I did in my docker-compose configuration by introducing depend_on property.
For me it was:
1- get your ip by running command
docker-machine ip
2- don't go to localhost/port, go to your ip/port, example : http://192.168.99.100:8080
I'm new to MEAN stack development and was wondering whats the ideal way to spin an mongo+express environment.
Running synchronous bash script commands make the mongo server stop further execution and listen for connections. What would be a local + docker compatible script to initiate the environment ?
Many people use docker-compose for a situation like this. You can set up a docker-compose configuration file where you describe services that you would like to run. Each service defines a docker image. In your case, you could have mongodb, your express app and your angular app defined as services. Then, you can launch the whole stack with docker-compose up.
A sample docker-compose config file would look something like:
version: '2' # specify docker-compose version
# Define the services/containers to be run
services:
angular: # name of the first service
build: angular-client # specify the directory of the Dockerfile
ports:
- "4200:4200" # specify port forewarding
express: #name of the second service
build: express-server # specify the directory of the Dockerfile
ports:
- "3000:3000" #specify ports forewarding
database: # name of the third service
image: mongo # specify image to build container from
ports:
- "27017:27017" # specify port forewarding
which comes from an article here: https://scotch.io/tutorials/create-a-mean-app-with-angular-2-and-docker-compose
Lets say we have three services
- php+ apache
- mysql
- nodejs
I know how to use docker-compose to setup application to link mysql with php apache service. I was wondering how we can add node.js service just to manage
js/css assets. The purpose of node.js service is to just manage javascript/css resources. Since docker provides this flexibility I was wondering to use docker service instead of setting up node.js on my host computer.
version: '3.2'
services:
web:
build: .
image: lap
volumes:
- ./webroot:/var/www/app
- ./configs/php.ini:/usr/local/etc/php/php.ini
- ./configs/vhost.conf:/etc/apache2/sites-available/000-default.conf
links:
- dbs:mysql
dbs:
image: mysql
ports:
- "3307:3306"
environment:
- MYSQL_ROOT_PASSWORD=root
- MYSQL_PASSWORD=rest
- MYSQL_DATABASE=symfony_rest
- MYSQL_USER=restman
volumes:
- /var/mysql:/var/lib/mysql
- ./configs/mysql.cnf:/etc/mysql/conf.d/mysql.cnf
node:
image: node
volumes:
- ./webroot:/var/app
working_dir: /var/app
I am not sure this is correct strategy , I am sharing ./webroot with both web and node service. docker-compose up -d only starts mysql and web and fails to start node container , probably there is not valid entrypoint set.
if you want to use node js separate from PHP service you must set two more options to make node stay up, one is stdin_open and the other one is tty like bellow
stdin_open: true
tty: true
this is equivalent to CLI command -it like bellow
docker container run --name nodeapp -it node:latest
if you have a separate port to run your node app (e.g. your frontend is completely separate from your backend and you must run it independently from your backend like you must run npm run start command in order to run the frontend app) you must publish your port like bellow
ports:
- 3000:3000
ports structure is systemPort:containerInnerPort.
this means publish port 3000 from inside node container to port 3000 on the system, in another way your make port 3000 inside your container accessible on your system and you can access this port like localhost:3000.
in the end, your node service would be like bellow
node:
image: node
stdin_open: true
tty: true
volumes:
- ./webroot:/var/app
working_dir: /var/app
You can also add nginx service to docker-compose, and nginx can take care of forwarding requests to php container or node.js container. You need some server that binds to 80 port and redirect requests to designated container.
I have problems to connect a nodeJS application which is running as a docker container to a mongoDB. Let me explain what I have done so far:
$ docker ps
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
3a3732cc1d90 mongo:3.4 "docker-entrypoint..." 3 weeks ago Up 3 weeks 27017/tcp mongo_live
As you can see, there is already a mongo docker container running.
Now I'm running my nodeJS application docker container (which is a build from meteorJS):
$ docker run -it 0b422defbd59 /bin/bash
In this docker container I want to run the application by running:
$ node main.js
Now I'm getting the error
Error: MONGO_URL must be set in environment
I already tried to set MONGO_URL by setting:
ENV MONGO_URL mongodb://mongo_live:27017/
But this doesn't work:
MongoError: failed to connect to server [mongo_live:27017] on first connect
So my question is how to connect to a DB, which is - as far as I understand - 'outside' of the running container. Alternativly how do I set up a new DB to this container?
There are couple of ways to do it.
run your app in the same network as your mongodb:
docker run --net container:mongo_live your_app_docker_image
# then you can use mongodb in your localhost
$ ENV MONGO_URL mongodb://localhost:27017/
Also you can link two containers:
docker run --link mongo_live:mongo_live you_app_image ..
# Now mongodb is accessible via mongo_live
use mongodb container ip address:
docker inspect -f '{{.NetworkSettings.IPAddress}}' mongo_live
# you will get you container ip here
$ docker run -it 0b422defbd59 /bin/bash
# ENV MONGO_URL mongodb://[ip from previous command]:27017/
You can bind your mongodb port to your host and use host's hostname in your app
You can use docker network and run both apps in the same network
You could pass --add-host mongo_live:<ip of mongo container> to docker run for your application and then use mongo_live for mongodb url
You can also use docker compose to make your life easier ;)
...
When you run containers each container works in independent network. Because one container cant connect to other point to point.
The are 3 ways to connect containers
Have a little fuss with low-level docker network magic
Connect container through localhost. Each container must expose ports on localhost (as your mongo_live). But you need add to host ile on localhost 127.0.0.1 mongo_live (This is the simplest way)
Use docker-compose. It convenient tool for working many containers together. (This is right way)
Add mongodb to application container is not docker way.
Please use below snippet for your docker-compose.yml file, replace comments with your actuals. Should solve your problem.
version: '2'
services:
db:
build: <image for mongoDB>
ports:
- "27017:27017" # whatever port u r using
environment:
#you can specify mondo db username and stuff here
volumes:
- #load default config for mondodb from here
- "db-data-store:/data/db" # path depends on which image you use
networks:
- network
nodejs:
build: #image for node js
expose:
- # mention port for nodejs
volumes:
- #mount project code on container
networks:
- network
depends_on:
- db
networks:
network:
driver: bridge
Please use the below links for references :
1) NodeJs Docker
2) MongoDb docker
3) docker-compose tutorial
Best of Luck
I had problem how to connect my server.js to mongodb. And that's how i solved it hope you find it useful.
Tap For My Screenshot