I have a simple React Front-end that uses a proxy Node.js server to fetch data over http and ws.
This runs completely fine on my local host.
When I run this within a docker container, the http calls still work, but the ws connection cannot be established.
If I expose my proxy server port with:
ports:
- 2999:2999
Also the ws connection starts to work. I do not want to expose the proxy server though.
I thought about running this entire setup in another docker container, but that would be the last option for me.
I thought setting:
ports:
- "2999"
Would just make the internal port of the docker image available and solve my problem, but it does not.
How do I expose the port of the proxy to the react app, but not the outside network?
(both react app and proxy are in the same docker image)
Related
I am trying to get my nuxt app working on the production servers. For the local machine, the generated docker image runs well and it can access the nodejs app that runs on localhost. The axios 'baseurl: http://127.0.0.1:6008/' seems working fine, the docker image can access this. On the production servers, i have used docker to setup the nuxt app, the same way i tested on my local machine. Yet the docker nuxt app cannot reach the nodejs app on the host server. I can see this must be some kind of network setting issue.
In vuejs app, i usually setup a proxypass in the apache web conf, to convert the input backend query to match and replace them with localhost address.
ProxyPass /app/query http://localhost:6008/query
The nuxt.config file, axios setting looks lik this:
axios: {
baseURL:'http://127.0.0.1:6008/',
browserBaseURL: ''
},
Does docker needs additional settings or should i configure my apache for this communication between my docker container and a node app that running on host apache pm2 ?
localhost or 127.0.0.1 from within the docker will not resolve to the server's localhost. Instead you need to specify the name of the nodejs service (if you are using docker-compose) or the nodejs docker container name (if you are just using docker run).
You can also try by giving the IP of the server where the docker is running instead of 127.0.0.1
I am trying to run a Node application in a Docker container. The installation instructions specified that after adding host: '0.0.0.0' to config/local.js and running docker-compose up the app should be accessible at localhost:3000, but I get an error message in the browser saying "The connection was reset - The connection to the server was reset while the page was loading."
I have tried to add the host: '0.0.0.0' in different places, or remove it entirely, access https://localhost:3000 as opposed to at http, connecting to localhost:3000/webpack-dev-server/index.html, etc.
What could be going wrong?
I have a docker container running in localhost:3000
Also, I have a node app running in localhost:8081
Now if I want to make post or get request from localhost:3000 to localhost:8001 its not working at all.
Now if run the service as a binary (not a docker file) on localhost:3000 the same API requests works.
How do I communicate if using docker?
Each container runs on it's own bridge network where localhost means the container itself.
To access the host, you can add the option --add-host=host.docker.internal:host-gateway to the docker run command. Then you can access the host using the host name host.docker.internal. I.e. if you have a REST service on the host, your URL would look something like http://host.docker.internal:8081/my/service/endpoint.
The name host.docker.internal is the common name to use for the host, but you can use any name you like.
You need docker version 20.10 for this to work.
I have a backend up (NodeJS) which is the listening on some port (2345 in this example).
The client application is a React app. Both are leveraging socket.io in order to communicate between themselves. Both are containerized.
I then run them both on the same network, using the "network" flag:
docker run -it --network=test1 --name serverapp server-app
docker run -it --network=test1 client-app
In the client, I have this code (in the server the code is pretty standard, almost copy-paste from socket.io page):
const socket = io.connect('http://serverapp:2345');
socket.emit('getData');
socket.on('data', function(data) {
console.log('got data: ${data}');
})
What I suspect is that the problem has to do with having the client (React) app served by the http-server package, and then in the browser context, the hostname is not understood and therefore cannot be resolved. When I go into the browser console, I then see the following error: GET http://tbserver:2345/socket.io/?EIO=3&transport=polling&t=MzyGQLT net::ERR_NAME_NOT_RESOLVED
Now if I switch (in the client-app) the hostname serverapp to localhost (which the browser understands but is not recommended to use in docker as it is interpreted differently), when trying to connect to the server socket, I get the error: GET http://localhost:2345/socket.io/?EIO=3&transport=polling&t=MzyFyAe net::ERR_CONNECTION_REFUSED.
Another piece of information is that we currently build the React app (using npm run build), and then we build and run the Docker container using the following Dockerfile:
FROM mhart/alpine-node
RUN npm install -g http-server
WORKDIR /app
COPY /app/build/. /app/.
EXPOSE 2974 2326 1337 2324 7000 8769 8000 2345
CMD ["http-server", "-p", "8000"]`
(So, no build of the React app takes place while building the container; we rather rely on a prebuilt once)
Not sure what I am missing here and if it has to do with the http-server or not, but prior to that I managed to get a fully working socket.io connection between 2 NodeJS applications using the same Docker network, so it shouldn't be a code issue.
Your browser needs to know how to resolve the SocketIO server address. You can expose port 2345 of serverapp and bind port 2345 of your host to port 2345 of serverapp
docker run -it --network=test1 -p 2345:2345 --name serverapp server-app
That way you can use localhost in your client code
const socket = io.connect('http://localhost:2345');
Also get rid of 2345 from your client Dockerfile
I have 3 separate pieces to my dockerized application:
nodeapp: A node:latest docker container running an expressjs app that returns a JSON object when accessed from /api. This server is also CORs enabled according to this site.
nginxserver: A nginx:latest static server that simply hosts an index.html file that allows the user to click a button which would make the XMLHttpRequest to the node server above.
My host machine
The node:latest has its port exposed to the host via 3000:80.
The nginx:latest has its port exposed to the host via 8080:80.
From host I am able to access both nodeapp and nginxserver individually: I can make requests and see the JSON object returned from the node server using curl from the command line, and the button (index.html) is visible on the screen when I hit localhost:8080.
However, when I try clicking the button the call to XMLHttpRequest('GET', 'http://nodeapp/api', true) fails without seemingly hitting the nodeapp server (no log is present). I'm assuming this is because host does not understand http://nodeapp/api.
Is there a way to tell docker that while a container is running to add its container linking alias to my hosts file?
I don't know if my question is the proper solution to my problem. It looks as though I'm getting a CORs error returned but I don't think it is ever hitting my server. Does this have to do with accessing the application from my host machine?
Here is a link to an example repo
Edit: I've noticed that the when using the stack that clicking the button sends a response from my nginx container. I'm confused as to why it is routing through that server as the nodeapp is in my hosts file so it should recognize the correlation there?
Problem:
nodeapp exists in internal network, which is visible to your nginxserver only, you can check this by enter nginxserver
docker exec -it nginxserver bash
# cat /etc/hosts
Most important, your service setup is not correct, nginxserver shall act as reverse proxy in front of nodeapp
host (client) -> nginxserver -> nodeapp
Dirty Quick Solution:
If you really really want your client (host) to access internal application nodeapp, then you simple change below code
XMLHttpRequest('GET', 'http://nodeapp/api', true)
To
XMLHttpRequest('GET', 'http://localhost:3000/api', true)
Since in your docker-compose.yml, nodeapp service port 80 is exposed in home network as 3000, which can be accessed directly.
Better solution
You need redesign your service stack, to make nginxserver as frontend node, see one sample http://schempy.com/2015/08/25/docker_nginx_nodejs/