Setting NO_PROXY in http call from NodeJS+Express - node.js

I am building a nodejs+express application which makes a HTTP GET call using the node-fetch library to retrieve a response from a service. The problem is the host of that service which I am calling does not work well with proxy, it sometime returns a response and sometimes it fails, which is working as designed. So I was asked to add the hostname of that API call to the NO_PROXY environment variable to get it to work. I am running the application in a docker container so I have added it as an ENV variable as well as added it on the node app startup script as
NODE_ENV=xyz.abc.com node app.js
However I am still running into the same issue and looks like the node+express application is not considering the NO_PROXY variable is using the proxy to make the REST call. Is there any other way to configure it so that the the environment variable is picked up.

Related

HTTP requests between two docker services works fine but I don't know why [duplicate]

I have a ReactJS project with its own Dockerfile, exposing port 3000:3000.
I also have a PHP project with its own Dockerfile, exposing port 80:80. The PHP app also has containers for MySQL, Redis and Nginx
For the PHP app, I have a docker-compose file that creates a network (my-net) for PHP, Nginx, MySQL and Redis to communicate on. However, I now want the ReactJS (which is in a separate project) to be able to communicate with the PHP app.
I added a docker-compose file to the React project, and added it to the network from the PHP project my-net and declared it as external so that it doesn't try to create it.
This seems to work: From the ReactJS container, I can ping app (the name of my backend service) and it works properly. However, from the ReactJS code, if I use something like axios to try and hit the backend API, it can't resolve app or http://app or any variation. It can however access the underlying IP address if I substitute that into in axios.
So there seems to be some issue with the hostname resolution, and presumably this is on the axios / JavaScript end. is there something I'm missing or a reason this isn't working?
When the JavaScript runs in a browser (outside of Docker) you can not use app because that is only available inside the Docker network (via the embedded DNS server).
To access your PHP server from outside use localhost and the exposed port (80) instead.

running frontend and backend on different ports

Hi Im running my frontend (create-react-app) and backend server (express.js) on different ports but on the same host.
For example: frontend is on 127.0.0.1:3000 and backend on 127.0.0.1:3003.
in my package.json:
{...
"proxy": "http://localhost:3003",
...}
Everything worked fine till I didn't migrate my app to remote server.
My app started to refresh unexpectedly when I'm trying to send http request (axios) to server (probably due to bad proxy settings).
So I have frontend app running on 35.125.320:10:3000 and server is running on 35.125.320:10:3003. My http requests was unexpectedly cancelled. (I checked the network ). So I changed my proxy settings to
{...
"proxy": "35.125.320:10:3003",
...}
but anyway my app is still refreshing when Im trying to make http req. on server. I think the problem is that I can't reach my express backend server. So proxy is forwarding my requests badly.
UPDATE
scenario:(Im doing two post requests)
1) first request still passed (app is not refreshed)
2) same request passed (but sometimes app is refreshed)
3) second is still cancelled by browser.
QUESTION
How can my frontend communicate with backend server via proxy when they are running on different ports but on the same server and domain ??
Thanks for the answer.
Solution:
The problem was that I used proxy in production that is only suitable for development.
I added this line in my express.js server :
app.use(express.static(`${process.cwd()}/build`));
app.use(express.static(`${process.cwd()}/public`));
I make a build and serve js,css files from my build folder. And also I needed serve static files (images, folders, etc...) from my public folder.
This problem can also cause cancelling http request by browser on production. Means, requests weren't able to reach server.
To make your app publicly available, you will want to make a production build. You mentioned in a comment that you "run npm build and then serve this build as static file in express.js". This is a great way to make your react app publicly available. As it says in the create-react-app documentation:
npm start or yarn start
Runs the app in development mode.
When running yarn start or npm start, you are also given a notification that says "Note that the development build is not optimized." The best option will be to run yarn build or npm build and find a way to serve those static files as you are doing.

Why does my npm request module work in tonicdev but not in my Node app in OpenShift cloud?

I have a Node Express app that runs with no problems on the OpenShift cloud. It accepts router.post and router.get just fine. But I want the OpenShift app to also post some data to a Raspberry Pi which is also running a Node and Express app. To accomplish this, on the OpenShift app, I am using the Node npm request module. When I run it in the Node dev space at http://tonicdev.com, it works just fine. It sends its JSON data to the RPi and the Pi accepts and processes it.
But when I run the exact same request code in the node app in the OpenShift cloud it crashes with an EACCES error. Note that I am using the request-debug module which displays the request module headers immediately before the crash. They are as expected to be.
What in the devil is going on in the OpenShift environment that is changing the request module's execution causing it to crash, and how can I address it?
OpenShift Online currently has a whitelist of acceptable outbound ports, and port 3000 is probably not in the list. We can add ports to the list, but typically we only do this for well-known service ports. Can you run the remote service on a different port (such as 80 or 8080)?

NodeJS request -- how to disable automatic proxy from environment

I am using the request npm module for making http requests in my program. The default behavior of request seems to be that it will try to use proxy server, when one is defined in environment.
I am testing this in a shared unix server used by multiple developers, who keep changing proxy settings. Further more, I do not need proxy, because I just connect other web services within the lan directly.
So, is there a way to tell the 'request' not to use the proxy option, even though if it is set in environment?
Credit goes to #mh-cbon in the comments. Codifying here to complete the answer.
Either blank out the configured proxies prior to starting nodejs
HTTPS_PROXY="" node script.js
Or use NO_PROXY, to disable proxies for specified patterns (or all)
NO_PROXY="*" node script.js
Alternatively within your node js script, do the above before loading and using of request module.
// Disable proxy from being used by request module
process.env["NO_PROXY"]="";
// Then go on as per normal
const request = require("request")
... do stuff ..

How to launch node.js app with proxy (specify proxy to use without changing JavaScript code)

To launch Node.js app one needs to run node app.js
The app is to read data from Internet.
Now I work behind firewall and the target server is only available via proxy that is running as localhost:3213
I don't want to change OS setting and allow all apps to use the proxy.
How to specify proxy for launched Node.js app without changing JavaScript code?
Node options (listed with node -h) don't give hint.
How can I use an http proxy with node.js http.Client? refers to altering Node app.
For Java I can pass [Java] system properties via -D parameters
java -Dhttp.proxyHost=127.0.0.1 -Dhttp.proxyPort=3213 -Dhttp.nonProxyHosts=”localhost|host.example.com” MyJavaApplication

Resources