I am using the request npm module for making http requests in my program. The default behavior of request seems to be that it will try to use proxy server, when one is defined in environment.
I am testing this in a shared unix server used by multiple developers, who keep changing proxy settings. Further more, I do not need proxy, because I just connect other web services within the lan directly.
So, is there a way to tell the 'request' not to use the proxy option, even though if it is set in environment?
Credit goes to #mh-cbon in the comments. Codifying here to complete the answer.
Either blank out the configured proxies prior to starting nodejs
HTTPS_PROXY="" node script.js
Or use NO_PROXY, to disable proxies for specified patterns (or all)
NO_PROXY="*" node script.js
Alternatively within your node js script, do the above before loading and using of request module.
// Disable proxy from being used by request module
process.env["NO_PROXY"]="";
// Then go on as per normal
const request = require("request")
... do stuff ..
Related
I have a working node.js express based server (and client) application here that shows RPC over http+websockets. This works perfectly when run locally (using devcontainers) and includes the Dockerfile as well as devcontainer.json. However, when run from a codespace, it fails with the following client-side error messages.
client.js:9 Mixed Content:
The page at 'https://aniongithub-jsonrpc-bidirectional-example-<redacted>-8080.preview.app.github.dev/'
was loaded over HTTPS, but attempted to connect to the insecure WebSocket endpoint
'ws://aniongithub-jsonrpc-bidirectional-example-<redacted>-8080.preview.app.github.dev/api'.
This request has been blocked; this endpoint must be available over WSS.
(anonymous) # client.js:9
client.js:9 Uncaught DOMException: Failed to construct 'WebSocket':
An insecure WebSocket connection may not be initiated from a page loaded over HTTPS
at 'https://aniongithub-jsonrpc-bidirectional-example-<redacted>-8080.preview.app.github.dev/client.js:9:10'
The documentation here states that By default, GitHub Codespaces forwards ports using HTTP but you can update any port to use HTTPS, as needed. When I check the settings indicated:
it's set to http. What am I missing here? How can I get it to serve my express application over http?
Note: My intention is that when locally cloned and opened in a devcontainer, the code works just as it would if opened in a CodeSpace. This means I need to ensure that the certs generated by CodeSpaces are somehow factored into my local devcontainer process or that I forego authentication altogether. Alternatively, I need to find out if I'm running on CodeSpaces and do different things, which seems messy and shouldn't be the case. Hope this makes my intentions for asking this question clearer!
It turns out that I just couldn't use http for the RPC endpoint when running over https, so the solution was to use location.protocol and ws/wss depending on the current protocol to initialize the client RPC endpoint.
as the article says:
https://docs.telerik.com/fiddler-everywhere/knowledge-base/how-to-capture-nodejs-traffic
is it possible to proxy globaly the traffic to fiddler except for the libraries that use the http module, other libraries that use http module, on the other hand, needs to be proxied explicitly
I need to inspect the traffic generated from a node app, which i don't know what type of library it uses.
i've tryed to proxy globally but i doesn't works
process.env.https_proxy='http://127.0.0.1:8888';
process.env.http_proxy='http://127.0.0.1:8888';
process.env.NODE_TLS_REJECT_UNAUTHORIZED=0;
does it exist a workaround?
I have a problem with an Express.js service running on production that I'm not able to replicate on my localhost. I have already tried requesting all the urls to production again to my local machine, but on my machine everything works fine. So I suspect that the problem comes with the data on the http headers (cookies, user agents, languages...).
So, is there a way, (some express module, or sniffer that runs on ubuntu) that allows me to easily create a dump on the server with the whole header so I can later repeat those exact requests to my localhost?
You can capture network packages with https://www.wireshark.org/, analyze them and maybe find the difference between your local environment and the production one.
You can try to use a Proxy-Tool like Charles (https://www.charlesproxy.com/) or Fiddler (http://www.telerik.com/fiddler) to log your Browser Requests.
I am building a development tool, and would like to monitor requests to specific domains on a user's system.
I have already written a MITM proxy server. I would like to observe all requests to, say, api.twitter.com, without requiring users to change their code to point to my proxy server. This might be called an HTTP sniffer, I'm not sure.
I have considered:
Browser plugin (eg: chrome dev tools plugin)
/etc/hosts (but this can't map domain to domain I think, and if you did you wouldn't be able to get to the original one)
Native OSX app (learning curve)
Is there a way to observe system HTTP requests using node? I don't know where to start.
To launch Node.js app one needs to run node app.js
The app is to read data from Internet.
Now I work behind firewall and the target server is only available via proxy that is running as localhost:3213
I don't want to change OS setting and allow all apps to use the proxy.
How to specify proxy for launched Node.js app without changing JavaScript code?
Node options (listed with node -h) don't give hint.
How can I use an http proxy with node.js http.Client? refers to altering Node app.
For Java I can pass [Java] system properties via -D parameters
java -Dhttp.proxyHost=127.0.0.1 -Dhttp.proxyPort=3213 -Dhttp.nonProxyHosts=”localhost|host.example.com” MyJavaApplication