Node.js getaddrinfo ENOTFOUND - node.js

When using Node.js to try and get the html content of the following web page:
eternagame.wikia.com/wiki/EteRNA_Dictionary
I get the following error:
events.js:72
throw er; // Unhandled 'error' event
^
Error: getaddrinfo ENOTFOUND
at errnoException (dns.js:37:11)
at Object.onanswer [as oncomplete] (dns.js:124:16)
I did already look up this error on stackoverflow, and realized that this is because node.js cannot find the server from DNS (I think). However, I am not sure why this would be, as my code works perfectly on www.google.com.
Here is my code (practically copied and pasted from a very similar question, except with the host changed):
var http = require("http");
var options = {
host: 'eternagame.wikia.com/wiki/EteRNA_Dictionary'
};
http.get(options, function (http_res) {
// initialize the container for our data
var data = "";
// this event fires many times, each time collecting another piece of the response
http_res.on("data", function (chunk) {
// append this chunk to our growing `data` var
data += chunk;
});
// this event fires *one* time, after all the `data` events/chunks have been gathered
http_res.on("end", function () {
// you can use res.send instead of console.log to output via express
console.log(data);
});
});
Here is the source where I copied and pasted from : How to make web service calls in Expressjs?
I am not using any modules with node.js.
Thanks for reading.

In Node.js HTTP module's documentation: http://nodejs.org/api/http.html#http_http_request_options_callback
You can either call http.get('http://eternagame.wikia.com/wiki/EteRNA_Dictionary', callback), the URL is then parsed with url.parse(); or call http.get(options, callback), where options is
{
host: 'eternagame.wikia.com',
port: 8080,
path: '/wiki/EteRNA_Dictionary'
}
Update
As stated in the comment by #EnchanterIO, the port field is also a separate option; and the protocol http:// shouldn't be included in the host field. Other answers also recommends the use of https module if SSL is required.

Another common source of error for
Error: getaddrinfo ENOTFOUND
at errnoException (dns.js:37:11)
at Object.onanswer [as oncomplete] (dns.js:124:16)
is writing the protocol (https, https, ...) when setting the host property in options
// DON'T WRITE THE `http://`
var options = {
host: 'http://yoururl.com',
path: '/path/to/resource'
};

in the options for the HTTP request, switch it to
var options = { host: 'eternagame.wikia.com',
path: '/wiki/EteRNA_Dictionary' };
I think that'll fix your problem.

My problem was that my OS X (Mavericks) DNS service needed to be restarted. On Catalina and Big Sur DNS cache can be cleared with:
sudo dscacheutil -flushcache; sudo killall -HUP mDNSResponder
Older macOS versions see here.

If you need to use https, then use the https library
https = require('https');
// options
var options = {
host: 'eternagame.wikia.com',
path: '/wiki/EteRNA_Dictionary'
}
// get
https.get(options, callback);

var http=require('http');
http.get('http://eternagame.wikia.com/wiki/EteRNA_Dictionary', function(res){
var str = '';
console.log('Response is '+res.statusCode);
res.on('data', function (chunk) {
str += chunk;
});
res.on('end', function () {
console.log(str);
});
});

I think http makes request on port 80, even though I mentioned the complete host url in options object. When I run the server application which has the API, on port 80, which I was running previously on port 3000, it worked. Note that to run an application on port 80 you will need root privilege.
Error with the request: getaddrinfo EAI_AGAIN localhost:3000:80
Here is a complete code snippet
var http=require('http');
var options = {
protocol:'http:',
host: 'localhost',
port:3000,
path: '/iso/country/Japan',
method:'GET'
};
var callback = function(response) {
var str = '';
//another chunk of data has been recieved, so append it to `str`
response.on('data', function (chunk) {
str += chunk;
});
//the whole response has been recieved, so we just print it out here
response.on('end', function () {
console.log(str);
});
}
var request=http.request(options, callback);
request.on('error', function(err) {
// handle errors with the request itself
console.error('Error with the request:', err.message);
});
request.end();

I fixed this error with this
$ npm info express --verbose
# Error message: npm info retry will retry, error on last attempt: Error: getaddrinfo ENOTFOUND registry.npmjs.org registry.npmjs.org:443
$ nslookup registry.npmjs.org
Server: 8.8.8.8
Address: 8.8.8.8#53
Non-authoritative answer:
registry.npmjs.org canonical name = a.sni.fastly.net.
a.sni.fastly.net canonical name = prod.a.sni.global.fastlylb.net.
Name: prod.a.sni.global.fastlylb.net
Address: 151.101.32.162
$ sudo vim /etc/hosts
# Add "151.101.32.162 registry.npmjs.org` to hosts file
$ npm info express --verbose
# Works now!
Original source: https://github.com/npm/npm/issues/6686

Note that this issue can also occur if the domain you are referencing goes down (EG. no longer exists.)

in my case error was because of using incorrect host value
was
var options = {
host: 'graph.facebook.com/v2.12/',
path: path
}
should be
var options = {
host: 'graph.facebook.com',
path: path
}
so anything after .com or .net etc should be moved to path parameter value

In my case the problem was a malformed URL.
I had double slashes in the URL.

I tried it using the request module, and was able to print the body of that page out pretty easily. Unfortunately with the skills I have, I can't help other than that.

I got this error when going from development environment to production environment. I was obsessed with putting https:// on all links. This is not necessary, so it may be a solution for some.

I was getting the same error and used below below link to get help:
https://nodejs.org/api/http.html#http_http_request_options_callback
I was not having in my code:
req.end();
(NodeJs V: 5.4.0)
once added above req.end(); line, I was able to get rid of the error and worked fine for me.

Try using the server IP address rather than the hostname.
This worked for me. Hope it will work for you too.

I got rid of http and extra slash(/).
I just used this 'node-test.herokuapp.com' and it worked.

If still you are facing checkout for proxy setting, for me it was the proxy setting which were missing and was not able to make the request as direct http/https are blocked. So i configured the proxy from my organization while making the request.
npm install https-proxy-agent
or
npm install http-proxy-agent
const httpsProxyAgent = require('https-proxy-agent');
const agent = new httpsProxyAgent("http://yourorganzation.proxy.url:8080");
const options = {
hostname: 'encrypted.google.com',
port: 443,
path: '/',
method: 'GET',
agent: agent
};

I got this issue resolved by removing non-desirable characters from the password for the connection. For example, I had these characters: <##% and it caused the problem (most probably hash tag was the root cause of the problem).

My problem was we were parsing url and generating http_options for http.request();
I was using request_url.host which already had port number with domain name so had to use request_url.hostname.
var request_url = new URL('http://example.org:4444/path');
var http_options = {};
http_options['hostname'] = request_url.hostname;//We were using request_url.host which includes port number
http_options['port'] = request_url.port;
http_options['path'] = request_url.pathname;
http_options['method'] = 'POST';
http_options['timeout'] = 3000;
http_options['rejectUnauthorized'] = false;

Related

Nodejs Fetch API: unable to verify the first certificate

I'm using the fetch API module in my Philips Hue project and when I make a call to the local ip address (my hub) it produces that error in the title.
const fetch = require('node-fetch');
const gateway = "192.168.0.12";
const username = "username";
let getLights = function(){
fetch(`https://${gateway}/api/${username}/lights`, {
method: 'GET'
}).then((res) => {
return res.json();
}).then((json) => {
console.log(json);
});
}
module.exports = {getLights};
Any SECURE fix this will eventually go onto the public internet for me to access my lights from anywhere sooo?
To skip the SSL tests, you can use this:
process.env['NODE_TLS_REJECT_UNAUTHORIZED'] = 0;
It seems like you tried to access it using HTTPS. Most likely on your local network it is going to be HTTP
So by changing https://${gateway}/api/${username}/lights to http://${gateway}/api/${username}/lights should work.
If you're trying to keep it HTTPS then you will have to install a SSL certificate authority onto your network.
These may be useful sources if you're trying to get that done:
https://www.freecodecamp.org/news/how-to-get-https-working-on-your-local-development-environment-in-5-minutes-7af615770eec/
https://letsencrypt.org/docs/certificates-for-localhost/

Websocket 'Sec-WebSocket-Accept' header mismatch between ReactJS client and Node.js server

I'm writing an application using WebSockets with a React client on port 8080 (run using Webpack devServer) and Node server and sockets on port 5000. However, the initial handshake always fails with an error: WebSocket connection to 'ws://localhost:5000/' failed: Error during WebSocket handshake: Incorrect 'Sec-WebSocket-Accept' header value
To make sure, I check the request and response of the React app using Chrome devtools, and see the following:
While on my Node server, I logged the sec-websocket-accept header accept key, as well as the headers for my response, and got the following:
It looks like that indeed, the keys don't match. In fact, they don't seem to be the same keys at all. Is there something in between the React client and Node server (like the Webpack devserver that I'm using for React) that's changing them?
My React code:
componentDidMount(){
this.socket = new WebSocket('ws://localhost:5000', ['json']);
this.socket.onerror = err => {
console.log(err)
}
this.socket.onmessage = e => {
let res = JSON.parse(e.data);
console.log(e, res);
let copyArr = [...this.state.message]
copyArr.push(res);
this.setState({
message: copyArr
});
}
}
My Node.js code:
const server = http.createServer();
server.on('upgrade', (req, socket) => {
if(req.headers['upgrade'] !== "websocket"){
socket.end('HTTP/1.1 400 Bad Request');
return;
}
const acceptKey = req.headers['sec-websocket-key'];
const acceptHash = generateValue(acceptKey);
console.log('accepkey', acceptKey, 'hash', acceptHash);
const resHeaders = [ 'HTTP/1.1 101 Web Socket Protocol Handshake', 'Upgrade: WebSocket', 'Connection: Upgrade', `Sec-WebSocket-Accept: ${acceptHash}` ];
console.log(resHeaders);
let protocols = req.headers['sec-websocket-protocol'];
protocols = !protocols ? [] : protocols.split(',').map(name => name.trim());
if(protocols.includes('json')){
console.log('json here');
resHeaders.push(`Sec-WebSocket-Protocol: json`);
}
socket.write(resHeaders.join('\r\n') + '\r\n\r\n');
})
function generateValue(key){
return crypto
.createHash('sha1')
.update(key + '258EAFA5-E914–47DA-95CA-C5AB0DC85B11', 'binary')
.digest('base64');
}
The correct Accept hash for a key of 'S1cb73xifMvqiIpMjvBabg==' is 'R35dUOuC/ldiVp1ZTchRsiHUnvo='.
Your generateValue() function calculates an incorrect hash because it has an incorrect character in the GUID string '258EAFA5-E914–47DA-95CA-C5AB0DC85B11'. If you look very carefully you'll see that the second dash, in '...14–47...', is different from the other dashes. It should be a plain ASCII dash or hyphen with character code 45, but in fact it is a Unicode en-dash with character code 8211. That different character code throws off the calculation.
Fixing that character will make your WebSocket client much happier.
For anyone wondering, the culprits causing the issues, in my case, were the extra new lines and returns I added after writing my headers in my Node.js server. Taking them out and doing:
socket.write(resHeaders.join('\r\n'));
instead of:
socket.write(resHeaders.join('\r\n') + '\r\n\r\n');
solved the handshake mismatch for me.

Node: Can't bind to IPv6 localAddress while using https.request()

I can bind to localAddress just fine when using HTTP, but as soon as I switch to HTTPS I get an error: bind EINVAL. Please consider this code:
var http = require('http');
var https = require('https');
var options = { host:'icanhazip.com',path:'/',localAddress:'2604:a880:1:20::27:a00f',family:6 };
callback = function(response) {
var data = '';
response.on('data',function(chunk) { data+= chunk; });
response.on('error',function(error) { console.log("error: "+error.message); });
response.on('end',function() {
console.log(data);
});
}
http.request(options,callback).end(); // Works. IP:2604:a880:1:20::27:a00f
https.request(options,callback).end(); // Fails. IP:2604:a880:1:20::27:a00f
https.request({host:'icanhazip.com',path:'/',family:6},callback).end(); // Works. IP:2604:a880:1:20::27:a00f
Here's the error while running node v5.0.0:
Error: bind EINVAL 2604:a880:1:20::27:a00f
at Object.exports._errnoException (util.js:860:11)
at exports._exceptionWithHostPort (util.js:883:20)
at connect (net.js:809:16)
at net.js:984:7
at GetAddrInfoReqWrap.asyncCallback [as callback] (dns.js:63:16)
at GetAddrInfoReqWrap.onlookup [as oncomplete] (dns.js:82:10)
The only difference between the working and the failing code is setting localAddress and ironically, the last example binds to the correct IP address, but won't let you do it using localAddress.
The problem here is I have to make a request from a completely separate IPv6 address under my use case, and it works fine with HTTP but I need this to work for HTTPS requests. Currently I can only make this work while using cURL. Could you please provide some insight as to why this is happening or how I could make this work without additional libraries?
I had same issue as you. Figured it out. Update your node to the latest stable. They fixed it. Check it with node --version I'm on 6.6.0 and it works great.
The version I got from doing an apt-get was way too old.

why data event not working on http.get response stream

Handling images in nodejs, I got stuck on an issue that the data event is not working with piped response stream.
var http = require('http'),
url = require('url');
var httpServer = http.createServer(function(req,res){
var path = url.parse(req.url).pathname;
var fstream = $.FS.createReadStream('/root/image'+path);
fstream.on('error',function(){
res.writeHead(404);
return res.end();
});
fstream.pipe(res);
});
httpServer.listen(8080);
http.get({
host : '127.0.0.1',
port : 8080,
path : '/image.jpg'
},function(res){
res.on('data',function(){
console.log('data received'); //nothing happened
});
}).on('error',function(er){
throw er;
});
Did I do anything wrong? Or is this a node bug?
In my testing if the response has a status of 304 (not modified), then no data event is fired. The solution is fairly easy if you are not afraid of the terminal (or command line in Windows). Open your command tool and cd to the directory containing your image. Once there, you can use the following command to make it appear that your file has been updated (replace filename.ext with the name of your file):
Unix/Linux
touch filename.ext
Windows
copy /b filename.ext +,,
To rerun the command, hit the ↑ up arrow and then hit enter

Change port without losing data

I'm building a settings manager for my http server. I want to be able to change settings without having to kill the whole process. One of the settings I would like to be able to change is change the port number, and I've come up with a variety of solutions:
Kill the process and restart it
Call server.close() and then do the first approach
Call server.close() and initialize a new server in the same process
The problem is, I'm not sure what the repercussions of each approach is. I know that the first will work, but I'd really like to accomplish these things:
Respond to existing requests without accepting new ones
Maintain data in memory on the new server
Lose as little uptime as possible
Is there any way to get everything I want? The API for server.close() gives me hope:
server.close(): Stops the server from accepting new connections.
My server will only be accessible by clients I create and by a very limited number of clients connecting through a browser, so I will be able to notify them of a port change. I understand that changing ports is generally a bad idea, but I want to allow for the edge-case where it is convenient or possibly necessary.
P.S. I'm using connect if that changes anything.
P.P.S. Relatively unrelated, but what would change if I were to use UNIX server sockets or change the host name? This might be a more relevant use-case.
P.P.P.S. This code illustrates the problem of using server.close(). None of the previous servers are killed, but more are created with access to the same resources...
var http = require("http");
var server = false,
curPort = 8888;
function OnRequest(req,res){
res.end("You are on port " + curPort);
CreateServer(curPort + 1);
}
function CreateServer(port){
if(server){
server.close();
server = false;
}
curPort = port;
server = http.createServer(OnRequest);
server.listen(curPort);
}
CreateServer(curPort);
Resources:
http://nodejs.org/docs/v0.4.4/api/http.html#server.close
I tested the close() function. It seems to do absolute nothing. The server still accepts connections on his port. restarting the process was the only way for me.
I used the following code:
var http = require("http");
var server = false;
function OnRequest(req,res){
res.end("server now listens on port "+8889);
CreateServer(8889);
}
function CreateServer(port){
if(server){
server.close();
server = false;
}
server = http.createServer(OnRequest);
server.listen(port);
}
CreateServer(8888);
I was about to file an issue on the node github page when I decided to test my code thoroughly to see if it really is a bug (I hate filing bug reports when it's user error). I realized that the problem only manifests itself in the browser, because apparently browsers do some weird kind of HTTP request keep alive thing where it can still access dead ports because there's still a connection with the server.
What I've learned is this:
Browser caches keep ports alive unless the process on the server is killed
Utilities that do not keep caches by default (curl, wget, etc) work as expected
HTTP requests in node also don't keep the same type of cache that browsers do
For example, I used this code to prove that node http clients don't have access to old ports:
Client-side code:
var http = require('http'),
client,
request;
function createClient (port) {
client = http.createClient(port, 'localhost');
request = client.request('GET', '/create');
request.end();
request.on('response', function (response) {
response.on('end', function () {
console.log("Request ended on port " + port);
setTimeout(function () {
createClient(port);
}, 5000);
});
});
}
createClient(8888);
And server-side code:
var http = require("http");
var server,
curPort = 8888;
function CreateServer(port){
if(server){
server.close();
server = undefined;
}
curPort = port;
server = http.createServer(function (req, res) {
res.end("You are on port " + curPort);
if (req.url === "/create") {
CreateServer(curPort);
}
});
server.listen(curPort);
console.log("Server listening on port " + curPort);
}
CreateServer(curPort);
Thanks everyone for the responses.
What about using cluster?
http://learnboost.github.com/cluster/docs/reload.html
It looks interesting!

Resources