I am trying to run a local server on Node.js that returns a simple html page. The difficulty for me at this point is understanding how to make the file system handle function correctly.. I am looking for the right code to use when the /recipe extension is called in the browser.
I get the error "no such file or directory", while the path they specify is correct. There is a file in there with the correct name..
Do I have to manually add "fs" in npm?
Is there another mistake in my code?
Am I forgetting something?
I have the following code:
// strict mode catches javascript errors better..
"use strict";
// localhost port on which you can access the application in DEV
// http status codes accesses npm package that contains main API status codes
// fs is the file system handler to handle the html files
const port = 3000,
http = require("http"),
httpStatus = require("http-status-codes"),
app = http.createServer(),
fs = require("fs");
// set up route mapping for html file
const routeResponseMap = {
"/recipe": "view/recipe.html",
"/index": "<h2>this is the index page</h2>"
};
// need to open your browser localhost:port for the request to be made..
app.on("request", (request,response) => {
response.writeHead(httpStatus.OK, {"Content-Type": "text/html"});
if(routeResponseMap[request.url]){
response.end(routeResponseMap[request.url]);
if(routesResponse[request.url]){
// the error is here, file does not get read
// WHAT CODE DO I NEED HERE?
fs.readFile(routesResponse[request.url]), (error, data) => {
response.write(data);
response.end();
}
console.log("route in mapping");
}
else{
response.end("<h3>Sorry not found</h3>");
}
})
app.listen(port);
console.log("The server has started and is listening on port " + port);
I am working on trying to set up the Highcharts export server under node.js using iisnode (https://github.com/tjanczuk/iisnode). It basically acts as a pipe between requests to IIS through to node. Great! Only, how do I "install" the highcharts export server so it is using iisnode? I did the instructions on how to install the highcharts-export node module but it is installed under (Windows) AppData\Roaming\npm. How to move or point iisnode to the export server?
This export server is run via the following once installed from npm:
highcharts-export-server --enableServer 1
So, to get this installed and used in IIS8 + iisnode
1) What is the right directory to install export server locally (on Windows the modules pulled in via npm go to C:\Users\\AppData\Roaming\nmp\ where is the logged in user using npm to install the package)?
2) What is the iisnode configuration necessary for this?
I have the iisnode setup and running on our development box and all the examples work. My confusion lies partly with the utter lack of documentation for issnode. All the links I have found just repeat the items listed in the links from the issnode developer with no actual "here is how you take a node app that exists in npm and have it work in issnode." I don't necessarily need my hand held every step of the way. I am just seeing no list of steps to even follow.
install node (if not already installed)
install iisnode (if not already installed => https://github.com/tjanczuk/iisnode)
verify IIS has iisnode registered as a module
create a new Application Pool, set to "No Managed Code"
create a new empty web site
load the iisnode sample content into it, update the web.config
verify you can hit it and it runs and can write it's logs
go to the IIS web site folder and run these npm commands
npm init /empty
npm install --save highcharts-export-server
npm install --save tmp
add file hcexport.js and reconfigure web.config
var fs = require('fs');
var http = require('http');
var path = require("path");
var tmp = require('tmp');
const exporter = require('highcharts-export-server');
http.createServer(function (req, res) {
try {
if (req.method !== 'POST') { throw "POST Only"; }
var body = '';
req.on('data', function (data) {
body += data;
});
req.on('end', function () {
if (body === '') { throw "Empty body"; }
var tempFile = tmp.fileSync({discardDescriptor: true, postfix: ".svg", dir: process.cwd()});
var input = JSON.parse(body);
input.outfile = path.basename(tempFile.name);
exporter.initPool();
exporter.export(input, function (err, exres) {
if (err) { throw "Export failed"; }
var filename = path.join(process.cwd(), exres.filename);
exporter.killPool();
fs.readFile(filename, function(err, file) {
res.writeHead(200, { 'Content-Type': 'image/svg+xml', 'Content-disposition': 'attachment; filename=' + exres.filename });
res.write(file.toString());
res.end();
tempFile.removeCallback();
});
});
});
} catch (err) {
console.log({port: process.env.PORT, error: err});
res.writeHead(409, { 'Content-Type': 'text/html' });
res.end(err.message);
}
}).listen(process.env.PORT);
Extend as needed to support the export types you plan to use.
The highcharts-export-server uses phantomjs internally and this can run away under some error conditions using up 100% of available CPU, if you see this you can kill it using:
Taskkill /IM phantomjs.exe /F
The solution from saukender seems to work, however it seems that it always initializes a new pool of phantom workers every time.
If you already have node and issnode setup, another solution is to directly start the highcharts export server and not call the export function manually. This seems to provide a better performance, since it doesn't initialze the worker pool on every request.
// app.js
const highcharts = require("highcharts-export-server");
highcharts.initPool();
highcharts.startServer(process.env.PORT || 8012);
Don't forget to point your web.config to app.js.
I found these two resource quite useful during setup:
https://www.galaco.me/node-js-on-windows-server-2012/
https://tomasz.janczuk.org/2011/08/hosting-nodejs-applications-in-iis-on.html
I'm dabbling with Node and trying to get a basic web server setup that'll accept HTML and return either a PDF or an image depending on the route used.
The below works when running it on my local machine. I've popped it onto two different servers, one using Apache and the other using Nginx. In both of those it's failing to return an image or a PDF. The PDF route returns a 502 and the image route returns an empty image.
It's possible I'm going about something the wrong way, but I'm somewhat at a loss right now as to what I'm missing. Any pointers would be greatly appreciated.
var url = require('url');
var http = require('http');
var wkhtmltox = require('wkhtmltox');
var converter = new wkhtmltox();
// Locations of the binaries can be specified, but this is
// only needed if the programs are located outside your PATH
// converter.wkhtmltopdf = '/opt/local/bin/wkhtmltopdf';
// converter.wkhtmltoimage = '/opt/local/bin/wkhtmltoimage';
http.createServer(function (req, res) {
console.log(url.parse(req.url, true).query);
if (req.url == "/pdf") {
res.writeHead(200, {'Content-Type': 'application/pdf'});
converter.pdf(req, url.parse(req.url, true).query).pipe(res);
} else if (req.url == "/image") {
res.writeHead(200, {'Content-Type': 'image/png'});
converter.image(req, { format: "png" , quality: 75 }).pipe(res);
} else {
res.end('Control is an illusion.');
}
}).listen(1337, '127.0.0.1');
console.log('Server running at http://127.0.0.1:1337/');
This is the error logged on the server for the PDF route. No error is logged for the image route.
Error: This socket has been ended by the other party
at Socket.writeAfterFIN [as write] (net.js:285:12)
at performWork (/var/www/app_name/node_modules/wkhtmltox/lib/wkhtmltox.js:98:22)
at wkhtmltox.pdf (/var/www/app_name/node_modules/wkhtmltox/lib/wkhtmltox.js:113:16)
at Server.<anonymous> (/var/www/app_name/index.js:16:14)
at emitTwo (events.js:106:13)
at Server.emit (events.js:191:7)
at HTTPParser.parserOnIncoming [as onIncoming] (_http_server.js:543:12)
at HTTPParser.parserOnHeadersComplete (_http_common.js:105:23)
I was testing using curl.
curl -d #file_to_render.html -s "http://localhost:1337/pdf" -o test.pdf
curl -d #file_to_render.html -s "http://localhost:1337/image" -o test.png
Please try this command I have resolved it by this command
sudo apt-get install libfontconfig
Try add onError event to pipe
converter.image(req, { format: "png" , quality: 75 }).pipe(res).on('error', function(e){ console.log(e); });
I have a need where i have to open a SFTP connection with a server, copy a file from there to local.
To that end, i have tried installing node-sftp module using
npm install node-sftp
It didnt work out of the box, i had to replace the sftp.js file that was installed by npm with that of github repository here : https://github.com/ajaxorg/node-sftp
(npm version was using TTY and github version was using PTY. i am not sure what they are)
After start the server and invoking the code, i see this in console.
launching: sftp -o Port=22 jash#xxx.63.xxx.49
listening...
console just hangs here. I am trying to print all files in the current directory after connection is opened.
This is the code
var http = require('http');
var Sftp = require('node-sftp');
var port = process.env.PORT || 1337;
var msgHandler = function(request, response) {
var options = {
host:"xxx.63.xxx.49",
username:"jash",
password:"mypassword",
port:22
};
var conn = new Sftp(options,function(err){
console.log(err);
});
conn.cd(".", function(err) {
console.log(err);
conn.ls(".", function(err, res) {
console.log(res[0].path);
});
});
console.log("listening...");
}
http.createServer(msgHandler).listen(port);
The credentials are fine, i used them in SecureCRT and was able to login.
The second argument to Sftp() (the function(err)...) is where you want to place your conn.cd(... code. It (said 2nd argument) is a function that gets called once the connection has been established. Make sure to check for err of course.
When using Node.js to try and get the html content of the following web page:
eternagame.wikia.com/wiki/EteRNA_Dictionary
I get the following error:
events.js:72
throw er; // Unhandled 'error' event
^
Error: getaddrinfo ENOTFOUND
at errnoException (dns.js:37:11)
at Object.onanswer [as oncomplete] (dns.js:124:16)
I did already look up this error on stackoverflow, and realized that this is because node.js cannot find the server from DNS (I think). However, I am not sure why this would be, as my code works perfectly on www.google.com.
Here is my code (practically copied and pasted from a very similar question, except with the host changed):
var http = require("http");
var options = {
host: 'eternagame.wikia.com/wiki/EteRNA_Dictionary'
};
http.get(options, function (http_res) {
// initialize the container for our data
var data = "";
// this event fires many times, each time collecting another piece of the response
http_res.on("data", function (chunk) {
// append this chunk to our growing `data` var
data += chunk;
});
// this event fires *one* time, after all the `data` events/chunks have been gathered
http_res.on("end", function () {
// you can use res.send instead of console.log to output via express
console.log(data);
});
});
Here is the source where I copied and pasted from : How to make web service calls in Expressjs?
I am not using any modules with node.js.
Thanks for reading.
In Node.js HTTP module's documentation: http://nodejs.org/api/http.html#http_http_request_options_callback
You can either call http.get('http://eternagame.wikia.com/wiki/EteRNA_Dictionary', callback), the URL is then parsed with url.parse(); or call http.get(options, callback), where options is
{
host: 'eternagame.wikia.com',
port: 8080,
path: '/wiki/EteRNA_Dictionary'
}
Update
As stated in the comment by #EnchanterIO, the port field is also a separate option; and the protocol http:// shouldn't be included in the host field. Other answers also recommends the use of https module if SSL is required.
Another common source of error for
Error: getaddrinfo ENOTFOUND
at errnoException (dns.js:37:11)
at Object.onanswer [as oncomplete] (dns.js:124:16)
is writing the protocol (https, https, ...) when setting the host property in options
// DON'T WRITE THE `http://`
var options = {
host: 'http://yoururl.com',
path: '/path/to/resource'
};
in the options for the HTTP request, switch it to
var options = { host: 'eternagame.wikia.com',
path: '/wiki/EteRNA_Dictionary' };
I think that'll fix your problem.
My problem was that my OS X (Mavericks) DNS service needed to be restarted. On Catalina and Big Sur DNS cache can be cleared with:
sudo dscacheutil -flushcache; sudo killall -HUP mDNSResponder
Older macOS versions see here.
If you need to use https, then use the https library
https = require('https');
// options
var options = {
host: 'eternagame.wikia.com',
path: '/wiki/EteRNA_Dictionary'
}
// get
https.get(options, callback);
var http=require('http');
http.get('http://eternagame.wikia.com/wiki/EteRNA_Dictionary', function(res){
var str = '';
console.log('Response is '+res.statusCode);
res.on('data', function (chunk) {
str += chunk;
});
res.on('end', function () {
console.log(str);
});
});
I think http makes request on port 80, even though I mentioned the complete host url in options object. When I run the server application which has the API, on port 80, which I was running previously on port 3000, it worked. Note that to run an application on port 80 you will need root privilege.
Error with the request: getaddrinfo EAI_AGAIN localhost:3000:80
Here is a complete code snippet
var http=require('http');
var options = {
protocol:'http:',
host: 'localhost',
port:3000,
path: '/iso/country/Japan',
method:'GET'
};
var callback = function(response) {
var str = '';
//another chunk of data has been recieved, so append it to `str`
response.on('data', function (chunk) {
str += chunk;
});
//the whole response has been recieved, so we just print it out here
response.on('end', function () {
console.log(str);
});
}
var request=http.request(options, callback);
request.on('error', function(err) {
// handle errors with the request itself
console.error('Error with the request:', err.message);
});
request.end();
I fixed this error with this
$ npm info express --verbose
# Error message: npm info retry will retry, error on last attempt: Error: getaddrinfo ENOTFOUND registry.npmjs.org registry.npmjs.org:443
$ nslookup registry.npmjs.org
Server: 8.8.8.8
Address: 8.8.8.8#53
Non-authoritative answer:
registry.npmjs.org canonical name = a.sni.fastly.net.
a.sni.fastly.net canonical name = prod.a.sni.global.fastlylb.net.
Name: prod.a.sni.global.fastlylb.net
Address: 151.101.32.162
$ sudo vim /etc/hosts
# Add "151.101.32.162 registry.npmjs.org` to hosts file
$ npm info express --verbose
# Works now!
Original source: https://github.com/npm/npm/issues/6686
Note that this issue can also occur if the domain you are referencing goes down (EG. no longer exists.)
in my case error was because of using incorrect host value
was
var options = {
host: 'graph.facebook.com/v2.12/',
path: path
}
should be
var options = {
host: 'graph.facebook.com',
path: path
}
so anything after .com or .net etc should be moved to path parameter value
In my case the problem was a malformed URL.
I had double slashes in the URL.
I tried it using the request module, and was able to print the body of that page out pretty easily. Unfortunately with the skills I have, I can't help other than that.
I got this error when going from development environment to production environment. I was obsessed with putting https:// on all links. This is not necessary, so it may be a solution for some.
I was getting the same error and used below below link to get help:
https://nodejs.org/api/http.html#http_http_request_options_callback
I was not having in my code:
req.end();
(NodeJs V: 5.4.0)
once added above req.end(); line, I was able to get rid of the error and worked fine for me.
Try using the server IP address rather than the hostname.
This worked for me. Hope it will work for you too.
I got rid of http and extra slash(/).
I just used this 'node-test.herokuapp.com' and it worked.
If still you are facing checkout for proxy setting, for me it was the proxy setting which were missing and was not able to make the request as direct http/https are blocked. So i configured the proxy from my organization while making the request.
npm install https-proxy-agent
or
npm install http-proxy-agent
const httpsProxyAgent = require('https-proxy-agent');
const agent = new httpsProxyAgent("http://yourorganzation.proxy.url:8080");
const options = {
hostname: 'encrypted.google.com',
port: 443,
path: '/',
method: 'GET',
agent: agent
};
I got this issue resolved by removing non-desirable characters from the password for the connection. For example, I had these characters: <##% and it caused the problem (most probably hash tag was the root cause of the problem).
My problem was we were parsing url and generating http_options for http.request();
I was using request_url.host which already had port number with domain name so had to use request_url.hostname.
var request_url = new URL('http://example.org:4444/path');
var http_options = {};
http_options['hostname'] = request_url.hostname;//We were using request_url.host which includes port number
http_options['port'] = request_url.port;
http_options['path'] = request_url.pathname;
http_options['method'] = 'POST';
http_options['timeout'] = 3000;
http_options['rejectUnauthorized'] = false;