Node file system - node.js

Can something explain how in res.write(html), the html parameter maps to ./index.html?
Here is the code. Am I not understanding how callback functions work?
var http = require('http');
var fs = require('fs');
var host = 'localhost';
var port = '8888';
fs.readFile('./index.html', function(err, html){
if(err){
console.log(err);
return;
}
var server = http.createServer(function(req, res){
res.StatusCode = 200;
res.setHeader('Content-Type', 'text/html');
res.write(html);
res.end();
});
server.listen(port, host, function(){
console.log('Server running on port ' + port);
})
});

This code says to run fs.readFile('./index.html', ...) to get the file './index.html' into memory. When the file is done being read into memory, call the callback that you passed it and put the contents into the function parameter you named html. At any point inside that callback function, you can refer to the html function parameter and it will contain the contents of the './index.html' file that was read from disk.
Then after that, create your server and define a request handler callback for it that will be called each time an incoming request is received by your web server.
That callback will then send the data in that html parameter as the response to that incoming request.
Then, start that server.
It's kind of an odd way to write things, but there's nothing technically wrong with it.
Note that the http.serverServer() callback is nested inside the other callback. In Javascript, you have access to not only your own local parameters and local variables, but also the parameters and local variables of any parent functions that you are nested inside of.
Am I not understanding how callback functions work?
I don't know what you do and don't understand about callback functions. In both the fs.readFile() and http.createServer() case, these are callbacks that will be called sometime in the future when some operation completes. In the fs.readFile() case, it's callback is called when the file contents have been entirely read into memory. In the http.createserver() case, the callback is called whenever any incoming request is received by the web server.

Related

Is it okay to not send a response to urls that people are pentesting my node/express site?

I log all 404s on my website. I keep getting them for pages I haven't linked to, and it's clearly someone (a bot) trying to find admin pages / secure files on my site such as /wp-admin.php;
router.get('/wp-admin.php', function(req, res, next) {});
I tried this and it doesn't seem to hold up the server, it just outputs something like this a minute later:
GET /wp-admin.php - - ms - -
Is there any detriment to adding routes such as that, where no response is sent, possibly wasting their time?
router.get('/wp-admin.php', function(req, res, next) {});
This will cause express to time out and close the connection. This will make Denial of Service attack easier for hackers and jam up your node server.
You can always use some kind of rate limiters to prevent continuous request from a certain IP.
express-rate-limit
is a can be used for this. It is simple express middleware
As noted in the already accepted answer, an Express route like that will leave you vulnerable.
I recommend going one step further and tearing down those requests using req.destroy.
I'm not sure of the implications of Express being included, here, though. For example, is the request body being read automatically by a middleware upstream of this request handler you've shown? If so, that would be an attack vector that makes the mitigation I'm suggesting useless.
Regardless, to demonstrate what I am suggesting with a vanilla HTTP server:
var h = require('http')
h.createServer(function(req, res) {
// tear down the socket as soon as the request event is emitted
req.destroy()
}).listen(8888, function() {
// send a request to the server we just created
var r = h.request({port: 8888})
r.on('response', console.log.bind(console, 'on_response'))
r.on('error', console.log.bind(console, 'on_error'))
r.on('timeout', console.log.bind(console, 'on_timeout'))
// abort will be emitted to the caller, but nothing else
r.on('abort', console.log.bind(console, 'on_abort'))
r.end()
})
You could also call socket.destroy in the connection event of the HTTP server if you're able to identify the calling agent as a bot (or whatever) somehow.
var h = require('http')
h.createServer(function(req, res) {
res.send('foo')
}).on('connection', function(socket) {
// pretend this ip address is the remote address of an attacker, for example
if (socket.remoteAddress === '10.0.0.0') {
socket.destroy()
}
}).listen(8888, function() {
// send a request to the server we just created
var r = h.request({port: 8888})
r.on('response', console.log.bind(console, 'on_response'))
r.on('error', console.log.bind(console, 'on_error'))
r.on('timeout', console.log.bind(console, 'on_timeout'))
// abort will be emitted to the caller, but nothing else
r.on('abort', console.log.bind(console, 'on_abort'))
r.end()
})

NodeJS respond to http.ServerResponse via stream

I'm currently experimenting with NodeJS streams and had a deeper look into the http.ServerResponse object that's being created upon every http request of the http.createServer request handler.
What I'm now trying to do is having the exact same API of the http.ServerResponse object in a different process connected via another arbitrary method (for instance using Streams) and pipe all output of this object (including headers!) to the actual request, like the following:
-[http]-> server1 -[stream]-> server2 -[stream]-> server1 -[http]->
I've tried a couple of variants, like the following (only local) example:
var http = require('http');
var net = require('net');
var through = require('through');
var StreamWrap = require('_stream_wrap').StreamWrap;
http.createServer(function(req, _res) {
var resStream = through();
resStream.pipe(_res.socket);
resStream.__proto__.cork = function() {};
resStream.__proto__.uncork = function() {};
var resSocket = new StreamWrap(resStream);
var res = new http.ServerResponse(req);
res.assignSocket(resSocket);
res.writeHead(201, {'Content-Type': 'text/plain'});
res.write('hello');
res.end();
}).listen(8000, function() {
console.log("Server listening");
});
which should essentially send the raw HTTP message to the underlying socket (or not?), but for some reason I'm getting a Segmentation fault: 11 - I'm also not sure whether I'm using the StreamWrap object correctly. But I think that this is not an optimal solution as the ServerResponse object handles a lot of things with regards to the socket internally.
So what would be the best way to tackle this?
If I'm piping directly to the response object, this results only in writing to the HTTP body (see also this answer), and I'm loosing the header information. So would it be best to separate header and body and then transfer them separately? How do I know when the headers are being set final on the ServerResponse object, and when the data starts (apart from other options such as trailing headers)?
Another option would be to remotely call methods on the ServerResponse Object i.e. via dnode, but isn't there a better way to do this? I would really love to be able to use Express for instance on the remotely connected server, that's why I want to keep the http.ServerResponse API.
I'm glad for any input!
Thanks!

Sending emails without using mail server from customer premises

I have software running in customer servers on premises and there are multiple software and I want on failure of any software it should send emails to me
It can be a pain enabling & configuring to work with the customers mail servers.
I thought to write simple socket program in NodeJS to read the error log file and push those messages to my server that should handle the sending email
or may be web service to call for sending email.
If any has used things like this please tell me or Is there any easy solution exist somewhere?
Updating my question
As per comments I tried to implement same solution here is my main nodejs server file and where exactly I am facing problem now in Socket event emit. I want to emit socket event whenever log.xml file get changes, This run only one time.
var app = require('http').createServer(handler),
io = require('socket.io').listen(app),
parser = new require('xml2json'),
fs = require('fs');
app.listen(8030);
console.log('server listening on localhost:8030');
// creating a new websocket to keep the content updated without REST call
io.sockets.on('connection', function (socket) {
console.log(__dirname);
// reading the log file
fs.readFile(__dirname + '/var/home/apache/log.xml', function (err, data) {
if (err)
throw err;
// parsing the new xml data and converting them into json file
var json = parser.toJson(data);
// send the new data to the client
socket.emit('error', json);
});
});
/* Email send services This code to in my client server outside of main socket server cloud This part is working fine I tested it in my different server
var socket = io.connect('http://localhost:8030');
socket.on('error', function (data) {
// convert the json string into a valid javascript object
var _data = JSON.parse(data);
mySendMailTest(_data);
*/
Please apologies me as I am new to stackoverflow community.
I think there is no problem in your socket code you need to use fs.watchFile before reading file. this is watch function similar to Angular Watch , it will detect any change happen to your file and run another function in callback to emit the socket
https://nodejs.org/docs/latest/api/fs.html#fs_fs_watchfile_filename_options_listener
// creating a new websocket to keep the content updated without REST call
io.sockets.on('connection', function (socket) {
console.log(__dirname);
// reading the log file
// watching the file
fs.watchFile(__dirname + '/var/home/apache/log.xml', function(curr, prev) {
// on file change just read it
fs.readFile(__dirname + '/var/home/apache/log.xml', function (err, data) {
if (err)
throw err;
// parsing the new xml data and converting them into json file
var json = parser.toJson(data);
// send the new data to the client
socket.emit('error', json);
});
});
});

Node.js watching a file

I want to print a message whenever the file that I am watching has changed. I am able to do that using console.log but I couldn't figure out how to do that using response.write or similar functions.
var counter = 0;
const
http = require('http'),
fs = require('fs'),
filename = process.argv[2];
var server = http.createServer(function(request, response) {
response.writeHead(200, { 'Content-Type' : 'text/plain' });
counter = counter + 1;
response.end('Hello client ' + Math.round(counter / 2));
});
server.on('listening', function() {
var watcher = fs.watch(filename, function(){
console.log('The file ' + filename + ' has just changed.');
});
});
server.listen(8080);
Also, the reason why I have done that Math.round(counter / 2) is because counter is increasing by 2 each time a client is connected. I was wondering why this is happening and if there is a better technique to resolve this.
For you to be able to do it using response.write it would need to be part of your server request handler function.
File events can occur independently of someone sending a request, so it's handling is independent to that of handling a request. Because of this, there is no associated request for you to write to.
If you want to keep track of all the file change events and then show it to the user whenever they do send a request, consider storing the information about changes in an object outside your handler functions and when the a request takes place, read that object to see if there have been changes and write a response based on it to the user.
If you want to inform an end user that the file has change, for example in a web browser, then you have a number of options, including polling the server or using websockets.
I would recommend you take a look at Server Sent Events
It is easy to implement and there are npm module out there to make it even easier in node.js. e.g. npm sse
You can try node module chokidar
https://github.com/paulmillr/chokidar
it is a gread module for file watching

How do i store request-level variables in node.js?

for data that only needs to be available during an individual request, where should it be stored?
i am creating new properties on the req and res objects so i dont have to pass that data from function to function.
req.myNewValue = 'just for this request'
is the process object an option? or is it shared globally across all requests?
In Express 4, the best practice is to store request level variables on res.locals.
An object that contains response local variables scoped to the
request, and therefore available only to the view(s) rendered during
that request / response cycle (if any). Otherwise, this property is
identical to app.locals.
This property is useful for exposing request-level information such as
the request path name, authenticated user, user settings, and so on.
app.use(function(req, res, next){
res.locals.user = req.user;
res.locals.authenticated = ! req.user.anonymous;
next();
});
The process object is shared by all requests and should not be used per request.
If you are talking about the variable passed like here:
http.createServer(function (req, res) {
req.myNewValue = 'just for this request';
res.writeHead(200, {'Content-Type': 'text/plain'});
res.end('Hello World\n');
}).listen(1337, '127.0.0.1');
then it is perfectly fine what you are doing. req stores the request data, you can modify it as you want. If you are using some framework like Express, then it should be fine as well (keep in mind that you may overwrite some built-in properties of req object).
If by "process object" you are refering to the global variable process, then absolutely not. The data here is global and shouldn't be modified at all.
If you want to preserve the data across the async callback and there could be scenarios, where request and response objects are not available. So in that case continuation-local-storage package, is helpful.
It is used to access the data or the current express request/response from a point where that is not readily accessible. It use the concept of namespace.
Here is how I set up this
Install the continuation-local-storage package
npm install continuation-local-storage --save
Create namespace
let app = express();
let cls = require('continuation-local-storage');
let namespace = cls.createNamespace('com.domain');
then middleware
app.use((req, res, next) => {
var namespace = cls.getNamespace('com.domain');
// wrap the events from request and response
namespace.bindEmitter(req);
namespace.bindEmitter(res);
// run following middleware in the scope of the namespace we created
namespace.run(function () {
// set data on the namespace, makes it available for all continuations
namespace.set('data', "any_data");
next();
});
})
Now in any file or function you can get this namespace and use the saved data in it
//logger.ts
var getNamespace = require("continuation-local-storage").getNamespace;
let namespace = getNamespace("com.domain");
let data = namespace.get("data");
console.log("data : ", data);
No, it isn't shared along with all requests, it only persists for that request long.

Resources