Node.js watching a file - node.js

I want to print a message whenever the file that I am watching has changed. I am able to do that using console.log but I couldn't figure out how to do that using response.write or similar functions.
var counter = 0;
const
http = require('http'),
fs = require('fs'),
filename = process.argv[2];
var server = http.createServer(function(request, response) {
response.writeHead(200, { 'Content-Type' : 'text/plain' });
counter = counter + 1;
response.end('Hello client ' + Math.round(counter / 2));
});
server.on('listening', function() {
var watcher = fs.watch(filename, function(){
console.log('The file ' + filename + ' has just changed.');
});
});
server.listen(8080);
Also, the reason why I have done that Math.round(counter / 2) is because counter is increasing by 2 each time a client is connected. I was wondering why this is happening and if there is a better technique to resolve this.

For you to be able to do it using response.write it would need to be part of your server request handler function.
File events can occur independently of someone sending a request, so it's handling is independent to that of handling a request. Because of this, there is no associated request for you to write to.
If you want to keep track of all the file change events and then show it to the user whenever they do send a request, consider storing the information about changes in an object outside your handler functions and when the a request takes place, read that object to see if there have been changes and write a response based on it to the user.

If you want to inform an end user that the file has change, for example in a web browser, then you have a number of options, including polling the server or using websockets.
I would recommend you take a look at Server Sent Events
It is easy to implement and there are npm module out there to make it even easier in node.js. e.g. npm sse

You can try node module chokidar
https://github.com/paulmillr/chokidar
it is a gread module for file watching

Related

Node file system

Can something explain how in res.write(html), the html parameter maps to ./index.html?
Here is the code. Am I not understanding how callback functions work?
var http = require('http');
var fs = require('fs');
var host = 'localhost';
var port = '8888';
fs.readFile('./index.html', function(err, html){
if(err){
console.log(err);
return;
}
var server = http.createServer(function(req, res){
res.StatusCode = 200;
res.setHeader('Content-Type', 'text/html');
res.write(html);
res.end();
});
server.listen(port, host, function(){
console.log('Server running on port ' + port);
})
});
This code says to run fs.readFile('./index.html', ...) to get the file './index.html' into memory. When the file is done being read into memory, call the callback that you passed it and put the contents into the function parameter you named html. At any point inside that callback function, you can refer to the html function parameter and it will contain the contents of the './index.html' file that was read from disk.
Then after that, create your server and define a request handler callback for it that will be called each time an incoming request is received by your web server.
That callback will then send the data in that html parameter as the response to that incoming request.
Then, start that server.
It's kind of an odd way to write things, but there's nothing technically wrong with it.
Note that the http.serverServer() callback is nested inside the other callback. In Javascript, you have access to not only your own local parameters and local variables, but also the parameters and local variables of any parent functions that you are nested inside of.
Am I not understanding how callback functions work?
I don't know what you do and don't understand about callback functions. In both the fs.readFile() and http.createServer() case, these are callbacks that will be called sometime in the future when some operation completes. In the fs.readFile() case, it's callback is called when the file contents have been entirely read into memory. In the http.createserver() case, the callback is called whenever any incoming request is received by the web server.

NodeJS respond to http.ServerResponse via stream

I'm currently experimenting with NodeJS streams and had a deeper look into the http.ServerResponse object that's being created upon every http request of the http.createServer request handler.
What I'm now trying to do is having the exact same API of the http.ServerResponse object in a different process connected via another arbitrary method (for instance using Streams) and pipe all output of this object (including headers!) to the actual request, like the following:
-[http]-> server1 -[stream]-> server2 -[stream]-> server1 -[http]->
I've tried a couple of variants, like the following (only local) example:
var http = require('http');
var net = require('net');
var through = require('through');
var StreamWrap = require('_stream_wrap').StreamWrap;
http.createServer(function(req, _res) {
var resStream = through();
resStream.pipe(_res.socket);
resStream.__proto__.cork = function() {};
resStream.__proto__.uncork = function() {};
var resSocket = new StreamWrap(resStream);
var res = new http.ServerResponse(req);
res.assignSocket(resSocket);
res.writeHead(201, {'Content-Type': 'text/plain'});
res.write('hello');
res.end();
}).listen(8000, function() {
console.log("Server listening");
});
which should essentially send the raw HTTP message to the underlying socket (or not?), but for some reason I'm getting a Segmentation fault: 11 - I'm also not sure whether I'm using the StreamWrap object correctly. But I think that this is not an optimal solution as the ServerResponse object handles a lot of things with regards to the socket internally.
So what would be the best way to tackle this?
If I'm piping directly to the response object, this results only in writing to the HTTP body (see also this answer), and I'm loosing the header information. So would it be best to separate header and body and then transfer them separately? How do I know when the headers are being set final on the ServerResponse object, and when the data starts (apart from other options such as trailing headers)?
Another option would be to remotely call methods on the ServerResponse Object i.e. via dnode, but isn't there a better way to do this? I would really love to be able to use Express for instance on the remotely connected server, that's why I want to keep the http.ServerResponse API.
I'm glad for any input!
Thanks!

Sending emails without using mail server from customer premises

I have software running in customer servers on premises and there are multiple software and I want on failure of any software it should send emails to me
It can be a pain enabling & configuring to work with the customers mail servers.
I thought to write simple socket program in NodeJS to read the error log file and push those messages to my server that should handle the sending email
or may be web service to call for sending email.
If any has used things like this please tell me or Is there any easy solution exist somewhere?
Updating my question
As per comments I tried to implement same solution here is my main nodejs server file and where exactly I am facing problem now in Socket event emit. I want to emit socket event whenever log.xml file get changes, This run only one time.
var app = require('http').createServer(handler),
io = require('socket.io').listen(app),
parser = new require('xml2json'),
fs = require('fs');
app.listen(8030);
console.log('server listening on localhost:8030');
// creating a new websocket to keep the content updated without REST call
io.sockets.on('connection', function (socket) {
console.log(__dirname);
// reading the log file
fs.readFile(__dirname + '/var/home/apache/log.xml', function (err, data) {
if (err)
throw err;
// parsing the new xml data and converting them into json file
var json = parser.toJson(data);
// send the new data to the client
socket.emit('error', json);
});
});
/* Email send services This code to in my client server outside of main socket server cloud This part is working fine I tested it in my different server
var socket = io.connect('http://localhost:8030');
socket.on('error', function (data) {
// convert the json string into a valid javascript object
var _data = JSON.parse(data);
mySendMailTest(_data);
*/
Please apologies me as I am new to stackoverflow community.
I think there is no problem in your socket code you need to use fs.watchFile before reading file. this is watch function similar to Angular Watch , it will detect any change happen to your file and run another function in callback to emit the socket
https://nodejs.org/docs/latest/api/fs.html#fs_fs_watchfile_filename_options_listener
// creating a new websocket to keep the content updated without REST call
io.sockets.on('connection', function (socket) {
console.log(__dirname);
// reading the log file
// watching the file
fs.watchFile(__dirname + '/var/home/apache/log.xml', function(curr, prev) {
// on file change just read it
fs.readFile(__dirname + '/var/home/apache/log.xml', function (err, data) {
if (err)
throw err;
// parsing the new xml data and converting them into json file
var json = parser.toJson(data);
// send the new data to the client
socket.emit('error', json);
});
});
});

Automatically Refresh an 'app.get()' in Node.js

I tried to find this topic everywhere, unsuccessfully.
I have a app.get() code, really simple, that gets one information from a variable that is constantly changing ("equip[id].status"), I want it to check if the variable changed from time to time.
Other option is keep a track on this variable, and when it changes run app.get() again.
My codes:
This is the one I want to refresh:
app1.get("/", function(req,res){
id = 1;
res.render(__dirname + '/connected.html', {equipment: 'Esteira 1', status: equip[id].status, historico: equip[id].hist});
});
And this are the ones that change "status"
app1.get("/open", function(req,res){
id = 1;
equip[id].status = 'Conectado';
equip[id].hist = equip[id].hist + equip[id].status;
var now = new Date();
time = dateFormat(now,"h:MM:ss");
console.log(time+" - Equipment 'Esteira 1' has connected on port 3001");
res.end
});
app1.get("/closed", function(req,res){
id = 1;
equip[id].status = 'Desconectado';
equip[id].hist = equip[id].hist + equip[id].status;
var now = new Date();
time = dateFormat(now,"h:MM:ss");
console.log(time+" - Equipment 'Esteira 1' has disconnected");
res.end
});
The app.get() is a server-side code, and on it's own it has no power over changing the client-side.
In the client-side, you need to employ a javascript code to either poll the server regularly (ajax) or have it actively listen to server for changes through websockets. That way you can also choose to either refresh the whole page or just load the relevant changes (like this very site does!).
You should look into these relevant technologies: javascript, ajax, long-polling, socket.io

Socket.IO & private messages

This must have been asked already a thousand times, but I do not find any of the answers satisfying, so I'll try having another go, being as clear as possible.
I am starting out with a clean Express; the one that is usually done via the following terminal commands:
user$ express
user$ npm install
then I proceed installing socket.io, this way:
user$ npm install socket.io --save
on my main.js file I then have the following:
//app.js
var express = require('express'),
http = require('http'),
path = require('path'),
io = require('socket.io'),
routes = require('./routes');
var app = express();
I start my socket.io server by attaching it to my express one:
//app.js
var server = http.createServer(app).listen(app.get('port'), function(){
console.log('express server started!');
});
var sIo = io.listen(server);
What I do now is to set the usual routes for Express to work with:
//app.js
app.get('/', routes.index);
app.get('/send/:recipient/:text', routes.sendMessage);
Now, Since I like to keep things organized, I want to put my socket.io code in another file, so instead of using the usual code:
//app.js
sIo.sockets.on('connection', function(socket){
console.log('got a connection');
});
I use the following to be able to access both the socket and the sIo object (as that object contains all the connections infos (important)):
//app.js
sIo.sockets.on('connection', function(socket){
routes.connection(sIo, socket);
});
// index.js (./routes)
exports.connection = function(sIo, socket){
console.log('got a connection.');
};
This way I can do all my socket.io jobs in here. I know that I can access all my clients information now from the sIo object, but of course, they do not contain any information about their session data.
My questions now are the following:
Suppose a user makes an HTTP request to send a message and the handler in my routes is like this:
exports.sendMessage = function(req, res){
//do stuff here
};
How can I get this to "fire" something in my socket.io to send a message? I do not want to know all the underlying work that needs to be done, like keeping track of messages, users, etc. I only want to understand how to "fire" socket.io to do something.
How can I make sure that socket.io sends the message only to a person in particular and be 100% sure that nobody else gets it? From what I can see, there is no way to get the session infos from the sIo object.
Thanks in advance.
question one: The cleanest way to separate the two would probably be to use an EventEmitter. You create an EventEmitter that emits when an http message comes in. You can pass session information along with the event to tie it back to the user who sent the message if necessary.
// index.js (./routes)
var EventEmitter = require('events').EventEmitter;
module.exports.messageEmitter = messageEmitter = new EventEmitter();
module.exports.sendMessage = function(req, res) {
messageEmitter.emit('private_message', req.params.recipient, req.params.text);
};
question 2: You can access the socket when the initial connection is made. An example mostly borrowed from this answer:
var connect = require('connect'),
userMap = {};
routes.messageEmitter.on('private_message', function(recipient, text) {
userMap[recipient].emit('private_message', text);
});
io.on('connection', function(socket_client) {
var cookie_string = socket_client.request.headers.cookie;
var parsed_cookies = connect.utils.parseCookie(cookie_string);
var connect_sid = parsed_cookies['connect.sid'];
if (connect_sid) {
session_store.get(connect_sid, function (error, session) {
userMap[session.username] = socket_client;
});
}
socket_client.on('private_message', function(username, message) {
userMap[username].emit(private_message, message)
});
});
So we're just creating a map between a session's username and a socket connection. Now whenever you need to send a message you can easily lookup what socket is associated with that user and send a message to them using their socket. Just make sure to handle disconnects, and reconnects and connecting in multiple tabs, etc.
I have built something like what you are saying. If a user can make a socket request, it pushes the message via the socket, and then the server does a broadcast or emit of it. But, if a user can't connect to the socket, it then does the http post, like what you are saying by calling the sendMessage. What I have done, rather than having sendMessage shoot off a socket is that I also have my clients doing an ajax request every 5 seconds or so. That will bring back new messages, and if any of the messages were not received via socket.io, I then add them to my clientside array. This acts as sort of a safety net, so I don't have to always fully trust socket.io.
see below in pseudo code
client
if canSendSocketMessage()
sendSocketMessage(message)
else
sendAjaxMessage(message)
setInterval( ->
// ajax call
getNewMessages()
), 5000
server
socket stuff
socket.on 'message' ->
saveMessage()
socket.emit(message)
ajax endpoints
app.post 'sendMessage'
saveMessage()
app.get 'getNewMessages'
res.send getNewMessages()

Resources