Tracking all the requests and responses of a server using node js - node.js

I am beginner in nodejs app development. I have the following requirement,
A Webapp running on a server-A on a host. Another service is to be implemented and run on a different host should intercept all the incoming requests and outgoing responses from the WebApp running on a server-A. All these details has to be stored in a file.
Looking for example or thought on how to implement the intercepting service. Thank you.

Take a look at socket.io, it's a socket client & server for Node.js (here is a demo: http://socket.io/get-started/chat/)
You can use socket.io to send the data between both apps, and then you write the data to a file using file stream:
var fs = require('fs');
fs.writeFile(__dirname "/file.txt", "Hello World!", function(err) {
if(err) {
return console.log(err);
}
console.log("File saved!");
});

Related

What is the most efficient way of sending files between NodeJS servers?

Introduction
Say that on the same local network we have two Node JS servers set up with Express: Server A for API and Server F for form.
Server A is an API server where it takes the request and saves it to MongoDB database (files are stored as Buffer and their details as other fields)
Server F serves up a form, handles the form post and sends the form's data to Server A.
What is the most efficient way to send files between two NodeJS servers where the receiving server is Express API? Where does the file size matter?
1. HTTP Way
If the files I'm sending are PDF files (that won't exceed 50mb) is it efficient to send the whole contents as a string over HTTP?
Algorithm is as follows:
Server F handles the file request using https://www.npmjs.com/package/multer and saves the file
then Server F reads this file and makes an HTTP request via https://github.com/request/request along with some details on the file
Server A receives this request and turns the file contents from string to Buffer and saves a record in MongoDB along with the file details.
In this algorithm, both Server A (when storing into MongoDB) and Server F (when it was sending it over to Server A) have read the file into the memory, and the request between the two servers was about the same size as the file. (Are 50Mb requests alright?)
However, one thing to consider is that -with this method- I would be using the ExpressJS style of API for the whole process and it would be consistent with the rest of the app where the /list, /details requests are also defined in the routes. I like consistency.
2. Socket.IO Way
In contrast to this algorithm, I've explored https://github.com/nkzawa/socket.io-stream way which broke away from the consistency of the HTTP API on Server A (as the handler for socket.io events are defined not in the routes but the file that has var server = http.createServer(app);).
Server F handles the form data as such in routes/some_route.js:
router.post('/', multer({dest: './uploads/'}).single('file'), function (req, res) {
var api_request = {};
api_request.name = req.body.name;
//add other fields to api_request ...
var has_file = req.hasOwnProperty('file');
var io = require('socket.io-client');
var transaction_sent = false;
var socket = io.connect('http://localhost:3000');
socket.on('connect', function () {
console.log("socket connected to 3000");
if (transaction_sent === false) {
var ss = require('socket.io-stream');
var stream = ss.createStream();
ss(socket).emit('transaction new', stream, api_request);
if (has_file) {
var fs = require('fs');
var filename = req.file.destination + req.file.filename;
console.log('sending with file: ', filename);
fs.createReadStream(filename).pipe(stream);
}
if (!has_file) {
console.log('sending without file.');
}
transaction_sent = true;
//get the response via socket
socket.on('transaction new sent', function (data) {
console.log('response from 3000:', data);
//there might be a better way to close socket. But this works.
socket.close();
console.log('Closed socket to 3000');
});
}
});
});
I said I'd be dealing with PDF files that are < 50Mb. However, if I use this program to send larger files in the future, is socket.io a better way to handle 1GB files as it's using stream?
This method does send the file and the details across but I'm new to this library and don't know if it should be used for this purpose or if there is a better way of utilizing it.
Final thoughts
What alternative methods should I explore?
Should I send the file over SCP and make an HTTP request with file details including where I've sent it- thus, separating the protocols of files and API requests?
Should I always use streams because they don't store the whole file into memory? (that's how they work, right?)
This https://github.com/liamks/Delivery.js ?
References:
File/Data transfer between two node.js servers this got me to try socket-stream way.
transfer files between two node.js servers over http for HTTP way
There are plenty of ways to achieve this , but not so much to do it right !
socket io and wesockets are efficient when you use them with a browser , but since you don't , there is no need for it.
The first method you can try is to use the builtin Net module of nodejs, basically it will make a tcp connection between the servers and pass the data.
you should also keep in mind that you need to send chunks of data not the entire file , the socket.write method of the net module seems to be a good fit for your case check it : https://nodejs.org/api/net.html
But depending on the size of your files and concurrency , memory consumption can be quite large.
if you are running linux on both servers you could even send the files at ground zero with a simple linux command called scp
nohup scp -rpC /var/www/httpdocs/* remote_user#remote_domain.com:/var/www/httpdocs &
You can even do this with windows to linux or the other way.
http://www.chiark.greenend.org.uk/~sgtatham/putty/download.html
the client scp for windows is pscp.exe
Hope this helps !

Node.js http server: "getifaddres: Too many open files"

I'm currently running a nodejs server, and using GazeboJS to connect to the Gazebo server in order to send messages.
The problem is:
From my searches it seems like its due to the linux open file limit which is default at 1024 (Using Ubunuty 14.04). Most solutions seem to be to increase the open file limit.
However I don't know why my script is opening files and not closing them. It seems like each http request opens a connection which is not closed even though a response is sent? The http requests are coming from a Lua script using async.
The error
getifaddres: Too many open files
occurs after exactly 1024 requests.
I have no experience with webservers so I hope someone could give an explanation.
The details of the nodejs server i'm running:
The server is created using
http.createServer(function(req, res))
when a HTTP GET request is received, the response is sent as a string. Example of one response
gazebo.subscribe(obj.states[select].type, obj.states[select].topic, function(err, msg) // msg is a JSON object
{
if (err)
{
console.log('Error: ' + err);
return;
}
res.setHeader('Content-Type', 'text/html');
res.end(JSON.stringify(msg));
gazebo.unsubscribe(obj.states[select].topic);
})
The script makes use of the publish/subscribe topics in the Gazebo server to extract information or publish actions. More information about Gazebo communication is here.

WADO Protocol implemented in node.js

I am creating a very simple DICOM ECHO server with nodejs however I am facing a problem where the clients always respond as can't connect, I am unsure what I am missing, has someone here experience in writing a DICOM ECHO server?
This is the code I have
var net = require('net');
net.createServer(function(socket){
socket.on('data', function(data){
datat = String.fromCharCode.apply(null, new Uint16Array(data));
console.log(datat);
socket.write(data);
socket.end()
});
socket.on('error', function(error){
console.log("Caught server socket error: ")
console.log(error.stack)
console.log(error)
});
}).listen(8041);
console.log('Server running at 127.0.0.1 on port 8041');
I have tried responding with the binary data and also with text data but neither one seems to work.
DICOM Echo is not as simple as a ping. You must implement a subset of the full stack of the DICOM network protocol. Instead of writing your own server with node.js, I would advise you to rely on an existing DICOM server. Orthanc is an example of a free DICOM server designed to act as a back-end service to Web applications. Orthanc has built-in support of DICOM C-Echo, which can be triggered by an AJAX request to its REST API (URI /modalities/{dicom}/echo).
Disclaimer: I am the author of Orthanc.

Need to know something regarding socket.io and redis and nginx

My goal is to build a chat application - similar to whatsapp
To my understanding, socket.io is a real-time communication library written in javascript and it is very simple to use
For example
// Serverside
io.on('connection', function(socket) {
socket.on('chat', function(msg) {
io.emit('chat', msg);
});
});
// ClientSide (Using jquery)
var socket = io();
$('form').submit(function(){
socket.emit('chat', $('#m').val());
$('#m').val('');
return false;
});
socket.on('chat', function(msg){
$('#messages').append($('<li>').text(msg));
});
1) do I always need to start an io.on('connection') to use the real-time feature or i could just start using socket.on object instead? for example i have a route
app.post('/postSomething', function(req, res) {
// Do i need to start an io.on or socket.on here?
});
because i want the real-time feature to be listen only on specific route.
2) Redis is a data structure library which handles the pub/sub, why do we need to use pub/sub mechanism?
I read alot of articles but couldn't grasp the concept. Article example http://ejosh.co/de/2015/01/node-js-socket-io-and-redis-intermediate-tutorial-server-side/
for example the code below
// Do i need redis for this, if so why? is it for caching purposes?
// Where does redis fit in this code?
var redis = require("redis");
var client = redis.createClient();
io.on('connection', function(socket) {
socket.on('chat', function(msg) {
io.emit('chat', msg);
});
});
3) Just wondering why I need nginx to scale node.js application? i found this stackoverflow answer:
Strategy to implement a scalable chat server
It says something about load balancing, read that online and couldn't grasp the concept as well.
So far I have only been dealing with node.js , mongoose simple CRUD application, but I'm willing work really hard if you guys could share some of your knowledge and share some useful resources so that I could deepen my knowledge about all of these technologies.
Cheers!
Q. Socket.on without IO.on
io.on("connection" ... )
Is called when you receive a new connection. Socket.on listens to all the emits at the client side. If you want your client to act as a server for some reason then (in short) yes io.on is required
Q. Redis pub/sub vs Socket.IO
Take a look at this SO question/anwer, quoting;
Redis pub/sub is great in case all clients have direct access to redis. If you have multiple node servers, one can push a message to the others.
But if you also have clients in the browser, you need something else to push data from a server to a client, and in this case, socket.io is great.
Now, if you use socket.io with the Redis store, socket.io will use Redis pub/sub under the hood to propagate messages between servers, and servers will propagate messages to clients.
So using socket.io rooms with socket.io configured with the Redis store is probably the simplest for you.
Redis can act like a message queue if it is a requirement. Redis is a datastore support many datatypes.
Q. Why Nginx with Node.js
Node.js can work standalone but nginx is faster to server static content.
Since nginx is a reverse proxy therefore servers are configured with nginx to handle all the static data (serving static files, doing redirects, handling SSL certificates and serving error pages.
) and every other request is sent to node.js
Check this Quora post as well: Should I host a node.js project without nginx?
Quoting:
Nginx can be used to remove some load from the Node.js processes, for example, serving static files, doing redirects, handling SSL certificates and serving error pages.
You can do everything without Nginx but it means You have to code it yourself, so why not use a fast and proven solution for this.

how to embed nodejs application on client website?

firstly sorry for not standard English. : D
I have a chat application using nodejs, expresss finished running on port 3000. So I want to embed in website application clients, then how?
I had to use ajax load, but can not be:
jQuery.ajax({
type:"GET",
url:"http://localhost:3000/client/",
success: function (data){
jQuery('body').append(data);
}
});
Since you are using socket.io at server side then you can get its reference at the client using below line (jquery required):
$.getScript(host+'socket.io/socket.io.js', function()
{
var clientSocket = io.connect(host);
// ... do other stuff with your socket
});
Where host is your server host name e.g. http://192.168.1.5:3000/ where your nodejs application is deployed.
For more information see here for more client - server communication exposed events.
try the example from http://book.mixu.net/node/ch3.html (it uses long-polling) once done with it use socket.io

Resources