Problem writing data with NodeJS TLSSocket - node.js

Background:
I have a NodeJS TCP stream server with SSL (tls server) which simply forwards data from client A to client B. i.e. it listens to connections on port 9000 (client A) and writes the received data to an existing persistent connection on port 9001 (client B). The connection between NodeJS and client B is persistent, never broken. Everything is running on my local machine.
Problem:
When using JMeter as client A and sending 300 requests (threads) with a ramp-up period of 1 second, a few requests never arrive at client B.
Troubleshooting done so far:
First I checked the logs of NodeJS application, which indicated that 300 requests were received from client A, and 300 requests were sent to client B.
Then I checked the logs of the client B, which indicated that only 298 requests arrived.
Next I monitored the local network using WireShark, which to my surprise indicated that 300 requests arrived from Client A to server, but only 298 of them were sent to client B.
Server side function which writes data to client B:
The function is a class member, where vSocket is a NodeJS TLSSocket which represents the persisted connection with client B. In the server logs, I get "Writing data to socket." and "Wrote data to socket." 300 times. I never get "Unexpected error while writing data to socket." or "Write returned false.".
public async SendAsync (pData: string): Promise<void> {
return new Promise<void>((pResolve, pReject) => {
try {
this.uLogger.Debug('Writing data to socket.', pData);
const rc = this.vSocket.write(pData, (pError) => {
if (pError) {
this.uLogger.Error('Unexpected error while writing data to socket.', pError);
return pReject(pError);
}
this.uLogger.Debug('Wrote data to socket.', pData);
if (this.vKeepConnectionOpen == false) {
this.uLogger.Debug('Not keeping this socket open.');
this.CloseAsync();
}
return pResolve();
});
if (rc == false) {
this.uLogger.Debug('Write returned false.', this.uId);
}
} catch (error) {
this.uLogger.Error('Unexpected error while writing data to socket.', error);
return pReject(error);
}
});
}
socket.write(data[, encoding][, callback]):
The docs say that -
Returns true if the entire data was flushed successfully to the kernel buffer. Returns false if all or part of the data was queued in user memory.
Since in my case this function never returns false, I assume NodeJS has successfully written the data to OS buffer, and the data should have been sent.
This problem is frequently replicable, and more likely to occur when I send hundreds of requests with JMeter. It is less likely to occur with 10 requests, and has never happened with < 5 requests.
I don't understand what's happening here, any help is appreciated.

Related

Progress bar for express / react communicating with backend

I want to make a progress bar kind of telling where the user where in process of fetching the API my backend is. But it seems like every time I send a response it stops the request, how can I avoid this and what should I google to learn more since I didn't find anything online.
React:
const {data, error, isError, isLoading } = useQuery('posts', fetchPosts)
if(isLoading){<p>Loadinng..</p>}
return({data&&<p>{data}</p>})
Express:
app.get("api/v1/testData", async (req, res) => {
try {
const info = req.query.info
const sortByThis = req.query.sortBy;
if (info) {
let yourMessage = "Getting Data";
res.status(200).send(yourMessage);
const valueArray = await fetchData(info);
yourMessage = "Data retrived, now sorting";
res.status(200).send(yourMessage);
const sortedArray = valueArray.filter((item) => item.value === sortByThis);
yourMessage = "Sorting Done now creating geojson";
res.status(200).send(yourMessage);
createGeoJson(sortedArray)
res.status(200).send(geojson);
}
else { res.status(400) }
} catch (err) { console.log(err) res.status(500).send }
}
You can only send one response to a request in HTTP.
In case you want to have status updates using HTTP, the client needs to poll the server i.e. request status updates from the server. Keep in mind though that every request needs to be processed on the server side and will take resources away which are then not available for other (more important) requests from other clients. So don't poll too frequently.
If you want to support long running operations using HTTP have a look at the following API design pattern.
Alternatively you could also use a WebSockets connection to push updates from the server to the client. I assume your computation on the backend will not be minutes long and you want to update the client in real-time, so probably WebSockets will be the best option for you. A WebSocket connection has, once established, considerably less overhead than sending huge HTTP requests/ responses between client and server.
Have a look at this thread which dicusses abovementioned and other possibilites.

Asynchronous processing of data in Expressjs

I've an Express route which receives some data and process it, then insert into mongo (using mongoose).
This is working well if I return a response after the following steps are done:
Receive request
Process the request data
Insert the processed data into Mongo
Return 204 response
But client will be calling this API concurrently for millions of records. Hence the requirement is not to block the client for processing the data. Hence I made a small change in the code:
Receive request
Return response immediately with 204
Process the requested data
Insert the processed data into Mongo
The above is working fine for the first few requests (say 1000s), after that client is getting socket exception: connection reset peer error. I guess it is because server is blocking the connection as the port is not free and at some point of time, I notice my nodejs process is throwing out Out of memory error.
Sample code is as follows:
async function enqueue(data) {
// 1. Process the data
// 2. Insert the data in mongo
}
async function expressController(request, response) {
logger.info('received request')
response.status(204).send()
try {
await enqueue(request.body)
} catch (err) {
throw new Error(err)
}
}
Am I doing something wrong here?

Write process output to response as they are generated

I have a background process which generates lots of output. I want to send the output to client as they become available. My expectation is client will send a HTTP GET/POST request, server opens a connection and keeps it alive, then server keeps streaming data as they become available.
Real World Example: When you run a test or shell command in AWS/Heroku/GoogleAppEngine, it shows the output in real time, as if the command was running on my local machine. How are they doing that?
Server: In this sample server, the process generates a message every second.
function handleRequest (request, response) {
for (let i = 0; i < 10; i++) {
response.write('This is message.' + i)
require('child_process').execSync("sleep 1")
}
response.end()
}
Client: The client should receive data as they become available i.e. one by one. But the output is of course what I expected, the entire data is collected at once and sent back to the client.
request.put(formData)
.on('data', function (data) {
console.log(data.toString())
})
I am pretty new to NodeJS, but I am hoping maybe I can return some
form of writable stream back to client, and as data are written to
this stream on server side, the client will receive it on its end?

Node SSDP Client not finding Server broadcast

I've implemented a server/client implementation of node-ssdp(npm install node-ssdp). Everything "appears" to work but my client does not pick up the server's packet. I am receiving a lot of other payloads from different devices/locations but not the payload from my node-ssdp server.
I'm running on the same machine and I'm running on OSX. I have two separate node projects: one for my client and one for my server.
Note: I've also tried running the server on one machine and the client on a separate machine, in case there was an issue with loopback or something. I also verified via Wireshark that the packets from the server were being read by the client machine. It is sending a NOTIFY * HTTP/1.1 in the headers.
Here are my implementations for client and server:
Server
var SSDP = require('node-ssdp').Server
, server = new SSDP({
location: 'http://' + require('ip').address() + ':33333',
ssdpPort: 33333
})
console.log(require('ip').address())
server.addUSN('upnp:rootdevice')
server.addUSN('urn:schemas-upnp-org:device:MediaServer:1')
server.addUSN('urn:schemas-upnp-org:service:ContentDirectory:1')
server.addUSN('urn:schemas-upnp-org:service:ConnectionManager:1')
server.on('advertise-alive', function (heads) {
console.log('advertise-alive', heads)
// Expire old devices from your cache.
// Register advertising device somewhere (as designated in http headers heads)
})
server.on('advertise-bye', function (heads) {
console.log('advertise-bye', heads)
// Remove specified device from cache.
})
// start server on all interfaces
console.log('starting ssdp server')
server.start()
Client
var ssdp = require('node-ssdp').Client
, client = new ssdp({
})
client.on('notify', function () {
//console.log('Got a notification.')
})
client.on('response', function inResponse(headers, code, rinfo) {
console.log('Got a response to an m-search:\n%d\n%s\n%s', code, JSON.stringify(headers, null, ' '), JSON.stringify(rinfo, null, ' '))
})
client.search('ssdp:all')
// And after 10 seconds, you want to stop
setTimeout(function () {
client.stop()
}, 10000)
I am running out of ideas. It's weird because I've previously implemented a UDP multicast solution and it works. SSDP is, from what I understand, UDP multicast behind the scenes.
From the github issue itself, apparently adding sourcePort to the configuration solved the issue. https://github.com/diversario/node-ssdp/issues/75#issuecomment-292054892

Node.js outbound http request concurrency

I've got a node.js script that pulls data from an external web API for local storage. The first request is a query that returns a list of IDs that I need to get further information on. For each ID returned, I spawn a new http request from node.js and reach out to the server for the data (POST request). Once the job is complete, I sleep for 3 minutes, and repeat. Sometimes the number of IDs is in the hundreds. Each individual http request for those returns maybe 1kb of data, usually less, so the round trip is very short.
I got an email this morning from the API provider begging me to shut off my process because I'm "occupying all of the API servers with hundreds of connections" (which I am actually pretty proud of, but that is not the point). To be nice, I increased the sleep from 3 minutes to 30 minutes, and that has so far helped them.
On to the question... now I've not set maxSockets or anything, so I believe the default is 5. Shouldn't that mean I can only create 5 live http request connections at a time? How does the admin have hundreds? Is their server not hanging up the connection once the data is delivered? Am I not doing so? I don't have an explicit disconnect at the end of my http request, so perhaps I am at fault here. So what does maxSockets actually set?
Sorry for some reason I didn't read your question correctly
maxSockets is the max number of connections the http module will make for that current process. You can check to see what yours is currently set at by accessing it from http.globalAgent.maxSockets.
You can see some information on the current number of connections you have to a given host with the following:
console.log("Active socket connections: %d", http.globalAgent.sockets['localhost:8080'].length )
console.log("Total queued requests: %d", http.globalAgent.requests['localhost:8080'].length)
Substituting the localhost:8080 for what ever host and port you are making the request too.
You can see how node handles these connections at the following two points:
Adding a new connection and storing to the request queue
https://github.com/joyent/node/blob/master/lib/_http_agent.js#L83
Creating connections from queued requests
https://github.com/joyent/node/blob/master/lib/_http_agent.js#L148
I wrote this up really quick to give you an idea how you could stagger those requests out a bit. This particular code doesn't check to see how many requests are "pending" you could easily modify it to allow you to only have a set number of requests going out at any given time (which honestly would be the better way to do it).
var Stagger = function (data, stagger, fn, cb) {
var self = this;
this.timerID = 0;
this.data = [].concat(data);
this.fn = fn;
this.cb = cb;
this.stagger = stagger;
this.iteration = 0;
this.store = {};
this.start = function () {
(function __stagger() {
self.fn(self.iteration, self.data[self.iteration], self.store);
self.iteration++;
if (self.iteration != self.data.length)
self.timerID = setTimeout(__stagger, self.stagger);
else
cb(self.store);
})();
};
this.stop = function () {
clearTimeout(self.timerID);
};
};
var t = new Stagger([1,2,3,4,5,6], 1000, function (i, item, store) {
console.log(i, item);
if (!store.out) store.out = [];
store.out[i] = Math.pow(2,i);
},
function (store) {
console.log('Done!', store);
});
t.start();
This code can definitely could be improved but it should give you an idea of maybe where to start.
Live Demo: http://jsbin.com/ewoyik/1/edit (note: requires console)

Resources