NodeJS batching async.parallel - node.js

I have
async.parallel(tasksGetContentFromGitHub, function(err, res) {
// all request over, do something
}
the problem is that I might have a large number of tasks, and each of them is sending a request to GitHub.
Since I am a nice citizen, I don't want to send 1000+ queries at once at GitHub, therefore I would like to batch those requests 10 at the time, and then execute my inner code.
Is there an easy way to do that?

You can try async.parallelLimit:
async.parallelLimit(tasksGetContentFromGitHub, 10, function(err, res) {
// all request over, do something
}
Hope that it can help!

Related

Returning multiple asynchronous responses

I'm currently looking to set up an endpoint that accepts a request, and returns the response data in increments as they load.
The application of this is that given one upload of data, I would like to calculate a number of different metrics for that data. As each metric gets calculated asynchronously, I want to return this metric's value to the front-end to render.
For testing, my controller looks as follows, trying to use res.write
uploadData = (req, res) => {
res.write("test");
setTimeout(() => {
res.write("test 2");
res.end();
}, 3000);
}
However, I think the issue stems from my client-side which I'm writing in React-Redux, and calling that route through an Axios call. From my understanding, it's because the axios request closes once receiving the first response, and the connection doesn't stay open. Here is what my axios call looks like:
axios.post('/api', data)
.then((response) => {
console.log(response);
})
.catch((error) => {
console.log(error);
});
Is there an easy way to do this? I've also thought about streaming, however my concern with streaming is that I would like each connection to be direct and unique between clients that are open for short amount of time (i.e. only open when the metrics are being calculated).
I should also mention that the resource being uploaded is a db, and I would like to avoid parsing and opening a connection multiple times as a result of multiple endpoints.
Thanks in advance, and please let me know if I can provide any more context
One way to handle this while still using a traditional API would be to store the metrics in an object somewhere, either a database or redis for example, then just long poll the resource.
For a real world example, say you want to calculate the following metrics of foo, time completed, length of request, bar, foobar.
You could create an object in storage that looks like this:
{
id: 1,
lengthOfRequest: 123,
.....
}
then you would create an endpoint in your API that like so metrics/{id}
and would return the object. Just keep calling the route until everything completes.
There are some obvious drawbacks to this of course, but once you get enough information to know how long the metrics will take to complete on average you can tweak the time in between the calls to your API.

NodeJs request handling on server side

Hi all Currently My System take multiple request as in queue but when first request take too much time than another requested person have to wait until it's not closed Like FCFS (First come First Serve) algorithm.
Now I want something Like that If 5 Request in queue than Next request has give an error server is too much busy right now Please try after some time.
Please let me know any other Technic that can handle request better than these or any other IDEA please share your thoughts May it's Helpful.
Thanks
I don't think its a good idea to serve only one request at a time. Servers are for handling multiple requests. But if you want to limit requests to 5 you can use use closure variable to achieve the same. e.g. this express router:
var noOfActiveReq = 0;
router.get('/handle', function (req, res) {
if(noOfActiveReq > 5) {
return next(new Error("server is too much busy right now Please try after some time"));
}
noOfActiveReq++;
db.get(req.query.id, function(err, result){
noOfActiveReq--;
if(err) return next(err);
res.json(result);
});
});

Node.js - process millions of http post items without blocking

Using node.js, what is the best way to process a million items in an HTTP post request without blocking the server? My only guess is some sort of message queue, but I really have no idea.
You would want to use a lib like async.js to create non-blocking loops.
https://github.com/caolan/async
var async = require("async");
async.each(yourArrayOfThings, function(oneItem, callback) {
// do something
// ...
return callback(null);
}, function(err) {
// if any of the callbacks returned an error, err would equal that error
});
Give some more information on what your processing needs are, if this is not an applicable solution for you.

Why can't we do multiple response.send in Express.js?

3 years ago I could do multiple res.send in express.js.
even write a setTimeout to show up a live output.
response.send('<script class="jsbin" src="http://code.jquery.com/jquery-1.7.1.min.js"></script>');
response.send('<html><body><input id="text_box" /><button>submit</button></body></html>');
var initJs = function() {
$('.button').click(function() {
$.post('/input', { input: $('#text_box').val() }, function() { alert('has send');});
});
}
response.send('<script>' + initJs + '</script>');
Now it will throw:
Error: Can't set headers after they are sent
I know nodejs and express have updated. Why can't do that now? Any other idea?
Found the solution but res.write is not in api reference http://expressjs.com/4x/api.html
Maybe you need: response.write
response.write("foo");
response.write("bar");
//...
response.end()
res.send implicitly calls res.write followed by res.end. If you call res.send multiple times, it will work the first time. However, since the first res.send call ends the response, you cannot add anything to the response.
response.send sends an entire HTTP response to the client, including headers and content, which is why you are unable to call it multiple times. In fact, it even ends the response, so there is no need to call response.end explicitly when using response.send.
It appears to me that you are attempting to use send like a buffer: writing to it with the intention to flush later. This is not how the method works, however; you need to build up your response in code and then make a single send call.
Unfortunately, I cannot speak to why or when this change was made, but I know that it has been like this at least since Express 3.
res.write immediately sends bytes to the client
I just wanted to make this point about res.write clearer.
It does not build up the reply and wait for res.end(). It just sends right away.
This means that the first time you call it, it will send the HTTP reply headers including the status in order to have a meaningful response. So if you want to set a status or custom header, you have to do it before that first call, much like with send().
Note that write() is not what you usually want to do in a simple web application. The browser getting the reply little by little increases the complexity of things, so you will only want to do it it if it is really needed.
Use res.locals to build the reply across middleware
This was my original use case, and res.locals fits well. I can just store data in an Array there, and then on the very last middleware join them up and do a final send to send everything at once, something like:
async (err, req, res, next) => {
res.locals.msg = ['Custom handler']
next(err)
},
async (err, req, res, next) => {
res.locals.msg.push('Custom handler 2')
res.status(500).send(res.locals.msg.join('\n'))
}

Do I need to wait for a callback on a call to WATCH in Redis (in node.js)?

I'm using node-redis. In code like this:
var store = require('redis').createClient();
store.watch('some:key');
store.get('some:key', function (err, results) {
var multi = store.multi();
// COMPUTE SOMETHING WITH results
multi.set('something:or:other', 25);
multi.exec(checkAllIsWell);
});
Should lines 1-2 read
store.watch('some:key', function (err, alwaysok) {
store.get('some:key', function (err, result) {
or will watch always have immediate effect?
EDIT: To reframe the question a little, is sequence guaranteed on seqential calls on the same Redis client? Or could the WATCH happen after the GET?
Having reframed my question, I realize that it must surely be sequence-preserving, and I'm actually duplicating this question: Are Redis updates synchronous?
So the answer is surely that I don't need to wait for WATCH to call back and my original code is OK.
Sorry to noise up the web, folks!
Watch always returns OK. http://redis.io/commands/watch
It is useful only if later you use MULTI/EXEC, to check the EXEC return value.
For more information about Redis transactions, visit http://redis.io/topics/transactions

Resources