Node.js - Calling asynchronous functions whenever a toggle is changed - node.js

In my website, I have a toggle button that determines whether or not a given user will receive messages from a given source. Whenever this toggle is changed, an asynchronous function needs to be called. However, if the toggle is changed, and then it is quickly changed again, my program must wait for the previous asynchronous call to finish. Here are my two functions that call the asynchronous functions
_enable() {
let params = determineSubscriptionParams(this.endpoint, this.level);
SNS_CLIENT.subscribe(params, (err, data) => {
if (err) {
throw err;
}
else {
// do stuff here
}
});
}
_disable() {
let params = {
SubscriptionArn: this.subscriptionArn
}
SNS_CLIENT.unsubscribe(params, (err, data) => {
if (err) {
throw err;
}
else {
// do stuff here
}
});
}
Both of these functions are members of a class and the subscribe and unsubscribe functions are the asynchronous calls

You are looking for a lock for critical sections of code. If you don't want to implement your own, you could use async-lock to accomplish this.
This is will make the second request made in quick succession unable to enter the critical part of the code before the first request has released it.

Related

Why is my RxJS Observable completing right away?

I'm a bit new to RxJS and it is kicking my ass, so I hope someone can help!
I'm using RxJS(5) on my express server to handle behaviour where I have to save a bunch of Document objects and then email each of them to their recepients. The code in my documents/create endpoint looks like this:
// Each element in this stream is an array of `Document` model objects: [<Document>, <Document>, <Document>]
const saveDocs$ = Observable.fromPromise(Document.handleCreateBatch(docs, companyId, userId));
const saveThenEmailDocs$ = saveDocs$
.switchMap((docs) => sendInitialEmails$$(docs, user))
.do(x => {
// Here x is the `Document` model object
debugger;
});
// First saves all the docs, and then begins to email them all.
// The reason we want to save them all first is because, if an email fails,
// we can still ensure that the document is saved
saveThenEmailDocs$
.subscribe(
(doc) => {
// This never hits
},
(err) => {},
() => {
// This hits immediately.. Why though?
}
);
The sendInitialEmails$$ function returns an Observable and looks like this:
sendInitialEmails$$ (docs, fromUser) {
return Rx.Observable.create((observer) => {
// Emails each document to their recepients
docs.forEach((doc) => {
mailer.send({...}, (err) => {
if (err) {
observer.error(err);
} else {
observer.next(doc);
}
});
});
// When all the docs have finished sending, complete the
// stream
observer.complete();
});
});
The problem is that when I subscribe to saveThenEmailDocs$, my next handler is never called, and it goes straight to complete. I have no idea why... Inversely if I remove the observer.complete() call from sendInitialEmails$$, the next handler is called every time and the complete handler in subscribe is never called.
Why isn't the expected behaviour of next next complete happening, instead it's one or the other... Am I missing something?
I can only assume that mailer.send is an asynchronous call.
Your observer.complete() is called when all the asynchronous calls have been launched, but before any of them could complete.
In such cases I would either make an stream of observable values from the docs array rather than wrap it like this.
Or, if you would like to wrap it manually into an observable, I suggest you look into the library async and use
async.each(docs, function(doc, callback) {...}, function finalized(err){...})

Design pattern for use of promises in error conditions [duplicate]

I'm writing a JavaScript function that makes an HTTP request and returns a promise for the result (but this question applies equally for a callback-based implementation).
If I know immediately that the arguments supplied for the function are invalid, should the function throw synchronously, or should it return a rejected promise (or, if you prefer, invoke callback with an Error instance)?
How important is it that an async function should always behave in an async manner, particularly for error conditions? Is it OK to throw if you know that the program is not in a suitable state for the async operation to proceed?
e.g:
function getUserById(userId, cb) {
if (userId !== parseInt(userId)) {
throw new Error('userId is not valid')
}
// make async call
}
// OR...
function getUserById(userId, cb) {
if (userId !== parseInt(userId)) {
return cb(new Error('userId is not valid'))
}
// make async call
}
Ultimately the decision to synchronously throw or not is up to you, and you will likely find people who argue either side. The important thing is to document the behavior and maintain consistency in the behavior.
My opinion on the matter is that your second option - passing the error into the callback - seems more elegant. Otherwise you end up with code that looks like this:
try {
getUserById(7, function (response) {
if (response.isSuccess) {
//Success case
} else {
//Failure case
}
});
} catch (error) {
//Other failure case
}
The control flow here is slightly confusing.
It seems like it would be better to have a single if / else if / else structure in the callback and forgo the surrounding try / catch.
This is largely a matter of opinion. Whatever you do, do it consistently, and document it clearly.
One objective piece of information I can give you is that this was the subject of much discussion in the design of JavaScript's async functions, which as you may know implicitly return promises for their work. You may also know that the part of an async function prior to the first await or return is synchronous; it only becomes asynchronous at the point it awaits or returns.
TC39 decided in the end that even errors thrown in the synchronous part of an async function should reject its promise rather than raising a synchronous error. For example:
async function someAsyncStuff() {
return 21;
}
async function example() {
console.log("synchronous part of function");
throw new Error("failed");
const x = await someAsyncStuff();
return x * 2;
}
try {
console.log("before call");
example().catch(e => { console.log("asynchronous:", e.message); });
console.log("after call");
} catch (e) {
console.log("synchronous:", e.message);
}
There you can see that even though throw new Error("failed") is in the synchronous part of the function, it rejects the promise rather than raising a synchronous error.
That's true even for things that happen before the first statement in the function body, such as determining the default value for a missing function parameter:
async function someAsyncStuff() {
return 21;
}
async function example(p = blah()) {
console.log("synchronous part of function");
throw new Error("failed");
const x = await Promise.resolve(42);
return x;
}
try {
console.log("before call");
example().catch(e => { console.log("asynchronous:", e.message); });
console.log("after call");
} catch (e) {
console.log("synchronous:", e.message);
}
That fails because it tries to call blah, which doesn't exist, when it runs the code to get the default value for the p parameter I didn't supply in the call. As you can see, even that rejects the promise rather than throwing a synchronous error.
TC39 could have gone the other way, and had the synchronous part raise a synchronous error, like this non-async function does:
async function someAsyncStuff() {
return 21;
}
function example() {
console.log("synchronous part of function");
throw new Error("failed");
return someAsyncStuff().then(x => x * 2);
}
try {
console.log("before call");
example().catch(e => { console.log("asynchronous:", e.message); });
console.log("after call");
} catch (e) {
console.log("synchronous:", e.message);
}
But they decided, after discussion, on consistent promise rejection instead.
So that's one concrete piece of information to consider in your decision about how you should handle this in your own non-async functions that do asynchronous work.
How important is it that an async function should always behave in an async manner, particularly for error conditions?
Very important.
Is it OK to throw if you know that the program is not in a suitable state for the async operation to proceed?
Yes, I personally think it is OK when that is a very different error from any asynchronously produced ones, and needs to be handled separately anyway.
If some userids are known to be invalid because they're not numeric, and some are will be rejected on the server (eg because they're already taken) you should consistently make an (async!) callback for both cases. If the async errors would only arise from network problems etc, you might signal them differently.
You always may throw when an "unexpected" error arises. If you demand valid userids, you might throw on invalid ones. If you want to anticipate invalid ones and expect the caller to handle them, you should use a "unified" error route which would be the callback/rejected promise for an async function.
And to repeat #Timothy: You should always document the behavior and maintain consistency in the behavior.
Callback APIs ideally shouldn't throw but they do throw because it's very hard to avoid since you have to have try catch literally everywhere. Remember that throwing error explicitly by throw is not required for a function to throw. Another thing that adds to this is that the user callback can easily throw too, for example calling JSON.parse without try catch.
So this is what the code would look like that behaves according to these ideals:
readFile("file.json", function(err, val) {
if (err) {
console.error("unable to read file");
}
else {
try {
val = JSON.parse(val);
console.log(val.success);
}
catch(e) {
console.error("invalid json in file");
}
}
});
Having to use 2 different error handling mechanisms is really inconvenient, so if you don't want your program to be a fragile house of cards (by not writing any try catch ever) you should use promises which unify all exception handling under a single mechanism:
readFile("file.json").then(JSON.parse).then(function(val) {
console.log(val.success);
})
.catch(SyntaxError, function(e) {
console.error("invalid json in file");
})
.catch(function(e){
console.error("unable to read file")
})
Ideally you would have a multi-layer architecture like controllers, services, etc. If you do validations in services, throw immediately and have a catch block in your controller to catch the error format it and send an appropriate http error code. This way you can centralize all bad request handling logic. If you handle each case youll end up writing more code. But thats just how I would do it. Depends on your use case

how to make this function async in node.js

Here is the situation:
I am new to node.js, I have a 40MB file containing multilevel json file like:
[{},{},{}] This is an array of objects (~7000 objects). Each object has properties and a one of those properties is also an array of objects
I wrote a function to read the content of the file and iterate it. I succeeded to get what I wanted in terms of content but not usability. I thought that I wrote an async function that would allow node to serve other web requests while iterating the array but that is not the case. I would be very thankful if anyone can point me to what I've done wrong and how to rewrite it so I can have a non-blocking iteration. Here's the function that handles the situation:
function getContents(callback) {
fs.readFile(file, 'utf8', function (err, data) {
if (err) {
console.log('Error: ' + err);
return;
}
js = JSON.parse(data);
callback();
return;
});
}
getContents(iterateGlobalArr);
var count = 0;
function iterateGlobalArr() {
if (count < js.length) {
innerArr = js.nestedProp;
//iterate nutrients
innerArr.forEach(function(e, index) {
//some simple if condition here
});
var schema = {
//.....get props from forEach iteration
}
Model.create(schema, function(err, post) {
if(err) {
console.log('\ncreation error\n', err);
return;
}
if (!post) {
console.log('\nfailed to create post for schema:\n' + schema);
return;
}
});
count++;
process.nextTick(iterateGlobalArr);
}
else {
console.log("\nIteration finished");
next();
}
Just so it is clear how I've tested the above situation. I open two tabs one loading this iteration which takes some time and second with another node route which does not load until the iteration is over. So essentially I've written a blocking code but not sure how to re-factor it! I suspect that just because everything is happening in the callback I am unable to release the event loop to handle another request...
Your code is almost correct. What you are doing is inadvertently adding ALL the items to the very next tick... which still blocks.
The important piece of code is here:
Model.create(schema, function(err, post) {
if(err) {
console.log('\ncreation error\n', err);
return;
}
if (!post) {
console.log('\nfailed to create post for schema:\n' + schema);
return;
}
});
// add EVERYTHING to the very same next tick!
count++;
process.nextTick(iterateGlobalArr);
Let's say you are in tick A of the event loop when getContents() runs and count is 0. You enter iterateGlobalArr and you call Model.create. Because Model.create is async, it is returning immediately, causing process.nextTick() to add processing of item 1 to the next tick, let's say B. Then it calls iterateGlobalArr, which does the same thing, adding item 2 to the next tick, which is still B. Then item 3, and so on.
What you need to do is move the count increment and process.nextTick() into the callback of Model.create(). This will make sure the current item is processed before nextTick is invoked... which means next item is actually added to the next tick AFTER the model item has been created... which will give your app time to handle other things in between. The fixed version of iterateGlobalArr is here:
function iterateGlobalArr() {
if (count < js.length) {
innerArr = js.nestedProp;
//iterate nutrients
innerArr.forEach(function(e, index) {
//some simple if condition here
});
var schema = {
//.....get props from forEach iteration
}
Model.create(schema, function(err, post) {
// schedule our next item to be processed immediately.
count++;
process.nextTick(iterateGlobalArr);
// then move on to handling this result.
if(err) {
console.log('\ncreation error\n', err);
return;
}
if (!post) {
console.log('\nfailed to create post for schema:\n' + schema);
return;
}
});
}
else {
console.log("\nIteration finished");
next();
}
}
Note also that I would strongly suggest that you pass in your js and counter with each call to iterageGlobalArr, as it will make your iterateGlobalArr alot easier to debug, among other things, but that's another story.
Cheers!
Node is single-threaded so async will only help you if you are relying on another system/subsystem to do the work (a shell script, external database, web service etc). If you have to do the work in Node you are going to block while you do it.
It is possible to create one node process per core. This solution would result in only blocking one of the node processes and leave the rest to service your requests, but this feature is still listed as experimental http://nodejs.org/api/cluster.html.
A single instance of Node runs in a single thread. To take advantage
of multi-core systems the user will sometimes want to launch a cluster
of Node processes to handle the load.
The cluster module allows you to easily create child processes that
all share server ports.

What's going on with Meteor and Fibers/bindEnvironment()?

I am having difficulty using Fibers/Meteor.bindEnvironment(). I tried to have code updating and inserting to a collection if the collection starts empty. This is all supposed to be running server-side on startup.
function insertRecords() {
console.log("inserting...");
var client = Knox.createClient({
key: apikey,
secret: secret,
bucket: 'profile-testing'
});
console.log("created client");
client.list({ prefix: 'projects' }, function(err, data) {
if (err) {
console.log("Error in insertRecords");
}
for (var i = 0; i < data.Contents.length; i++) {
console.log(data.Contents[i].Key);
if (data.Contents[i].Key.split('/').pop() == "") {
Projects.insert({ name: data.Contents[i].Key, contents: [] });
} else if (data.Contents[i].Key.split('.').pop() == "jpg") {
Projects.update( { name: data.Contents[i].Key.substr(0,
data.Contents[i].Key.lastIndexOf('.')) },
{ $push: {contents: data.Contents[i].Key}} );
} else {
console.log(data.Contents[i].Key.split('.').pop());
}
}
});
}
if (Meteor.isServer) {
Meteor.startup(function () {
if (Projects.find().count() === 0) {
boundInsert = Meteor.bindEnvironment(insertRecords, function(err) {
if (err) {
console.log("error binding?");
console.log(err);
}
});
boundInsert();
}
});
}
My first time writing this, I got errors that I needed to wrap my callbacks in a Fiber() block, then on discussion on IRC someone recommending trying Meteor.bindEnvironment() instead, since that should be putting it in a Fiber. That didn't work (the only output I saw was inserting..., meaning that bindEnvironment() didn't throw an error, but it also doesn't run any of the code inside of the block). Then I got to this. My error now is: Error: Meteor code must always run within a Fiber. Try wrapping callbacks that you pass to non-Meteor libraries with Meteor.bindEnvironment.
I am new to Node and don't completely understand the concept of Fibers. My understanding is that they're analogous to threads in C/C++/every language with threading, but I don't understand what the implications extending to my server-side code are/why my code is throwing an error when trying to insert to a collection. Can anyone explain this to me?
Thank you.
You're using bindEnvironment slightly incorrectly. Because where its being used is already in a fiber and the callback that comes off the Knox client isn't in a fiber anymore.
There are two use cases of bindEnvironment (that i can think of, there could be more!):
You have a global variable that has to be altered but you don't want it to affect other user's sessions
You are managing a callback using a third party api/npm module (which looks to be the case)
Meteor.bindEnvironment creates a new Fiber and copies the current Fiber's variables and environment to the new Fiber. The point you need this is when you use your nom module's method callback.
Luckily there is an alternative that takes care of the callback waiting for you and binds the callback in a fiber called Meteor.wrapAsync.
So you could do this:
Your startup function already has a fiber and no callback so you don't need bindEnvironment here.
Meteor.startup(function () {
if (Projects.find().count() === 0) {
insertRecords();
}
});
And your insert records function (using wrapAsync) so you don't need a callback
function insertRecords() {
console.log("inserting...");
var client = Knox.createClient({
key: apikey,
secret: secret,
bucket: 'profile-testing'
});
client.listSync = Meteor.wrapAsync(client.list.bind(client));
console.log("created client");
try {
var data = client.listSync({ prefix: 'projects' });
}
catch(e) {
console.log(e);
}
if(!data) return;
for (var i = 1; i < data.Contents.length; i++) {
console.log(data.Contents[i].Key);
if (data.Contents[i].Key.split('/').pop() == "") {
Projects.insert({ name: data.Contents[i].Key, contents: [] });
} else if (data.Contents[i].Key.split('.').pop() == "jpg") {
Projects.update( { name: data.Contents[i].Key.substr(0,
data.Contents[i].Key.lastIndexOf('.')) },
{ $push: {contents: data.Contents[i].Key}} );
} else {
console.log(data.Contents[i].Key.split('.').pop());
}
}
});
A couple of things to keep in mind. Fibers aren't like threads. There is only a single thread in NodeJS.
Fibers are more like events that can run at the same time but without blocking each other if there is a waiting type scenario (e.g downloading a file from the internet).
So you can have synchronous code and not block the other user's events. They take turns to run but still run in a single thread. So this is how Meteor has synchronous code on the server side, that can wait for stuff, yet other user's won't be blocked by this and can do stuff because their code runs in a different fiber.
Chris Mather has a couple of good articles on this on http://eventedmind.com
What does Meteor.wrapAsync do?
Meteor.wrapAsync takes in the method you give it as the first parameter and runs it in the current fiber.
It also attaches a callback to it (it assumes the method takes a last param that has a callback where the first param is an error and the second the result such as function(err,result).
The callback is bound with Meteor.bindEnvironment and blocks the current Fiber until the callback is fired. As soon as the callback fires it returns the result or throws the err.
So it's very handy for converting asynchronous code into synchronous code since you can use the result of the method on the next line instead of using a callback and nesting deeper functions. It also takes care of the bindEnvironment for you so you don't have to worry about losing your fiber's scope.
Update Meteor._wrapAsync is now Meteor.wrapAsync and documented.

How to wait for all async calls to finish

I'm using Mongoose with Node.js and have the following code that will call the callback after all the save() calls has finished. However, I feel that this is a very dirty way of doing it and would like to see the proper way to get this done.
function setup(callback) {
// Clear the DB and load fixtures
Account.remove({}, addFixtureData);
function addFixtureData() {
// Load the fixtures
fs.readFile('./fixtures/account.json', 'utf8', function(err, data) {
if (err) { throw err; }
var jsonData = JSON.parse(data);
var count = 0;
jsonData.forEach(function(json) {
count++;
var account = new Account(json);
account.save(function(err) {
if (err) { throw err; }
if (--count == 0 && callback) callback();
});
});
});
}
}
You can clean up the code a bit by using a library like async or Step.
Also, I've written a small module that handles loading fixtures for you, so you just do:
var fixtures = require('./mongoose-fixtures');
fixtures.load('./fixtures/account.json', function(err) {
//Fixtures loaded, you're ready to go
};
Github:
https://github.com/powmedia/mongoose-fixtures
It will also load a directory of fixture files, or objects.
I did a talk about common asyncronous patterns (serial and parallel) and ways to solve them:
https://github.com/masylum/i-love-async
I hope its useful.
I've recently created simpler abstraction called wait.for to call async functions in sync mode (based on Fibers). It's at an early stage but works. It is at:
https://github.com/luciotato/waitfor
Using wait.for, you can call any standard nodejs async function, as if it were a sync function, without blocking node's event loop. You can code sequentially when you need it.
using wait.for your code will be:
//in a fiber
function setup(callback) {
// Clear the DB and load fixtures
wait.for(Account.remove,{});
// Load the fixtures
var data = wait.for(fs.readFile,'./fixtures/account.json', 'utf8');
var jsonData = JSON.parse(data);
jsonData.forEach(function(json) {
var account = new Account(json);
wait.forMethod(account,'save');
}
callback();
}
That's actually the proper way of doing it, more or less. What you're doing there is a parallel loop. You can abstract it into it's own "async parallel foreach" function if you want (and many do), but that's really the only way of doing a parallel loop.
Depending on what you intended, one thing that could be done differently is the error handling. Because you're throwing, if there's a single error, that callback will never get executed (count won't be decremented). So it might be better to do:
account.save(function(err) {
if (err) return callback(err);
if (!--count) callback();
});
And handle the error in the callback. It's better node-convention-wise.
I would also change another thing to save you the trouble of incrementing count on every iteration:
var jsonData = JSON.parse(data)
, count = jsonData.length;
jsonData.forEach(function(json) {
var account = new Account(json);
account.save(function(err) {
if (err) return callback(err);
if (!--count) callback();
});
});
If you are already using underscore.js anywhere in your project, you can leverage the after method. You need to know how many async calls will be out there in advance, but aside from that it's a pretty elegant solution.

Resources