node / express - render page before db callback? - node.js

Full disclosure: I'm very new to the totally asynchronous model.
In my application there are a number of instances where information needs to be committed to the db, but the application can continue on without knowing the result. Is it acceptable to render a page before waiting for a db write to complete?

Yes. For example:
app.get('/', function(req, res, next) {
res.jsonp({
message: 'Hello World!'
});
var i = 0;
while (true) {
i++;
}
});
When a user visits '/', he will see the result immediately. But if there is only one node instance is running, when the other user visits '/', he won't receive any response as the only instance is under a infinite loop.
If you have a lot of heavy work to do(for example, CPU-bound works), it's much better to use a message queue such as MSMQ and AMQP instead of having all the works done in the node instance.

Sure. But how would you notify the user of an error if something did go wrong? Unless you're doing sockets or ajax or something, requests are the standard way.

Related

google cloud functions release resources without respond request

I want to release the resources associated with a node js request without sending any kind of response to the client.
This might sound weird but my goal is very simple, the last few days my servers have been targeted by hackers... i'm trying to improve the defenses and if i identify a malicious request i could just DROP IT without sending any response i would make the attacker wait for connection timeout and it would give a little more advantage.
i tried:
exports.test = (req, res) => {
res.end();
};
but this case the server sends an empty response which isn't my goal since i want make client wait forever
also tried:
exports.test = (req, res) => {
res.socket.destroy();
};
which on google cloud functions throws an exception
does anyone know if on GCF if i simple return the function it will be released or the connection will be hang on?
exports.test = (req, res) => {
return; //will google release all resources or connection and socket will be kept until timeout?
};
Cloud Functions does not enable what you're trying to do. The only way it will keep the connection open is if your function times out with no response. You can't instruct it to keep the connection open while also terminating the function. Or, to put it another way, you're going to have to pay the usual Cloud Functions rate for execution-seconds in order to keep that connection open.

Nodejs prevent new request before send response to last request

How to prevent new requests before sending the response to the last request. on On the other hand just process one request at the same time.
app.get('/get', function (req, res) {
//Stop enter new request
someAsyncFunction(function(result){
res.send(result);
//New Request can enter now
}
}
Even tho I agree with jfriend00 that this might not be the optimal way to do this, if you see that it's the way to go, I would just use some kind of state management to check if it's allowed to access that /get request and return a different response if it's not.
You can use your database to do this. I strongly recommend using Redis for this because it's in-memory and really quick. So it's super convenient. You can use mongodb or mysql if you prefer so, but Redis would be the best. This is how it would look, abstractly -
Let's say you have an entry in your database called isLoading, and it's set to false by default.
app.get('/get', function (req, res) {
//get isloading from your state management of choice and check it's value
if(isLoading == true) {
// If the app is loading, notify the client that he should wait
// You can check for the status code in your client and react accordingly
return res.status(226).json({message: "I'm currently being used, hold on"})
}
// Code below executes if isLoading is not true
//Set your isLoading DB variable to true, and proceed to do what you have
isLoading = true
someAsyncFunction(function(result){
// Only after this is done, isLoading is set to false and someAsyncFunction can be ran again
isLoading = false
return res.send(result)
}
}
Hope this helps
Uhhhh, servers are designed to handle multiple requests from multiple users so while one request is being processed with asynchronous operations, other requests can be processed. Without that, they don't scale beyond a few users. That is the design of any server framework for node.js, including Express.
So, whatever problem you're actually trying to solve, that is NOT how you should solve it.
If you have some sort of concurrency issue that is pushing you to ask for this, then please share the ACTUAL concurrency problem you need to solve because it's much better to solve it a different way than to handicap your server into one request at a time.

Does the node.js express framework create a new lightweight process per client connection?

Say this code is run inside of a node.js express application. Say two different clients request the index resource. Call these clients ClientA and ClientB. Say ClientA requests the index resource before ClientB. In this case the console will log the value 1 for ClientA and the console will log the value 2 for ClientB. My main question is: Does each client request get its own lightweight process with the router being the shared code portion between those processes, the variables visible to router but not part of the router being the shared heap and of course each client then gets their own stack? My sub questions is: If yes to my main question then in this example each of these clients would have to queue waiting for the lock the global_counter before incrementing,correct?
var global_counter = 0;
router.get('/', function (req, res) {
global_counter += 1;
console.log(global_counter);
res.render('index');
});
Nope. Single thread/process. Concurrency is accomplished via a work queue. Some ways to get stuff into the work queue include setTimeout() and nexttick(). Check out http://howtonode.org/understanding-process-next-tick
Only one thing is running at a time, so no need to do any locking.
It takes a while to get your brain to warm up to the idea.

Spawning a Node.js task to run on its own

Sorry if this is a basic question. I'm just starting my 3rd week of doing Node.js programming! I looked around and didn't see an answer to this, specifically. Maybe it's just assumed when answering questions about child_process.spawn/fork by those who know this stuff better than I do.
I have a Node/Express app where I want to take in an HTTP request, save a bit of data to Mongo, return success/error, but...at the same time kick off a process to take some of the data and do a lookup against a web API. I want to save that data back to Mongo, but there's no need to have that communicated back to the HTTP client. (I'll probably log the success/error of that call somewhere.)
How do I kick off that 2nd task to run independent of the main request and not cause the response to wait for it to complete?
The 2nd task will also be written in Node.js. I'd like it to just be another function in the same file, if possible.
Thanks in advance!
I don't see why you would need spawning another process just for that. In node you are not limited to the http request lifecycle to run stuff like other frameworks. This should do it:
function yourHandler(req, res, next) {
dataAccess.writeToMongo(someData, function(err, res) {
var status = err ? 500 : 200;
// write back to response already!
res.status(status);
res.end();
// do not completely terminate yet
// kick off web api call
apiClient.doSomething();
});
}

How return result from request, and after perform async task but without callback in NodeJS

iOS application perform request for sending messages to users. I want to return result to application, and after that send push notification to users, and I don't want to wait until notifications were pushed successfully or not.
app.post("/message", function(req, res, next) {
User.sendMessages(query, options, function(err, results) {
res.json(results);
sendPushNotifications();
});
});
How can I do this?
That's how it works.
Keep in mind everything that happens in node is in a single thread, unlike other back-end languages you might be used to.
Requests, jobs, everything happens in that single thread. Unless, of course, you use cluster or something like that.

Resources