How to prevent new requests before sending the response to the last request. on On the other hand just process one request at the same time.
app.get('/get', function (req, res) {
//Stop enter new request
someAsyncFunction(function(result){
res.send(result);
//New Request can enter now
}
}
Even tho I agree with jfriend00 that this might not be the optimal way to do this, if you see that it's the way to go, I would just use some kind of state management to check if it's allowed to access that /get request and return a different response if it's not.
You can use your database to do this. I strongly recommend using Redis for this because it's in-memory and really quick. So it's super convenient. You can use mongodb or mysql if you prefer so, but Redis would be the best. This is how it would look, abstractly -
Let's say you have an entry in your database called isLoading, and it's set to false by default.
app.get('/get', function (req, res) {
//get isloading from your state management of choice and check it's value
if(isLoading == true) {
// If the app is loading, notify the client that he should wait
// You can check for the status code in your client and react accordingly
return res.status(226).json({message: "I'm currently being used, hold on"})
}
// Code below executes if isLoading is not true
//Set your isLoading DB variable to true, and proceed to do what you have
isLoading = true
someAsyncFunction(function(result){
// Only after this is done, isLoading is set to false and someAsyncFunction can be ran again
isLoading = false
return res.send(result)
}
}
Hope this helps
Uhhhh, servers are designed to handle multiple requests from multiple users so while one request is being processed with asynchronous operations, other requests can be processed. Without that, they don't scale beyond a few users. That is the design of any server framework for node.js, including Express.
So, whatever problem you're actually trying to solve, that is NOT how you should solve it.
If you have some sort of concurrency issue that is pushing you to ask for this, then please share the ACTUAL concurrency problem you need to solve because it's much better to solve it a different way than to handicap your server into one request at a time.
Related
I would like to implement a system that allows users to add each other as friends and share data between them. I have gotten the authentication done and currently researching ways to do this real time. This project of mine is purely a learning experience so I am looking for many ways to perform this task to grow my knowledge.
I have experience using Websockets on a previous project and it was easy to use. Websockets seems like the best solution to my problem as it allows the user to send and receive invites through the open socket. However I have also learnt that the downside would be a long open socket connection that might be potentially performance taxing(?) Since I'm only sending/receiving information only when an invite is sent/received, websockets might be overutilized for a simple function.
At the same time I would like to learn about new technologies and I found out about Server Sent Events that would be less performance heavy(?) Using SSE would be much efficient as it only sends HTTP requests to the clients/server whenever the user send the invite.
Please correct me if I'm wrong for what I typed out above as this is what I gathered through my reading online. So now I'm having a hard time understanding whether SSE is better than websocket for my project. If there are other technologies please do let me know too! Thank you
how you doing ?
The best advise would be always to use websocket in this context, cuz your project can grow and need some feature that would be better using websocket
But you got another options, one of the is Firebase, Yes, FIREBASE!
You can do a nice reactive application with firebase, becouse the its observers update data in realtime, just like the websockets do.
But here go some cons and pros.
Websocket: Can make your project escalable, its more complete, you can use it in any context, BUT: is hard to implement and takes more time to be learned and understood.
Firebase, Easy and fast to implement, you can do a chat in 20 minuts, and surelly would help you with your problem, There is Firestore and Reatime database.. even the firestore updates in realtime.. BUT: Firebase costs in a big project can be expensive, i dont think is a good option for a big project.
Thats it.. the better options to do a real time data application to me.
A little bit more about. Firebase vs Websocket
https://ably.com/compare/firebase-vs-socketio
to send a friend invitation, you just send an API request. WebSocket is used for real time communication. From react.js, get the email and send the email to the server
export const sendFriendInvitation = async (data) => {
try {
return axios.post("/friend-invitation", data);
} catch (exception) {
console.error(error)
}
};
On node.js side, write a controller to control this request:
const invitationRequest = async (req, res) => {
// get the email
const { targetMail } = req.body;
// write code to handle that same person is not sending req to himself
// get the details of user who sent the email
const targetUser = await User.findOne({
mail: targetMail.toLowerCase(),
});
if (!targetUser) {
return res
.status(404)
.send("send error message");
}
// you should have Invitations model
// check if invitation already sent.
// check if the user we would like to invite is our friend
// now create a new invitation
// if invitation has been successfully created, update the user's friend
return res.status(201).send("Invitation has been sent");
};
I want to release the resources associated with a node js request without sending any kind of response to the client.
This might sound weird but my goal is very simple, the last few days my servers have been targeted by hackers... i'm trying to improve the defenses and if i identify a malicious request i could just DROP IT without sending any response i would make the attacker wait for connection timeout and it would give a little more advantage.
i tried:
exports.test = (req, res) => {
res.end();
};
but this case the server sends an empty response which isn't my goal since i want make client wait forever
also tried:
exports.test = (req, res) => {
res.socket.destroy();
};
which on google cloud functions throws an exception
does anyone know if on GCF if i simple return the function it will be released or the connection will be hang on?
exports.test = (req, res) => {
return; //will google release all resources or connection and socket will be kept until timeout?
};
Cloud Functions does not enable what you're trying to do. The only way it will keep the connection open is if your function times out with no response. You can't instruct it to keep the connection open while also terminating the function. Or, to put it another way, you're going to have to pay the usual Cloud Functions rate for execution-seconds in order to keep that connection open.
I have a NodeJS / background-process issue, that I don't know how to solve it 'elegant', straight, the right way.
The user submits some (like ~10 or more) URLs via a textarea and then they should be processed asynchronous. [a screenshot with puppeteer has to be taken, some information gathered, the screenshot should be processed with sharp and the result should be persisted in a MongoDB. The screenshot via GridFS and the URL in an own collection with a reference to the screenshot].
While this async process is calculated in the background, the page should be updated whenever a URL got processed.
There are so many ways to do that, but which one is the most correct/straightforward/resource saving way?
Browserify and I do it in the browser? No, too much stuff on the client side.. AJAX/Axios posts and wait for the URLs to be processed and reflect the results on the side? Trigger the process before the response gets send back to the client or let the client start the processing?
So, I made a workflow engine of some sort that supports long-running jobs. And I followed this tutorial https://farazdagi.com/2014/rest-and-long-running-jobs/
Which is nothing, when a request is created you just return a status code and when the jobs are completed you just log them somewhere and use that.
For, this I used EventEmitter which is used inside a promise. It's only my solution maybe not elegant, maybe outright wrong. Made a little POC for you.
const events = require('events')
const emitter = new events.EventEmitter();
const actualWork = function() {
return new Promise((res,rej)=>{
setTimeout(res, 1000);
})
}
emitter.on('workCompleted', function(){
// log somewhere
});
app.get('/someroute', (req,res)=>{
res.json({msg:'reqest initiated', id: 'some_id'})
actualWork()
.then(()=>{
emitter.emit('workCompleted', {id: 'some_id'});
});
})
app.get('/someroute/id/status', (req,res)=>{
//get the log
})
Sorry if this is a basic question. I'm just starting my 3rd week of doing Node.js programming! I looked around and didn't see an answer to this, specifically. Maybe it's just assumed when answering questions about child_process.spawn/fork by those who know this stuff better than I do.
I have a Node/Express app where I want to take in an HTTP request, save a bit of data to Mongo, return success/error, but...at the same time kick off a process to take some of the data and do a lookup against a web API. I want to save that data back to Mongo, but there's no need to have that communicated back to the HTTP client. (I'll probably log the success/error of that call somewhere.)
How do I kick off that 2nd task to run independent of the main request and not cause the response to wait for it to complete?
The 2nd task will also be written in Node.js. I'd like it to just be another function in the same file, if possible.
Thanks in advance!
I don't see why you would need spawning another process just for that. In node you are not limited to the http request lifecycle to run stuff like other frameworks. This should do it:
function yourHandler(req, res, next) {
dataAccess.writeToMongo(someData, function(err, res) {
var status = err ? 500 : 200;
// write back to response already!
res.status(status);
res.end();
// do not completely terminate yet
// kick off web api call
apiClient.doSomething();
});
}
Full disclosure: I'm very new to the totally asynchronous model.
In my application there are a number of instances where information needs to be committed to the db, but the application can continue on without knowing the result. Is it acceptable to render a page before waiting for a db write to complete?
Yes. For example:
app.get('/', function(req, res, next) {
res.jsonp({
message: 'Hello World!'
});
var i = 0;
while (true) {
i++;
}
});
When a user visits '/', he will see the result immediately. But if there is only one node instance is running, when the other user visits '/', he won't receive any response as the only instance is under a infinite loop.
If you have a lot of heavy work to do(for example, CPU-bound works), it's much better to use a message queue such as MSMQ and AMQP instead of having all the works done in the node instance.
Sure. But how would you notify the user of an error if something did go wrong? Unless you're doing sockets or ajax or something, requests are the standard way.