I am using NodeJS with https://www.npmjs.com/package/elasticsearch package
Use Case is like this: When a link is clicked on the page, I will make a request to NodeJS Server which will in turn use the ES node package to fetch the data from ES Server and sends the data back to the client.
The issue is, when two requests are made in quick session(two links clicked in a short span), the Response of first request and then the Response of second request is reaching the client. The UI depends on this response, and i would like to directly show only the second request's response.
So, the question is, Is there any way to cancel out the previous request made to ES Server before starting a new one ?
Code:
ES Client:
var elasticsearch = require('elasticsearch');
var client = new elasticsearch.Client({
host: 'HostName',
log: 'trace'
});
Route:
app.get('/data/:reportName', dataController.getReportData);
DataController:
function getReportData(req, res) {
query = getQueryForReport(report)
client.search(query)
.then(function(response) {
res.json(parseResponse(response)
})
}
So, the same API /data/reportName is called twice in succession with different reportNames. I would like to send only the second report Data back and cancel our the first request.
If you're only concerned about the UX, rather than stressing your ES, than aborting the ajax request is what you want.
Since you didn't post your client side code, I'll give you a generic example:
var xhr = $.ajax({
type: "GET",
url: "searching_route",
data: "name=John&location=Boston",
success: function(msg){
alert( "Data Saved: " + msg );
}
});
//kill the request
xhr.abort()
Remember that aborting the request may not prevent the elasticsearch query from being processed, but will prevent the client from receiving the data.
Related
I'm running a small server that needs to receive webforms. The server checks the request and sends back "success" or "fail" which is then displayed on the form (client screen).
Now, checking the form may take a few seconds, so the user may be tempted to send the form again.
What is the corret way to ignore the second request?
So far I have come out with this solutions: If the form is duplicate of the previous one
Don't check and send some server error back (like 429, or 102, or some other one)
Close directly the connection req.destroy();res.destroy();
Ignore the request and exit from the requestListener function.
With solution 1 and 2 the form (on client's browser) displays a message error (even if the first request they sent was correct, so as the duplicates). So it's not a good one.
Solution 3 gives the desired outcome... but I'm not sure if it is the right way around it... basically not changing req and res instead of destroying them. Could this cause issues, or slow down the server? (like... do they stack up?). Of course the first request, once it has been checked, will be sent back with the outcome code. My concern is with the duplicate requests, which I don't destroy nor answer...
Some details on the setup: Nodejs application using the very default code by the http module.
const http = require("http");
const requestListener = function (req, res) {
var requestBody = '';
req.on('data', (data)=>{
requestBody += data;
});
req.on('end', ()=>{
if (isduplicate(requestBody))
return;
else
evalRequest(requestBody, res);
})
}
I'm trying to implement an application and one of the things I need to do is to use Server Sent Events to send data from the server to the client. The basis of SSE is to have one connection in which data is transfered back and forth without this connection being closed. The problem I'm having right now is that everytime I make a HTTP from the client using EventSource() multiple request are being made.
Client:
const eventSource = new EventSource('http://localhost:8000/update?nick='+username+'&game='+gameId)
eventSource.onmessage = function(event) {
const data = JSON.parse(event.data)
console.log(data)
}
Server (Node.Js):
case '/update':
res.writeHead(200,{
'Content-Type': 'text/event-stream',
'Cache-Control': 'no-cache',
'Connection': 'keep-alive'
})
res.write('data: 1')
res.write('\n\n')
res.end('{}')
break
This is what I see in the chrome dev tools. When the client tries to connect using SSE, it makes multiple requests to the server. However only one request was supposed to made.
Do any of you know how to fix this? Thank you in advance.
The way one would do that is to not include the res.end() since the connection has to be kept alive. On top of that, I had to keep track of the responses from the http requests made by the user, so I created a different module with the following methods:
let responses = []
module.exports.remember = function(res){
responses.push(res)
}
module.exports.forget = function(res){
let pos = responses.findIndex((response)=>response===res)
if(pos>-1){
responses.splice(pos, 1)
}
}
module.exports.update = function(data){
for(let response of responses){
response.write(`data: ${data} \n\n`)
}
}
This way one can access the response objects and use the function update() to send data to the connected clients.
I'm trying to build a real-time program where users can set a marker on a Google Map and others who are connected can get that same marker. Everything seems to work fine except that after a few minutes, the server side is submitting the data a second time.
To clarify: client sets a marker on the map, the marker is sent to the server, running Node JS with Express JS, in JSON format. The server returns the data to all connected clients. Minutes later, the server sends the same data it received once more, causing a "ERR_EMPTY_RESPONSE" client-side on the last line of example "Client code".
Client code:
var data = new Array();
data.push({lat: Gmap.markers[0].lat, lng: Gmap.markers[0].lng});
var xhttp = new XMLHttpRequest();
xhttp.open("POST", "/marker", true);
xhttp.setRequestHeader('Content-type', 'application/json; charset=UTF-8');
xhttp.send(JSON.stringify(data));
Server-side:
var app = express():
app.post('/marker', function(req,res){
io.emit('marker', req.body);
})
Anyone have any idea of whats going on?
You need to send a response to the http request. If you don't, the browser will time it out and may attempt to retry.
var app = express():
app.post('/marker', function(req,res){
io.emit('marker', req.body);
res.send("ok"); // <== Send a response to the http request here
})
I'm using express and the request POST look like that
router.post('/', function(req, res, next){
var data = req.body;
getRandom(data, function(value){
res.json({value: value});
});
});
POST is sent through ajax and then update textarea with new data.
$.ajax({
type: "POST",
url: "/",
data: JSON.stringify(datareq),
dataType: 'json',
contentType: 'application/json',
success: function(x){
$.each(x, function(index, value) {
$('.textarea').append(value + '\n');
});
},
error: function(x) {
console.log(x + 'error');
}
});
How to send this using one POST and a few response. User received one data in textarea when cb finished and then another data and so one till the end.
<textarea>
data 1 - 1sec
data 2 - 2sec leater
data 3 - 3 second later
...
</textarea>
I add Time (1sec ...) only to show that callback has a lot to do to send another response.
Of course this not working because res.send() close connection and I received error
So how to achieve my idea, to sending simultaneously after post request. I want to give user data very fast, then another one when is ready not waiting for all and then send response.
You can't
Reason:
Http closes connection after sending response. You can not keep it open and sending multiple responses to the client. HTTP doesn't support it.
Solution 1:
Simply put a timer at client side and request periodically.
Solution 2 (Recommended):
Use socket, and pass data through it. socket.io is the socket library for nodejs applications. It is very easy to use. Set up a connection, keep sending data from server and receive it on client side.
Just to add on the answer. This answer explains why res.send closes the connection.
When saving a model to a Node.js endpoint I'm not getting a success or error response every time, particularly on the first the first save and then sometimes on other attempts. The Node.js server is sending a success response every time, and if I use a Chrome rest client it works every time.
var mailchimpModel = new MailchimpModel();
var data = {
"email": $('#email').val()
}
mailchimpModel.save(data, {
success: function(model, response) {
console.log("success");
console.log(response);
},
error: function(model, response) {
console.log("error");
}
});
What I have found is the nodejs server is receiving 2 requests when it's failing
OPTIONS /api/mailchimp 200
POST /api/mailchimp 200
and I only get a success response if I submit the request again straight afterwards.
It's possible your model is failing client-side validation. To check, try:
console.log(mailchimpModel.save(data));
If the value is false then your model is failing client-side validation (usually defined in a validate function in the model). You can check the errors with
console.log(mailchimpModel.valdiationError);
OK found that i need to handle the OPTIONS method on the server, using the soltion on this post worked for me.
https://stackoverflow.com/a/13148080/10644