I'm trying to implement an application and one of the things I need to do is to use Server Sent Events to send data from the server to the client. The basis of SSE is to have one connection in which data is transfered back and forth without this connection being closed. The problem I'm having right now is that everytime I make a HTTP from the client using EventSource() multiple request are being made.
Client:
const eventSource = new EventSource('http://localhost:8000/update?nick='+username+'&game='+gameId)
eventSource.onmessage = function(event) {
const data = JSON.parse(event.data)
console.log(data)
}
Server (Node.Js):
case '/update':
res.writeHead(200,{
'Content-Type': 'text/event-stream',
'Cache-Control': 'no-cache',
'Connection': 'keep-alive'
})
res.write('data: 1')
res.write('\n\n')
res.end('{}')
break
This is what I see in the chrome dev tools. When the client tries to connect using SSE, it makes multiple requests to the server. However only one request was supposed to made.
Do any of you know how to fix this? Thank you in advance.
The way one would do that is to not include the res.end() since the connection has to be kept alive. On top of that, I had to keep track of the responses from the http requests made by the user, so I created a different module with the following methods:
let responses = []
module.exports.remember = function(res){
responses.push(res)
}
module.exports.forget = function(res){
let pos = responses.findIndex((response)=>response===res)
if(pos>-1){
responses.splice(pos, 1)
}
}
module.exports.update = function(data){
for(let response of responses){
response.write(`data: ${data} \n\n`)
}
}
This way one can access the response objects and use the function update() to send data to the connected clients.
Related
I have a client app in React, and a server in Node (with Express).
At server side, I have an endpoint like the following (is not the real endpoint, just an idea of what i'm doing):
function endpoint(req, res) {
res.writeHead(200, {
'Content-Type': 'text/plain',
'Transfer-Encoding': 'chunked'
});
for(x < 1000){
res.write(some_string + '\n');
wait(a_couple_of_seconds); // just to make process slower for testing purposes
}
res.end();
}
This is working perfect, i mean, when I call this endpoint, I receive the whole stream with all the 1.000 rows.
The thing is that I cannot manage to get this data by chunks (for each 'write' or a bunch of 'writes') in order to show that on the frontend as soon as i'm receiving them..(think of a table that shows the rows as soon as i get them from the endpoint call)
In the frontend I'm using Axios to call the API with the following code:
async function getDataFromStream(_data): Promise<any> {
const { data, headers } = await Axios({
url: `http://the.api.url/endpoint`,
method: 'GET',
responseType: 'stream',
timeout: 0,
});
// this next line doesn't work. it says that 'on' is not a function
data.on('data', chunk => console.log('chunk', chunk));
// data has actually the whole response data (all the rows)
return Promise.resolve();
}
The thing is that the Axios call returns the whole data object after the 'res.end()' on the server is called, but I need to get data as soon as the server will start sending the chunks with the rows (on each res.write or whenever the server thinks is ready to send some bunch of chunks).
I have also tried not to use an await and get the value of the promise at the 'then()' of the axios call but it is the same behavior, the 'data' value comes with all the 'writes' together once the server does the 'res.end()'
So, what I doing wrong here ? maybe this is not possible with Axios or Node and I should use something like websockets to solve it.
Any help will be very appreciate it because I read a lot but couldn't get a working solution yet.
For anyone interested in this, what I ended up doing is the following:
At the Client side, I used the Axios onDownloadProgress handler that allows handling of progress events for downloads.
So, I implemented something like this:
function getDataFromStream(_data): Promise<any> {
return Axios({
url: `http://the.api.url/endpoint`,
method: 'GET',
onDownloadProgress: progressEvent => {
const dataChunk = progressEvent.currentTarget.response;
// dataChunk contains the data that have been obtained so far (the whole data so far)..
// So here we do whatever we want with this partial data..
// In my case I'm storing that on a redux store that is used to
// render a table, so now, table rows are rendered as soon as
// they are obtained from the endpoint.
}
}).then(({ data }) => Promise.resolve(data));
}
My node server has a strange behaviour when it comes to a GET endpoint that resplies with a big JSON (30-35MB).
I am not using any npm package. Just the core API.
The unexpected behaviour only happens when querying the server from the Internet and it behaves fine if it is queried from the local network.
The problem is that the server stops writing to the response after it writes the first 1260 bytes of the content body. It does not close the connection nor throw an error. Insomnia (the REST client I use for testing) just states that it received a 1260B chunk. If I query the same endpoint from a local machine it says that it received more and bigger chunks (a few KB each).
I don't even think the problem is caused by node but since I am on a clean raspberry pi (installed raspbian and then just node v13.0.1) and the only process I use is node.js I don't know how to find the source of the problem, there is no load balancer or web server to blame. Also the public IP seems OK, every other endpoint is working fine (they reply with less than 1260B per request)
The code for that endpoint looks like this
const text = url.parse(req.url, true).query.text;
if (text.length > 4) {
let results = await models.fullTextSearch(text);
results = await results.map(async result=>{
result.Data = await models.FindData(result.ProductID, 30);
return result;
});
results = await Promise.all(results);
results = JSON.stringify(results);
res.writeHead(200, {'Content-Type': 'application/json', 'Transfer-Encoding': 'chunked', 'Access-Control-Allow-Origin': '*', 'Cache-Control': 'max-age=600'});
res.write(results);
res.end();
break;
}
res.writeHead(403, {'Content-Type': 'text/plain', 'Access-Control-Allow-Origin': '*'});
res.write("You made an invalid request!");
break;
Here are a number of things to do in order to debug this:
Add console.log(results.length) to make sure the length of the data is what you expect it to be.
Add a callback to res.end(function() { console.log('finished sending response')}) to see if the http library thinks it is done sending the response.
Check the return value from res.write(). If it is false (indicating that not all data has yet been sent), add a handler for the drain event and see if it gets called.
Try increasing the sending timeout with res.setTimeout() in case it's just taking too long to send all the data.
I'm using express and the request POST look like that
router.post('/', function(req, res, next){
var data = req.body;
getRandom(data, function(value){
res.json({value: value});
});
});
POST is sent through ajax and then update textarea with new data.
$.ajax({
type: "POST",
url: "/",
data: JSON.stringify(datareq),
dataType: 'json',
contentType: 'application/json',
success: function(x){
$.each(x, function(index, value) {
$('.textarea').append(value + '\n');
});
},
error: function(x) {
console.log(x + 'error');
}
});
How to send this using one POST and a few response. User received one data in textarea when cb finished and then another data and so one till the end.
<textarea>
data 1 - 1sec
data 2 - 2sec leater
data 3 - 3 second later
...
</textarea>
I add Time (1sec ...) only to show that callback has a lot to do to send another response.
Of course this not working because res.send() close connection and I received error
So how to achieve my idea, to sending simultaneously after post request. I want to give user data very fast, then another one when is ready not waiting for all and then send response.
You can't
Reason:
Http closes connection after sending response. You can not keep it open and sending multiple responses to the client. HTTP doesn't support it.
Solution 1:
Simply put a timer at client side and request periodically.
Solution 2 (Recommended):
Use socket, and pass data through it. socket.io is the socket library for nodejs applications. It is very easy to use. Set up a connection, keep sending data from server and receive it on client side.
Just to add on the answer. This answer explains why res.send closes the connection.
I am using NodeJS with https://www.npmjs.com/package/elasticsearch package
Use Case is like this: When a link is clicked on the page, I will make a request to NodeJS Server which will in turn use the ES node package to fetch the data from ES Server and sends the data back to the client.
The issue is, when two requests are made in quick session(two links clicked in a short span), the Response of first request and then the Response of second request is reaching the client. The UI depends on this response, and i would like to directly show only the second request's response.
So, the question is, Is there any way to cancel out the previous request made to ES Server before starting a new one ?
Code:
ES Client:
var elasticsearch = require('elasticsearch');
var client = new elasticsearch.Client({
host: 'HostName',
log: 'trace'
});
Route:
app.get('/data/:reportName', dataController.getReportData);
DataController:
function getReportData(req, res) {
query = getQueryForReport(report)
client.search(query)
.then(function(response) {
res.json(parseResponse(response)
})
}
So, the same API /data/reportName is called twice in succession with different reportNames. I would like to send only the second report Data back and cancel our the first request.
If you're only concerned about the UX, rather than stressing your ES, than aborting the ajax request is what you want.
Since you didn't post your client side code, I'll give you a generic example:
var xhr = $.ajax({
type: "GET",
url: "searching_route",
data: "name=John&location=Boston",
success: function(msg){
alert( "Data Saved: " + msg );
}
});
//kill the request
xhr.abort()
Remember that aborting the request may not prevent the elasticsearch query from being processed, but will prevent the client from receiving the data.
I am having an issue getting the result set back to the client using Node.js. I am new to it and using it for a project but I am stuck and not sure why. Here's the situation: I have a webpage, a server, a request handler and a database interface. I am able to send data back and forth the client and server without any issue. The only time it doesn't work is when I try to send the result from my query back to the client.
function doSomething(response)
{
var data = {
'name': 'doSomething'
};
response.writeHead(200, {'Content-Type': 'text/html', 'Access-Control-Allow-Origin': '*'});
response.end(JSON.stringify(data));
}
This works fine as I can read the name from the object on the client side, but
function fetchAllIDs(response)
{
dbInterface.fetchAllIDs(function(data) {
// console.log(data) prints the correct information here
response.writeHead(200, {'Content-Type': 'text/html', 'Access-Control-Allow-Origin': '*'});
response.end(data);
// console.log(data) from the client side is just blank
});
}
I believe the issue is the way I handle my callback and response because without trying to use mysql the rest of my code works fine. Thanks!
EDIT: I removed a piece code that seems to confuse people. It was just to show that if I have the response code outside the callback then I am able to get any data back to the server. In my actual code, I do not have the two response statements together. I just can't get the rows from the fetchAllIDs function back to the client.
The way you have it written means the
reponse.end('This string is received by the client');
line is always called before the callback function meaning the reponse has ended.
Under JS all the code will finish before the next event (your callback function) is taken off the queue. Comment out the above line // and test it