I'm currently looking to set up an endpoint that accepts a request, and returns the response data in increments as they load.
The application of this is that given one upload of data, I would like to calculate a number of different metrics for that data. As each metric gets calculated asynchronously, I want to return this metric's value to the front-end to render.
For testing, my controller looks as follows, trying to use res.write
uploadData = (req, res) => {
res.write("test");
setTimeout(() => {
res.write("test 2");
res.end();
}, 3000);
}
However, I think the issue stems from my client-side which I'm writing in React-Redux, and calling that route through an Axios call. From my understanding, it's because the axios request closes once receiving the first response, and the connection doesn't stay open. Here is what my axios call looks like:
axios.post('/api', data)
.then((response) => {
console.log(response);
})
.catch((error) => {
console.log(error);
});
Is there an easy way to do this? I've also thought about streaming, however my concern with streaming is that I would like each connection to be direct and unique between clients that are open for short amount of time (i.e. only open when the metrics are being calculated).
I should also mention that the resource being uploaded is a db, and I would like to avoid parsing and opening a connection multiple times as a result of multiple endpoints.
Thanks in advance, and please let me know if I can provide any more context
One way to handle this while still using a traditional API would be to store the metrics in an object somewhere, either a database or redis for example, then just long poll the resource.
For a real world example, say you want to calculate the following metrics of foo, time completed, length of request, bar, foobar.
You could create an object in storage that looks like this:
{
id: 1,
lengthOfRequest: 123,
.....
}
then you would create an endpoint in your API that like so metrics/{id}
and would return the object. Just keep calling the route until everything completes.
There are some obvious drawbacks to this of course, but once you get enough information to know how long the metrics will take to complete on average you can tweak the time in between the calls to your API.
Related
I have run into an unforeseen problem with my socket.io setup.
I use socket.io to live load data from my database (mongoDB, nodejs, react).
To accomplish this, I use mongoDB's changestream to detect changes and then push them to the front-end via socket.io.
Now this works perfectly as long as the user is connected. And right now, when the user reconnects, it just reloads all data. While this is fine for most users, there is a small group with very bad network connection and thus the front-end is reloading data all the time. Which causes the front-end to be unresponsive for some time.
So, I am looking for a way to only send events that occurred during the front-end being offline. While the front-end can do this quite easily: https://socket.io/docs/v4/client-offline-behavior/
It doesn't seem possible to do this at the server side. Since socket.io (server side) immediately forgets sockets that have disconnected and thus cant buffer events.
So, I was wondering if there is a good way do this? Or would this need a full "wrapper" around socket.io that caches disconnected sockets?
Any help or advice would be appreciated!
I find it is a really interesting and painful problem ! ^^'
If you can give more variables, it may help people to give you a better answer
For instance
How many data are stored in database, how much a typical user will receive, and how many events are triggered on a time frame ?
How long should an event take to be visible ? I mean, if users receive an event with a 10s,30s,... delay, is it harmfull for the service they provide.
How your data is structured ? is it a simple json array with the same field, custom field, dynamic json object, etc..
How your react app is structured, do you put heavy logic when your data is update, etc..
I think you should put more controls in your front end code and update only when new datas.
Some paths to explore
1. Put more controls in your front end
As you stated, for the users with bad connection, the react client seems to update his state too quickly, when they reload data after the websocket is connected, again and again. Ui may freeze in this case, yes.
For this, I think of two approaches :
Before updating the state, check if react current state is the same as the data you receive from websocket connection. If the reconnection is quick enough and no new data arrived, it should be the same. So in this case do not update react state.
If too many events are triggered and after each reconnection new data arrived, you can buffer the datas from the websocket and display it only once per time frame. What i mean by time frame, is you can use functions like setInterval or requestAnimationFrame to trigger react update. A pseudo react code to illustrate this.
function App() {
const [events, setEvents] = useState({ datas: [] });
const bufferedEvents = useRef([]);
useEffect(() => {
websocket.on("connected", (newEvents) => {
bufferedEvents.current = bufferedEvents.current.concat(newEvents);
})
websocket.on("data", (newEvent) => {
bufferedEvents.current = bufferedEvents.current.concat(newEvent);
})
// In the setInterval function you take all the events receive at the connection + new events. to update the react state. You clean the bufferedEvents at the same time.
const intervalId=setInterval(() => {
const events = bufferedEvents.current;
bufferedEvents.current = [];
//update if new datas
if (events.length > 0) {
setEvents((prevState) => { return { datas: prevState.datas.concat(events) } });
}
// console.log(events)
}, 1000) // trigger data update every second. You could replace this approach with a requestAnimationFrame. You can adapt the time refresh as you need.
//Do not forget to clear the interval when the component is unmount
return ()=>{
clearInterval(intervalId)
}
}, []);
return (
<div>
<span>Total events : {events.datas.length}</span>
<br />
{
events.datas.map(event => {
return <div>{event.data}</div>
})
}
</div>
)
}
You can look at this article for details on using requestAnimation frame.
I think that modifying the front end is needed in all case, but still alone, not really good on performance.
2. Fetch only new data in your back end
For this approach, it really depends how your data is structured in the database.
If the data have some timestamp in it, I can think of a naive but simple cookie with a timestamp in it.
When user connects the first time, this cookie is null.
When they fetch the data, on the websocket connection, they receive all the datas. When datas arrived, you update the cookie timestamp with the most recent date in the data.
Websocket is disconnected, you open a new websocket with the cookie timestamp on it. With this information you can query all the datas more recent than the timestamp on the cookie.
Like this, you don't have to download the entirity of data, but only fresh ones.
Other approaches may be more helpfull but without more informations on your datas and more precise requirements, it is hard to say.
If you have a lot of data, I will personally check some pagination mechanism and maybe combine some classic http request for fetching the data, and websocket, sse, or long polling for live events.
You can put a comment if needed and I will update my response !
Cheers
The lambda's job is to see if a query returns any results and alert subscribers via an SNS topic. If no rows are return, all good, no action needed. This has to be done every 10 minutes.
For some reasons, I was told that we can't have any triggers added on the database, and no on prem environment is suitable to host a cron job
Here comes lambda.
This is what I have in the handler, inside a loop for each database.
sequelize.authenticate()
.then(() => {
for (let j = 0; j < database[i].rawQueries[j].length; j++) {
sequelize.query(database[i].rawQueries[j] => {
if (results[0].length > 0) {
let message = "Temporary message for testing purposes" // + query results
publishSns("Auto Query Alert", message)
}
}).catch(err => {
publishSns("Auto Query SQL Error", `The following query could not be executed: ${database[i].rawQueries[j])}\n${err}`)
})
}
})
.catch(err => {
publishSns("Auto Query DB Connection Error", `The following database could not be accessed: ${databases[i].database}\n${err}`)
})
.then(() => sequelize.close())
// sns publisher
function publishSns(subject, message) {
const params = {
Message: message,
Subject: subject,
TopicArn: process.env.SNStopic
}
SNS.publish(params).promise()
}
I have 3 separate database configurations, and for those few SELECT queries, I thought I could just loop through the connection instances inside a single lambda.
The process is asynchronous and it takes 9 to 12 seconds per invocation, which I assume is far far from optimal
The whole thing feels very very sub optimal but that's my current level :)
To make things worse, I now read that lambda and sequelize don't really play well together:
I am using sequelize because that's the only way I could get 3 connections to the database in the same invocation to work without issues. I tried mssql and tedious packages and wasn't able with either of them
It now feels like using an ORM is an overkill for this very simple task of a SELECT query, and I would really like to at least have the connections and their queries done asynchronously to save some execution time
I am looking into different ways to accomplish this and i went down the rabbit hole and I now have more questions than before! Generators? are they still useful? Observables with RxJs? Could this apply here? Async/Await or just Promises? Do I even need sequelize?
Any guidance/opinion/criticism would be very appreciated
I'm not familiar with sequelize.js but hope I can help. I don't know your level with RxJS and Observables but it's worth to try.
I think you could definitely use Observables and RxJS.
I would start with an interval() that will run the code every time you define.
You can then pipe the interval since it's an Observable, do the auth bit and do a map() to get an array of Observables (for each .query call, I am assuming all your calls, authenticate and query, are Promises so it's possible to transform them into Observables with from()). You can then use something like forkJoin() with the previous array to get a response after all calls are done.
In the .subscribe at the end, you would make the publishSns().
You can pipe a catchError() too and process errors.
The map() part might be not necessary and do it previously and have it stored in a variable since you don't depend on an authenticate value.
I'm certain my solution isn't the only one or the best but i think it would work.
Hope it helps and let me know if it works!
Am using express js on node for my api service ! In which am using sequelize for query handling purposes !
So in some usecase like creating record, or updating record its simply returning "1" or sometimes nothing !
In this case , am just using
res.sendStatus(200);
or sometimes
res.send("success");
Is there any better way or this is the correct way to handle ? Or should in need .end() in order to end the process ??
which is a good way to handle these kind of useless responses which we dont need to send back ?
This is where Status 204 comes in to play: https://en.wikipedia.org/wiki/List_of_HTTP_status_codes#2xx_success
It states: everything is OK (like in 200), but there is simple no need to send a body.
Using express, it's just as simple as: res.sendStatus(204)
Usually a json is send back to the client with different information in it that lets the client-side know that the operation that they request through the api is successfull or failed. for e.g. here, you can define a standard json response like this
//in case of success of the request
{
requestStatus: 1,
...other data if you want to like new Id of the created data/updated data or info about it if you want or depending upon use case
}
or
// in case of failure of the request
{
requestStatus: 0,
...other data if you want to like reason for failure, error message or depending upon your use case
}
Just add this line of code, should be fine:
//For all bad endpoints
app.use('*', (req, res) => {
res.status(404).json({ msg: 'Not a good endpoint, why are you here? Go play FIFA.'})
})
If you want you can generate an error HTML file, but since it's back-end, JSON format is strongly suggested. You can also add success: false for more clarity.
I am currently using the below server side rendering logic (using reactjs + nodejs +redux) to fetch the data synchronously the first time and set it as initial state in store.
fetchInitialData.js
export function fetchInitialData(q,callback){
const url='http://....'
axios.get(url)
.then((response)=>{
callback((response.data));
}).catch((error)=> {
console.log(error)
})
}
I fetch the data asynchronously and load the output in to store the first time the page loads using callback.
handleRender(req, res){
fetchInitialData(q,apiResult => {
const data=apiResult;
const results ={data,fetched:true,fetching:false,queryValue:q}
const store = configureStore(results, reduxRouterMiddleware);
....
const html = renderToString(component);
res.status(200);
res.send(html);
})
}
I have a requirement to make 4 to 5 API calls on initial page load hence thought of checking to see if there is an easy way to achieve making multiple calls on page load.
Do I need to chain the api calls and manually merge the response from different API calls and send it back to load the initial state?
Update 1:
I am thinking of using axios.all approach, can someone let me know if that is a ideal approach?
You want to make sure that requests happen in parallel, and not in sequence.
I have solved this previously by creating a Promise for each API call, and wait for all of them with axios.all(). The code below is
basically pseudo code, I could dig into a better implementation at a later time.
handleRender(req, res){
fetchInitialData()
.then(initialResponse => {
return axios.all(
buildFirstCallResponse(),
buildSecondCallResponse(),
buildThirdCallResponse()
)
})
.then(allResponses => res.send(renderToString(component)))
}
buildFirstCallResponse() {
return axios.get('https://jsonplaceholder.typicode.com/posts/1')
.catch(err => console.error('Something went wrong in the first call', err))
.then(response => response.json())
}
Notice how all responses are bundled up into an array.
The Redux documentation goes into server-side rendering a bit, but you might already have seen that. redux.js.org/docs/recipes/ServerRendering
Also checkout the Promise docs to see exactly what .all() does.developer.mozilla.org/en/docs/Web/JavaScript/Reference/Global_Objects/Promise/all
Let me know if anything is unclear.
You could try express-batch or with GraphQL is another option.
You also could use Redux-Sagas to use pure Redux actions to trigger multiple api calls and handle all of those calls using pure actions. Introduction to Sagas
3 years ago I could do multiple res.send in express.js.
even write a setTimeout to show up a live output.
response.send('<script class="jsbin" src="http://code.jquery.com/jquery-1.7.1.min.js"></script>');
response.send('<html><body><input id="text_box" /><button>submit</button></body></html>');
var initJs = function() {
$('.button').click(function() {
$.post('/input', { input: $('#text_box').val() }, function() { alert('has send');});
});
}
response.send('<script>' + initJs + '</script>');
Now it will throw:
Error: Can't set headers after they are sent
I know nodejs and express have updated. Why can't do that now? Any other idea?
Found the solution but res.write is not in api reference http://expressjs.com/4x/api.html
Maybe you need: response.write
response.write("foo");
response.write("bar");
//...
response.end()
res.send implicitly calls res.write followed by res.end. If you call res.send multiple times, it will work the first time. However, since the first res.send call ends the response, you cannot add anything to the response.
response.send sends an entire HTTP response to the client, including headers and content, which is why you are unable to call it multiple times. In fact, it even ends the response, so there is no need to call response.end explicitly when using response.send.
It appears to me that you are attempting to use send like a buffer: writing to it with the intention to flush later. This is not how the method works, however; you need to build up your response in code and then make a single send call.
Unfortunately, I cannot speak to why or when this change was made, but I know that it has been like this at least since Express 3.
res.write immediately sends bytes to the client
I just wanted to make this point about res.write clearer.
It does not build up the reply and wait for res.end(). It just sends right away.
This means that the first time you call it, it will send the HTTP reply headers including the status in order to have a meaningful response. So if you want to set a status or custom header, you have to do it before that first call, much like with send().
Note that write() is not what you usually want to do in a simple web application. The browser getting the reply little by little increases the complexity of things, so you will only want to do it it if it is really needed.
Use res.locals to build the reply across middleware
This was my original use case, and res.locals fits well. I can just store data in an Array there, and then on the very last middleware join them up and do a final send to send everything at once, something like:
async (err, req, res, next) => {
res.locals.msg = ['Custom handler']
next(err)
},
async (err, req, res, next) => {
res.locals.msg.push('Custom handler 2')
res.status(500).send(res.locals.msg.join('\n'))
}