nodejs manual progress indicator for post request - node.js

Am trying to get the progress of a normal post request that takes at least 5 to 10sec. This request is validating data and uploading files to third party api.
The issue is, i want to display some kind of progress feedback on my react app for it.
I've been using axios onUploadProgress and onDownloadProgress and checking what is happening but nothing.
onUploadProgress: (progress) => {
console.log("onUpload", progress)
},
onDownloadProgress: (progress) => {
console.error("onDownload ", progress)
}
How can i for every step i have in my node controller send back a manual progress indicator ?

Related

Nodejs how to separate multiple "multipartform-data" POST requests

In Nodejs I have developed a small Client application that sends multiple “multipart/form-data” to my Server application using POST requests.
Each form to be sent is composed by a file (loaded from the Client hard-disk) and a string information. Basically I have the following situation :
Form 1: (File 1, String 1)
Form 2: (File 2, String 2)
Form 3: (File 3, String 3)
Etc..
To make the POST requests I’m using the “form-data” library ( https://www.npmjs.com/package/form-data ).
The problem that I’m facing is that all the POST requests are sent after the end of the execution of my Client application, but I would like to be able to send each POST request separately.
Here is a part of the code that I’m using :
function FormSubmit(item)
{
var FileStream = fs.createReadStream(item.path);
// Create an "Upload" Form and set all form parameters.
let form = new FormData();
form.append('Text1', 'test');
form.append('file', FileStream);
// Form Submit.
form.submit('http://localhost:5000/upload', function(err, res) {
if (err) {
console.log(err);
}
if (res!= undefined)
res.resume();
else
console.log('Res undefined: ', res);
});
}
I’m calling the “FormSubmit” function multiple times, and I was expecting to receive the POST request on the Server application every time after executing the command “form.submit”, but in reality I receive the POST requests all together after the entire application execution finish.
In particular the Server receives the requests on the command “self.emit('connect');” inside the function “afterConnect” in the “net.js” file in the core module.
It seems that it has nothing to do with timings, because even if i put a breakpoint and wait for some minutes after the first execution of the "FormSubmit" function, i don't receive anything on the Server application.
Probably it is not something related to the "form-data" library, because i get the same behaviour using "request", etc..
I guess it is something related to NodeJs itself or about how i wrote the Client application.
I am new to NodeJs so any help/advice would be appreciated.
Thanks.

How to return real-time JSON without HTML code

I am trying to make an API that will send back a real-time JSON. I am using NodeJS with ExpressJS, and Socket.io, and the problem is that res.send can not be sent more than one time; And, I really don't know how to send my (real-time) data without asking the refresh of my page.
Basically, I made a timer that changes the value every second.
I also tried to send a file, but I can't use this method, because my iOS app is asking a JSON data without HTML code
setInterval( function() {
var msg = Math.random().toString();
io.emit('message', msg);
console.log(msg);
res.send(msg);
}, 1000);
Maybe, there is another framework than Express than I could use and could refresh my data automatically? The console.log line works well and my data is updated every 1000ms.
Thank you in advance

expressjs sending err_empty_response

I have a problem with my expressJS framework when I am sending a response with delay 200 seconds it's sending err_empty_response with status code 324
here is fakeTimer example
fakeTimeout(req, res) {
setTimeout(() => { res.json({success: true})}, 200000)
}
ERR_EMPTY_RESPONSE is a Google Chrome error code.
Actually, Chrome will automatically timeout requests when they exceeds 300 seconds, and there is no way to change that settings unfortunately.
One workaround could be to change the Keep Alive headers.
However, if one task is taking longer than one minute, you should really just don't let the user wait that amount of time and have a feedback later on the UI when it's completed.

React app with Server-side rendering crashes with load

I'm using react-boilerplate (with react-router, sagas, express.js) for my React app and on top of it I've added SSR logic so that once it receives an HTTP request it renders react components to string based on URL and sends HTML string back to the client.
While react rendering is happening on the server side, it also makes fetch request through sagas to some APIs (up to 5 endpoints based on the URL) to get data for components before it actually renders the component to string.
Everything is working great if I make only several request to the Node server at the same time, but once I simulate load of 100+ concurrent requests and it starts processing it then at some point it crashes with no indication of any exception.
What I've noticed while I was trying to debug the app is that once 100+ incoming requests begin to be processed by the Node server it sends requests to APIs at the same time but receives no actual response until it stops stacking those requests.
The code that's used for rendering on the server side:
async function renderHtmlDocument({ store, renderProps, sagasDone, assets, webpackDllNames }) {
// 1st render phase - triggers the sagas
renderAppToString(store, renderProps);
// send signal to sagas that we're done
store.dispatch(END);
// wait for all tasks to finish
await sagasDone();
// capture the state after the first render
const state = store.getState().toJS();
// prepare style sheet to collect generated css
const styleSheet = new ServerStyleSheet();
// 2nd render phase - the sagas triggered in the first phase are resolved by now
const appMarkup = renderAppToString(store, renderProps, styleSheet);
// capture the generated css
const css = styleSheet.getStyleElement();
const doc = renderToStaticMarkup(
<HtmlDocument
appMarkup={appMarkup}
lang={state.language.locale}
state={state}
head={Helmet.rewind()}
assets={assets}
css={css}
webpackDllNames={webpackDllNames}
/>
);
return `<!DOCTYPE html>\n${doc}`;
}
// The code that's executed by express.js for each request
function renderAppToStringAtLocation(url, { webpackDllNames = [], assets, lang }, callback) {
const memHistory = createMemoryHistory(url);
const store = createStore({}, memHistory);
syncHistoryWithStore(memHistory, store);
const routes = createRoutes(store);
const sagasDone = monitorSagas(store);
store.dispatch(changeLocale(lang));
match({ routes, location: url }, (error, redirectLocation, renderProps) => {
if (error) {
callback({ error });
} else if (renderProps) {
renderHtmlDocument({ store, renderProps, sagasDone, assets, webpackDllNames })
.then((html) => {
callback({ html });
})
.catch((e) => callback({ error: e }));
} else {
callback({ error: new Error('Unknown error') });
}
});
}
So my assumption is that something is going wrong once it receives too many HTTP requests which in turn generates even more requests to API endpoints to render react components.
I've noticed that it blocks event loop for 300ms after renderAppToString() for every client request, so once there are 100 concurrent requests it blocks it for about 10 seconds. I'm not sure if that's a normal or bad thing though.
Is it worth trying to limit simultaneous requests to Node server?
I couldn't find much information on the topic of SSR + Node crashes. So I'd appreciate any suggestions as to where to look at to identify the problem or for possible solutions if anyone has experienced similar issue in the past.
In the above image, I am doing ReactDOM.hydrate(...) I can also load my initial and required state and send it down in hydrate.
I have written the middleware file and I am using this file to decide based on what URL i should send which file in response.
Above is my middleware file, I have created the HTML string of the whichever file was requested based on URL. Then I add this HTML string and return it using res.render of express.
Above image is where I compare the requested URL path with the dictionary of path-file associations. Once it is found (i.e. URL matches) I use ReactDOMserver render to string to convert it into HTML. This html can be used to send with handle bar file using res.render as discussed above.
This way I have managed to do SSR on my most web apps built using MERN.io stack.
Hope my answer helped you and Please write comment for discussions
1. Run express in a cluster
A single instance of Node.js runs in a single thread. To take
advantage of multi-core systems, the user will sometimes want to
launch a cluster of Node.js processes to handle the load.
As Node is single threaded the problem may also be in a file lower down the stack were you are initialising express.
There are a number of best practices when running a node app that are not generally mentioned in react threads.
A simple solution to improve performance on a server running multiple cores is to use the built in node cluster module
https://nodejs.org/api/cluster.html
This will start multiple instance of your app on each core of your server giving you a significant performance improvement (if you have a multicore server) for concurrent requests
See for more information on express performance
https://expressjs.com/en/advanced/best-practice-performance.html
You may also want to throttle you incoming connections as when the thread starts context switching response times drop rapidly this can be done by adding something like NGINX / HA Proxy in front of your application
2. Wait for the store to be hydrated before calling render to string
You don't want to have to render you layout until your store has finished updating as other comments note this is a blocks the thread while rendering.
Below is the example taken from the saga repo which shows how to run the sagas with out the need to render the template until they have all resolved
store.runSaga(rootSaga).done.then(() => {
console.log('sagas complete')
res.status(200).send(
layout(
renderToString(rootComp),
JSON.stringify(store.getState())
)
)
}).catch((e) => {
console.log(e.message)
res.status(500).send(e.message)
})
https://github.com/redux-saga/redux-saga/blob/master/examples/real-world/server.js
3. Make sure node environment is set correctly
Also ensure you are correctly using NODE_ENV=production when bundling / running your code as both express and react optimise for this
The calls to renderToString() are synchronous, so they are blocking the thread while they are running. So its no surprise that when you have 100+ concurrent requests that you have an extremely blocked up queue hanging for ~10 seconds.
Edit: It was pointed out that React v16 natively supports streaming, but you need to use the renderToNodeStream() method for streaming the HTML to the client. It should return the exact same string as renderToString() but streams it instead, so you don't have to wait for the full HTML to be rendered before you start sending data to the client.

NodeJs Execute function mutiple times without delay

i will share the code directly
app.get('/ListBooks', function (req, res) {
console.log("Function called");
//internally calls another URL and sends its response to browser
request({
url: 'someURLinRESTServer',
method: 'POST',
json: MyJsonData
}, function (error, response, body) {
if (error) {
console.log("/Call Failed ->" + error);
res.status(200).send('Failed');
} else {
console.log("/Call got Response");
console.log(response.statusCode, body);
res.send(body); res.end();
}
})
now when the browser generates a request on http://localhost/ListBooks
my node console shows the first message "Function called" and waits for internal REST URL Response
the real problem occurs only when the REST SERVER is down
then if i try to call http://localhost/ListBooks from another browser tab the Node server console doesnt show any changes and only after the repsonse of previous function REST CALL call it displays console message of second function call on app.get('/ListBooks'
i thought node js makes async functions bt here i dnt want functions to wait likes this for multiple instance calls
or is it just a delay in printing message and each function call executes separately .Plz clarify ...
If this is only occurring when the REST server is down (as your comment indicates), then that's just a function of how long your calls to request() take to fail. And, each separate call to request() goes through its own cycle of trying to connect and then eventually timing out. If both are timing out, then you will issue request1, then request2, then some timeout amount of time will pass and request1 will fail and then request2 will fail shortly after it. This has nothing to do with how express handles multiple requests and everything to do with how the calls to your REST server behave.
You can set the timeout option for request() if you want to shorten how long it will wait for a response, but you do need to make sure you don't shorten it so much that a busy REST server that just takes a little while to actually respond gets timed out.
or is it just a delay in printing message and each function call
executes separately
Each call is acting completely separately. There is no serialization of these responses by node.js or by Express. The appearance of serialization is just because they both take the same amount of time to fail with a timeout so they will fail one after the other.

Resources