Prevent post request timeout - node.js

I'm making a POST request to a nodejs server that contains just a file name in the body. Upon receiving the request, node uploads the corresponding local file to an Amazon S3 bucket. The files can sometimes take awhile to upload, and sometimes the request times out. How can I lengthen or prevent the timeout from happening?

You can send the response back to the browser while you still continue to work on the upload. You don't have to wait until the upload is done before finishing the response. For example, you can do:
res.end("done"); // or put whatever HTML you want in the response
And, that will end the post response and the browser will be happy. Meanwhile, your server continues to finish the upload.
The upside to this is that the browser and user goes on their merry way to the next thing they want to do without waiting for the server to finish its business. The only downside I'm aware of is that if the upload fails, you don't have a simple means of informing the browser that it failed (because the response is already done).
The timeout itself is controlled by the browser. It is not something that you can directly control from the server. If you are continually sending data to the browser, then it will likely not timeout. So, as long as the upload was continuing, you could do something like this:
res.write(" ");
every 15 seconds or so as sort of a keep-alive that keeps the browser happy and keeps it from timing out. You'd probably set an interval timer for 15 seconds to do the small send of data and then when the upload finishes you would stop the timer.
If you control the Javascript in the browser that makes the post, then you can set the timeout value on an Ajax call from the Javascript in the browser. When you create an XMLHttpRequest object in the browser, it has a property called timeout which you can set for asynchronous ajax calls. See MDN for more info.

Related

Chrome V3 synchronous notification before request

I have to add a header whose value is derived from the content of a file. I've solved the problem polling the file for changes and calling declarativeNetRequest.updateDynamicRules each time the file changes. The only way to avoid this polling is being synchronously notified each time a request is going to happen. I've not found a way to do this with this API. Does it exist?

show progress during processing big data

I have a question I have a web app that to run need to process some big file. It can take a 5secund so meantime I want to show user that file is processing or the best will be to send information how many time to end. This is the first page and I cannot send twice res.render so how to do this?
var fileinarray = require('./readfile');
app.get('/', function(req, res){
var dataready = fileinarray;
res.render('index', {data: dataready});
});
So how I can do his? I read a little about socke.io but I don't now how to used in my case?
Thank you for your help
If you are loading a page (which it looks like) with this request, then you can't show progress with the way you have it structured because the browser is waiting to download your page and thus can't show anything in that window until you render the page for it. But, you want to show progress before the page gets there.
So, you have to restructure the way things work. First off, in order to show progress, the browser has to have a working page that you can show progress in. That means the browser can't be waiting for the page to arrive. The most straightforward way I can think of to do this is to render a shell page initially that your server returns immediately (no waiting for the original content). In that shell page, you have code to show progress and you initiate a request to your server to fetch the long running data. There are several ways this could be done using Ajax and webSockets. I'll outline some combinations:
All with Ajax
Browser requests the / page and your server renders the shell page
Meanwhile, after rendering the page, the server starts the long running process of fetching the desired data.
Rendered inside the shell page is a Javascript variable that contains a transaction ID
Meanwhile, client-side Javascript can regularly send Ajax requests to the server to check on the progress of the given transaction id (say every 10 seconds). The server returns whatever useful progress info and client-side Javascript displays the progress info.
At some point the server is done with the data and one of the regular Ajax requests that was checking on progress returns with all the data. The client-side Javascript then inserts this data into the current page and the operation is complete.
Mostly with WebSocket
Browser requests the / page and your server renders the shell page
Client-side code inside the shell page makes a webSocket or socket.io connection to the server and sends an initial request for the data that belongs with this page.
The server receives the webSocket connection and the initial request for the data and starts the long running process of fetching the data.
As the server has meaningful progress information, it sends that to the client over the webSocket/socket.io connection and when the client receives that information, it renders appropriate progress in the page.
At some point the server is done fetching the data and sends the client a message containing the final data. The client-side Javascript receives this data and inserts it into the page.
The client can then close the webSocket/socket.io connection.

Play a sound when download is ready

The user selects some options then clicks "download". At that point, my php script starts preparing the file, and it can take 5-10 minutes before the file is ready and starts downloading. I want to notify the user with a sound that the download has started.
How can I do that?
According to this question:
Is there a way to detect the start of a download in JavaScript?
There is no programmatic way to detect when a download begins. That question is six years old now, so perhaps it is out of date, but I could not find any more recent information to contradict it.
An alternative approach would be to break the download process into two parts so that you can control when the actual data transfer begins:
Instead of initiating the download immediately, have the button send an AJAX request to the server asking it to prepare the file for download.
The server should not reply to the AJAX immediately, but should prepare the file and save it in a temporary file storage area with a unique generated name/ID.
Once the file is ready, the server should reply to the AJAX with the name/ID of the file.
On the client, the AJAX completion callback can play the sound, since it knows the download is about to begin.
It then uses window.open() to request the file from the server.
Now the server can respond with the appropriate headers as you used to do.
Finally, the server can delete the file from temporary storage (or just wait for a cron job to do it).

is it possible to load offline version of site, while sending request to server

I have a single page app written in node.js which has a fair amount of javascript and css.
now is it possible to load the offline version of webpage as soon as the url is entered and at the same time send the request to server and wait for response while the offline version is showing a nice splash screen?
in other words, instead of waiting for the response from server and then render the application, I prefer that browser would render the app while the request is being sent.
this way the page is loaded instantly(with the splash page) and the time that requests are being sent and responses are being returned, the javascript and css is being loaded which saves some time.
is this possible with modern technologies? and is it even a good idea?

Node/Express - Is it bad to not respond to a particular request?

I have a route that takes about a minute to run before a response is sent back (this is on purpose). The problem I ran into was when a user logged out after the request was made (via ajax) but before a response was returned, when the response was finally sent back, the user's session would be re-initialized, and stored back into Redis. This was leaving my Redis instance with a lot of extra entries that weren't useful anymore.
My solution was to listen for the req.close event, and when that happened, set a variable that prevented any response from being sent back at all (basically don't call next or res.end). This fixed my problem, but I'm wondering if it is bad to have an unresolved request. I don't actually care what the response is when the request is cancelled, since the user has navigated away from the page that made the request.

Resources