Is it possible to detect an immediate when sending a chunky POST request with AXIOS - node.js

I am using AXIOS in a web client in order to POST a file to the express backend as an upload. Since both the file's size and the client user's bandwidth is variable, it may take a certain amount of time for the POST request to finish. On the backend some logic applies and the request is promptly rejected.
The problem is that the client receives the response only after the request is finished, which can be several seconds.
I have already tested that it is not the backend's fault, as the behavior is the same when POSTing to any arbitrary post-enabled url in the web, regardless of the technology. Here is an (over)simplified example of the case.
Here's the post action. Notice the commended request to the arbitrary post-enabled url. It behaves exactly the same:
try{
console.log("posting....")
const res = await axios.post("http://localhost:4000/upload", formData)
// const res = await axios.post("https://github.com/logout", formData)
console.log("result:")
console.log(res)
}catch(err){
console.error(err)
}
And the demo express backend route:
app.post("/upload", (req, res) => {
console.log("Rejecting...")
res.status(403).send()
console.log("Rejected.")
return
})
For testing purposes I choose a 3.7Mb file, and throttle down my browsers bandwidth to the Fast 3G preset.
The backend immediately outputs:
Rejecting...
Rejected.
Whereas the request is pending for about 43 seconds before returning the 403 error:
Am I missing something obvious here? It is such a common functionality, it makes me doubt that this is the correct way to be handled. And if it really is, do we have any information on whether express's thread is active during that time, or is it just a client inconvenience?
Thanks in advance!

I believe you could just use res.status(403) rather than res.status(403).send() .
You could also try using res.status(403).end() and I am not sure why you should use a return statement in the router part.

It seems that first sending the response headers and then manually destroying the request does the trick:
app.post("/upload", (req, res) => {
console.log("Rejecting...")
res.status(403).send("Some message")
return req.destroy()
})
The AXIOS request stays pending until just the current chunk is uploaded, and then immediately results in the correct status and message. In the throttled down fast 3g example, pending time went down from 43s to 900ms.
Also, this solution emerged through trial and error, so it can possibly not be the best practice.
I would still be interested in an AXIOS oriented solution, if one exists.

Related

Can't create a listener to the database while also being able to update it. (using References.on and References.put)

I'm trying to display a list of comments on my react page.
For this I have setup a NodeJS server which loads the data from Firebase and passes it on to React. I am able to get it to load the comments list and display them, but when I try to add a comment, the server crashes with the following error:
#firebase/database: FIREBASE WARNING: Error [ERR_HTTP_HEADERS_SENT]: Cannot set headers after they are sent to the client
This is because I am using:
firebase.database().ref('my-path').on("value", ...)
However, if I use firebase.database().ref('my-path').once("value", ...) then I lose the ability to update the comments as soon as a new comment is posted. Is there a way to be able to have a listener attached to the database and still be able to update the contents of that database?
Here is my NodeJS code:
app.get("/comments/:id", (req, res) => {
const itemsRef = firebase.database().ref(`comments/${req.params.id}`);
itemsRef.on('value', (snapshot) => {
let comments = snapshot.val();
return res.status(200).json(comments);
})
})
app.post("/comments/:id", (req, res) => {
const itemsRef = firebase.database().ref(`comments/${req.params.id}`);
itemsRef.push(req.body);
})
The error occurs after the post request is called.
You're sending a response back to the client with:
res.status(200).json(comments)
This sets at least two headers (the status, and the response type) and then sends the response. Next time you get an update from the database, this code runs again, and again tries to send the two headers. But in HTTP all headers must be before the main body of the response. So the second time this code runs, it throws an error.
If you want to keep sending more data to the client, you'll need to use more primitive methods of the response object to prevent sending headers, or other illegal data. While possible, it's more complex than you may think, as the client needs to handle this response stream, which most clients won't.
I'd highly recommend looking at Doug's alternative, which is to just use the Firebase Realtime Database from the client directly. That way you can use the client SDK that it has, which handles this (and many more complexities) behind the scenes.

Microsoft Graph API calendarView delta endpoint does not respond on initial request

When sending the first request to the calendarView API, the request does not return or timeout. This only happens to some of the requests and seems to happen only on the first request (perhaps because the first request has larger response sizes).
An example request:
GET /me/calendarView/delta?startDateTime=2019-06-27T22:00:00.000Z&endDateTime=2019-09-08T13:17:30.659Z
The current solution I found was reducing the odata.maxpagesize to a very small number (currently 2 is the highest number which works for all calendars I have tested).
The requests are sent using the nodejs client "#microsoft/microsoft-graph-client": "1.7.0".
// Initialize client with credentials
const client = graph.Client.init({
authProvider: done => {
done(null, credentials.access_token);
}
});
const url = "/me/calendarView/delta?startDateTime=2019-06-27T22:00:00.000Z&endDateTime=2019-09-08T13:17:30.659Z
console.log("Request start")
const result = await oAuth2Client
.api(url)
.header("prefer", "odata.maxpagesize=10")
.get();
console.log("Got result", result);
Here the last console.log is never called.
The expected result is that the request returns, at least with an error code. I also expect the API to be able to handle a lot more items than 2 per page.
The current solution with setting a small maxpagesize works temporarily, however, I expect that there is another root cause issue.
Any idea what is wrong, and how this can be resolved?
After a lot of debugging i traced the issue to the node library. When asking for a raw response from the API, I got back the result regardless of page size.
The solution was to manually parse the response myself, after asking for the raw response from the library. This was based on the implementation in the library at https://github.com/microsoftgraph/msgraph-sdk-javascript/blob/dev/src/GraphResponseHandler.ts#L98
I could not find the root cause issue in the library, and ended up just parsing it on my end. I also analysed the raw response from the API, but the content-type header was correctly application/json, and the response status was 200 OK.
const rawResponse = client.api(url).response(ResponseType.RAW).get()
const parsedResponse = await rawResponse.json()

How to add express middleware at the end of the chain that gets invoked no matter what (OK/FAIL responses)?

Is there a way to add middleware to the end of an express app or router chain that gets called to track whether or not the res / response was sent or not?
I mean, regardless of if:
A response is sent (string, JSON, etc.)
A static served file.
No file found in the static folder.
A catch-all callback was reached.
An error middleware was reached.
Example
For instance, if I wanted to log everything...
whether a response was successful or not, ie: it served a file via a express.static( ... ) middleware, some data fetched from a DB, or a custom middleware, or again... if it failed / threw an error...,
is there a way to invoke a callback at the very end?
So far from what I can understand, it seems like, by design, if a static file gets served successfully (via express.static), it doesn't call next(), so the chain stops there.
And for any custom-made middlewares using res.send(), you normally wouldn't want to call next() afterwards since it could cause some undesirable side-effects (errors with headers getting resent).
For error-handlers, that's easier since all unsuccessful responses can be caught here.
But how can it output both successful / unsuccessful responses? Could this be something that should be done without middlewares?
The solution I went with ended up being slightly different from this one by #idbehold, but in a nutshell, at the very top of the express app middleware chain, I had to hook a callback to the res Response object's finish event which gets triggered for most (all?) HTTP status-codes I needed to track a successfully served request.
app.use( ( req, res, next ) => {
res.on( 'finish', () => {
var codeStr = String( res.statusCode );
codeStr = codeStr[res.statusCode < 400 ? 'green' : 'red'];
var output = [req.method.green, req.fullUrl().green, codeStr];
trace( output.join( ' ' ) );
} );
next();
});
I can now get things like:
EDIT
Alright! So provided you also have an error-handler at the "end" of your middleware chain that serves something with an error 404 code, that will trigger the finish event on the res object.
Example of such an error-handler:
app.use( ( err, req, res, next ) => {
trace( "Error!".red );
trace( err );
res.status( 404 ).send(); // Triggers 'finish' on res.
})
There's a conceptual difficulty with the asynchronous architecture of node.js and Express for doing this. I'll describe the general problem and then discuss a few possible work-arounds.
First, each Express handler can be asynchronous. Thus, it gets called and returns pretty much immediately and nobody outside of that world knows whether it is still waiting for some asynchronous operation to finish before eventually sending its response or if it just failed to do anything. You literally can't tell from the outside world.
Second, you can monitor a given request to see if it either calls an error handler or if it sends a response. There is no way to monitor a request handler to see if it just failed to send anything because of the reason above - you have no way of knowing if its still waiting for some asynchronous thing to finish.
So, here's the best I could recommend:
Hook res.end() to see when it gets called. This is an indication that the response is now done (whether error or success). You can see an example of doing that in the express-afterware module that Medet linked in an above comment. The general idea is that you'd have your own middleware somewhere very early in the chain that overrides res.end() so you can see when its called. That early middleware would just install the override and call next() to continue the handler chain. Then, when the response is finished, your override would see that res.end() got called. This should work for all cases where the response is sent.
Then, you still need to handle cases where no response is sent (which is probably due to faulty code since all requests should get a response eventually). The only way I know of to do that is to implement some sort of timeout for a request. You can either use a built-in mechanism server.setTimeout() or you can implement your own inside your middleware (same middleware as describe in step 1). Then, after some timeout that you specify, if no response has yet been sent, you would take over and send some error response.
Install your own error middlewares early in the chain that will see and log all errors. Note that res.end() will still be called so the behavior in step 1 will still be triggered even for errors (error responses still call res.end()).
You can trigger a piece of code at the end of a request by using the finish event of the response object. The finish event is emitted when the response has been sent to the client and all the data has been flushed to the network.
app.use(function(req, res, next) {
res.on('finish', function() {
console.log('Request finished');
});
next();
});

How to access all response times on every route in Express 4+?

I'm struggling to pass response time data at the application-level to every route in my Express app. I've Googled and Googled but everything comes back suggesting one throw a new Date() into API calls, which is gross, or the Express response-time package (including many results from here on SO). While I'm sure it's great, I don't understand what purpose it serves other than adding a header seeing as how I can get response times in my browser's dev tools.
What I want to do is access all response time data from the server into the view, on every route.
Express already provides some request data, but not all of it. I just can't seem to access all responses. My terminal looks like this when I load up a basic page, even though I have 6 images on the page besides the one CSS file.
GET / 200 64.439 ms - 1663
GET /styles/index.css 200 36.582 ms - 2035
If I use the response-time package, I can't seem to access the 'X-Response-time' header. req.headers only seems to return a subset of all headers, similar to the Express traffic output mentioned above.
Maybe I'm just dense, but even the docs of the response-time package mention how to configure it with Express, but I still don't understand what it's supposed to be adding or how I would access it outside of my console.
Create a new middleware that records the response time of a request and makes this available to your own function fn. The fn argument will be invoked as fn(req, res, time), where time is a number in milliseconds.
Couldn't you just do this:
var responseTime = require('response-time')
app.use(responseTime(function(req, res, time) {
res.header('X-Response-Time', time);
}));
Now every route below this will have a response time header on it.
The response-time package will add the X-Response-Time header at the same time as you send the request:
app.use(responseTime());
app.get('/', function(req, res) {
console.log(res.get('X-Response-Time')); // undefined
res.send('Hello');
console.log(res.get('X-Response-Time')); // 3.720ms
});
It does this by listening out for when headers are about to be written (e.g. when a request is about to be sent), appending the response time just before (using the on-headers package on npm).
It sounds like what you want to do is record the response time, then set it into res.locals so you can render it in your view.
I've played around for a while trying to use the on-headers package to set the response time inside res.locals, but I think it must be too late by then to write to the response... Maybe this is the right path? Sorry I don't have more time right now to play around :(

Express request is called twice

To learn node.js I'm creating a small app that get some rss feeds stored in mongoDB, process them and create a single feed (ordered by date) from these ones.
It parses a list of ~50 rss feeds, with ~1000 blog items, so it's quite long to parse the whole, so I put the following req.connection.setTimeout(60*1000); to get a long enough time out to fetch and parse all the feeds.
Everything runs quite fine, but the request is called twice. (I checked with wireshark, I don't think it's about favicon here).
I really don't get it.
You can test yourself here : http://mighty-springs-9162.herokuapp.com/feed/mde/20 (it should create a rss feed with the last 20 articles about "mde").
The code is here: https://github.com/xseignard/rss-unify
And if we focus on the interesting bits :
I have a route defined like this : app.get('/feed/:name/:size?', topics.getFeed);
And the topics.getFeed is like this :
function getFeed(req, res) {
// 1 minute timeout to get enough time for the request to be processed
req.connection.setTimeout(60*1000);
var name = req.params.name;
var callback = function(err, topic) {
// if the topic has been found
if (topic) {
// aggregate the corresponding feeds
rssAggregator.aggregate(topic, function(err, rssFeed) {
if (err) {
res.status(500).send({error: 'Error while creating feed'});
}
else {
res.send(rssFeed);
}
},
req);
}
else {
res.status(404).send({error: 'Topic not found'});
}};
// look for the topic in the db
findTopicByName(name, callback);
}
So nothing fancy, but still, this getFeed function is called twice.
What's wrong there? Any idea?
This annoyed me for a long time. It's most likely the Firebug extension which is sending a duplicate of each GET request in the background. Try turning off Firebug to make sure that's not the issue.
I faced the same issue while using Google Cloud Functions Framework (which uses express to handle requests) on my local machine. Each fetch request (in browser console and within web page) made resulted in two requests to the server. The issue was related to CORS (because I was using different ports), Chrome made a OPTIONS method call before the actual call. Since OPTIONS method was not necessary in my code, I used an if-statement to return an empty response.
if(req.method == "OPTIONS"){
res.set('Access-Control-Allow-Origin', '*');
res.set('Access-Control-Allow-Headers', 'Content-Type');
res.status(204).send('');
}
Spent nearly 3hrs banging my head. Thanks to user105279's answer for hinting this.
If you have favicon on your site, remove it and try again. If your problem resolved, refactor your favicon url
I'm doing more or less the same thing now, and noticed the same thing.
I'm testing my server by entering the api address in chrome like this:
http://127.0.0.1:1337/links/1
my Node.js server is then responding with a json object depending on the id.
I set up a console log in the get method and noticed that when I change the id in the address bar of chrome it sends a request (before hitting enter to actually send the request) and the server accepts another request after I actually hit enter. This happens with and without having the chrome dev console open.
IE 11 doesn't seem to work in the same way but I don't have Firefox installed right now.
Hope that helps someone even if this was a kind of old thread :)
/J
I am to fix with listen.setTimeout and axios.defaults.timeout = 36000000
Node js
var timeout = require('connect-timeout'); //express v4
//in cors putting options response code for 200 and pre flight to false
app.use(cors({ preflightContinue: false, optionsSuccessStatus: 200 }));
//to put this middleaware in final of middleawares
app.use(timeout(36000000)); //10min
app.use((req, res, next) => {
if (!req.timedout) next();
});
var listen = app.listen(3333, () => console.log('running'));
listen.setTimeout(36000000); //10min
React
import axios from 'axios';
axios.defaults.timeout = 36000000;//10min
After of 2 days trying
you might have to increase the timeout even more. I haven't seen the express source but it just sounds on timeout, it retries.
Ensure you give res.send(); The axios call expects a value from the server and hence sends back a call request after 120 seconds.
I had the same issue doing this with Express 4. I believe it has to do with how it resolves request params. The solution is to ensure your params are resolved by for example checking them in an if block:
app.get('/:conversation', (req, res) => {
let url = req.params.conversation;
//Only handle request when params have resolved
if (url) {
res.redirect(301, 'http://'+ url + '.com')
}
})
In my case, my Axios POST requests were received twice by Express, the first one without body, the second one with the correct payload. The same request sent from Postman only received once correctly. It turned out that Express was run on a different port so my requests were cross origin. This caused Chrome to sent a preflight OPTION method request to the same url (the POST url) and my app.all routing in Express processed that one too.
app.all('/api/:cmd', require('./api.js'));
Separating POST from OPTIONS solved the issue:
app.post('/api/:cmd', require('./api.js'));
app.options('/', (req, res) => res.send());
I met the same problem. Then I tried to add return, it didn't work. But it works when I add return res.redirect('/path');
I had the same problem. Then I opened the Chrome dev tools and found out that the favicon.ico was requested from my Express.js application. I needed to fix the way how I registered the middleware.
Screenshot of Chrome dev tools
I also had double requests. In my case it was the forwarding from http to https protocol. You can check if that's the case by looking comparing
req.headers['x-forwarded-proto']
It will either be 'http' or 'https'.
I could fix my issue simply by adjusting the order in which my middlewares trigger.

Resources