I'm looking to delay all requests to an endpoint 30 minutes from when it was originally received. I'm working with Shopify and am responding to an order creation webhook, passing along data to another 3rd party service.
This is what happens: an order is placed, and then a webhook fires via Shopify, which contacts my koa endpoint with the associated order data.
I want to receive the webhook body (which includes order data) immediately, but then wait 30 minutes, then send along that information to another service. But I want to send a 200 header back to Shopify immediately, regardless of whether or not the code executes properly after the 30 minute delay (if it doesn't, I'll be informed via an error email anyway and can handle it manually, it wouldn't be the end of the world).
I've tried setTimeout methods, but the issue seems to be this: I can't send a 200 header back to Shopify until after the 30 minutes is up (or I shouldn't anyway, according to Koa docs?).
Meaning, if I do something like
context.response.status = 200; // tried to send response code immediately due to 30 minute timeout.
await handleRequest(context.request.body); // code waits 30 mins via settimeout
the 200 header isn't being sent right away
By the time 30 minutes has passed, Shopify has long given up on receiving a 200 header and has automatically put the request in a retry queue, which means the webhook will fire again before the 30 minutes is up, so I'll be sending duplicate requests to the service I'm trying to wait 30 minutes for.
I've considered cron, but it's not that I want these to be processed every 30 minutes, I want each individual request to wait 30 minutes to send to a third party service after it was received.
I'm really trying to avoid a database.
I'm fine with using setTimeout, if there is a way to send a 200 header back to Shopify before the timeout and subsequent service call after 30 minutes.
Status code goes with the first part of the HTTP message, so you need to start sending, or end the HTTP response.
In this case, you don't need to block the response on the 3rd party call. Move the logic you want to run in 30 minutes into its own function and don't await on the timer. You don't need it in your response logic.
Some pseudo code:
function talkTo3rdPartyDelayed(context, delay) {
setTimeout(()=>{
// talk to 3rdparty
}, delay);
return; // continue what you were doing before, this is non-blocking
}
function listenForShopify(req, res, next) {
talkTo3rdPartyDelayed({req.body}, 30*60*1000);
res.statusCode = 200;
next(); // or res.end() which sends the status code
}
As an aside, in a robust system, you'd store that 3rd party request you want delayed for 30 minutes in a queue system, with a visibility timer set to 30 minutes ahead. That way, if the process dies, you don't lose that work you intended to do. Having to manually handle any issues that come up will get tiresome.
Related
I have a problem with my expressJS framework when I am sending a response with delay 200 seconds it's sending err_empty_response with status code 324
here is fakeTimer example
fakeTimeout(req, res) {
setTimeout(() => { res.json({success: true})}, 200000)
}
ERR_EMPTY_RESPONSE is a Google Chrome error code.
Actually, Chrome will automatically timeout requests when they exceeds 300 seconds, and there is no way to change that settings unfortunately.
One workaround could be to change the Keep Alive headers.
However, if one task is taking longer than one minute, you should really just don't let the user wait that amount of time and have a feedback later on the UI when it's completed.
My request look this -
$http.post('/api/internal/Properties/' + options.property.id + '/Upload', { Buildings: Buildings })
.success(function (data){
on the server side, this request can take a long time to process. Up to 5 minutes. Before it can finish (res.send()), the $http.post request is being called again, every few minutes.
When it finally does finish processing, the res.send(data) is never caught on the client side. It's like it just disappears.
Anything would help.
Thanks,
Each browser has it's own http request timeout.
So it will not wait for 5 minutes to complete request it would just fail.
Browser Timeouts .
In chrome it's 30 or 60 seconds.
In your case i suggest use sockets or something like that to show user what's up with uploading.
Hope this helps.
I am using Rest API with node js.
My requirement is that if I get repeated requests within a certain time interval, I need to block it for 15 minutes (because token expired in 15 mints).
My condition allow only 15 requests in 10 seconds from the same client.
once you receive the 16th request in the 11th second from the same client then client is need to be black listed and gets a Forbidden status from then on – until their Token expires.
Token expiry time limit is 15 mints from the time of token creation, this will be maintained in node session.
How to achieve this please help me any one I am using below method
var rateLimit = require('express-rate-limit');
var limiter = rateLimit({/* config */});
app.use('/users', limiter);
I am using default config values here. If I send repeated request then its says 429-Too Many Requests. After a few seconds its working no error response. Here I need to block the request from same place for 15 minutes, after 15 minutes token will be expired. I then want to continue the process then, creating a new token.
You need to decide 2 things:
A time limit
How many requests you want to allow in that time limit
If you want to limit requests to a maximum of 15 every 10 seconds, you would configure express-rate-limit like so:
var rateLimit = require('express-rate-limit');
var limiter = rateLimit({
windowMs: 10 * 1000, // 10 seconds
max: 15,
});
app.use('/users', limiter);
I have app hosted on heroku + heroku postgres instance.
My REST api fetches some data from DATABASE and returns data back to client.
I am using Sequelize.js as ORM and restify.js as server.
My API function looks like this:
app.get('/test', function(req,res,next){
t1 = new Date()
db.getSomeData.done(function(err,result){
t2 = new Date()
console.log((t2.getTime()-t1.getTime())+'ms')
res.json({}) // it is not mistake, I return empty json to make sure that amount of data transfered back to client is not causing dely
next()
})
})
For tests I always fetch exactly same data from DB. console.log says that DB query takes 400-500ms. However when I am testing ajax requests (in chrome) and I can see that average request time is 500-800ms, but from time to time (it happens around 2-3 times per 10 tries) response is received after 3-5 seconds. This is strange because db query time is still normal (400ms for example) and I am not sending any data back to client. API is used only by me so there is no load on heroku server. I though that it might be caused by my internet connection so I made second test with following handler:
app.get('/test', function(req,res,next){
res.json({}) // it is not mistake, I return empty json to make sure that amount of data transfered back to client is not causing dely
next()
})
And for 100 times ther eis no delay - all requeststs are completed in 73-89 ms.
Do you know what might be a problem? It seems that request sometimes takes more time just because db query was done, no matter how much data is sent back. (despite that db query itself does not take time longer than normal)...
That is super-strange because I tried something like this:
app.get('/test', function(req,res,next){
setTimeout(function(){
res.json({}) // it is not mistake, I return empty json to make sure that amount of data transfered back to client is not causing dely
next()
},500)
})
And problem still occurs: 3-4 time per 10 tries response is received after 1.5 - 3.5 seconds where average is 800-900ms.
Also Response-Time header set by restify is correct despite the delay. For example:
Chrome says: 3.5s
Response-Time: 600s
It means that lag is caused by Heroku?
Resolved. It was a Heroku platform issue.
https://status.heroku.com/incidents/649
Investigating
Our automated systems have detected potential platform errors.
We are investigating.
Posted Jul 12, 2014 23:38 UTC
I'm writing proxy in Node.js + Express 2. Proxy should:
decrypt POST payload and issue HTTP request to server based on result;
encrypt reply from server and send it back to client.
Encryption-related part works fine. The problem I'm facing is timeouts. Proxy should process requests in less than 15 secs. And most of them are under 500ms, actually.
Problem appears when I increase number of parallel requests. Most requests are completed ok, but some are failed after 15 secs + couple of millis. ab -n5000 -c300 works fine, but with concurrency of 500 it fails for some requests with timeout.
I could only speculate, but it seems thant problem is an order of callbacks exectuion. Is it possible that requests that comes first are hanging until ETIMEDOUT because of node's focus in latest ones which are still being processed in time under 500ms.
P.S.: There is no problem with remote server. I'm using request for interactions with it.
upd
The way things works with some code:
function queryRemote(req, res) {
var options = {}; // built based on req object (URI, body, authorization, etc.)
request(options, function(err, httpResponse, body) {
return err ? send500(req, res)
: res.end(encrypt(body));
});
}
app.use(myBodyParser); // reads hex string in payload
// and calls next() on 'end' event
app.post('/', [checkHeaders, // check Content-Type and Authorization headers
authUser, // query DB and call next()
parseRequest], // decrypt payload, parse JSON, call next()
function(req, res) {
req.socket.setTimeout(TIMEOUT);
queryRemote(req, res);
});
My problem is following: when ab issuing, let's say, 20 POSTs to /, express route handler gets called like thousands of times. That's not always happening, sometimes 20 and only 20 requests are processed in timely fashion.
Of course, ab is not a problem. I'm 100% sure that only 20 requests sent by ab. But route handler gets called multiple times.
I can't find reasons for such behaviour, any advice?
Timeouts were caused by using http.globalAgent which by default can process up to 5 concurrent requests to one host:port (which isn't enough in my case).
Thouthands of requests (instead of tens) were sent by ab (Wireshark approved fact under OS X; I can not reproduce this under Ubuntu inside Parallels).
You can have a look at node-http-proxy module and how it handles the connections. Make sure you don't buffer any data and everything works by streaming. And you should try to see where is the time spent for those long requests. Try instrumenting parts of your code with conosle.time and console.timeEnd and see where is taking the most time. If the time is mostly spent in javascript you should try to profile it. Basically you can use v8 profiler, by adding --prof option to your node command. Which makes a v8.log and can be processed via a v8 tool found in node-source-dir/deps/v8/tools. It only works if you have installed d8 shell via scons(scons d8). You can have a look at this article to help you further to make this working.
You can also use node-webkit-agent which uses webkit developer tools to show the profiler result. You can also have a look at my fork with a bit of sugar.
If that didn't work, you can try profiling with dtrace(only works in illumos-based systems like SmartOS).