I have been working on the answer from this question located here How to make a socket a stream? To connect https response to S3 after imagemagick. As per loganfsmyth recommendation I commented the req.end(image) line however when I attempt to upload a file the server simply times out. I experience similar behaviour when I uncomment the req.end(image) line with the exception that the image successfully uploadsto S3. Can someone clarify for me which way is correct also if it is right to uncomment the req.end(image) line what is the best way to send a response to the browser to prevent it from timing out?
https.get(JSON.parse(queryResponse).data.url,function(res){
graphicsmagick(res)
.resize('50','50')
.stream(function (err, stdout, stderr) {
ws. = fs.createWriteStream(output)
i = []
stdout.on('data',function(data){
i.push(data)
})
stdout.on('close',function(){
var image = Buffer.concat(i)
var req = S3Client.put("new-file-name",{
'Content-Length' : image.length
,'Content-Type' : res.headers['content-type']
})
req.on('response',function(res){ //prepare 'response' callback from S3
if (200 == res.statusCode)
console.log('it worked')
})
//req.end(image) //send the content of the file and an end
})
})
})
Basically the page was being requested twice which caused the image to be overwritten because of the favicon. node.js page refresh calling resources twice?
In the question you link to, the user was using putStream, so calling req.end() is incorrect, however in your case you are using put directly, so you need to call req.end(). Otherwise with it commented out, you never actually use the image value, except for the length, so you never send the image data.
It is hard to tell without seeing the server handler that actually runs this code, but you need to (optionally) return some response, and then .end() the actual connection to the browser too, or it will set there waiting.
So if you have something like this
http.createServer(function(req, browserResponse){
// Other code.
req.on('response',function(s3res){ //prepare 'response' callback from S3
if (200 == s3res.statusCode) console.log('it worked')
// Close the response. You also pass it data to send to the browser.
browserResponse.end();
})
// Other code.
});
Related
I'm running a small server that needs to receive webforms. The server checks the request and sends back "success" or "fail" which is then displayed on the form (client screen).
Now, checking the form may take a few seconds, so the user may be tempted to send the form again.
What is the corret way to ignore the second request?
So far I have come out with this solutions: If the form is duplicate of the previous one
Don't check and send some server error back (like 429, or 102, or some other one)
Close directly the connection req.destroy();res.destroy();
Ignore the request and exit from the requestListener function.
With solution 1 and 2 the form (on client's browser) displays a message error (even if the first request they sent was correct, so as the duplicates). So it's not a good one.
Solution 3 gives the desired outcome... but I'm not sure if it is the right way around it... basically not changing req and res instead of destroying them. Could this cause issues, or slow down the server? (like... do they stack up?). Of course the first request, once it has been checked, will be sent back with the outcome code. My concern is with the duplicate requests, which I don't destroy nor answer...
Some details on the setup: Nodejs application using the very default code by the http module.
const http = require("http");
const requestListener = function (req, res) {
var requestBody = '';
req.on('data', (data)=>{
requestBody += data;
});
req.on('end', ()=>{
if (isduplicate(requestBody))
return;
else
evalRequest(requestBody, res);
})
}
I'm using superagent (although willing to user other node lib) with the goal to solely get the redirected url, but not the body. The latter is overkill and I want to prevent my code to download the body if I can help it.
I cannot use HEAD requests since these are not guaranteed to be enabled on the server.
Instead my idea was to pipe the superagent response to a writeStream and stop the writeStream on receiving the first data. However, .on('data', fn) is only available on a readstream instead of a writestream.
Instead I tried to see if superagent.get(...) itself was a readstream on which I could listen for .on('data', fn) to kill the stream, but that doesn't appear to be the case either.
In short, is there another way to cancel a request early, while still getting the redirect url, but not incurring the download overhead of the entire body?
Solved: needed buffer(false). Using that response can be treated as a proper readstream.
const url = "http://www.google.com";
superagent.get(url)
.buffer(false)
.end(function (err, res) {
const urlRedirect = res.redirects.length ? res.redirects.pop() : url;
res.destroy();
res.on("end", function () {
//do stuff with 'urlRedirect'
})
})
I have a page with file uploading/downloading functionality.
When I try to download a file AND cancel the save file prompt, which happens after the res.writeHead part, it leaves the headers and it waits for the res.write and res.end parts.
The problem is that these are escaped if the prompt is cancelled, making every other response fail with the error "Can't set headers after they are sent".
Is there anyway to end the response catching the cancelled prompt event is some way, or any other way to avoid this?
The part that sets headers and streams data for the file download (located in a function that is called in the /download/:filename route) is :
res.writeHead(200, {'Content-Type': 'application/octet-stream'});
var readstream = gfs.createReadStream({
filename: files[0].filename
});
readstream.on('data', function(data){
res.write(data);
});
readstream.on('end', function(){
res.end();
});
If this sequence is not completed, every other response fails.
example:
res.status(403).send('You have no access to this file');
in another controller, called in the same page.
(I guess if I redirected to another page headers would actually get cleared?)
*If I select download location and press ok, no error occurs
*I am not having a double response in a loop, to avoid this common mistake answer :)
in express you can check for res.headersSent, this way you would be able to avoid the exception
if(res.headersSent){
return;
} else {
//set you headers
}
I've been trying to stream binary data (PDF, images, other resources) directly from a request to a remote server but have had no luck so far. To be clear, I don't want to write the document to any filesystem. The client (browser) will make a request to my node process which will subsequently make a GET request to a remote server and directly stream that data back to the client.
var request = require('request');
app.get('/message/:id', function(req, res) {
// db call for specific id, etc.
var options = {
url: 'https://example.com/document.pdf',
encoding: null
};
// First try - unsuccessful
request(options).pipe(res);
// Second try - unsuccessful
request(options, function (err, response, body) {
var binaryData = body.toString('binary');
res.header('content-type', 'application/pdf');
res.send(binaryData);
});
});
Putting both data and binaryData in a console.log show that the proper data is there but the subsequent PDF that is downloaded is corrupt. I can't figure out why.
Wow, never mind. Found out Postman (Chrome App) was hijacking the request and response somehow. The // First Try example in my code excerpt works properly in browser.
I have an endpoint in a node app which is used to download images
var images = {
'car': 'http://someUrlToImage.jpg',
'boat': 'http://someUrlToImage.jpg',
'train': 'http://someUrlToImage.jpg'
}
app.get('/api/download/:id', function(req, res){
var id = req.params.id;
res.setHeader("content-disposition", "attachment; filename=image.jpg");
request.get(images[id]).pipe(res);
});
Now this code works fine, but after a few hours of the app running, the endpoint just hangs.
I am monitoring the memory usage of the app, which remains consistent, and any other endpoints which just return some JSON respond as normal so it is not as if the event loop is somehow being blocked. Is there a gotcha of some kind that I am missing when using the request module to pipe a response? Or is there a better solution to achieve this?
I am also using the Express module.
You should add an error listener on your request because errors are not passed in pipes. That way, if your request has an error, it will close the connection and you'll get the reason.
request
.get(...)
.on('error', function(err) {
console.log(err);
res.end();
})
.pipe(res)