koa.js streaming response from remote url - node.js

I want to create a koa route that acts like a proxy for another url, which delivers a file that is usually a few dozens of Megabytes.
Therefore I would like not to block when making the response. I am using this.body = yield request.get(url); currently, where request is the [co-request]1 module.
How do I stream the response back to the client ?
Edit :
I am now doing the following :
var req = require('request');
//...
this.body = req(url).pipe(fs.createWriteStream(this.params.what));
If I paste the url in my browser, I get a file just fine.
However if I get a Error: Cannot pipe. Not readable. in my route.

Turns out the solution was simply :
var req = require('request');
//...
this.body = req(url);
This is because this.body has to be a readable stream, which req(url) returns. Thanks to #danneu for the explanation.

Related

Piping readable stream using superagent

I'm trying to create a multer middleware to pipe a streamed file from the client, to a 3rd party via superagent.
const superagent = require('superagent');
const multer = require('multer');
// my middleware
function streamstorage(){
function StreamStorage(){}
StreamStorage.prototype._handleFile = function(req, file, cb){
console.log(file.stream) // <-- is readable stream
const post = superagent.post('www.some-other-host.com');
file.stream.pipe(file.stream);
// need to call cb(null, {some: data}); but how
// do i get/handle the response from this post request?
}
return new StreamStorage()
}
const streamMiddleware = {
storage: streamstorage()
}
app.post('/someupload', streamMiddleware.single('rawimage'), function(req, res){
res.send('some token based on the superagent response')
});
I think this seems to work, but I'm not sure how to handle the response from superagent POST request, since I need to return a token received from the superagent request.
I've tried post.end(fn...) but apparently end and pipe can't both be used together. I feel like I'm misunderstanding how piping works, or if what i'm trying to do is practical.
Superagent's .pipe() method is for downloading (piping data from a remote host to the local application).
It seems you need piping in the other direction: upload from your application to a remote server. In superagent (as of v2.1) there's no method for that, and it requires a different approach.
You have two options:
The easiest, less efficient one is:
Tell multer to buffer/save the file, and then upload the whole file using .attach().
The harder one is to "pipe" the file "manually":
Create a superagent instance with URL, method and HTTP headers you want for uploading,
Listen to data events on the incoming file stream, and call superagent's .write() method with each chunk of data.
Listen to the end event on the incoming file stream, and call superagent's .end() method to read server's response.

change content-disposition on piped response

I have the following controller that get a file from the a service and pipes the answer to the browser.
function (req,res){
request.get(serviceUrl).pipe(res);
}
I'd like to change the content-disposition (from attachment to inline) so the browser opens the file instead of directly download it.
I already tried this, but it is not working:
function (req,res){
res.set('content-disposition','inline');
request.get(serviceUrl).pipe(res);
}
The versions I'm using are:
NodeJS: 0.12.x
Express: 4.x
To do this you can use an intermediate passtrhough stream between request and response, then headers from request won't be passed to response:
var through2 = require('through2'); // or whatever you like better
function (req, res) {
var passThrough = through2(); // this stream is necessary to put correct response headers
res.set('content-disposition','inline');
request.get(serviceUrl).pipe(passThrough).pipe(res);
}
But be carefull, as this will ignore all headers, and you will probably need to specify 'Content-Type', etc.

NodeJS middleware how to read from a writable stream (http.ServerResponse)

I'm working on a app (using Connect not Express) composed of a set of middlewares plus node-http-proxy module, that is, I have a chain of middlewares like:
midA -> midB -> http-proxy -> midC
In this scenario, the response is wrote by the http-proxy that proxies the request to some target and returns the content.
I would like to create a middleware (say midB) to act as a cache. The idea is:
If url is cached the cache-middleware writes the response and avoids continuing the middleares chain.
If url is not cached the cache-middleware passes the request within the middlewares chain bit requires to read the final response content to be cached.
How can achieve this? Or there is another approach?
Cheers
Answering myself.
If you have a middleware like function(req, res, next){..} and need to read the content of the response object.
In this case the res is a http.ServerResponse object, a writable stream where every middleware in the chain is allowed to write content that will conform the response we want to return.
Do not confuse with the response you get when make a request with http.request(), that is a http.IncomingMessage which in fact is a readable stream.
The way I found to read the content all middlewares write to the response is redefining the write method:
var middleare = function(req, res, next) {
var data = "";
res._oldWrite = res.write;
res.write = function(chunk, encoding, cb) {
data += chunck;
return res._oldWrite.call(res, chunck, encoding, cb);
}
...
}
Any other solutions will be appreciated.

Node.js - Stream Binary Data Straight from Request to Remote server

I've been trying to stream binary data (PDF, images, other resources) directly from a request to a remote server but have had no luck so far. To be clear, I don't want to write the document to any filesystem. The client (browser) will make a request to my node process which will subsequently make a GET request to a remote server and directly stream that data back to the client.
var request = require('request');
app.get('/message/:id', function(req, res) {
// db call for specific id, etc.
var options = {
url: 'https://example.com/document.pdf',
encoding: null
};
// First try - unsuccessful
request(options).pipe(res);
// Second try - unsuccessful
request(options, function (err, response, body) {
var binaryData = body.toString('binary');
res.header('content-type', 'application/pdf');
res.send(binaryData);
});
});
Putting both data and binaryData in a console.log show that the proper data is there but the subsequent PDF that is downloaded is corrupt. I can't figure out why.
Wow, never mind. Found out Postman (Chrome App) was hijacking the request and response somehow. The // First Try example in my code excerpt works properly in browser.

How to get the data from a node read stream to use it for a HTTP post request?

I'd like to send an HTTP post request using the scoped-http-client, like this:
client('http://url-to-post-to.com/').post({filedata: <data from stream>})
How would I pass in the data of a node stream?
From their documentation, you can get access to the http.ClientRequest which is a writable stream. From there, you can just pipe the file data to the request. For example, if you were to send JSON file data:
var fs = require("fs"),
scopedClient = require("scoped-http-client");
var file = fs.createReadStream("./test.json");
var client = scopedClient.create("http://url-to-post-to.com/").header("Content-Type", "application/json");
client.post(function(err, req) {
file.pipe(req);
});

Resources