I'm trying to use streams to send data to the browser with Hapi, but can't figure our how. Specifically I am using the request module. According to the docs the reply object accepts a stream so I have tried:
reply(request.get('https://google.com'));
The throws an error. In the docs it says the stream object must be compatible with streams2, so then I tried:
reply(streams2(request.get('https://google.com')));
Now that does not throw a server side error, but in the browser the request never loads (using chrome).
I then tried this:
var stream = request.get('https://google.com');
stream.on('data', data => console.log(data));
reply(streams2(stream));
And in the console data was outputted, so I know the stream is not the issue, but rather Hapi. How can I get streaming in Hapi to work?
Try using Readable.wrap:
var Readable = require('stream').Readable;
...
function (request, reply) {
var s = Request('http://www.google.com');
reply(new Readable().wrap(s));
}
Tested using Node 0.10.x and hapi 8.x.x. In my code example Request is the node-request module and request is the incoming hapi request object.
UPDATE
Another possible solution would be to listen for the 'response' event from Request and then reply with the http.IncomingMessage which is a proper read stream.
function (request, reply) {
Request('http://www.google.com')
.on('response', function (response) {
reply(response);
});
}
This requires fewer steps and also allows the developer to attach user defined properties to the stream before transmission. This can be useful in setting status codes other than 200.
2020
I found it !! the problem was the gzip compression
to disable it just for event-stream you need provide the next config to Happi server
const server = Hapi.server({
port: 3000,
...
mime:{
override:{
'text/event-stream':{
compressible: false
}
}
}
});
in the handler I use axios because it support the new stream 2 protocol
async function handler(req, h) {
const response = await axios({
url: `http://some/url`,
headers: req.headers,
responseType: 'stream'
});
return response.data.on('data',function (chunk) {
console.log(chunk.toString());
})
/* Another option with h2o2, not fully checked */
// return h.proxy({
// passThrough:true,
// localStatePassThrough:true,
// uri:`http://some/url`
// });
};
Related
I'm trying to pass a NodeJS Stream API PassThrough object as the response to a http request. I'm running an Express server and am doing something similar to the following:
const { PassThrough } = require('stream')
const createPassThrough = () => {
return PassThrough()
}
app.get('/stream', (req, res) => {
res.writeHead(200, {
'Content-Type': 'audio/mpeg',
'Transfer-Encoding': 'chunked'
})
res.write(createPassThrough())
})
However, when I try to do this, Express throws the following error:
The first argument must be of type string or an instance of Buffer. Received an instance of PassThrough
Is there anyway to do this with Express or am I going to need to use a different framework? I've read that Hapi.js is able to return a PassThrough object.
The stream write() operations is intended to write a chunk of data rather than a reference to a readable source.
It seems that passThrough.pipe(res) might be what the OP indended to achieve. This will propagate all data written to the passthrough into the Express response.
Calling the Riot-Api Im receiving incomplete JSON on a https GET-request.
After debugging, I realized that depending how much I wait (breakpoint) pre-executing the
https on'data' callback Im actually receiving the complete JSON object.
(Average API response time for me is 200-300ms)
let getOptions = function(url) {
return {
host: 'na.api.pvp.net',
port: 443,
path: `${url}?api_key=${apiKey}`,
method: 'GET'
};
}
exports.Call = function(url, callback) {
let response = {};
let req = https.request(getOptions(url), function(res) {
response.statusCode = res.statusCode;
res.on('data', function(data) {
response.json = JSON.parse(data);
callback(response);
});
});
req.on('error', function(err) {
response.err = err;
callback(response);
});
req.end();
};
Running the code without breakpoints or only breaking a short time I run either into error:
JSON.parse(data): Unexpected Token in JSON at position ...
or
JSON.parse(data): Unexptected end of JSON Input.
As I expect the 'data' callback to be executed only after the request is complete im confused about how to fix it (without artificially delaying it ofc.).
http.request returns a stream – its not a simple callback that contains the whole response.
You will have to buffer and concatenate everything if you want to parse the whole response.
I would strongly recomment to use a helper library like got or request
I'm trying to POST a raw body with restify. I have the receive side correct, when using POSTman I can send a raw zip file, and the file is correctly created on the server's file system. However, I'm struggling to write my test in mocha. Here is the code I have, any help would be greatly appreciated.
I've tried this approach.
const should = require('should');
const restify = require('restify');
const fs = require('fs');
const port = 8080;
const url = 'http://localhost:' + port;
const client = restify.createJsonClient({
url: url,
version: '~1.0'
});
const testPath = 'test/assets/test.zip';
fs.existsSync(testPath).should.equal(true);
const readStream = fs.createReadStream(testPath);
client.post('/v1/deploy', readStream, function(err, req, res, data) {
if (err) {
throw new Error(err);
}
should(res).not.null();
should(res.statusCode).not.null();
should(res.statusCode).not.undefined();
res.statusCode.should.equal(200);
should(data).not.null();
should(data.endpoint).not.undefined();
data.endpoint.should.equal('http://endpointyouhit:8080');
done();
});
Yet the file size on the file system is always 0. I'm not using my readStream correctly, but I'm not sure how to correct it. Any help would be greatly appreciated.
Note that I want to stream the file, not load it in memory on transmit and receive, the file can potentially be too large for an in memory operation.
Thanks,
Todd
One thing is that you would need to specify a content-type of multi-part/form-data. However, it looks like restify doesn't support that content type, so you're probably out of luck using the restify client to post a file.
To answer my own question, it doesn't appear to be possible to do this with the restify client. I also tried the request module, which claims to have this capability. However, when using their streaming examples, I always had a file size of 0 on the server. Below is a functional mocha integration test.
const testPath = 'test/assets/test.zip';
fs.existsSync(testPath).should.equal(true);
const readStream = fs.createReadStream(testPath);
var options = {
host: 'localhost'
, port: port
, path: '/v1/deploy/testvalue'
, method: 'PUT'
};
var req = http.request(options, function (res) {
//this feels a bit backwards, but these are evaluated AFTER the read stream has closed
var buffer = '';
//pipe body to a buffer
res.on('data', function(data){
buffer+= data;
});
res.on('end', function () {
should(res).not.null();
should(res.statusCode).not.null();
should(res.statusCode).not.undefined();
res.statusCode.should.equal(200);
const json = JSON.parse(buffer);
should(json).not.null();
should(json.endpoint).not.undefined();
json.endpoint.should.equal('http://endpointyouhit:8080');
done();
});
});
req.on('error', function (err) {
if (err) {
throw new Error(err);
}
});
//pipe the readstream into the request
readStream.pipe(req);
/**
* Close the request on the close of the read stream
*/
readStream.on('close', function () {
req.end();
console.log('I finished.');
});
//note that if we end up with larger files, we may want to support the continue, much as S3 does
//https://nodejs.org/api/http.html#http_event_continue
I'm doing a simple request with express & the request module and piping its response to res:
var pipe = request.get({
url: 'url-to-file'
});
pipe.on('response', function (response) {
req.on('close', function () {
// these methods won't work:
// pipe.unpipe();
// pipe.end();
// pipe.finish();
// pipe.close();
});
pipe.on('end', function () {
// this will never fire if I cancel the request
});
res.writeHead(response.statusCode, response.headers);
pipe.pipe(res);
});
This works like a charm, except if I cancel downloads. The end event won't fire and some seconds later, an ESOCKETTIMEDOUT error gets thrown.
How can I close the pipe? These node docs claim that I can call .unpipe, but all node gives is pipe.unpipe is not a funtion (tested with v0.12.7 & 4.2.2 & & 5.0.0), probably because it's not an original node stream.
I also tried using events like end, finish and close, but neither of them work.
request.get() does not return a pure node.js stream but rather a Request object which inherits from the native Stream class and adds some custom methods. The method you are looking for is Request#abort() (Source link).
Your code example would look like the following:
var pipe = request.get({
url: 'url-to-file'
});
pipe.on('response', function (response) {
req.on('close', function () {
pipe.abort();
});
pipe.on('end', function () {
// this will never fire if I cancel the request
});
res.writeHead(response.statusCode, response.headers);
pipe.pipe(res);
});
I'm using Node.js and connect to create a simple web server. I have something similar to the following code and I can't figure out how to access the actual request message body from the request object. I'm new to this so bear with me. I'm also taking out some of the stuff that's not necessary for the example.
function startServer(dir) {
var port = 8888,
svr = connect().use(connect.static(dir, {"maxAge" : 86400000}))
.use(connect.directory(dir))
/*
* Here, I call a custom function for when
* connect.static can't find the file.
*/
.use(custom);
http.createServer(svr).listen(port);
}
function custom(req, res) {
var message = /* the message body in the req object */;
// Do some stuff with message...
}
startServer('dirName');
Make sense? I've tried logging that object to the console and it is full of TONS of stuff. I can easily see headers in there plus the request URL and method. I just can't seem to isolate the actual message body.
You should include the connect.bodyParser middleware as well:
svr = connect().use(connect.static(dir, {"maxAge" : 86400000}))
.use(connect.directory(dir))
.use(connect.bodyParser())
.use(custom);
That will provide the parsed message body as req.body to your handler.
If you want the raw message body, you shouldn't use it but instead read the req stream yourself:
function custom(req, res) {
var chunks = [];
req.on('data', function(chunk) {
chunks.push(chunk);
});
req.on('end', function() {
var rawbody = Buffer.concat(chunks);
...do stuff...
// end the request properly
res.end();
});
}
if(req.method == "POST"){
var body = '';
req.on('data', function(data){
body += data;
});
}
Then body should contain your message if you posted correctly.
A better idea would be to use Express, then use the bodyparser middleware - which will give you this functionality out of the box without worrying about somebody hammering your server. The code above has NO functionality to worry about attacks - but it will get you started.