Error When Sending PassThrough Stream in Response with NodeJS Express Server - node.js

I'm trying to pass a NodeJS Stream API PassThrough object as the response to a http request. I'm running an Express server and am doing something similar to the following:
const { PassThrough } = require('stream')
const createPassThrough = () => {
return PassThrough()
}
app.get('/stream', (req, res) => {
res.writeHead(200, {
'Content-Type': 'audio/mpeg',
'Transfer-Encoding': 'chunked'
})
res.write(createPassThrough())
})
However, when I try to do this, Express throws the following error:
The first argument must be of type string or an instance of Buffer. Received an instance of PassThrough
Is there anyway to do this with Express or am I going to need to use a different framework? I've read that Hapi.js is able to return a PassThrough object.

The stream write() operations is intended to write a chunk of data rather than a reference to a readable source.
It seems that passThrough.pipe(res) might be what the OP indended to achieve. This will propagate all data written to the passthrough into the Express response.

Related

Receiving XML request in strong-soap

I am currently developing a NodeJS project with strong-soap library to receive and transmit SOAP requests and responses. Before asking question, i want to present code sample to ask my question efficiently.
server.js
//Address of this project is http://localhost:8080
const soap = require('strong-soap');
const http = require('http');
const fs = require('fs');
let myService = { //My SOAP server's service (group of methods)
Method:
function(data) //My question literally and directly related with data
{
//... Function content.
}
};
let xmlFile = fs.readFileSync('sample.wsdl');
let path = '/wsdl';
http.createServer(function (req, res) {
let soapServer = soap.listen(server, '/wsdl', myService, xmlFile);
}).listen(8080);
Assume that i have a server with codebase which is is seen above. And assume that have another Node.js project (or another .js file) which send request and transmit response from this server.
client.js
const soap = require('strong-soap');
soap.createClient('http://localhost:8080/wsdl', {forceSoap12Headers: true}, function(error, client){
if(error)
{
reject(error);
//res.sendStatus(404);
//console.log(error);
}
else{
client.Method(requestArgs, function(err, result, envelope) {
//I want to be able to sent an XML data as requestArgs
// Result in SOAP envelope body which is the wrapper element.
// In this case, result object corresponds to GetCityForecastByZIPResponse.
console.log(JSON.stringify(result));
}, null, customRequestHeader);
}
});
My question is: how i can send an XML in client.js? Or how i can receive requests in XML format without any problem in server.js? Most probably, i will deal with raw XML requests, and i want to deal with both XML and JSON requests? Help me developers!

How do we encrypt a file with AES in node http proxy?

I have created a http proxy in nodejs using http-proxy package. What I want to do through this proxy is listen to the request coming to the proxy, encrypt the file contents and forward it to the respective service. So I am using proxyReq event to listen to the request coming to proxy. and trying to encrypt the file in request.
const { createEncryptStream, setPassword } = require('aes-encrypt-stream');
const PassThroughStream = require('stream').PassThrough;
proxy.on('proxyReq', (proxyReq, pReq, pRes) => {
setPassword(Buffer.from(pReq.encryptionKey, 'hex'));
const stream = new PassThroughStream();
createEncryptStream(pReq).pipe(stream);
});
I amnot that good with streams but how do I encrypt the files in request and forward it? I tried various solutions of chunking the file contents incase of multipart/form-data and forwarding it but that use memory extensively. So I am looking for a solution where I can encrypt in the stream.
In terms of memory usage, the problem with the code you shown here is that the stream will buffer all the incoming data.
Assuming that targetRequestStream is the upstream request, and targetResponseStream is the upstream response. This will make sure that the data is encrypted and sent to the target server as it is read. And the response is decrypted and sent to the client as it is received.
const { createEncryptStream, setPassword } = require('aes-encrypt-stream');
proxy.on('proxyReq', (proxyReq, pReq, pRes) => {
setPassword(Buffer.from(pReq.encryptionKey, 'hex'));
createEncryptStream(pReq).pipe(targetRequestStream);
createDecryptStream(targetResponseStream).pipe(pRes)
});
Other way to prevent all the data from buffering is to add add an a data event listener.
const { createEncryptStream, setPassword } = require('aes-encrypt-stream');
proxy.on('proxyReq', (proxyReq, pReq, pRes) => {
setPassword(Buffer.from(pReq.encryptionKey, 'hex'));
createEncryptStream(pReq)
.on('data', (data) => { /*handle encrypted data here */});
createDecryptStream(targetResponseStream)
.on('data', (data) => { /* handle decrypted data here */ });
});

Streaming a large remote file

I need to serve local files from a different server using node. The api endpoint are being handled by express.
The goal is not to contain the entire file in memory instead stream the data so it shows the output to the enduser progressively.
By reading the stream api documentation i came up with this solution with a combination with expressjs response. Here is the example:
const open = (req, res) => {
const formattedUrl = new url.URL(
"https://dl.bdebooks.com/Old%20Bangla%20Books/Harano%20Graher%20Jantra%20Manob%20-%20Shaktimoy%20Biswas.pdf"
);
const src = fs.createReadStream(formattedUrl);
return src.pipe(res);
};
But when i hit this express endpoint http://localhost:3000/open it throws following error:
TypeError [ERR_INVALID_URL_SCHEME]: The URL must be of scheme file
I would like to display the file content inline! What I am doing wrong? Any suggestions will be greatly appreciated. :)
fs.createReadStream() operates on the file system. It does not accept an http or https URL. Instead, you need to use something like http.get() to make an http request and then return a readable stream that you can then pipe from.
const open = (req, res) => {
const formattedUrl = new url.URL("https://dl.bdebooks.com/Old%20Bangla%20Books/Harano%20Graher%20Jantra%20Manob%20-%20Shaktimoy%20Biswas.pdf");
http.get(formattedUrl, (stream) => {
stream.pipe(res);
}).on('error', (err) => {
// send some sort of error response here
});
};

Express custom log crashes while using express.static

I'm currently trying to log the full end response of every Express request with a simple middleware function like this:
module.exports = (req, res, next) => {
const startTime = new Date();
const oldEnd = res.end;
res.end = (chunks, encoding) => {
const responseTime = new Date() - startTime;
res.set('Server-Timing', `total;dur=${responseTime}`);
console.log(req.path, `Response Time: ${responseTime}`);
res.end = oldEnd;
res.end(chunks, encoding);
};
next();
}
This code works fine with normal Express endpoints but when I try to serve a static file like this: app.use('/static/path', express.static('path')) I get the following error:
Error [ERR_HTTP_HEADERS_SENT]: Cannot set headers after they are sent to the client
This happens because of the res.set for the server timing but this means express.static uses .end() twice? When I console.log in my middleware function it only gets called once.
I'm using NodeJS 10 and Express 4.16.4
Does anyone know how to solve this problem?
res.end is not called twice.
serve-static is streaming the file to the client and when the first chunk of the file is written to the stream, the headers will be sent. From the nodejs docs:
response.writeHead
If response.write() or response.end() are called before calling this, the implicit/mutable headers will be calculated and call this function.
So it is not possible to set headers after the stream has started to send data to the client. But it is possible to pass a setHeader function in the options to serve-static.
express.static('./public', {
setHeaders: (res, path, stat) => {
const responseTime = new Date() - res.locals.startTime;
res.set('Server-Timing', `total;dur=${responseTime}`);
},
});
However, since the headers are sent off at the start of the steam this is not accurate response time. More of a response time for the just the headers.

How do I stream data to browsers with Hapi?

I'm trying to use streams to send data to the browser with Hapi, but can't figure our how. Specifically I am using the request module. According to the docs the reply object accepts a stream so I have tried:
reply(request.get('https://google.com'));
The throws an error. In the docs it says the stream object must be compatible with streams2, so then I tried:
reply(streams2(request.get('https://google.com')));
Now that does not throw a server side error, but in the browser the request never loads (using chrome).
I then tried this:
var stream = request.get('https://google.com');
stream.on('data', data => console.log(data));
reply(streams2(stream));
And in the console data was outputted, so I know the stream is not the issue, but rather Hapi. How can I get streaming in Hapi to work?
Try using Readable.wrap:
var Readable = require('stream').Readable;
...
function (request, reply) {
var s = Request('http://www.google.com');
reply(new Readable().wrap(s));
}
Tested using Node 0.10.x and hapi 8.x.x. In my code example Request is the node-request module and request is the incoming hapi request object.
UPDATE
Another possible solution would be to listen for the 'response' event from Request and then reply with the http.IncomingMessage which is a proper read stream.
function (request, reply) {
Request('http://www.google.com')
.on('response', function (response) {
reply(response);
});
}
This requires fewer steps and also allows the developer to attach user defined properties to the stream before transmission. This can be useful in setting status codes other than 200.
2020
I found it !! the problem was the gzip compression
to disable it just for event-stream you need provide the next config to Happi server
const server = Hapi.server({
port: 3000,
...
mime:{
override:{
'text/event-stream':{
compressible: false
}
}
}
});
in the handler I use axios because it support the new stream 2 protocol
async function handler(req, h) {
const response = await axios({
url: `http://some/url`,
headers: req.headers,
responseType: 'stream'
});
return response.data.on('data',function (chunk) {
console.log(chunk.toString());
})
/* Another option with h2o2, not fully checked */
// return h.proxy({
// passThrough:true,
// localStatePassThrough:true,
// uri:`http://some/url`
// });
};

Resources