I have created a http proxy in nodejs using http-proxy package. What I want to do through this proxy is listen to the request coming to the proxy, encrypt the file contents and forward it to the respective service. So I am using proxyReq event to listen to the request coming to proxy. and trying to encrypt the file in request.
const { createEncryptStream, setPassword } = require('aes-encrypt-stream');
const PassThroughStream = require('stream').PassThrough;
proxy.on('proxyReq', (proxyReq, pReq, pRes) => {
setPassword(Buffer.from(pReq.encryptionKey, 'hex'));
const stream = new PassThroughStream();
createEncryptStream(pReq).pipe(stream);
});
I amnot that good with streams but how do I encrypt the files in request and forward it? I tried various solutions of chunking the file contents incase of multipart/form-data and forwarding it but that use memory extensively. So I am looking for a solution where I can encrypt in the stream.
In terms of memory usage, the problem with the code you shown here is that the stream will buffer all the incoming data.
Assuming that targetRequestStream is the upstream request, and targetResponseStream is the upstream response. This will make sure that the data is encrypted and sent to the target server as it is read. And the response is decrypted and sent to the client as it is received.
const { createEncryptStream, setPassword } = require('aes-encrypt-stream');
proxy.on('proxyReq', (proxyReq, pReq, pRes) => {
setPassword(Buffer.from(pReq.encryptionKey, 'hex'));
createEncryptStream(pReq).pipe(targetRequestStream);
createDecryptStream(targetResponseStream).pipe(pRes)
});
Other way to prevent all the data from buffering is to add add an a data event listener.
const { createEncryptStream, setPassword } = require('aes-encrypt-stream');
proxy.on('proxyReq', (proxyReq, pReq, pRes) => {
setPassword(Buffer.from(pReq.encryptionKey, 'hex'));
createEncryptStream(pReq)
.on('data', (data) => { /*handle encrypted data here */});
createDecryptStream(targetResponseStream)
.on('data', (data) => { /* handle decrypted data here */ });
});
Related
I'm trying to pass a NodeJS Stream API PassThrough object as the response to a http request. I'm running an Express server and am doing something similar to the following:
const { PassThrough } = require('stream')
const createPassThrough = () => {
return PassThrough()
}
app.get('/stream', (req, res) => {
res.writeHead(200, {
'Content-Type': 'audio/mpeg',
'Transfer-Encoding': 'chunked'
})
res.write(createPassThrough())
})
However, when I try to do this, Express throws the following error:
The first argument must be of type string or an instance of Buffer. Received an instance of PassThrough
Is there anyway to do this with Express or am I going to need to use a different framework? I've read that Hapi.js is able to return a PassThrough object.
The stream write() operations is intended to write a chunk of data rather than a reference to a readable source.
It seems that passThrough.pipe(res) might be what the OP indended to achieve. This will propagate all data written to the passthrough into the Express response.
I have made a simple server and client program where the server reads the data from file and send to the client through TCP socket But the data I am getting is in object and not a simple string ?
So why I cant see the data as plaintext as it is in my data.txt file.
Explanation with example would be appreciated.
Here is my code :-
SERVER CODE
const fs = require('fs');
const net = require('net');
const readableData = fs.createReadStream('data.txt', 'utf8');
const server = net.createServer(socket => {
socket.on('data', chunk => {
console.log(chunk.toString());
socket.write(JSON.stringify(readableData));
});
socket.on('end', () => {
console.log("done");
})
socket.on('close', () => {
console.log("closed")
})
});
server.listen(3000);
CLIENT CODE
const fs = require('fs');
const net = require('net');
const client = new net.Socket();
client.connect('3000', () => {
console.log("connected");
client.write("Server please send the data");
});
client.on('data', chunk => {
console.log("Data recieved:" + chunk.toString());
});
client.on('finish', () => {
console.log("Work completed");
})
client.on('close', () => {
console.log("connection closed");
})
And here is my data.txt file which has simple data
Hello client how are you ?
And the output I'm getting is here :-
Data recieved:{"_readableState":{"objectMode":false,"highWaterMark":65536,"buffer":{"head":{"data":"Hello client how are you ?","next":null},"tail":{"data":"Hello client how are you ?","next":null},"length":1},"length":26,"pipes":null,"pipesCount":0,"flowing":null,"ended":true,"endEmitted":false,"reading":false,"sync":false,"needReadable":false,"emittedReadable":false,"readableListening":false,"resumeScheduled":false,"paused":true,"emitClose":false,"autoDestroy":false,"destroyed":false,"defaultEncoding":"utf8","awaitDrain":0,"readingMore":false,"decoder":{"encoding":"utf8"},"encoding":"utf8"},"readable":true,"_events":{},"_eventsCount":1,"path":"data.txt","fd":35,"flags":"r","mode":438,"end":null,"autoClose":true,"bytesRead":26,"closed":false}
The question why I won't be able to see the data as plaintext on client side as it is in data.txt file.
Your variable readableData contains a node.js stream object. That's what that variable is. It's only of use in the current node.js instance so it doesn't do anything useful to try to send that stream object to the client.
If you want to get all the data from that 'data.txt' file, you have several choices.
You can just read the whole file into a local variable with fs.readFile() and then send all that data with socket.write().
You can create a new stream attached to the file for each new incoming request and then as the data comes in on the readStream, you can send it out on the socket (this is often referred to as piping one stream into another). If you use higher level server constructs such as an http server, they make piping real easy.
Option #1 would look like this:
const server = net.createServer(socket => {
socket.on('data', chunk => {
console.log(chunk.toString());
fs.readFile('data.txt', 'utf8', (err, data) => {
if (err) {
// insert error handling here
console.log(err);
} else {
socket.write(data);
}
});
});
socket.on('end', () => {
console.log("done");
})
socket.on('close', () => {
console.log("closed")
})
});
FYI, you should also know that socket.on('data', chunk => {...}) can give you any size chunk of data. TCP streams do not make any guarantees about delivering the exact same chunks of data in the same pieces that they were originally sent in. They will come in order, but if you sent three 1k chunks from the other end, they might arrive as three separate 1k chunks, they might arrive as one 3k chunk or they might arrive as a whole bunch of much smaller chunks. How they arrive will often depend upon what intermediate transports and routers they had to travel over and if there were any recoverable issues along that transmission. For example, data sent over a satellite internet connection will probably arrive in small chunks because the needs of the transport broke it up into smaller pieces.
This means that reading any data over a plain TCP connection generally needs some sort of protocol so that the reader knows when they've gotten a full, meaningful chunk that they can process. If the data is plain text, it might be as simple a protocol as every message ends with a line feed character. But, if the data is more complex, then the protocol may need to be more complex.
I'm trying to set up HTTP2 for an Express app I've built. As I understand, Express does not support the NPM http2 module, so I'm using SPDY. Here's how I'm thinking to go about it-I'd appreciate advice from people who've implemented something similar.
1) Server setup-I want to wrap my existing app with SPDY, to keep existing routes. Options are just an object with a key and a cert for SSL.
const app = express();
...all existing Express stuff, followed by:
spdy
.createServer(options, app)
.listen(CONFIG.port, (error) => {
if (error) {
console.error(error);
return process.exit(1)
} else {
console.log('Listening on port: ' + port + '.')
}
});
2) At this point, I want to enhance some of my existing routes with a conditional PUSH response. I want to check to see if there are any updates for the client making a request to the route (the client is called an endpoint, and the updates are an array of JSON objects called endpoint changes,) and if so, push to the client.
My idea is that I will write a function which takes res as one of its parameters, save the endpoint changes as a file (I haven't found a way to push non-file data,) and then add them to a push stream, then delete the file. Is this the right approach? I also notice that there is a second parameter that the stream takes, which is a req/res object-am I formatting it properly here?
const checkUpdates = async (obj, res) => {
if(res.push){
const endpointChanges = await updateEndpoint(obj).endpointChanges;
if (endpointChanges) {
const changePath = `../../cache/endpoint-updates${new Date().toISOString()}.json`;
const savedChanges = await jsonfile(changePath, endpointChanges);
if (savedChanges) {
let stream = res.push(changePath, {req: {'accept': '**/*'}, res: {'content-type': 'application/json'}});
stream.on('error', function (err) {
console.log(err);
});
stream.end();
res.end();
fs.unlinkSync(changePath);
}
}
}
};
3) Then, within my routes, I want to call the checkUpdates method with the relevant parameters, like this:
router.get('/somePath', async (req, res) => {
await checkUpdates({someInfo}, res);
ReS(res, {
message: 'keepalive succeeded'
}, 200);
}
);
Is this the right way to implement HTTP2?
I am trying to implement the ._read function of a readable stream, a problem happens when ._read is called and there isn't data, the documentation says that I can push('') until more data comes, and I should only return false when the stream will never have more data.
https://nodejs.org/api/stream.html#stream_readable_read_size_1
But it also says that if I need to do that then something is terribly wrong with my design.
https://nodejs.org/api/stream.html#stream_stream_push
But I can't find an alternative to that.
code:
var http = require('http');
var https = require('https');
var Readable = require('stream').Readable;
var router = require('express').Router();
var buffer = [];
router.post('/', function(clientRequest, clientResponse) {
var delayedMSStream = new Readable;
delayedMSStream._read = function() {
var a=buffer.shift();
if(typeof a === 'undefined'){
this.push('');
return true;
}
else {
this.push(a);
if(a===null) {
return false;
}
return true;
}
};
//I need to get a url from example.com
https.request({hostname:'example.com'}, function(exampleResponse){
data='';
exampleResponse.on('data',function(chunk){data+=chunk});
exampleResponse.on('end',function(){
var MSRequestOptions = {hostname: data, method: 'POST'};
var MSRequest = https.request(MSRequestOptions, function(MSResponse){
MSResponse.on('end', function () {
console.log("MSResponse.on(end)");//>>>
});//end MSResponse.on(end)
}); //end MSRequest
delayedMSStream.pipe(MSRequest);
});
});
clientRequest.on('data', function (chunk) {
buffer.push(chunk);
});
clientRequest.on('end', function () {//when done streaming audio
buffer.push(null);
});
});//end router.post('/')
explanation:
client sends a POST request streaming audio to my server, my server requests a url from example.com, when example.com responds with the url, my server streams the audio to it.
What's a smarter way to do it?
So if I undertstand the code correctly, you:
receive a request,
make your own request to a remote endpoint and fetch a URL
make a new request to that URL and pipe that to original response.
There are ways to do this other then yours, and even your way would look cleaner to me if you just improve the naming a bit. Also, splitting the huge request into a few functions with smaller responsibility scopes might help.
I would make the endpoint this way:
let http = require('http');
let https = require('https');
let Readable = require('stream').Readable;
let router = require('express').Router();
let buffer = [];
/**
* Gets some data from a remote host. Calls back when done.
* We cannot pipe this directly into your stream chain as we need the complete data to get the end result.
*/
function getHostname(cb) {
https.request({
hostname: 'example.com'
}, function(response) {
let data = '';
response.on('error', err => cb(err)); // shortened for brewity
response.on('data', function(chunk) {
data = data + chunk;
});
response.on('end', function() {
// we're done here.
cb(null, data.toString());
});
});
}
router.post('/', function(request, response) {
// first let's get that url.
getHostname(function(err, hostname) {
if (err) { return response.status(500).end(); }
// now make that other request which we can stream.
https.request({
hostname: hostname,
method: 'POST'
}, function(dataStream) {
dataStream.pipe(response);
});
});
});
Now, as said in the comments, with streams2, you don't have to manage your streams. With node versions pre 0.10 you have had to listen to 'read', 'data' etc events, with newer node versions, it's handled. Furthermore, you don't even need it here, streams are smart enough to handle backpressure on their own.
I'm trying to use streams to send data to the browser with Hapi, but can't figure our how. Specifically I am using the request module. According to the docs the reply object accepts a stream so I have tried:
reply(request.get('https://google.com'));
The throws an error. In the docs it says the stream object must be compatible with streams2, so then I tried:
reply(streams2(request.get('https://google.com')));
Now that does not throw a server side error, but in the browser the request never loads (using chrome).
I then tried this:
var stream = request.get('https://google.com');
stream.on('data', data => console.log(data));
reply(streams2(stream));
And in the console data was outputted, so I know the stream is not the issue, but rather Hapi. How can I get streaming in Hapi to work?
Try using Readable.wrap:
var Readable = require('stream').Readable;
...
function (request, reply) {
var s = Request('http://www.google.com');
reply(new Readable().wrap(s));
}
Tested using Node 0.10.x and hapi 8.x.x. In my code example Request is the node-request module and request is the incoming hapi request object.
UPDATE
Another possible solution would be to listen for the 'response' event from Request and then reply with the http.IncomingMessage which is a proper read stream.
function (request, reply) {
Request('http://www.google.com')
.on('response', function (response) {
reply(response);
});
}
This requires fewer steps and also allows the developer to attach user defined properties to the stream before transmission. This can be useful in setting status codes other than 200.
2020
I found it !! the problem was the gzip compression
to disable it just for event-stream you need provide the next config to Happi server
const server = Hapi.server({
port: 3000,
...
mime:{
override:{
'text/event-stream':{
compressible: false
}
}
}
});
in the handler I use axios because it support the new stream 2 protocol
async function handler(req, h) {
const response = await axios({
url: `http://some/url`,
headers: req.headers,
responseType: 'stream'
});
return response.data.on('data',function (chunk) {
console.log(chunk.toString());
})
/* Another option with h2o2, not fully checked */
// return h.proxy({
// passThrough:true,
// localStatePassThrough:true,
// uri:`http://some/url`
// });
};