I am trying to act as a proxy between a client and an IP Camera using a NodeJS server. When I request a real-time stream from the camera, it responds with
HTTP/1.0 200 OK
Content-Type: Application/octet-stream
followed by a continuous stream of data. If I open the camera stream in Chrome it initiates a never ending download and curling it also initiates a continuous response.
Node appears to be buffering the response from the camera and parsing it through its HTTP parser each time. This works fine the first time as it has the correct headers but upon the second buffer of data it errors with
HPE_INVALID_HEADER_TOKEN
Can someone please help explain why this is happening? It's a continuous stream of data, why is it trying to parse the HTTP headers on the second buffer? I am not sure whether there is an option I am missing or my camera is not following the HTTP specification properly.
Edit: Example Code
const options = {
family: 4,
headers: {
Authorization: 'Basic ' + base64EncodedAuth,
},
host: '192.168.1.131',
method: 'GET',
path: '/cgi-bin/realmonitor.cgi?action=getStream&channel=1&subtype=0',
port: 80,
protocol:'http:',
};
const req = http.request(options, (res) => {
console.log(`STATUS: ${res.statusCode}`);
console.log(`HEADERS: ${JSON.stringify(res.headers)}`);
res.on('data', (chunk) => {
console.log(`BODY: ${chunk}`);
});
res.on('end', () => {
console.log('No more data in response.');
});
});
req.on('error', (e) => {
console.log(`problem with request: ${e.message}`);
});
req.end();
The only callback that is hit is the 'error' one.
I further examined the curl log from the camera and noticed that everything was being marked as:
<= Recv header
It is never sending the separate CRLF required by the HTTP specification to signal that all the headers have been sent. That is why the parser was trying to parse it as a header and, quite rightly, throwing an error.
Related
I'm using the request library to send a binary (pdf) file in the body of the request using http post (NOTE: This API does not accept multi-part forms). However, I have only been able to get it to work using fs.readFilesync(). For some reason, when I try to use fs.createReadStream() the pdf file is still sent, but it is empty, and the request never finishes (I never get a response back from the server).
Here is my working version using fs.readFileSync():
const request = require('request');
const fs = require('fs');
const filename = 'test.pdf';
request({
url: 'http://localhost:8083/api/v1/endpoint',
method: 'POST',
headers: {
'Content-Type': 'application/octet-stream',
'Accept': 'application/vnd.api+json',
'Content-Disposition': `file; filename="${filename}"`
},
encoding: null,
body: fs.readFileSync(filename)
}, (error, response, body) => {
if (error) {
console.log('error:', error);
} else {
console.log(JSON.parse(response.body.toString()));
}
});
If I try to replace the body with the below, it doesn't work:
body: fs.createReadStream(filename)
I have also tried piping the http request on to the stream, like it says in the request library docs, but I get the same result:
fs.createReadStream(filename).pipe(request({...}))
I've tried to monitor the stream by doing the following:
var upload = fs.createReadStream('test.pdf');
upload.pipe(req);
var upload_progress = 0;
upload.on("data", function (chunk) {
upload_progress += chunk.length
console.log(new Date(), upload_progress);
})
upload.on("end", function (res) {
console.log('Finished');
req.end();
})
I see progress for the stream and Finished, but still no response is returned from the API.
I'd prefer to create a read stream because of the benefits of working better with larger files, but am clueless as to what is going wrong. I am making sure I'm not altering the file with any special encoding as well.
Is there some way to get some kind of output to see what process is taking forever?
UPDATE:
I decided to test with a simple 1 KB .txt file. I found that it is still empty using fs.createReadStream(), however, this time I got a response back from the server. The test PDF I'm working with is 363 KB, which isn't outrageous in size, but still... weren't streams made for large files anyway? Using fs.readFileSync() also worked fine for the text file.
I'm beginning to wonder if this is an synchronous vs asynchronous issue. I know that fs.readFileSync() is synchronous. Do I need to wait until fs.createReadStream() finishes before trying to append it to the body?
I was able to get this working by doing the following:
const request = require('request');
const fs = require('fs');
const filename = 'test.pdf';
const readStream = fs.createReadStream(filename);
let chunks = [];
readStream.on('data', (chunk) => chunks.push(chunk));
readStream.on('end', () => {
const data = Buffer.concat(chunks);
request({
url: 'http://localhost:8083/api/v1/endpoint',
method: 'POST',
headers: {
'Content-Type': 'application/octet-stream',
'Accept': 'application/vnd.api+json',
'Content-Disposition': `file; filename="${filename}"`
},
encoding: null,
body: data
}, (error, response, body) => {
if (error) {
console.log('error:', error);
} else {
console.log(JSON.parse(response.body.toString()));
}
});
});
I chunked the data together and concatenated it with a buffer before making the request.
I noticed in the documentation it said this:
The Buffer class was introduced as part of the Node.js API to enable interaction with octet streams in TCP streams, file system operations, and other contexts.
The API I'm calling requires the application/octet-stream header, so I need to use the buffer rather than streaming it directly.
I have a node web server which listens on
this.app.use('/register-receiver', (req, res) => {
const receiverId = req.query.id;
res.status(200).set({
connection: 'keep-alive',
'Cache-Control': 'no-cache',
'Content-Type': 'application/json',
});
this.receivers[receiverId] = res;
});
It periodically sends a JSON payload to clients connected to /register-receiver with
this.receivers[receiverId].write(JSON.stringify({
// some big json
}))
I have another node program which acts as a client that makes a connection to the server on startup.
const options = {
agent: false,
host: <my host>,
defaultPort: <my port>
path: `/register-receiver?id=${activeConfig.receiver.id}`
headers: {
connection: 'keep-alive',
'Content-Type': 'application/json',
}
};
http.get(options, res => {
res.setEncoding('utf8');
res.on('data', data => {
try {
const response = JSON.parse(data);
// do stuff with response
} catch(e) {
console.log(`error ${e} with ${data}`);
}
});
res.on('end', () => console.log(`connection finished`) )
})
The server needs to periodically send JSON payloads to the client. The client should receive these JSONs and do something with them. However, the problem is that large JSONs writes are chunked such that the client will receive the JSON in pieces. This breaks JSON.parse(data) and now the client doesn't know how to process the server payloads. I can't rely on res.on('end') to detect the the completion of a chunked write because this is a keep-alive request that should stay open forever.
I want to avoid designing my own protocol for combining JSON chunks because of it's complexity. It's not as simple as concatenating the strings since I could have interleaved JSON chunks if the server sends 2 large JSON payloads at the same time.
Is it possible to force the server to write the entire JSON as 1 chunk in the stream? How can I setup my client such that it establishes a "forever" connection with my server and listens for complete JSON payloads?
I am trying to set up an express js server that will be hosting a mongodb database. Everything is pretty standard: I have some routes open that will take in data from the client and then store that in the database.
Here is my query string:
let url = "http://xxx.xxx.xx.xxx:3000/update/data=" + JSON.stringify(params);
What I have noticed is that if params doesn't contain much information, it works fine. However, if params is contains a lot of information, then the client throws this error:
Failed to load resource: The network connection was lost.
Http failure response for (unknown url): 0 Unknown Error
(This same error is happening in both Safari and Chrome.)
For example, if params is as below:
{
"accountId": "12345678910",
"data": [
1, 2, 3, 4
]
}
then there is no issue. However, if params.data is a huge array with a ton of information in it instead of just [1, 2, 3, 4], then the error is thrown.
Also, my express server never even seems to receive the request. No logs; nothing. What I would expect to happen is just a normal response and result, however it seems like the client is just giving up on sending something large. Perhaps it has something to do with sending it as a big string?
You put your data on your URL. But, URLs have limited length.
You need to use POST and put your data in the HTTP request body.
You haven't shown us how you use that URL, so it's hard to make suggestions about altering your code. Using the http request operation is the way to go. Something like this might work...
const payload = JSON.stringify(params);
const url = 'http://xxx.xxx.xx.xxx:3000/update/';
const options = {
method: 'POST', // <--- tell it to POST
headers: {
'Content-Type': 'application/json', // <--- tell it you're posting JSON
'Content-Length': payload.length; // <--- tell it how much data you're posting.
}
};
const req = http.request(url, options, (res) => {
/* handle stuff coming back from request here */
console.log(`STATUS: ${res.statusCode}`);
console.log(`HEADERS: ${JSON.stringify(res.headers)}`);
res.setEncoding('utf8');
let chunks=[];
res.on('data', (chunk) => {
chunks.push(chunk);
console.log(`BODY: ${chunk}`);
});
res.on('end', () => {
const resultingData = chunks.join();
console.log('No more data in response.');
});
});
req.on('error', (e) => {
console.error(`problem with request: ${e.message}`);
});
// write data to request body
req.write(payload);
req.end();
I am using the Node.js framework and Express module to write an API wrapper that redirects requests to another server. I can successfully redirect the request to the target server and I receive a valid response containing a JSON payload. However, after the initial request, if I try another request I get the following error.
Error [ERR_HTTP_HEADERS_SENT]: Cannot set headers after they are sent to the client
This is a sample of the code I wrote for the HTTP GET Express route:
app.get('/companyrecords/:name', function(req, res) {
const options = {
protocol: 'http:',
hostname: 'myhost',
port: 5001,
method: 'GET',
path: '/path/to/resource/?name=name',
auth: 'username:password',
headers: {
Connection: 'close'
}
}
const myAppReq = http.request(options, (myAppRes) =>{
console.log(`STATUS: ${myAppRes.statusCode}`);
myAppRes.on('data', (chunk) => {
console.log(`BODY: ${chunk}`);
res.send(chunk);
});
myAppRes.on('end', () => {
res.end('No more data to send.');
});
});
myAppReq.on('error', (err) => {
console.error(`Problem with request: ${err.message}`);
});
myAppReq.write('');
myAppReq.end();
});
Not sure why I am getting this error since I am calling the req.write() method so that the request's headers are sent. When looking at the error stack trace it appears the error occurs when I call the res.send() method inside the callback to the 'data' event. Perhaps I'm not understanding the flow of execution among the request or the sequence in which the events are being emitted. Any guidance/information would be greatly appreciated.
You shouldn't be sending the response inside data event callback because response will be sent when you will receive the first chunk of data. What you should do is to write the chunk to response stream and send response inside end event callback:
const myAppReq = http.request(options, (myAppRes) =>{
myAppRes.on('data', (chunk) => {
res.write(chunk);
});
myAppRes.on('end', () => {
res.end();
});
});
I am beginner to Node.js, so as per project requirements I am trying to call REST service from Node.js, I got information of how to call rest from this SO question. Here is the code to make rest call:
var options = {
host: url,
port: 80,
path: '/resource?id=foo&bar=baz',
method: 'POST'
};
http.request(options, function(res) {
console.log('STATUS: ' + res.statusCode);
console.log('HEADERS: ' + JSON.stringify(res.headers));
res.setEncoding('utf8');
res.on('data', function (chunk) {
console.log('BODY: ' + chunk); //I want to send this 'chunk' as a response to browser
});
}).end();
The problem is I want to send chunk as a response to a browser, I tried res.write() but it is throwing me error "write method not found". I looked in docs everywhere, but all they give is console.log. Can anyone let me know how can I send that data as a response to a browser?
The callback to http.request is an instance of IncomingMessage, which is a Readable Stream that doesn't have a write method. When making an HTTP request with http.request, you cannot send a response. HTTP is a request-response-message-oriented protocol.
how can I send that data as a response to a browser?
For a browser to be able to get response, it must make a request first. You'll have to have a server running which calls the REST service when it receives a request.
http.createServer(function(req, res) {
var data = getDataFromREST(function(data) {
res.write(data);
res.end();
});
});