How to make HTTP/2 request with body payload? - node.js

I am looking to open a HTTP/2 stream and use that stream to make multiple HTTP/2 POST requests. Each POST request will have its own body payload.
I currently have the code below, which works for requests that don't require a payload, but I'm not sure how to customize it for requests that do require a payload.
I've read RFC 7540 and nearly every related post on SO, but I'm still finding it difficult to write working code of HTTP/2 using a payload body.
For example:
Is using stream.write the recommended way to send DATA frames, or should I be using a built-in function provided by http2?
Do I pass arguments in plaintext, and the http2 protocol takes care of the binary encoding, or do I encode it myself?
How should I modify below code to send a payload body?
.
const http2 = require('http2')
const connection = http2.connect('https://www.example.com:443')
const stream = connection.request({
':authority':'www.example.com',
':scheme':'https',
':method': 'POST',
':path': '/custom/path',
}, { endStream: false })
stream.setEncoding('utf8')
stream.on('response', (headers) => {
console.log('RESPONSE', headers)
stream.on('data', (data) => console.log('DATA', data))
stream.on('end', () => console.log('END'))
})
stream.write(Buffer.from('POST-request-payload-body-here?'))

The first thing that you need to do is the to convert the body data into a buffer
var buffer = new Buffer(JSON.stringify(body));
Them you need to update the connection.request object with Content-Type and Content-Length keys. Note the Content-Length is the length of buffer
const stream = connection.request({
':authority':'www.example.com',
':scheme':'https',
':method': 'POST',
':path': '/custom/path',
'Content-Type': 'application/json',
'Content-Length': buffer.length
}, { endStream: false })
Finally you need to send the request by Converting the body data into a string
stream.end(JSON.stringify(body));

It work in my project
const body ={any}
const req = client.request({
':path': '/',
':method':'POST',
'Content-Type': 'application/json',///set body
});
req.write(JSON.stringify(body), 'utf8');///set body

Related

What is the Axios equivalent of req.pipe(request()) / Pipe express request data into Axios request

Using request you can forward a POST multipart/form-data request from express to another server without modifying the body/parameters of the initial request and then return the response from the other server to express. With axios this feature appears to be missing.
CODE:
header
const request = require('request');
const axios = require('axios');
const express = require('express');
const app = express();
app.listen(3000);
const FORWARD_URL = 'https://example.com/'
Working example using request
app.post('/test/0', (req, res) => {
req.pipe(request(FORWARD_URL)).pipe(res);
})
Attempt #1
app.post('/test/1', (req, res) => {
req.pipe(axios.post(FORWARD_URL)).pipe(res);
})
// internal/streams/readable.js:827
// dests[i].emit('unpipe', this, { hasUnpiped: false });
// ^
// TypeError: dests[i].emit is not a function
// at IncomingMessage.Readable.unpipe (internal/streams/readable.js:827:16)
// at unpipe (S:\_Work\[REDACTED]\node_modules\unpipe\index.js:47:12)
// at send (S:\_Work\[REDACTED]\node_modules\finalhandler\index.js:306:3)
// at Immediate.<anonymous> (S:\_Work\[REDACTED]\node_modules\finalhandler\index.js:133:5)
// at Immediate.<anonymous> (S:\_Work\[REDACTED]\node_modules\express\lib\router\index.js:635:15)
// at processImmediate (internal/timers.js:466:21)
Attempt #2
app.post('/test/2', (req, res) => {
req.pipe(axios({
url: FORWARD_URL,
method: 'POST',
responseType: 'stream'
})).pipe(res);
})
// SAME ERROR AS ABOVE
Attempt #3
app.post('/test/3', async (req, res) => {
const axiosRequest = await axios({
url: FORWARD_URL,
method: 'POST',
responseType: 'stream',
data: req
})
axiosRequest.data.pipe(res);
})
// server at FORWARD_URL receives improperly formatted request body, changing request content-type headers has no affect
// ------WebKitFormBoundaryc4BjPwpdR4mG7CFN
// Content-Disposition: form-data; name="field_name"
//
// field_value
// ------WebKitFormBoundaryc4BjPwpdR4mG7CFN--
A similar issue has been addressed here, the accepted answer, while not very clear, does answer the question, however, only covers GET & POST application/x-www-form-urlencoded requests, This is regarding POST multipart/form-data requests.
Ideally I'm looking for a solution that functions identically to the request example using axios, this works great for my use case as it includes file uploads, because of this, I want to avoid parsing the body and instead just forwarding it onto the next server.
Testing of the above routes was performed with postman
Changing the Axios request headers to the equivalent ones obtained from the original request, resolved the error for me.
Code:
const axiosRequest = await axios({
url: FORWARD_URL,
method: 'POST',
responseType: 'stream',
data: req
})
I have just copied the original request headers to the new request
const axiosRequest = await axios({
url: FORWARD_URL,
method: 'POST',
responseType: 'stream',
data: req,
headers: {
...req.headers
}
})
The following code worked for me for POSTing streams with content type multipart/form-data:
app.post('/test/0', async (req, res) => {
const response = await axios.post(FORWARD_URL, req, {
headers: {
'content-type': req.headers['content-type'],
},
});
return res.send(response.data);
})
In my case I did not need to set the responseType header to stream. This might depend on the response you are getting from the API.
What is important is:
posting the entire request object (not just the body!)
setting the content-type header of the outgoing request to the same value as the incoming request
Passing just the body of the incoming request to the outgoing request will result in the API (the reader of the stream) endpoint receiving an empty object (which it will initialise with default values).
Passing the body of the incoming request and setting the content-type header will result in the API endpoint receiving a null body.

Stream binary file with http post

I'm using the request library to send a binary (pdf) file in the body of the request using http post (NOTE: This API does not accept multi-part forms). However, I have only been able to get it to work using fs.readFilesync(). For some reason, when I try to use fs.createReadStream() the pdf file is still sent, but it is empty, and the request never finishes (I never get a response back from the server).
Here is my working version using fs.readFileSync():
const request = require('request');
const fs = require('fs');
const filename = 'test.pdf';
request({
url: 'http://localhost:8083/api/v1/endpoint',
method: 'POST',
headers: {
'Content-Type': 'application/octet-stream',
'Accept': 'application/vnd.api+json',
'Content-Disposition': `file; filename="${filename}"`
},
encoding: null,
body: fs.readFileSync(filename)
}, (error, response, body) => {
if (error) {
console.log('error:', error);
} else {
console.log(JSON.parse(response.body.toString()));
}
});
If I try to replace the body with the below, it doesn't work:
body: fs.createReadStream(filename)
I have also tried piping the http request on to the stream, like it says in the request library docs, but I get the same result:
fs.createReadStream(filename).pipe(request({...}))
I've tried to monitor the stream by doing the following:
var upload = fs.createReadStream('test.pdf');
upload.pipe(req);
var upload_progress = 0;
upload.on("data", function (chunk) {
upload_progress += chunk.length
console.log(new Date(), upload_progress);
})
upload.on("end", function (res) {
console.log('Finished');
req.end();
})
I see progress for the stream and Finished, but still no response is returned from the API.
I'd prefer to create a read stream because of the benefits of working better with larger files, but am clueless as to what is going wrong. I am making sure I'm not altering the file with any special encoding as well.
Is there some way to get some kind of output to see what process is taking forever?
UPDATE:
I decided to test with a simple 1 KB .txt file. I found that it is still empty using fs.createReadStream(), however, this time I got a response back from the server. The test PDF I'm working with is 363 KB, which isn't outrageous in size, but still... weren't streams made for large files anyway? Using fs.readFileSync() also worked fine for the text file.
I'm beginning to wonder if this is an synchronous vs asynchronous issue. I know that fs.readFileSync() is synchronous. Do I need to wait until fs.createReadStream() finishes before trying to append it to the body?
I was able to get this working by doing the following:
const request = require('request');
const fs = require('fs');
const filename = 'test.pdf';
const readStream = fs.createReadStream(filename);
let chunks = [];
readStream.on('data', (chunk) => chunks.push(chunk));
readStream.on('end', () => {
const data = Buffer.concat(chunks);
request({
url: 'http://localhost:8083/api/v1/endpoint',
method: 'POST',
headers: {
'Content-Type': 'application/octet-stream',
'Accept': 'application/vnd.api+json',
'Content-Disposition': `file; filename="${filename}"`
},
encoding: null,
body: data
}, (error, response, body) => {
if (error) {
console.log('error:', error);
} else {
console.log(JSON.parse(response.body.toString()));
}
});
});
I chunked the data together and concatenated it with a buffer before making the request.
I noticed in the documentation it said this:
The Buffer class was introduced as part of the Node.js API to enable interaction with octet streams in TCP streams, file system operations, and other contexts.
The API I'm calling requires the application/octet-stream header, so I need to use the buffer rather than streaming it directly.

How to consume multiple JSONs from a HTTPs keep-alive request?

I have a node web server which listens on
this.app.use('/register-receiver', (req, res) => {
const receiverId = req.query.id;
res.status(200).set({
connection: 'keep-alive',
'Cache-Control': 'no-cache',
'Content-Type': 'application/json',
});
this.receivers[receiverId] = res;
});
It periodically sends a JSON payload to clients connected to /register-receiver with
this.receivers[receiverId].write(JSON.stringify({
// some big json
}))
I have another node program which acts as a client that makes a connection to the server on startup.
const options = {
agent: false,
host: <my host>,
defaultPort: <my port>
path: `/register-receiver?id=${activeConfig.receiver.id}`
headers: {
connection: 'keep-alive',
'Content-Type': 'application/json',
}
};
http.get(options, res => {
res.setEncoding('utf8');
res.on('data', data => {
try {
const response = JSON.parse(data);
// do stuff with response
} catch(e) {
console.log(`error ${e} with ${data}`);
}
});
res.on('end', () => console.log(`connection finished`) )
})
The server needs to periodically send JSON payloads to the client. The client should receive these JSONs and do something with them. However, the problem is that large JSONs writes are chunked such that the client will receive the JSON in pieces. This breaks JSON.parse(data) and now the client doesn't know how to process the server payloads. I can't rely on res.on('end') to detect the the completion of a chunked write because this is a keep-alive request that should stay open forever.
I want to avoid designing my own protocol for combining JSON chunks because of it's complexity. It's not as simple as concatenating the strings since I could have interleaved JSON chunks if the server sends 2 large JSON payloads at the same time.
Is it possible to force the server to write the entire JSON as 1 chunk in the stream? How can I setup my client such that it establishes a "forever" connection with my server and listens for complete JSON payloads?

Making a POST request using puppeteer with JSON payload

I'm trying to make a POST request using puppeteer and send a JSON object in the request, however, I'm getting a timeout... if I'm trying to send a normal encoded form data that at least a get a reply from the server of invalid request...
here is the relevant part of the code
await page.setRequestInterception(true);
const request = {"mac": macAddress, "cmd": "block"};
page.on('request', interceptedRequest => {
var data = {
'method': 'POST',
'postData': request
};
interceptedRequest.continue(data);
});
const response = await page.goto(configuration.commandUrl);
let responseBody = await response.text();
I'm using the same code to make a GET request (without payload) and its working
postData needs to be encoded as form data (in the format key1=value1&key2=value2).
You can create the string on your own or use the build-in module querystring:
const querystring = require('querystring');
// ...
var data = {
'method': 'POST',
'postData': querystring.stringify(request)
};
In case you need to submit JSON data:
'postData': JSON.stringify(request)
If you are sending json, you need to add "content-type": "application/json". If you don't send it you can receive an empty response.
var data = {
method : 'POST',
postData: '{"test":"test_data"}',
headers: { ...interceptedRequest.headers(), "content-type": "application/json"}
};
interceptedRequest.continue(data);

How to convert gzip stream into a readable content and pipe it out in the request?

I am trying to make a proxy call to a website (somesite.com) and get the html from it. somesite.com is chucked and it is zgipped so I was unable to parse buffer (responseFromServer in my code) to be html (currently i get bunch of zumbbled string when i do res.write).
i tried res.end and res.send but neither of them work.
function renderProxyRequest(req, res) {
// somesite.com is gzipped and also is chunked.
var options = {
protocol: 'http:',
hostname: 'somesite.com',
// passing in my current headers
headers: req.headers,
maxRedirects: 0,
path: req.url,
socketTimeout: 200000,
connectTimeout: 1800,
method: 'GET'
}
var proxyrequest = someProxyApi.request(options);
proxyrequest.on('response', function (postresponse) {
// postresponse is a buffer
//postresponse.pipe(res);
var responseFromServer = ''
postresponse.on('data', function (data) {
responseFromServer += data;
});
postresponse.on('end', function () {
// getting some jumbled string onto the browser.
res.write(responseFromServer);
res.end();
})
});
req.pipe(proxyrequest);
}
If postresponse is a stream, you can probably do something like this:
const zlib = require('zlib');
...
postresponse.pipe(zlib.createGunzip()).pipe(res);
You have to check if the response is gzipped to begin with, by checking the Content-Encoding header from the remote server.
Alternatively, if you pass the original headers from the remote server to the client you're proxying the request for, you should be able to just pass the response data along as-is (because the original headers will tell the client that the data was gzipped). This obviously depends on the client being able to handle compressed responses (browsers will).

Resources