I'm using the request library to send a binary (pdf) file in the body of the request using http post (NOTE: This API does not accept multi-part forms). However, I have only been able to get it to work using fs.readFilesync(). For some reason, when I try to use fs.createReadStream() the pdf file is still sent, but it is empty, and the request never finishes (I never get a response back from the server).
Here is my working version using fs.readFileSync():
const request = require('request');
const fs = require('fs');
const filename = 'test.pdf';
request({
url: 'http://localhost:8083/api/v1/endpoint',
method: 'POST',
headers: {
'Content-Type': 'application/octet-stream',
'Accept': 'application/vnd.api+json',
'Content-Disposition': `file; filename="${filename}"`
},
encoding: null,
body: fs.readFileSync(filename)
}, (error, response, body) => {
if (error) {
console.log('error:', error);
} else {
console.log(JSON.parse(response.body.toString()));
}
});
If I try to replace the body with the below, it doesn't work:
body: fs.createReadStream(filename)
I have also tried piping the http request on to the stream, like it says in the request library docs, but I get the same result:
fs.createReadStream(filename).pipe(request({...}))
I've tried to monitor the stream by doing the following:
var upload = fs.createReadStream('test.pdf');
upload.pipe(req);
var upload_progress = 0;
upload.on("data", function (chunk) {
upload_progress += chunk.length
console.log(new Date(), upload_progress);
})
upload.on("end", function (res) {
console.log('Finished');
req.end();
})
I see progress for the stream and Finished, but still no response is returned from the API.
I'd prefer to create a read stream because of the benefits of working better with larger files, but am clueless as to what is going wrong. I am making sure I'm not altering the file with any special encoding as well.
Is there some way to get some kind of output to see what process is taking forever?
UPDATE:
I decided to test with a simple 1 KB .txt file. I found that it is still empty using fs.createReadStream(), however, this time I got a response back from the server. The test PDF I'm working with is 363 KB, which isn't outrageous in size, but still... weren't streams made for large files anyway? Using fs.readFileSync() also worked fine for the text file.
I'm beginning to wonder if this is an synchronous vs asynchronous issue. I know that fs.readFileSync() is synchronous. Do I need to wait until fs.createReadStream() finishes before trying to append it to the body?
I was able to get this working by doing the following:
const request = require('request');
const fs = require('fs');
const filename = 'test.pdf';
const readStream = fs.createReadStream(filename);
let chunks = [];
readStream.on('data', (chunk) => chunks.push(chunk));
readStream.on('end', () => {
const data = Buffer.concat(chunks);
request({
url: 'http://localhost:8083/api/v1/endpoint',
method: 'POST',
headers: {
'Content-Type': 'application/octet-stream',
'Accept': 'application/vnd.api+json',
'Content-Disposition': `file; filename="${filename}"`
},
encoding: null,
body: data
}, (error, response, body) => {
if (error) {
console.log('error:', error);
} else {
console.log(JSON.parse(response.body.toString()));
}
});
});
I chunked the data together and concatenated it with a buffer before making the request.
I noticed in the documentation it said this:
The Buffer class was introduced as part of the Node.js API to enable interaction with octet streams in TCP streams, file system operations, and other contexts.
The API I'm calling requires the application/octet-stream header, so I need to use the buffer rather than streaming it directly.
Related
I am looking to open a HTTP/2 stream and use that stream to make multiple HTTP/2 POST requests. Each POST request will have its own body payload.
I currently have the code below, which works for requests that don't require a payload, but I'm not sure how to customize it for requests that do require a payload.
I've read RFC 7540 and nearly every related post on SO, but I'm still finding it difficult to write working code of HTTP/2 using a payload body.
For example:
Is using stream.write the recommended way to send DATA frames, or should I be using a built-in function provided by http2?
Do I pass arguments in plaintext, and the http2 protocol takes care of the binary encoding, or do I encode it myself?
How should I modify below code to send a payload body?
.
const http2 = require('http2')
const connection = http2.connect('https://www.example.com:443')
const stream = connection.request({
':authority':'www.example.com',
':scheme':'https',
':method': 'POST',
':path': '/custom/path',
}, { endStream: false })
stream.setEncoding('utf8')
stream.on('response', (headers) => {
console.log('RESPONSE', headers)
stream.on('data', (data) => console.log('DATA', data))
stream.on('end', () => console.log('END'))
})
stream.write(Buffer.from('POST-request-payload-body-here?'))
The first thing that you need to do is the to convert the body data into a buffer
var buffer = new Buffer(JSON.stringify(body));
Them you need to update the connection.request object with Content-Type and Content-Length keys. Note the Content-Length is the length of buffer
const stream = connection.request({
':authority':'www.example.com',
':scheme':'https',
':method': 'POST',
':path': '/custom/path',
'Content-Type': 'application/json',
'Content-Length': buffer.length
}, { endStream: false })
Finally you need to send the request by Converting the body data into a string
stream.end(JSON.stringify(body));
It work in my project
const body ={any}
const req = client.request({
':path': '/',
':method':'POST',
'Content-Type': 'application/json',///set body
});
req.write(JSON.stringify(body), 'utf8');///set body
I am receiving a PDF file from a node server (it is running jsreport in this server) and i need to download this PDF in the client (i am using react in the client) but the problem is that when i download the file, it comes all blank and the title some strange symbols. After a lot of tests and researchs, i found that the problem may be that the file is coming enconded as chunked (i can see that in the headers of the response) and i need to decode do become a file again.
So, how to decode this chunked string to a file again?
In the client i am just downloading the file that comes in the responde:
handleGerarRelatorioButtonClick(){
axios.post(`${REQUEST_URL}/relatorios`, this.state.selectedExam).then((response) => {
fileDownload(response.data, this.state.selectedExam.cliente.nome.replace(' ', '_') + ".pdf");
});
}
In my server, i am making a request to my jsreport that is other node server and it returns the report as a PDF:
app.post('/relatorios', (request, response) => {
var exame = new Exame(request.body);
var pdf = '';
var body = {
"template": {
"shortid": "S1C9birB-",
"data": exame
}
};
var options = {
hostname: 'localhost',
port: 5488,
path: '/api/report',
method: 'POST',
headers: {
'Content-Type': 'application/json'
}
};
var bodyparts = [];
var bodylength = 0;
var post = http.request(options, (res) => {
res.on('data', (chunk) => {
bodyparts.push(chunk);
bodylength += chunk.length;
});
res.on('end', () => {
var pdf = new Buffer(bodylength);
var pdfPos = 0;
for(var i=0;i<bodyparts.length;i++){
bodyparts[i].copy(pdf, pdfPos, 0, bodyparts[i].length);
pdfPos += bodyparts[i].length;
}
response.setHeader('Content-Type', 'application/pdf');
response.setHeader('Content-disposition', exame._id + '.pdf');
response.setHeader('Content-Length', bodylength);
response.end(Buffer.from(pdf));
});
});
post.write(JSON.stringify(body));
post.end();
});
I am sure that my report is being rendered as expected because if i make a request from postman, it returns the PDF just fine.
Your solution is simply relaying data chunks but you are not telling your front end what to expect of these chunks or how to assemble them. At a minimum you should be setting the the Content-Type response header to application/pdf and to be complete should also be sending the Content-disposition as well as Content-Length. You may need to collect the PDF from your 3rd party source into a buffer and then send that buffer to your client if you are not able to set headers and pipe to response successfully.
[edit] - I'm not familiar with jsreport but it is possible (and likely) that the response they send is a buffer. If that is the case you could use something like this in place of your response to the client:
myGetPDFFunction(params, (err, res) => {
if (err) {
//handle it
} else {
response.writeHead(200, {
'Content-Type': 'application/pdf',
'Content-Length': [your buffer's content length]
});
response.end(Buffer.from([the res PDF buffer]));
}
}
What you haven't shown is the request made to obtain that PDF, so I couldn't be more specific at this time. You should look into the documentation of jsreport to see what it sends in its response, and you can also read up on buffers here
This is rough pseudo code but the point is to respond with the PDF buffer after setting the headers to their proper values.
I'm switching one of my projects from request over to something a bit more light-weight (such as got, axios, or fetch). Everything is going smoothly, however, I'm having an issue when attempting to upload a file stream (PUT and POST). It works fine with the request package, but any of the other three return a 500 from the server.
I know that a 500 generally means an issue on the server's end, but it is consistent only with the HTTP packages that I'm testing out. When I revert my code to use request, it works fine.
Here is my current Request code:
Request.put(`http://endpoint.com`, {
headers: {
Authorization: `Bearer ${account.token.access_token}`
},
formData: {
content: fs.createReadStream(localPath)
}
}, (err, response, body) => {
if (err) {
return callback(err);
}
return callback(null, body);
});
And here is one of the attempts using another package (in this case, got):
got.put(`http://endpoint.com`, {
headers: {
'Content-Type': 'multipart/form-data',
Authorization: `Bearer ${account.token.access_token}`,
},
body: {
content: fs.createReadStream(localPath)
}
})
.then(response => {
return callback(null, response.body);
})
.catch(err => {
return callback(err);
});
Per the got documentation, I've also tried using the form-data package in conjunction with it according to its example and I still get the same issue.
The only difference between these 2 I can gather is with got I do have to manually specify the Content-Type header otherwise the endpoint does give me a proper error on that. Otherwise, I'm not sure how the 2 packages are constructing the body with the stream, but as I said, fetch and axios are also producing the exact same error as got.
If you want any of the snippets using fetch or axios I'd be happy to post them as well.
I know this question was asked a while ago, but I too am missing the simple pipe support from the request package
const request = require('request');
request
.get("https://res.cloudinary.com/demo/image/upload/sample.jpg")
.pipe(request.post("http://127.0.0.1:8000/api/upload/stream"))
// Or any readable stream
fs.createReadStream('/Users/file/path/localFile.jpeg')
.pipe(request.post("http://127.0.0.1:8000/api/upload/stream"))
and had to do some experimenting to find similar features from current libraries.
Unfortunately, I haven't worked with "got" but I hope the following 2 examples help someone else that are interested in working with the Native http/https libraries or the popular axios library
HTTP/HTTPS
Supports piping!
const http = require('http');
const https = require('https');
console.log("[i] Test pass-through: http/https");
// Note: http/https must match URL protocol
https.get(
"https://res.cloudinary.com/demo/image/upload/sample.jpg",
(imageStream) => {
console.log(" [i] Received stream");
imageStream.pipe(
http.request("http://localhost:8000/api/upload/stream/", {
method: "POST",
headers: {
"Content-Type": imageStream.headers["content-type"],
},
})
);
}
);
// Or any readable stream
fs.createReadStream('/Users/file/path/localFile.jpeg')
.pipe(
http.request("http://localhost:8000/api/upload/stream/", {
method: "POST",
headers: {
"Content-Type": imageStream.headers["content-type"],
},
})
)
Axios
Note the usage of imageStream.data and that it's being attached to data in the Axios config.
const axios = require('axios');
(async function selfInvokingFunction() {
console.log("[i] Test pass-through: axios");
const imageStream = await axios.get(
"https://res.cloudinary.com/demo/image/upload/sample.jpg",
{
responseType: "stream", // Important to ensure axios provides stream
}
);
console.log(" [i] Received stream");
const upload = await axios({
method: "post",
url: "http://127.0.0.1:8000/api/upload/stream/",
data: imageStream.data,
headers: {
"Content-Type": imageStream.headers["content-type"],
},
});
console.log("Upload response", upload.data);
})();
Looks like this was a headers issue. If I use the headers directly from FormData (i.e., headers: form.getHeaders()) and just add in my additional headers afterwards (Authorization), then this ends up working just fine.
For me just works when I added other parameters on FormData.
before
const form = new FormData();
form.append('file', fileStream);
after
const form = new FormData();
form.append('file', fileStream, 'my-whatever-file-name.mp4');
So that way I can send stream from my backend to another backend in node, waiting a file in multipart/form-data called 'file'
I have used the Winston module to create a daily log file for my offline app. I now need to be able to send or upload that file to a remote server via POST (that part already exists)
I know I need to write the file in chunks so it doesn't hog the memory so I'm using fs.createReadStream however I seem to only get a 503 response, even if sending just sample text.
EDIT
I worked out that the receiver was expecting the data to be named 'data'. I have removed the createReadSteam as I could only get it to work with 'application/x-www-form-urlencoded' and a synchronous fs.readFileSync. If I change this to 'multipart/form-data' on the php server would I be able to use createReadStream again, or is that only if I change to physically uploading the json file.
I've only been learning node for the past couple of weeks so any pointers would be gratefully received.
var http = require('http'),
fs = require('fs');
var post_options = {
host: 'logger.mysite.co.uk',
path: '/',
port: 80,
timeout: 120000,
method: 'POST',
headers: {
'Content-Type': 'application/x-www-form-urlencoded'
}
}
var sender = http.request(post_options, function(res) {
if (res.statusCode < 399) {
var text = ""
res.on('data', function(chunk) {
text += chunk
})
res.on('end', function(data) {
console.log(text)
})
} else {
console.log("ERROR", res.statusCode)
}
})
var POST_DATA = 'data={['
POST_DATA += fs.readFileSync('./path/file.log').toString().replace(/\,+$/,'')
POST_DATA += ']}'
console.log(POST_DATA)
sender.write(POST_DATA)
sender.end()
After gazillion of trial-failure this worked for me. Using FormData with node-fetch. Oh, and request deprecated two days ago, btw.
const FormData = require('form-data');
const fetch = require('node-fetch');
function uploadImage(imageBuffer) {
const form = new FormData();
form.append('file', imageBuffer, {
contentType: 'image/jpeg',
filename: 'dummy.jpg',
});
return fetch(`myserver.cz/upload`, { method: 'POST', body: form })
};
In place of imageBuffer there can be numerous things. I had a buffer containing the image, but you can also pass the result of fs.createReadStream('/foo/bar.jpg') to upload a file from drive.
copied from https://github.com/mikeal/request#forms
var r = request.post('http://service.com/upload', function optionalCallback (err, httpResponse, body) {
if (err) {
return console.error('upload failed:', err);
}
console.log('Upload successful! Server responded with:', body);
})
var form = r.form()
form.append('my_field1', 'my_value23_321')
form.append('my_field2', '123123sdas')
form.append('my_file', fs.createReadStream(path.join(__dirname, 'doodle.png')))
Have a look at the request module.
It will provide you the ability to stream a file to POST requests.
I would like to stream the contents of an HTTP response to a variable. My goal is to get an image via request(), and store it in in MongoDB - but the image is always corrupted.
This is my code:
request('http://google.com/doodle.png', function (error, response, body) {
image = new Buffer(body, 'binary');
db.images.insert({ filename: 'google.png', imgData: image}, function (err) {
// handle errors etc.
});
})
What is the best way to use Buffer/streams in this case?
The request module buffers the response for you. In the callback, body is a string (or Buffer).
You only get a stream back from request if you don't provide a callback; request() returns a Stream.
See the docs for more detail and examples.
request assumes that the response is text, so it tries to convert the response body into a sring (regardless of the MIME type). This will corrupt binary data. If you want to get the raw bytes, specify a null encoding.
request({url:'http://google.com/doodle.png', encoding:null}, function (error, response, body) {
db.images.insert({ filename: 'google.png', imgData: body}, function (err) {
// handle errors etc.
});
});
var options = {
headers: {
'Content-Length': contentLength,
'Content-Type': 'application/octet-stream'
},
url: 'http://localhost:3000/lottery/lt',
body: formData,
encoding: null, // make response body to Buffer.
method: 'POST'
};
set encoding to null, return Buffer.
Have you tried piping this?:
request.get('http://google.com/doodle.png').pipe(request.put('{your mongo path}'))
(Though not familiar enough with Mongo to know if it supports direct inserts of binary data like this, I know CouchDB and Riak do.)
Nowadays, you can easily retreive a file in binary with Node 8, RequestJS and async await. I used the following:
const buffer = await request.get(pdf.url, { encoding: null });
The response was a Buffer containing the bytes of the pdf. Much cleaner than big option objects and old skool callbacks.