I would like to stream the contents of an HTTP response to a variable. My goal is to get an image via request(), and store it in in MongoDB - but the image is always corrupted.
This is my code:
request('http://google.com/doodle.png', function (error, response, body) {
image = new Buffer(body, 'binary');
db.images.insert({ filename: 'google.png', imgData: image}, function (err) {
// handle errors etc.
});
})
What is the best way to use Buffer/streams in this case?
The request module buffers the response for you. In the callback, body is a string (or Buffer).
You only get a stream back from request if you don't provide a callback; request() returns a Stream.
See the docs for more detail and examples.
request assumes that the response is text, so it tries to convert the response body into a sring (regardless of the MIME type). This will corrupt binary data. If you want to get the raw bytes, specify a null encoding.
request({url:'http://google.com/doodle.png', encoding:null}, function (error, response, body) {
db.images.insert({ filename: 'google.png', imgData: body}, function (err) {
// handle errors etc.
});
});
var options = {
headers: {
'Content-Length': contentLength,
'Content-Type': 'application/octet-stream'
},
url: 'http://localhost:3000/lottery/lt',
body: formData,
encoding: null, // make response body to Buffer.
method: 'POST'
};
set encoding to null, return Buffer.
Have you tried piping this?:
request.get('http://google.com/doodle.png').pipe(request.put('{your mongo path}'))
(Though not familiar enough with Mongo to know if it supports direct inserts of binary data like this, I know CouchDB and Riak do.)
Nowadays, you can easily retreive a file in binary with Node 8, RequestJS and async await. I used the following:
const buffer = await request.get(pdf.url, { encoding: null });
The response was a Buffer containing the bytes of the pdf. Much cleaner than big option objects and old skool callbacks.
Related
I'm using the request library to send a binary (pdf) file in the body of the request using http post (NOTE: This API does not accept multi-part forms). However, I have only been able to get it to work using fs.readFilesync(). For some reason, when I try to use fs.createReadStream() the pdf file is still sent, but it is empty, and the request never finishes (I never get a response back from the server).
Here is my working version using fs.readFileSync():
const request = require('request');
const fs = require('fs');
const filename = 'test.pdf';
request({
url: 'http://localhost:8083/api/v1/endpoint',
method: 'POST',
headers: {
'Content-Type': 'application/octet-stream',
'Accept': 'application/vnd.api+json',
'Content-Disposition': `file; filename="${filename}"`
},
encoding: null,
body: fs.readFileSync(filename)
}, (error, response, body) => {
if (error) {
console.log('error:', error);
} else {
console.log(JSON.parse(response.body.toString()));
}
});
If I try to replace the body with the below, it doesn't work:
body: fs.createReadStream(filename)
I have also tried piping the http request on to the stream, like it says in the request library docs, but I get the same result:
fs.createReadStream(filename).pipe(request({...}))
I've tried to monitor the stream by doing the following:
var upload = fs.createReadStream('test.pdf');
upload.pipe(req);
var upload_progress = 0;
upload.on("data", function (chunk) {
upload_progress += chunk.length
console.log(new Date(), upload_progress);
})
upload.on("end", function (res) {
console.log('Finished');
req.end();
})
I see progress for the stream and Finished, but still no response is returned from the API.
I'd prefer to create a read stream because of the benefits of working better with larger files, but am clueless as to what is going wrong. I am making sure I'm not altering the file with any special encoding as well.
Is there some way to get some kind of output to see what process is taking forever?
UPDATE:
I decided to test with a simple 1 KB .txt file. I found that it is still empty using fs.createReadStream(), however, this time I got a response back from the server. The test PDF I'm working with is 363 KB, which isn't outrageous in size, but still... weren't streams made for large files anyway? Using fs.readFileSync() also worked fine for the text file.
I'm beginning to wonder if this is an synchronous vs asynchronous issue. I know that fs.readFileSync() is synchronous. Do I need to wait until fs.createReadStream() finishes before trying to append it to the body?
I was able to get this working by doing the following:
const request = require('request');
const fs = require('fs');
const filename = 'test.pdf';
const readStream = fs.createReadStream(filename);
let chunks = [];
readStream.on('data', (chunk) => chunks.push(chunk));
readStream.on('end', () => {
const data = Buffer.concat(chunks);
request({
url: 'http://localhost:8083/api/v1/endpoint',
method: 'POST',
headers: {
'Content-Type': 'application/octet-stream',
'Accept': 'application/vnd.api+json',
'Content-Disposition': `file; filename="${filename}"`
},
encoding: null,
body: data
}, (error, response, body) => {
if (error) {
console.log('error:', error);
} else {
console.log(JSON.parse(response.body.toString()));
}
});
});
I chunked the data together and concatenated it with a buffer before making the request.
I noticed in the documentation it said this:
The Buffer class was introduced as part of the Node.js API to enable interaction with octet streams in TCP streams, file system operations, and other contexts.
The API I'm calling requires the application/octet-stream header, so I need to use the buffer rather than streaming it directly.
I'm using request module on node.js but there is something problem with encoding option. beneath codes are simple post request, but I don't know how to set up encoding of form field data. I already set headers to 'Content-Type': 'application/x-www-form-urlencoded; charset=euc-kr' But it doesn't works. field data is korean, like "안녕하세요", and I should post it with euc-kr encoding. (The site takes euc-kr, not utf8)
The same program on Java application, I coded like this :
PrintWriter wr = new PrintWriter(new OutputStreamWriter(conn.getOutputStream(), "euc-kr"));
But I don't know how to in nodejs. Can anyone give some solution...?
Code Sample
//Load the request module
var request = require('request');
//Lets configure and request
request({
url: 'http://google.com', //URL to hit
headers: { 'Content-Type': 'application/x-www-form-urlencoded; charset=euc-kr' },
method: 'POST',
form: {
field1: 'data',
field2: 'data'
}
}, function(error, response, body){
if(error) {
console.log(error);
} else {
console.log(response.statusCode, body);
}
});
Finally I got a soultion, and I solved this problem.
If you send a data as a form using request module, the module change your form encoding to utf-8 by force. So even you setted your form encoding to another charset, the module changes your charset to utf8 again. You can see that at request.js on line 1120-1130.
So, You'd better send a data by 'body' option, not 'form' option.
Node doesn't support EUC-KR so you can use iconv-lite to extend the native encodings available and set the encoding option in request.
List of Natively Supported Encodings
iconv.extendNodeEncodings(); only works for node pre v4+. See here to get this working for a newer version of node.
var iconv = require('iconv-lite');
var request = require('request');
// This will add to the native encodings available.
iconv.extendNodeEncodings();
request({
url: 'http://google.com', //URL to hit
method: 'POST',
form: {
field1: 'data',
field2: 'data'
},
encoding: 'EUC-KR'
}, function(error, response, body){
if(error) {
console.log(error);
} else {
console.log(response.statusCode, body);
}
});
iconv.undoExtendNodeEncodings();
I'm trying to upload a file to s3 using request.js but the file data seems to be incorrect after I upload it. Should I be using a data property of the response object instead?
var flickrPhotoUrl = 'http://c1.staticflickr.com/3/2207/2305523141_1f98981685_z.jpg?zz=1';
request.get(flickrPhotoUrl, function(error, response, body){
if (body) {
s3.upload({
Body: body,
Bucket: 'my-uploads',
Key: 'photo.jpg',
}, function (err) {
done(err);
});
}
else
{
done('no response');
}
});
When I grab the file from s3 after the upload, it's not a recognizable image and seems to be twice as big.
request by default converts your image binary data to utf8 string. That's why the file size is larger than the actual size. Try passing encoding:null to keep the body as buffer:
request.get({encoding:null, uri: flickrPhotoUrl}, function(error, response, body){
...
})
Update:
I think you can also pass a readable stream in Body parameter. This is faster than above for large files.
var stream = request.get(flickrPhotoUrl)
s3.upload({
Body: stream,
Bucket: 'my-uploads',
Key: 'photo.jpg',
}
I am using https.request() to make a HTTPS request using the following familiar pattern:
var request = https.request(options, function (response) {
var chunks = [];
response.on('data', function (chunk) {
chunks.push(chunk);
});
response.on('end', function () {
var buffer = Buffer.concat(chunks);
...
});
});
...
request.end();
...
Once I have the finished response Buffer, it needs to be packaged into a JSON object. The reason for this is because I am creating a kind of tunnel, whereby the HTTP response (its headers, status, and body) are to be sent as JSON through another protocol.
So that both textual and binary responses may be supported, what works for me so far is to encode the Buffer to Base64 (using buffer.toString('base64')) and unencode it at the other end using new Buffer(theJsonObject.body, 'base64'). While this works, it would be more efficient if I could selectively only perform Base64 encoding if the HTTP request response is known to be of binary type (e.g. images). Otherwise, in the https.request() callback shown above, I could simply do chunk.toString() and convey the response body in the JSON object as a UTF-8 string type. My JSON object would probably contain an additional property that indicates to the opposite end of the tunnel whether the 'body' is a UTF-8 string (e.g. for .htm, .css, etc.) or a Base64-encoded (e.g. images).
What I could do is try to use the MIME type in the response content-type header to work out whether the response is going to be binary. I would probably maintain a 'white list' of types that I know it's safe to assume are UTF-8 (such as 'text/html' and so on). All others (including e.g. 'image/png') would be Base64-encoded.
Can anyone propose a better solution?
Could you use the file-type package to detect the file type by checking the magic number of the buffer?
Install
npm install --save file-type
Usage
var fileType = require('file-type');
var safeTypes = ['image/gif'];
var request = https.request(options, function (response) {
var chunks = [];
response.on('data', function (chunk) {
chunks.push(chunk);
});
response.on('end', function () {
var buffer = Buffer.concat(chunks);
var file = fileType(buffer) );
console.log( file );
//=> { ext: 'gif', mime: 'image/gif' }
// mime isn't safe
if ( safeTypes.indexOf(file.mime) == '-1' ) {
// do your Base64 thing
}
});
});
...
request.end();
...
If you want to keep your code package free have a look at the package source on Github, it's pretty minimal.
I'm simply trying to create a node server that outputs the HTTP status of a given URL.
When I try to flush the response with res.write, I get the error: throw new TypeError('first argument must be a string or Buffer');
But if I replace them with console.log, everything is fine (but I need to write them to the browser not the console).
The code is
var server = http.createServer(function (req, res) {
res.writeHead(200, {"Content-Type": "text/plain"});
request({
uri: 'http://www.google.com',
method: 'GET',
maxRedirects:3
}, function(error, response, body) {
if (!error) {
res.write(response.statusCode);
} else {
//response.end(error);
res.write(error);
}
});
res.end();
});
server.listen(9999);
I believe I should add a callback somewhere but pretty confused and any help is appreciated.
I get this error message and it mentions options.body
I had this originally
request.post({
url: apiServerBaseUrl + '/v1/verify',
body: {
email: req.user.email
}
});
I changed it to this:
request.post({
url: apiServerBaseUrl + '/v1/verify',
body: JSON.stringify({
email: req.user.email
})
});
and it seems to work now without the error message...seems like bug though. I think this is the more official way to do it:
request.post({
url: apiServerBaseUrl + '/v1/verify',
json: true,
body: {
email: req.user.email
}
});
response.statusCode is a number, e.g. response.statusCode === 200, not '200'. As the error message says, write expects a string or Buffer object, so you must convert it.
res.write(response.statusCode.toString());
You are also correct about your callback comment though. res.end(); should be inside the callback, just below your write calls.
Request takes a callback method, its async! So I am assuming, by the time the callback is executed the res.end() might get called. Try closing the request within the callback.
Well, obviously you are trying to send something which is not a string or buffer. :) It works with console, because console accepts anything. Simple example:
var obj = { test : "test" };
console.log( obj ); // works
res.write( obj ); // fails
One way to convert anything to string is to do that:
res.write( "" + obj );
whenever you are trying to send something. The other way is to call .toString() method:
res.write( obj.toString( ) );
Note that it still might not be what you are looking for. You should always pass strings/buffers to .write without such tricks.
As a side note: I assume that request is a asynchronous operation. If that's the case, then res.end(); will be called before any writing, i.e. any writing will fail anyway ( because the connection will be closed at that point ). Move that line into the handler:
request({
uri: 'http://www.google.com',
method: 'GET',
maxRedirects:3
}, function(error, response, body) {
if (!error) {
res.write(response.statusCode);
} else {
//response.end(error);
res.write(error);
}
res.end( );
});
if u want to write a JSON object to the response then change the header content type to application/json
response.writeHead(200, {"Content-Type": "application/json"});
var d = new Date(parseURL.query.iso);
var postData = {
"hour" : d.getHours(),
"minute" : d.getMinutes(),
"second" : d.getSeconds()
}
response.write(postData)
response.end();
And there is another possibility (not in this case) when working with ajax(XMLhttpRequest), while sending information back to the client end you should use res.send(responsetext) instead of res.end(responsetext)
Although the question is solved, sharing knowledge for clarification of the correct meaning of the error.
The error says that the parameter needed to the concerned breaking function is not in the required format i.e. string or Buffer
The solution is to change the parameter to string
breakingFunction(JSON.stringify(offendingParameter), ... other params...);
or buffer
breakingFunction(BSON.serialize(offendingParameter), ... other params...);
The first argument must be one of type string or Buffer. Received type object
at write_
I was getting like the above error while I passing body data to the request module.
I have passed another parameter that is JSON: true and its working.
var option={
url:"https://myfirstwebsite/v1/appdata",
json:true,
body:{name:'xyz',age:30},
headers://my credential
}
rp(option)
.then((res)=>{
res.send({response:res});})
.catch((error)=>{
res.send({response:error});})