nodejs handling arraybuffers - node.js

suppose I make a multipart, application/octet-stream request with responseType as 'arraybuffer'...suppose I receive this in nodejs and I try to write the response into a file. How can I handle this such that I don't corrupt the contents?
My current approach is something like this
var req = var req = restler.post(url, opts)
.on('data', function (data){
console.log('receiving data...');
console.log(data);
}).on('complete', function (data) {
var buff = new Buffer(data) //this is prolly incorrect, but I can't figure this out at all
fs.writeFile(file_name, buff.toString('binary'), function(err){
console.log('done!')
});
Here I write the contents into filename.
Suppose I fetch a microsoft word file...fetching it only leads me to a corrupt file. Also using restler package for this

According to the restler documentation, you can set decoding: 'buffer' in your opts and it will keep the binary data intact as a Buffer instead of the default utf8-encoded string. From there it's just a matter of passing the buffer directly to fs.writeFile() without calling buffer.toString().

Related

Node: Express: How to handle application/octet-stream;charset=;UTF-8 response?

I have a node-express application.
There, I'm trying to make a call to an API which is responding an raw xlsx object as
'Content-Type' : 'application/octet-stream;charset=;UTF-8'
Code how I'm calling the API:
var unirest = require("unirest");
var reqClient = unirest("POST", "https://api.application.com/getExcel");
reqClient.headers({ "Authorization": "Bearer " + req.session.passport.user.token,
"content-type": req.headers['content-type'], "application/json"});
reqClient.type("json");
reqClient.send(JSON.stringify(requestbody));
reqClient.end(function(res) {
if (res.error) throw new Error(res.error);
console.log(res.body);
});
Now 2 things I'm trying to do actually with this data.
Write that into an excel file.
Below is the code how I'm trying it:
let data = res.body // res is the response coming from the API
let buf = Buffer.from(data);
excelfile = fs.createWriteStream("result.xlsx");
excelfile.write(buf);
excelfile.end();
Trying to send it to UI where the excelfile will get created.
Below is my code for that:
let data = res.body // res is the response coming from the API
let buf = Buffer.from(data);
response.write(buf); //response is the response to the request to ui
response.end();
So in both the cases the file is coming corrupted.
But the API response is prefect because, when it's directly getting consumed by the UI, xlsx file is generating properly.
When dealing with binary data, you have to set the encoding to null
reqClient.encoding(null)
reqClient.encoding(null)
reqClient.end(function(res) {
if (res.error) {
return response.status(500).send('error');
}
// res.body is now a buffer
response.write(res.body);
response.end();
});
Otherwise, the data is converted to UTF-8 and you can't convert from UTF-8 back to binary and get the same data, which is what you were doing:
Buffer.from(res.body)
The recommended approach is to use streams directly, I don't see a way of doing this in a simple way in unirest, I recommend using request or got, so you can .pipe directly to a file or express res

How to construct and extract value from simple HTTPS request in Node.js?

I have a simple HTTPS request -
https://api.pro.coinbase.com/products/btc-eur/ticker
In the browser this returns one object. What's the simplest code that will allow me to retrieve and display this object (as is) in the terminal of Node?
const https = require('https')
const url = https.get('https://api.pro.coinbase.com/products/btc-eur/ticker')
const myObject = JSON.parse(url)
console.log(myObject)
A simple copy / paste of the above code in VSC returns the error SyntaxError: Unexpected token o in JSON at position 1.
#mamba76, welcome to the SO community. Please use Node.js node-fetch package. It is much simpler to use. You can install it using npm install.
Following code might help:
"use strict";
const fetch = require('node-fetch')
async function getValue() {
// Invoke the API.
// Wait until data is fetched.
let response = await fetch('https://api.pro.coinbase.com/products/btc-eur/ticker');
let value = await response.json();
return value;
}
getValue().then(result => {console.log(result.price);});
As a good practice, always assume that API calls over the HTTP (whether in your own network or outside) might take time to return data and hence you should use async-await pattern to make these requests.
Extending #Akshay.N's answer and without using external dependencies,
const https = require('https')
https.get("https://api.pro.coinbase.com/products/btc-eur/ticker",res=>{
let body = '';
res.on('data', (chunk) => { body += chunk; });
res.on('end', () => {
const myObject = JSON.parse(body);
console.log(myObject);
})
})
Now, what we're doing here is waiting on the data event as long as the data is coming in, and appending it to the variable body. Once the end event is encountered, we take that as a signal that all data has been received and we can proceed to parse the body into an object using JSON.parse (assuming the data was serialized in JSON; if it wasn't JSON.parse will throw an error).
This tutorial is helpful: https://davidwalsh.name/nodejs-http-request
try something like this:-
https.get("https://api.pro.coinbase.com/products/btc-eur/ticker",res=>{
res.on('data', (chunk) => { console.log(JSON.parse(chunk))})
})
With node (you need request module) :
// display object
(require("request")).get({
url: "myurl",
json: true
}, function(e,r,b){
console.log(b);
});
// display as string
(require("request")).get({
url: "myurl",
json: false
}, function(e,r,b){
console.log(b);
});
With just curl in your terminal (without node)
curl myurl

hitting a multipart url in nodejs

I have a client code using form-data module to hit a url that returns a content-type of image/jpeg. Below is my code
var FormData = require('form-data');
var fs = require('fs');
var form = new FormData();
//form.append('POLICE', "hello");
//form.append('PAYSLIP', fs.createReadStream("./Desert.jpg"));
console.log(form);
//https://fbcdn-profile-a.akamaihd.net/hprofile-ak-xfp1/v/t1.0- 1/c8.0.50.50/p50x50/10934065_1389946604648669_2362155902065290483_n.jpg?oh=13640f19512fc3686063a4703494c6c1&oe=55ADC7C8&__gda__=1436921313_bf58cbf91270adcd7b29241838f7d01a
form.submit({
protocol: 'https:',
host: 'fbcdn-profile-a.akamaihd.net',
path: '/hprofile-ak-xfp1/v/t1.0-1/c8.0.50.50/p50x50/10934065_1389946604648669_2362155902065290483_n.jpg?oh=13640f19512fc3686063a3494c6c1&oe=55ADCC8&__gda__=1436921313_bf58cbf91270adcd7b2924183',
method: 'get'
}, function (err, res) {
var data = "";
res.on("data", function (chunks) {
data += chunks;
});
res.on("end", function () {
console.log(data);
console.log("Response Headers - " + JSON.stringify(res.headers));
});
});
I'm getting some chunk data and the response headers i received was
{"last-modified":"Thu, 12 Feb 2015 09:49:26 GMT","content-type":"image/jpeg","timing-allow-origin":"*","access-control-allow-origin":"*","content-length":"1443","cache-control":"no-transform, max-age=1209600","expires":"Thu, 30 Apr 2015 07:05:31 GMT","date":"Thu, 16 Apr 2015 07:05:31 GMT","connection":"keep-alive"}
I am now stuck as how to process the response that i received to a proper image.I tried base64 decoding but it seemed to be a wrong approach any help will be much appreciated.
I expect that data, once the file has been completely downloaded, contains a Buffer.
If that is the case, you should write the buffer as is, without any decoding, to a file:
fs.writeFile('path/to/file.jpg', data, function onFinished (err) {
// Handle possible error
})
See fs.writeFile() documentation - you will see that it accepts either a string or a buffer as data input.
Extra awesomeness by using streams
Since the res object is a readable stream, you can simply pipe the data directly to a file, without keeping it in memory. This has the added benefit that if you download really large file, Node.js will not have to keep the whole file in memory (as it does now), but will write it to the filesystem continuously as it arrives.
form.submit({
// ...
}, function (err, res) {
// res is a readable stream, so let's pipe it to the filesystem
var file = fs.createWriteStream('path/to/file.jpg')
res.on('end', function writeDone (err) {
// File is saved, unless err happened
})
.pipe(file) // Send the incoming file to the filesystem
})
The chunk you got is the raw image. Do whatever it is you want with the image, save it to disk, let the user download it, whatever.
So if I understand your question clearly, you want to download a file from an HTTP endpoint and save it to your computer, right? If so, you should look into using the request module instead of using form-data.
Here's a contrived example for downloading things using request:
var fs = require('fs');
var request = require('request')
request('http://www.example.com/picture.jpg')
.pipe(fs.createWriteStream('picture.jpg'))
Where 'picture.jpg' is the location to save to disk. You can open it up using a normal file browser.

What is the correct way to encode image data in nodejs buffers?

I'm trying to fetch an image, apply a transform and save it in a database like mongodb. Her's my code
var stor = function(inStream, sizeType) {
console.log('entering stor function');
var hashCode = '';
var img = new Buffer(1024 * 1024 * 5 * 1.1, 'binary'); //5mb size + 10% space
var hash = crypto.createHash('sha1');
inStream.on('data', function (chunk){
Buffer.concat([img, chunk]);
hash.update(chunk);
});
inStream.on('end', function() {
hashCode = hash.digest('hex');
var retUrl = "http://playground.com/" + hashCode;
//post this url using requests, set encoding : binary
});
};
server.post('/resize', function(req, res) {
req.accepts('application/json');
console.log('received a resize request for image =', req.body.host + req.body.path);
var request = requests.get({
url: req.body.url,
headers: {'accept-encoding': 'gzip'}
});
//error handling
request.on('response', function (response) {
console.log('succesfully fetched image...');
response.setEncoding('binary');
//save original in stor
stor(response, 'original');
res.writeHead(201);
res.send();
});
});
module.exports = server;
When i do this, where i receive some image from the internet and then save it in my database for future use, the image saved data in the database is not the original image i stored. It is corrupt. I have narrowed the problem down to the encoding of the data I buffer, in the function store( variable 'img'). I did this by directly piping the data from response to the post to database call. I can't do that for my purpose because i need to compute the hash of the image.
I want to know if my assumptions are correct.
Images from the internet can be read as 'binary'.
You can load that data onto a buffer as 'binary'.
PUT the image onto a store with encoding set to 'binary'.
I think one or all of these assumptions are wrong, as i get back only corrupted data back from the database.
The issue was that I was using exec. Exec outputs a buffer. Using spawn solved this issue. Spawn outputs a STREAM, which handles binary correctly. Ofcourse, I also set the encoding to binary as well.

nodejs gm content-length implementation hangs browser

I've written a simple image manipulation service that uses node gm on an image from an http response stream. If I use nodejs' default transfer-encoding: chunked, things work just fine. But, as soon as I try and add the content-length implementation, nodejs hangs the response or I get content-length mismatch errors.
Here's the gist of the code in question (variables have been omitted due to example):
var image = gm(response);
// gm getter used to get origin properties of image
image.identify({bufferStream: true}, function(error, value){
this.setFormat(imageFormat)
.compress(compression)
.resize(width,height);
// instead of default transfer-encoding: chunked, calculate content-length
this.toBuffer(function(err, buffer){
console.log(buffer.length);
res.setHeader('Content-Length', buffer.length);
gm(buffer).stream(function (stError, stdout, stderr){
stdout.pipe(res);
});
});
});
This will spit out the desired image and a content length that looks right, but the browser will hang suggesting that there's a bit of a mismatch or something else wrong. I'm using node gm 1.9.0.
I've seen similar posts on nodejs gm content-length implementation, but I haven't seen anyone post this exact problem yet.
Thanks in advance.
I ended up changing my approach. Instead of using this.toBuffer(), I save the new file to disk using this.write(fileName, callback), then read it with fs.createReadStream(fileName) and piping it to the response. Something like:
var filePath = './output/' + req.param('id') +'.' + imageFormat;
this.write(filePath, function (writeErr) {
var stat = fs.statSync(filePath);
res.writeHead(200, {
'Content-Type': 'image/' + imageFormat,
'Content-Length': stat.size
});
var readStream = fs.createReadStream(filePath);
readStream.pipe(res);
// async delete the file from filesystem
...
});
You end up getting all of the headers you need including your new content-length to return to the client.

Resources