POST request to retrieve pdf in node.js - node.js

I am making a POST request to retrieve a pdf. The request works fine if I do it in postman, but I get an empty pdf if I do it through node.js using the request package. Here's my request using the request package:
let body = {
attr1: "attr1",
attr2: "attr2"
}
let opts = {
url: "some_url",
method: "post",
headers: {
"Content-Type": "application/x-www-form-urlencoded",
},
body
}
request(requestOpts).then(pdf => {
console.log(pdf) // prints out the binary version of the pdf file
fs.writeFileSync("testing.pdf", pdf);
});
I use the exact same request parameters when I use postman but it returns the pdf w/ the correct content.
Can someone help? Or is the way I am saving my pdf incorrect?
Thanks in advance!

Solution - i had to set encoding: false in the request options.

Try
fs.writeFileSync("testing.pdf", pdf, 'binary');
The third argument here tells fs to write binary rather than trying to UTF-8 encode it.

According to the docs the third paramter should be a string that represents the encoding.
For pdf files the encoding is 'application/pdf'
So this should work for you : fs.writeFileSync("testing.pdf", pdf, 'application/psf');

Related

Getting a bad request 400 when trying to upload zipped wkt file to here-maps rest api

We have a problem with the fleet.ls.hereapi.com when uploading a new layer for geofencing.
const myId = 'MYLAYER'; // just a id to check
zip.file('data.wkt', `NAME\tWKT\n${myId}\t${wkt}`);
const content = await zip.generateAsync({ type: 'nodebuffer' });
const formData = new FormData();
formData.append('zipfile', content);
await axios.post(config.HERE_MAPS_UPLOAD_ENDPOINT, formData, {
headers: {
'content-type': 'multipart/form-data',
},
params: {
apiKey: config.HERE_MAPS_REST_API_KEY,
layer_id: myId,
},
});
We get a bad request without a message and do not know what the problem is. The same implementation works in the Frontend (with 'blob' as zip type). Is there a parameter to get a better error message from the api?
We got the instructions how to implement it from this tutorial: https://www.youtube.com/watch?v=Ayw9GcS1V-8 and as I mentioned it works fine in the frontend. Also it works if I write a file in node and upload it via curl. Thank you for any help in advance!
Edit: I'm getting the following issue from the API: 'Multipart should contain exactly one part but contains 0'
I fixed it!
The problem was that the api needed a filename for the form data. This filename can be provided as third parameter as described here.
So I basically changed formData.append('zipfile', content); to formData.append('zipfile', content, zipfile.zip); and it worked.
Hope this will help somebody in the future!

AWS Lambda return both a base64 encoded data and a text string in body response

I am using an AWS Lambda implementation with Node.js to generate a PDF file. I have the following callback that returns the pdf in an encoded base64 result. This works great for me:
return callback(null, {
statusCode: 200,
body: new Buffer(data).toString('base64'),
isBase64Encoded: true,
headers: {
'Content-Type': 'text',
},
})
However, I would like to add further information with my response - not just the PDF bae64 encoded data, but some string type results that I can use further down my active application connected to this Lambda function. I'd like to return the base64 data and string data, something like this:
return callback(null, {
statusCode: 200,
body: JSON.stringify(
{
message: 'hello world',
report: new Buffer(data).toString('base64')
}
),
isBase64Encoded: true,
headers: {
'Content-Type': 'text',
},
})
But this is failing for me. How would I refactor the above to return both string data and base64 data? I'm also having to force the isBase64Encoded setting to true, which may clash with my new requirement to return both base64 and normal string data.
The Content-Type of your response is not text - since you are returning JSON, a application/json value would make more sense and might alleviate some of the issues you are having. It would be helpful if the post could be updated with some more relevant details of the errors you are encountering.
One other possible work-around would be to add the message (and any other string values) as HTTP headers on the base64 encoded response you already have working. Then your client can decode the HTTP response whose body contains the base64 encoded PDF and HTTP headers x-custom-message (or something similar) set to hello world.

How to handle JPEG data received in HTTP res.body?

I'm using the request.js node module to make a GET request for an image. The body I get back looks like this:
body: 'u0014�����T���8�\u00029�\u001fZ\u0000m(\u0007�\u001d�A\u0014�9E9Oz#E8s��`8d�x�j`�<rq... etc'
How do I read that as a JPEG?
What I'm doing is just forwarding that content as a PUT request to another endpoint. This is working, except that the image data is no readable on the new URL (which is a CouchDB document attachment).
My PUT request looks like this:
request({
url: newDocUrl + '/' + aName + "?rev=" + resRev,
method: 'PUT',
headers: headersAttachment, //{'Content-Type': 'image/jpeg'}
body: attachment
}, function(e, r, b) {
console.log('body', b);
});
Questions: How do I read JPEG data from an HTTP res? What format should JPEG data be to forward an image? (i.e. base64, hex, something else?)
I found the answer here:
Node.js get image from web and encode with base64
I didn't know that request.js encodes URL responses. By turning it off the image was posted fine.

File Corruption when Uploading Excel files to Microsoft Graph API Beta

We're trying to upload Microsoft Excel file to OneDrive but the file gets corrupted every time we do so.
We've tried using [these instructions] to make a PUT request to the following permutations of settings:
Content-Encodings:
text/plan
application/vnd.openxmlformats-officedocument.spreadsheetml.sheet
POST bodies:
XLSX file bytes raw from disk
XLSX file encoded as UTF8 string
XLSX file encoded as base64
If we download the file that gets uploaded, it looks almost the same, but a few binary regions are different.
If you feel comfortable opening an Excel file off the internet, I've uploaded an example of the file we upload and the corrupted file OneDrive saves.
This has all the smell of a bug that can be fixed with a single parameter modification... I just can't figure out what it is.
Anyone have thoughts? Thanks!
Thanks #GSM. Here's our code in TypeScript.
var fileContent = FileSystem.readFileSync(localFile);
var url = `https://graph.microsoft.com/beta/me/drive/root/children/${doc.name}.xlsx:/content`,
var opts {
url: url,
method: 'PUT',
headers: [
'Content-Type': 'text/plain',
'Authorization': token
],
body: fileContent
};
var requestOpts = {
url: `https://${domain}${opts.path}`,
method: opts.method,
headers: {},
};
request(opts, cb);
The only difference I see is that you're using an alternate path to upload the file, which is also documented on the GraphAPI page. If we use the path you're using we get back the back the error message:
{
"error": {
"code": "BadRequest",
"message": "Entity only allows writes with a JSON Content-Type header.",
"innerError": {
"request-id": "2a2e7588-3217-4337-bee3-f8aff208510c",
"date": "2016-05-30T16:35:50"
}
}
}
..which is strange because it makes me expect that your code shouldn't have worked either.
Update -- the answer
By reading the file into a string and then writing it to the JSON object that defined the PUT parameters, we were corrupting it. We solved the problem by simply piping a file read stream right to the open HTTP request.
It would be easier to help if you posted your code.
However, here's some code that can be used to upload files to OneDrive. I tested it with your file and was able to upload and download just fine:
using (var client = new HttpClient())
{
client.DefaultRequestHeaders.Add("Authorization", "Bearer " + t.AccessToken);
var byteContent = File.ReadAllBytes(#"C:\Temp\sheet-uploaded.xlsx");
var url = resource + "beta/me/drive/root:/Documents/sheet-uploaded.xlsx:/content";
var result = client.PutAsync(url, new ByteArrayContent(byteContent)).Result;
result.Content.ReadAsStringAsync().Dump();
}

Response encoding with node.js "request" module

I am trying to get data from the Bing search API, and since the existing libraries seem to be based on old discontinued APIs I though I'd try myself using the request library, which appears to be the most common library for this.
My code looks like
var SKEY = "myKey...." ,
ServiceRootURL = 'https://api.datamarket.azure.com/Bing/Search/v1/Composite';
function getBingData(query, top, skip, cb) {
var params = {
Sources: "'web'",
Query: "'"+query+"'",
'$format': "JSON",
'$top': top, '$skip': skip
},
req = request.get(ServiceRootURL).auth(SKEY, SKEY, false).qs(params);
request(req, cb)
}
getBingData("bookline.hu", 50, 0, someCallbackWhichParsesTheBody)
Bing returns some JSON and I can work with it sometimes but if the response body contains a large amount of non ASCII characters JSON.parse complains that the string is malformed. I tried switching to an ATOM content type, but there was no difference, the xml was invalid. Inspecting the response body as available in the request() callback actually shows bad code.
So I tried the same request with some python code, and that appears to work fine all the time. For reference:
r = requests.get(
'https://api.datamarket.azure.com/Bing/Search/v1/Composite?Sources=%27web%27&Query=%27sexy%20cosplay%20girls%27&$format=json',
auth=HTTPBasicAuth(SKEY,SKEY))
stuffWithResponse(r.json())
I am unable to reproduce the problem with smaller responses (e.g. limiting the number of results) and unable to identify a single result which causes the issue (by stepping up the offset).
My impression is that the response gets read in chunks, transcoded somehow and reassembled back in a bad way, which means the json/atom data becomes invalid if some multibyte character gets split, which happens on larger responses but not small ones.
Being new to node, I am not sure if there is something I should be doing (setting the encoding somewhere? Bing returns UTF-8, so this doesn't seem needed).
Anyone has any idea of what is going on?
FWIW, I'm on OSX 10.8, node is v0.8.20 installed via macports, request is v2.14.0 installed via npm.
i'm not sure about the request library but the default nodejs one works well for me. It also seems a lot easier to read than your library and does indeed come back in chunks.
http://nodejs.org/api/http.html#http_http_request_options_callback
or for https (like your req) http://nodejs.org/api/https.html#https_https_request_options_callback (the same really though)
For the options a little tip: use url parse
var url = require('url');
var params = '{}'
var dataURL = url.parse(ServiceRootURL);
var post_options = {
hostname: dataURL.hostname,
port: dataURL.port || 80,
path: dataURL.path,
method: 'GET',
headers: {
'Content-Type': 'application/json; charset=utf-8',
'Content-Length': params.length
}
};
obviously params needs to be the data you want to send
I think your request authentication is incorrect. Authentication has to be provided before request.get.
See the documentation for request HTTP authentication. qs is an object that has to be passed to request options just like url and auth.
Also you are using same req for second request. You should know that request.get returns a stream for the GET of url given. Your next request using req will go wrong.
If you only need HTTPBasicAuth, this should also work
//remove req = request.get and subsequent request
request.get('http://some.server.com/', {
'auth': {
'user': 'username',
'pass': 'password',
'sendImmediately': false
}
},function (error, response, body) {
});
The callback argument gets 3 arguments. The first is an error when applicable (usually from the http.Client option not the http.ClientRequest object). The second is an http.ClientResponse object. The third is the response body String or Buffer.
The second object is the response stream. To use it you must use events 'data', 'end', 'error' and 'close'.
Be sure to use the arguments correctly.
You have to pass the option {json:true} to enable json parsing of the response

Resources