serverless express can't retrieve pdf file (base64 encoding) - node.js

I have setup an express/serverless application to retrieve a pdf file on a GET request. But I just retrieve a corrupted repsonse pdf response. I just wondering If my settings are correct to achieve a correct response.
I'm using aws-serverless-express and want to return my pdf buffer to the client browser (it should open in the browser)
My code:
status = 200;
let fileName = "demo.pdf";
res.setHeader('Content-disposition', 'inline; filename="' + fileName + '"');
res.setHeader('Content-type', 'application/pdf');
res.setHeader('isBase64Encoded', true);//isBase64Encoded: true
let pdf = pdfBuffer.toString('base64');
res.status(status).send(pdf);
so I'm sending a base64 encoded string to APIGW. I'm not actually sure if I can set the isBase64Encoded flag via header. I read this before but I'm not so certain about that
I have done this whole procedure before, but didn't make use of aws-serverless-express (where I Could set the isBase64Encoded flag easily)
I'm also using serverless-apigw-binary to automatically setup APIGW for the correct decoding of the base64 encoded data

lambda is automatically encoding to base64, so I had to remove it and directly send the buffer.

I came across similar problem while using serverless Express. AWS Gateway needs a http response like this:
{
"header":{....}
"body": "/9j/4AAQSkZ...",
"isBase64Encoded": true
}
It's useless to put isBase64Encoded in the Http Response Header, AWS API Gateway only checks if the isBase64Encoded is outside the Http header. If it's true, then decode the body before sending it to a HTTP client.
Express (its response object) doesn't seem to allow to add something outside HTTP Header in a Http Response.
If you don't want to give up using Express, the workaround is to do the decoding in the browser, which is pretty easy:
var decodedData = Buffer.from(body,'base64')

Related

how to read a file from form-Data in an AWS Lambda function

I am trying to send a PDF or text file to an AWS Lambda function (Node JS). I'd like to be able to process this file in the lambda function. I know often the best practice is to use a trigger function from an s3 bucket. However, I'd like to be able to send this lambda function a file from formdata, extract information from the file and then return the extracted info back.
I have been able to first encode the file in 64 bit binary data and send it to AWS lambda via a JSON, but often when I try to decode the file (especially a PDF) in the Lambda Function it is corrupted or empty.
Image files seem to work well for this type of encoding, but been unsuccessful with PDFs. Any help greatly appreciated. Rather then encode in base 64 is there a way I can obtain the file from a formdata? My code is below:
export const handler = async(event) => {
console.log("event", event)
var converted = atob(event.body) // RATHER HOW WOULD I READ A FILE FROM FORMDATA
const response = {
"statusCode": 200,
"headers": {"Content-Type":"text/html", //"application/pdf", // "multipart/form-data", //
"Access-Control-Allow-Origin":"*",
"Access-Control-Allow-Headers":"Content-Type,X-Amz-Date,Authorization,X-Api-Key,X-Amz-Security-Token"
},
"body": event,
"isBase64Encoded": true
}
return response;
};
thanks so much
I assume that you are using an API gateway to trigger the lambda function. In that case, you have to enable multipart/form-data in API gateway as mentioned in the documentation. The gist of the documentation is as follows:
In the Settings pane, choose Add Binary Media Type in the
Binary Media Types section. Type a required media type, for example, image/png, in the input text field.
Add Content-Type and Accept to the request headers for your proxy method.
Add those same headers to the integration request headers.
Deploy the API
PS: If you are using Lambda Proxy integration ({proxy+}, just steps 1 and 4 are enough.

Firebase bucket / axios url in request body decoding issue

Interesting issue when using Firebase buckets and axios in a JS environments.
When I upload a file into a bucket and send the file link returned by firebase to server in a request body, the link is auto decoded in the server.
Upload a file to bucket from web
Firebase returns a link: https://firebasestorage.googleapis.com/v0/b/[BUCKET_NAME]/o/[POINTER]%2Fimages%2F[FILE_NAME])
note the url encoded %2F that firebase uses around the 'images'
Save this to DB via a Cloud Function call by using axios.post()
Using headers: {'Content-Type': 'application/x-www-form-urlencoded'} due to Cloud Function limitations here. The url is nested in a JSON object as a String.
When this request is picked up in the Cloud Function, the URL in the object has been automatically urldecoded, resulting in:
https://firebasestorage.googleapis.com/v0/b/[BUCKET_NAME]/o/[POINTER]/images/[FILE_NAME])
note the / around the 'images'
Problem: Firebase doesn't return the file when %2F is replaced with / in the URL, only returning error:
Invalid HTTP method/URL pair.
I understand that I have only one option here, and it is to prevent this String to be URL decoded during the client-server axios call. Since I am using the mentioned headers, I'm not sure how this can be achieved.
Side quest: Why does Firebase enforce the urlencode this strongly and doesn't return the file independently of the representation of the path to file (encoded or not)?

node-http-proxy How to proxy pdfkit response?

I'm trying to figure out how to proxy a request of a pdf file generated at runtime with pdfkit.
Response headers of the backend service are set to
res.setHeader('Content-type', 'application/pdf');
// only if req.params.view != undefined
res.setHeader('Content-disposition', 'attachment; filename=' + req.params.template + '_' + id + '.pdf');
This allows to view the pdf inside browser (by just sending out the first header) or download it by also sending the second header.
While pdfkit generates the file it is piped to the response.
If I try it by contacting the backend directly it works, while using the proxy it's raising a ECONNRESET error.
I guess it might depend on the client terminating the request before to receive the chunked response, how can I allow the backend service to pipe the request as the backend is processing the pdf?
I'm surely missing something... thank you!!

Node.js: How to stream remote file through my server to the user?

There's a large binary file at somwhere.com/noo.bin
I want to send this to the user on my web app.
I don't want to save this file on my server and then serve it, wondering if there's a way to stream the file in which my web app acts as the proxy (the file will look like mysite.com/noo.bin)
Install request, then:
var http = require('http'),
request = require('request'),
remote = 'http://somewhere.com';
http.createServer(function (req, res) {
// http://somewhere.com/noo.bin
var remoteUrl = remote + req.url;
request(remoteUrl).pipe(res);
}).listen(8080);
Though I would have written exactly #LaurentPerrin's answer myself, for completeness sake I should say this:
The drawback in that method is that the request headers you are sending to somewhere.com are unrelated to the request headers your server got. For example: if the request sent to you has a specific value for Accept-Language, it is likely that (as the code stands) you are not going to specify the same value for Accept-Value when proxying from somewhere.com. Thus the resource might be returned to you (and then from you to the original requester) in the wrong language.
Or if the request to you comes in with Accept-Encoding: gzip, the current code will get the large file uncompressed, and will stream it back uncompressed, when you could have saved bandwidth and time by accepting and streaming back a compressed file.
This may or may not be of relevance to you.
If there are important headers you feel you need to pass, you could either add some code to explicitly copy them from your request to the request you are sending somewhere.com, and then copy relevant response headers back, or use node-http-proxy at https://github.com/nodejitsu/node-http-proxy.
An example for a forward proxy using node-http proxy is https://github.com/nodejitsu/node-http-proxy/blob/master/examples/http/forward-proxy.js

Express (node.js) seems to be replacing my content-type with application/json

I've written an express web app route that is essentially a proxy - it pipes the contents of an input stream (a zip file) into the server response's output stream.
I'd like the browser to prompt the user for download or whatever is most appropriate for a zip file. However, when I load this route in a browser, the contents of the input stream (the zip file's contents) show up in the browser window as text, rather than prompting a download. l
This is the code sending the response:
res.statusCode = 200;
res.setHeader ('Content-Length', size);
res.setHeader ('Content-Type', 'application/zip');
console.log ("content-type is " + res.header('Content-Type'));
inputStream.pipe (res);
The console.log statement above outputs "content-type is application/zip".
However, when I examine the request in Chrome's network tab, I see that the response's content-type is "application/json". This implies that express, or something else, is overriding my content-type header, or perhaps has already sent it.
Does anyone know what is changing the content-type on me, and how I could make sure the content-type is the one I set?
Thanks for any help.
You should check the order of the middleware, it's really tricky and can mess things up if they are in the correct order.
You can check the correct order here in the Connect webpage

Resources