Firebase bucket / axios url in request body decoding issue - node.js

Interesting issue when using Firebase buckets and axios in a JS environments.
When I upload a file into a bucket and send the file link returned by firebase to server in a request body, the link is auto decoded in the server.
Upload a file to bucket from web
Firebase returns a link: https://firebasestorage.googleapis.com/v0/b/[BUCKET_NAME]/o/[POINTER]%2Fimages%2F[FILE_NAME])
note the url encoded %2F that firebase uses around the 'images'
Save this to DB via a Cloud Function call by using axios.post()
Using headers: {'Content-Type': 'application/x-www-form-urlencoded'} due to Cloud Function limitations here. The url is nested in a JSON object as a String.
When this request is picked up in the Cloud Function, the URL in the object has been automatically urldecoded, resulting in:
https://firebasestorage.googleapis.com/v0/b/[BUCKET_NAME]/o/[POINTER]/images/[FILE_NAME])
note the / around the 'images'
Problem: Firebase doesn't return the file when %2F is replaced with / in the URL, only returning error:
Invalid HTTP method/URL pair.
I understand that I have only one option here, and it is to prevent this String to be URL decoded during the client-server axios call. Since I am using the mentioned headers, I'm not sure how this can be achieved.
Side quest: Why does Firebase enforce the urlencode this strongly and doesn't return the file independently of the representation of the path to file (encoded or not)?

Related

how to read a file from form-Data in an AWS Lambda function

I am trying to send a PDF or text file to an AWS Lambda function (Node JS). I'd like to be able to process this file in the lambda function. I know often the best practice is to use a trigger function from an s3 bucket. However, I'd like to be able to send this lambda function a file from formdata, extract information from the file and then return the extracted info back.
I have been able to first encode the file in 64 bit binary data and send it to AWS lambda via a JSON, but often when I try to decode the file (especially a PDF) in the Lambda Function it is corrupted or empty.
Image files seem to work well for this type of encoding, but been unsuccessful with PDFs. Any help greatly appreciated. Rather then encode in base 64 is there a way I can obtain the file from a formdata? My code is below:
export const handler = async(event) => {
console.log("event", event)
var converted = atob(event.body) // RATHER HOW WOULD I READ A FILE FROM FORMDATA
const response = {
"statusCode": 200,
"headers": {"Content-Type":"text/html", //"application/pdf", // "multipart/form-data", //
"Access-Control-Allow-Origin":"*",
"Access-Control-Allow-Headers":"Content-Type,X-Amz-Date,Authorization,X-Api-Key,X-Amz-Security-Token"
},
"body": event,
"isBase64Encoded": true
}
return response;
};
thanks so much
I assume that you are using an API gateway to trigger the lambda function. In that case, you have to enable multipart/form-data in API gateway as mentioned in the documentation. The gist of the documentation is as follows:
In the Settings pane, choose Add Binary Media Type in the
Binary Media Types section. Type a required media type, for example, image/png, in the input text field.
Add Content-Type and Accept to the request headers for your proxy method.
Add those same headers to the integration request headers.
Deploy the API
PS: If you are using Lambda Proxy integration ({proxy+}, just steps 1 and 4 are enough.

how to create a readable stream from a remote URL and pass it into another API without saving it into the local file system in nodejs using got?

I want to access an image from a remote URL and pass it into another API request as request body.
I am using got stream API library to stream the data from the external url.
const url = "https://media0.giphy.com/media/4SS0kfzRqfBf2/giphy.gif";
got.stream(url).pipe(createWriteStream('image.gif'));
const response= await axsios.post('post_url',fs.createReadStream('image.gif');
The download operation is working as expected, but I don't want write the data to the local system.Instead, I woud like pass the response from the got stream API to another API as a request body.

encode request.query in express

I wrote a simple API to return requset.query in response.
Sample Url:
http://localhost:8082/redirect?requesttype=click&id=79992&redirectto=http://localhost:8081/redirect?name=john&id=123
This returns following response,
but on passing encoded Url things are working as expected(screenshot attached),
http://localhost:8082/redirect?requesttype=click&id=79992&redirectto=http%3A%2F%2Flocalhost%3A8081%2Fredirect%3Fname%3Djohn%26id%3D123
But our API is getting utilized by various customers, is there any method in express.js where I can automatically encode the Url and serve the request as expected.

serverless express can't retrieve pdf file (base64 encoding)

I have setup an express/serverless application to retrieve a pdf file on a GET request. But I just retrieve a corrupted repsonse pdf response. I just wondering If my settings are correct to achieve a correct response.
I'm using aws-serverless-express and want to return my pdf buffer to the client browser (it should open in the browser)
My code:
status = 200;
let fileName = "demo.pdf";
res.setHeader('Content-disposition', 'inline; filename="' + fileName + '"');
res.setHeader('Content-type', 'application/pdf');
res.setHeader('isBase64Encoded', true);//isBase64Encoded: true
let pdf = pdfBuffer.toString('base64');
res.status(status).send(pdf);
so I'm sending a base64 encoded string to APIGW. I'm not actually sure if I can set the isBase64Encoded flag via header. I read this before but I'm not so certain about that
I have done this whole procedure before, but didn't make use of aws-serverless-express (where I Could set the isBase64Encoded flag easily)
I'm also using serverless-apigw-binary to automatically setup APIGW for the correct decoding of the base64 encoded data
lambda is automatically encoding to base64, so I had to remove it and directly send the buffer.
I came across similar problem while using serverless Express. AWS Gateway needs a http response like this:
{
"header":{....}
"body": "/9j/4AAQSkZ...",
"isBase64Encoded": true
}
It's useless to put isBase64Encoded in the Http Response Header, AWS API Gateway only checks if the isBase64Encoded is outside the Http header. If it's true, then decode the body before sending it to a HTTP client.
Express (its response object) doesn't seem to allow to add something outside HTTP Header in a Http Response.
If you don't want to give up using Express, the workaround is to do the decoding in the browser, which is pretty easy:
var decodedData = Buffer.from(body,'base64')

How to use aws s3 image url in node js lambda?

I am trying to use aws s3 image in lambda node js but it throws an error 'no such file or directory'. But I have made that image as public and all permissions are granted.
fs = require('fs');
exports.handler = function( event, context ) {
var img = fs.readFileSync('https://s3-us-west-2.amazonaws.com/php-7/pic_6.png');
res.writeHead(200, {'Content-Type': 'image/png' });
res.end(img, 'binary');
};
fs is node js file system core module. It is for writing and reading files on local machine. That is why it gives you that error.
There are multiple things wrong with your code.
fs is a core module used for file operations and can't be used to access S3.
You seem to be using express.js code in your example. In lambda, there is no built-in res defined(unless you define it yourself) that you can use to send response.
You need to use the methods on context or the new callback mechanism. The context methods are used on the older lambda node version(0.10.42). You should be using the newer node version(4.3.2 or 6.10) which return response using the callback parameter.
It seems like you are also using the API gateway, so assuming that, I'll give a few suggestions. If the client needs access to the S3 object, these are some of your options:
Read the image from S3 using the AWS sdk and return the image using the appropriate binary media type. AWS added support for binary data for API gateway recently. See this link OR
Send the public S3 URL to client in your json response. Consider whether the S3 objects need to be public. OR
Use the S3 sdk to generate pre-signed URLs that are valid for a configured duration back to the client.
I like the pre-signed URL approach. I think you should check that out. You might also want to check the AWS lambda documentation
To get a file from S3, you need to use the path that S3 give you. The base path is https://s3.amazonaws.com/{your-bucket-name}/{your-file-name}.
On your code, you must replace the next line:
var img = fs.readFileSync('https://s3.amazonaws.com/{your-bucket-name}/pic_6.png');
If don't have a bucket, you should to create one to give permissions.

Resources