Upload multipart/form-data to S3 from lambda (Nodejs) - node.js

I want to upload multipart/form-data file to S3 using Nodejs.
I have tried various approaches but none of them are working. I was able to write content to S3 from lambda but when the file is downloaded from S3, it was corrupted.
Can someone provide me a working example or steps that could help me?
Thanking you in anticipation.
Please suggest another alternative if you think so.
Following is my lambda code:
export const uploadFile = async event => {
const parser = require("lambda-multipart-parser");
const result = await parser.parse(event);
const { content, filename, contentType } = result.files[0];
const params = {
Bucket: "name-of-the-bucket",
Key: filename,
Body: content,
ContentDisposition: `attachment; filename="${filename}";`,
ContentType: contentType,
ACL: "public-read"
};
const res = await s3.upload(params).promise();
return {
statusCode: 200,
body: JSON.stringify({
docUrl: res.Location
})
};
}

If want to upload file through lambda, one way is to open your AWS API Gateway console.
Go to
"API" -> {YourAPI} -> "Settings"
There you will find "Binary Media Types" section.
Add following media type:
multipart/form-data
Save your changes.
Then Go to "Resources" -> "proxy method"(eg. "ANY") -> "Method Request" -> "HTTP Request Headers" and add following headers "Content-Type", "Accept".
Finally deploy your api.
For more info visit: https://docs.aws.amazon.com/apigateway/latest/developerguide/api-gateway-payload-encodings-configure-with-console.html

There are 2 possible points of failure - Lambda receives corrupted data or you corrupt data while sending it to S3.
Sending multipart/form-data content to Lambda is not straightforward. You can see how to do that here.
After you did this and you're sure your data is correct in Lambda, check if you send it to S3 correctly (see S3 docs and examples for that).

Related

Unable to send a file via an HTTP POST request because the file extension is incorrect for a Buffer retrieved from AWS S3

I am in a Node.JS application. (Node version 14 if that matters)
Also I am using node-fetch (v2) to send my request.
I am trying to send a Send Fax Request to Documo. The documentation can be found here: https://docs.documo.com/#da8dc725-8327-470c-83e4-34b40205cfa2.
As part of the request I want to send a pdf file that I have stored in S3. I have the following code to accomplish that:
const s3Object = await s3Client.send(
new GetObjectCommand({
Bucket: bucket,
Key: key,
})
const formData = new FormData();
formData.append("faxNumber", faxNumber);
formData.append("coverPage", "true");
formData.append("recipientName", recipientName);
formData.append("senderName", senderName);
formData.append("subject", subject);
formData.append("notes", note);
formData.append("webhookId", webhookId);
formData.append("attachments", s3Object.Body);
// ERROR: File extension 'pdf%22&x-id=getobject' is not allowed. Allowed formats: doc,docx,fodt,gif,htm,html,jpeg,jpg,odp,ods,odt,pdf,png,ppt,pptx,rtf,tif,tiff,txt,xls,xlsx,csv
const response = await fetch(`${BASE_DOCUMO_URL}/v1/faxes`, {
method: "POST",
headers: {
Authorization: DEMO_API_KEY,
ContentType: "multipart/form-data",
},
body: formData,
});
I am getting the following error from Documo:
"File extension 'pdf%22&x-id=getobject' is not allowed. Allowed formats: doc,docx,fodt,gif,htm,html,jpeg,jpg,odp,ods,odt,pdf,png,ppt,pptx,rtf,tif,tiff,txt,xls,xlsx,csv"
From what I understand it looks like the GetObjectCommand from S3 is appending an x-id in the file stream which the Documo client is not happy with. Ideally I don't want to recreate this file in memory and I just want to take the result from S3 to send through my POST request. (Although I could be convinced to just do that if there is no better option or I don't need to worry about holding files in memory).
What are my options. I've tried playing around with ResponseContentDisposition in GetObjectCommand to no avail

Lambda to S3 image upload shows a black background with white square

I am using CDK to upload an image file from a form-data multivalue request to S3. There are now no errors in the console but what is saved to S3 is a black background with a white sqaure which im sure is down to a corrupt file or something.
Any thoughts as to what I'm doing wrong.
I'm using aws-lambda-multipart-parser to parse the form data.
In my console the form actual image is getting logged like this.
My upload file function looks like this
const uploadFile = async(image: any) => {
const params = {
Bucket: BUCKET_NAME,
Key: image.filename,
Body: image.content,
ContentType: image.contentType,
}
return await S3.putObject(params).promise()
}
When I log the image.content I get a log of the buffer, which seems to be the format i should be uploading the image to.
My CDK stack initialises the S3 contsruct like so.
const bucket = new s3.Bucket(this, "WidgetStore");
bucket.grantWrite(handler);
bucket.grantPublicAccess();
table.grantStreamRead(handler);
handler.addToRolePolicy(lambdaPolicy);
const api = new apigateway.RestApi(this, "widgets-api", {
restApiName: "Widget Service",
description: "This service serves widgets.",
binaryMediaTypes: ['image/png', 'image/jpeg'],
});
Any ideas what I could be missing?
Thanks in advance

Getting a bad request 400 when trying to upload zipped wkt file to here-maps rest api

We have a problem with the fleet.ls.hereapi.com when uploading a new layer for geofencing.
const myId = 'MYLAYER'; // just a id to check
zip.file('data.wkt', `NAME\tWKT\n${myId}\t${wkt}`);
const content = await zip.generateAsync({ type: 'nodebuffer' });
const formData = new FormData();
formData.append('zipfile', content);
await axios.post(config.HERE_MAPS_UPLOAD_ENDPOINT, formData, {
headers: {
'content-type': 'multipart/form-data',
},
params: {
apiKey: config.HERE_MAPS_REST_API_KEY,
layer_id: myId,
},
});
We get a bad request without a message and do not know what the problem is. The same implementation works in the Frontend (with 'blob' as zip type). Is there a parameter to get a better error message from the api?
We got the instructions how to implement it from this tutorial: https://www.youtube.com/watch?v=Ayw9GcS1V-8 and as I mentioned it works fine in the frontend. Also it works if I write a file in node and upload it via curl. Thank you for any help in advance!
Edit: I'm getting the following issue from the API: 'Multipart should contain exactly one part but contains 0'
I fixed it!
The problem was that the api needed a filename for the form data. This filename can be provided as third parameter as described here.
So I basically changed formData.append('zipfile', content); to formData.append('zipfile', content, zipfile.zip); and it worked.
Hope this will help somebody in the future!

AWS lambda function issue with FormData file upload

I have a nodejs code which uploads files to S3 bucket.
I have used koa web framework and following are the dependencies:
"#types/koa": "^2.0.48",
"#types/koa-router": "^7.0.40",
"koa": "^2.7.0",
"koa-body": "^4.1.0",
"koa-router": "^7.4.0",
following is my sample router code:
import Router from "koa-router";
const router = new Router({ prefix: '/' })
router.post('file/upload', upload)
async function upload(ctx: any, next: any) {
const files = ctx.request.files
if(files && files.file) {
const extension = path.extname(files.file.name)
const type = files.file.type
const size = files.file.size
console.log("file Size--------->:: " + size);
sendToS3();
}
}
function sendToS3() {
const params = {
Bucket: bName,
Key: kName,
Body: imageBody,
ACL: 'public-read',
ContentType: fileType
};
s3.upload(params, function (error: any, data: any) {
if (error) {
console.log("error", error);
return;
}
console.log('s3Response', data);
return;
});
}
The request body is sent as FormData.
Now when I run this code locally and hit the request, the file gets uploaded to my S3 bucket and can be viewed.
In Console the file size is displayed as follows:
which is the correct actual size of the file.
But when I deploy this code as lambda function and hit the request then I see that the file size has suddenly increased(cloudwatch log screenshot below).
Still that file gets uploaded to S3 but the issue is when I open the file it show following error.
I further tried to find whether this behaviour persisted on standalone instance on aws. But it did not. So the problem occurs only when the code is deployed as a serverless lambda function.
I tried with postman as well as my own front end app. But the issue remains.
I don't know whether I have overlooked any configuration when setting up the lambda function that handles such scenarios.
This is an unprecedented issue I have encountered, and really would want to know if any one else encountered same before. Also I am not able to debug and find why the file size is increasing. I can only assume that when the file reaches the service, some kind of encoding/padding is being done on the file.
Finally was able to fix this issue. Had to add "Binary Media Type" in AWS API Gateway
Following steps helped.
AWS API Gateway console -> "API" -> "Settings" -> "Binary Media Types" section.
Added following media type:
multipart/form-data
Save changes.
Deploy api.
Info location: https://docs.aws.amazon.com/apigateway/latest/developerguide/api-gateway-payload-encodings-configure-with-console.html

How to read a remote image file from aws lambda and return that image file as a response

I want to read a file which is in a remote location. Let's say https://abc/image.jpeg or https://abc/image.png. And I need to read this file and send it back as a response from a lambda function. One solution in NodeJS express is to use res.sendFile but I am not sure whether I can use it in a lambda and how to do that.
Another alternative is first copy image to a s3 bucket and then send it back. Any suggestions those are better than s3 copy option ?
You can leverage axios and API Gateway isBase64Encoded option.
First, request the image and convert it to base64, using Buffer:
const imageBase64 = await axios.get(url, {responseType: 'arraybuffer'})
.then(response => Buffer.from(response.data, 'binary').toString('base64'));
Next, return it from your lambda through API Gateway:
return {
statusCode: 200,
body: imageBase64,
isBase64Encoded: true, //the most important part
}
However, keep in mind that API Gateway allows up to 10 megabytes of payload size. You'll get an error if your images are bigger.
With request and express :
var request = require("request");
request.get('https://www.example.com/static/img/logo-light.png').pipe(res);

Resources