Multipart file upload inside AWS Lambda function - node.js

I need to do a multipart file upload inside a Lambda function. Lambda is triggered by the S3 bucket on image upload. Need to get the uploaded image and then send it to another API service.Any suggestions?

Found the solution
import FormData from "form-data";
import AWS from "aws-sdk";
const S3 = new AWS.S3();
const file = await S3.getObject({ Bucket, Key }).promise();
const formData = new FormData();
// since file.Body is returning a Buffer we need add filename
formData.append("image", file.Body, { filename: fileName);
const res = await axios.post('https://someUrl', formData, {
headers: formData.getHeaders()
});

Related

amazon s3 - uploading empty image to bucket when using createWriteStream

When using createWriteStream, without any error it uploads image to bucket but empty(size-0B).
const uploadImage = async (filePath, fileId) => {
const fileStream = fs.createWriteStream(filePath);
const uploadParams = {
Bucket: bucket,
ACL: "public-read",
Body: fileStream,
Key: filePath,
ContentType: "image/png",
};
console.log(filePath);
const data = await s3.upload(uploadParams).promise();
console.log(data);
return;
};
but when using readFileSync it uploads image correctly.
const uploadImage = async (filePath, fileId) => {
const fileStream = fs.readFileSync(filePath);
const uploadParams = {
Bucket: bucket,
ACL: "public-read",
Body: fileStream,
Key: filePath,
ContentType: "image/png",
};
console.log(filePath);
const data = await s3.upload(uploadParams).promise();
console.log(data);
return;
};
why?
Problem that you have is more logical.
When you use createWriteStream you are creating new file on your file system. And basically you are creating empty file. So when you upload empty file on S3 it will be empty.
On the other hand when you use readFileSync you are reading the file from your file system, in your case picture, and send array of bytes to S3. That array of bytes is not empty but read from file system.
The first solution must be a ReadStream to read file data from path. Use fs.createReadStream(filePath).
Flow: read file from path -> write to S3.

How to give an EFS directory path to s3 putObject using nodejs lambda function?

I am trying to put data from EFS to S3 using NodeJs lambda function. First of all I am zipping all the files on EFS and then moving that zipped file to S3. Here is my code:
export const lambdaHandler = async (event, context) => {
try {
//file is going to be zipped here
ZipLocal.sync.zip("/mnt/data").compress().save("/mnt/data/zipped.zip");
console.log("files zipped");
// File is going to be uploaded to s3
const fileToUpload = "/mnt/data/zipped.zip";
let bucketName = "matomo-lambda-test-bucket";
const params = {
Bucket: bucketName,
Key: "zipped.zip",
Body: fileToUpload,
ContentType: 'application/json; charset=utf-8'
}
//await s3Client.putObject(params).promise();
const s3Data = await s3Client.send(
new PutObjectCommand(params)
);
console.log("Upload Completed", s3Data);
I don't know how to set Body that usually accepts object or string but in my case I want to give EFS zipped file path. How to do this?

Node.js aws s3 sdk v3 showing error while putting object

I am uploading file to s3 from node.js graphql api using graphql-upload,
const { createReadStream, mimetype, filename } = file;
const fileStream = createReadStream();
const prefix = encodeURIComponent(folder) + '/';
const uploadParam = {
Bucket: bucket,
Key: `testing`,
Body: fileStream,
ContentType: mimetype,
};
await s3.send(new PutObjectCommand(uploadParam));
Every time I upload, it show this error
NotImplemented: A header you provided implies functionality that is not implemented
Code: 'NotImplemented',
Header: 'Transfer-Encoding',
What am I doing wrong?

How to upload base64 encoded pdf directly to s3 with nodejs/aws-sdk?

I'm attempting to upload a base64 encoded pdf to S3 with the following code without having to write the file to the filesystem.
const AWS = require('aws-sdk');
exports.putBase64 = async (object_name, buffer, bucket) => {
const params = {
Key: object_name,
Body: buffer,
Bucket: bucket,
ContentEncoding: 'base64',
ContentType: 'application/pdf'
};
const response = await S3.upload(params).promise();
return response;
};
Where buffer is a blank pdf encoded to base64. When attempting to open the file on s3, I get "We can't open this file
Something went wrong." upon attempting to open it.
However, if I write the base64 encoding into a file and THEN upload it, it works.
await fs.writeFileSync(`./somepdf.pdf`, base_64, 'base64');
exports.put = async (object_name, file_location, bucket, content_type) => {
const file_content = fs.readFileSync(file_location);
const params = {
Key: object_name,
Body: './somepdf.pdf',
Bucket: bucket,
ContentType: 'application/pdf'
};
const response = await S3.upload(params).promise();
return response;
};
I notice that when uploading the file directly, the file encoding when viewing the file through a text editor it isn't base64 encoded, but viewing the file uploaded as strictly defined contentencoding base64 shows the base64. I attempted to convert the base64 to a blob using atob but that yielded the same results, so I assume there's a parameter I maybe missing or a header.
I had the same issue and managed to solve it by making this change:
const AWS = require('aws-sdk');
const S3 = new AWS.S3();
exports.putBase64 = async (object_name, buffer, bucket) => {
const params = {
Key: object_name,
Body: Buffer.from(buffer, 'base64'), // <---------
Bucket: bucket,
ContentType: 'application/pdf'
};
return await S3.upload(params).promise();
};
Create a new buffer
const newBuffer = buffer.replace(/^data:.+;base64,/, "")
Now use this new buffer in params. This should work!

How to send a file from node to Amazon S3 using request?

I'm trying to send a file using fs.createReadStream() but it doesn't work or give any errors.
I tried using request to put the file to s3 after steaming it from node.
const fs = require("fs");
const request = require("request");
const path = require("path");
let imagePath = path.join(__dirname,"../../public/images/img.jpg");
fs.createReadStream(imagePath).pipe(request.put(signedRequest));
when I change the first part to get an image from a url;
request.get('http://example.com/img.png').pipe(request.put(signedRequest));
it works and uploads the image to s3.
Is there a reason to why this is happening? Or is there any other method I can use to send a file from node to s3?
Try aws-sdk npm package.
Your code would look like this:
const params = {
Bucket: bucket,
Key: fileName,
Body: data,
};
const options = {
ACL: 'private',
CacheControl: 'max-age=86400',
ContentType: 'text/plain',
};
const s3 = new S3({
region: 'us-east-1',
apiVersion: '2006-03-01',
});
await s3.upload(params, options).promise();
Here is useful links: 1, 2

Resources