How to send a file from node to Amazon S3 using request? - node.js

I'm trying to send a file using fs.createReadStream() but it doesn't work or give any errors.
I tried using request to put the file to s3 after steaming it from node.
const fs = require("fs");
const request = require("request");
const path = require("path");
let imagePath = path.join(__dirname,"../../public/images/img.jpg");
fs.createReadStream(imagePath).pipe(request.put(signedRequest));
when I change the first part to get an image from a url;
request.get('http://example.com/img.png').pipe(request.put(signedRequest));
it works and uploads the image to s3.
Is there a reason to why this is happening? Or is there any other method I can use to send a file from node to s3?

Try aws-sdk npm package.
Your code would look like this:
const params = {
Bucket: bucket,
Key: fileName,
Body: data,
};
const options = {
ACL: 'private',
CacheControl: 'max-age=86400',
ContentType: 'text/plain',
};
const s3 = new S3({
region: 'us-east-1',
apiVersion: '2006-03-01',
});
await s3.upload(params, options).promise();
Here is useful links: 1, 2

Related

amazon s3 - uploading empty image to bucket when using createWriteStream

When using createWriteStream, without any error it uploads image to bucket but empty(size-0B).
const uploadImage = async (filePath, fileId) => {
const fileStream = fs.createWriteStream(filePath);
const uploadParams = {
Bucket: bucket,
ACL: "public-read",
Body: fileStream,
Key: filePath,
ContentType: "image/png",
};
console.log(filePath);
const data = await s3.upload(uploadParams).promise();
console.log(data);
return;
};
but when using readFileSync it uploads image correctly.
const uploadImage = async (filePath, fileId) => {
const fileStream = fs.readFileSync(filePath);
const uploadParams = {
Bucket: bucket,
ACL: "public-read",
Body: fileStream,
Key: filePath,
ContentType: "image/png",
};
console.log(filePath);
const data = await s3.upload(uploadParams).promise();
console.log(data);
return;
};
why?
Problem that you have is more logical.
When you use createWriteStream you are creating new file on your file system. And basically you are creating empty file. So when you upload empty file on S3 it will be empty.
On the other hand when you use readFileSync you are reading the file from your file system, in your case picture, and send array of bytes to S3. That array of bytes is not empty but read from file system.
The first solution must be a ReadStream to read file data from path. Use fs.createReadStream(filePath).
Flow: read file from path -> write to S3.

uploading a uri or should it be converted?

I am sending image data from my react native application to my node js backend which i want to upload to S3 . I want to know exactly which format i must change the data to in order to upload it to my S3 . Below is the formdata which i am logging in my backend at the moment .
[
'file',
{
uri: 'file:///var/mobile/Containers/Data/Application/CA974BC6-6943-4135-89DE-235BC593A54F/Library/Caches/ExponentExperienceData/%2540lb2020%252Fmy/ImagePicker/D7119C77-60D0-46CC-A194-4F1FDE0D9A3D.jpg',
type: 'image/jpeg',
name: 'hi.jpg'
}
]
My backend has this code below also . Would making the above code equal file work ? if not , suggestions will be appreciated .
const params = {
Bucket:"myarrowbucket", // bucket you want to upload to
Key: "filename"+".png",
Body: file,
ContentType:'image/png',
ACL: "public-read",
};
I have tried uploading and the image doesnt open correctly on S3 or gives me Error: Unsupported body payload object
Updated code - - no path found error
app.post("/upload", async (req, res) => {
const uri = (req.body._parts[0][1].uri)
const file = uri.substring(7);
const fileStream = fs.createReadStream(file);
const params = {
Bucket:"myarrowbucket", // bucket you want to upload to
Key: "filename"+".png",
Body: fileStream,
ContentType:'image/png',
ACL: "public-read",
};
const data = await client.upload(params).promise();
return data.Location; // returns the url location
});
I have tried uploading and the image doesnt open correctly on S3 or gives me > Error: Unsupported body payload object
You need to provide a stream to the S3 client.
app.post("/upload", fileUpload(), async (req, res) => {
const uri = (req.body._parts[0][1].uri)
const file = uri.substring(7);
const params = {
Bucket:"myarrowbucket", // bucket you want to upload to
Key: "filename"+".png",
Body: Buffer.from(req.files[0].data, 'binary'), <-- PROVIDE DATA FROM FORM-DATA
ACL: "public-read",
};
const data = await client.upload(params).promise();
return data.Location; // returns the url location
});
You can use a library like form-data to handle the form data conversion.

Multipart file upload inside AWS Lambda function

I need to do a multipart file upload inside a Lambda function. Lambda is triggered by the S3 bucket on image upload. Need to get the uploaded image and then send it to another API service.Any suggestions?
Found the solution
import FormData from "form-data";
import AWS from "aws-sdk";
const S3 = new AWS.S3();
const file = await S3.getObject({ Bucket, Key }).promise();
const formData = new FormData();
// since file.Body is returning a Buffer we need add filename
formData.append("image", file.Body, { filename: fileName);
const res = await axios.post('https://someUrl', formData, {
headers: formData.getHeaders()
});

Node.js aws s3 sdk v3 showing error while putting object

I am uploading file to s3 from node.js graphql api using graphql-upload,
const { createReadStream, mimetype, filename } = file;
const fileStream = createReadStream();
const prefix = encodeURIComponent(folder) + '/';
const uploadParam = {
Bucket: bucket,
Key: `testing`,
Body: fileStream,
ContentType: mimetype,
};
await s3.send(new PutObjectCommand(uploadParam));
Every time I upload, it show this error
NotImplemented: A header you provided implies functionality that is not implemented
Code: 'NotImplemented',
Header: 'Transfer-Encoding',
What am I doing wrong?

how to stop image download instead of image display in aws s3 using node.js

I have upload an image using node.js in AWS S3 and that have successfully uploaded on AWS S3 bucket, but I try to view the image instead of download. I cant view that downloded file. I have used following code to upload an image on AWS S3:
var AWS = require('aws-sdk');
var config = require('../../server/config');
AWS.config.update({
accessKeyId: config.aws.accessKeyId,
secretAccessKey: config.aws.secretAccessKey,
region: config.aws.region
});
var s3 = new AWS.S3();
var Busboy = require('busboy');
var busboyBodyParser = require('busboy-body-parser');
app.use(busboyBodyParser());
app.post('/upload', function(req,res){
var directory = req.body.directory;
console.log(req.files.file);
var image = req.files.file.name;
var contenttype = req.files.file.mimetype;
if(req.body.directory) {
var file = directory+'/'+image;
} else {
var file = image;
}
var data = req.files.file.data;
var keys = {
Bucket: req.body.bucket,
Key: file,
Body: data,
ACL: 'public-read',
contentType: contenttype
};
s3.upload(keys, function(err, result) {
if (err) {
res.send({
isError:true,
status:400,
message:"File Not Uplaod",
data:err
});
} else {
var data = {
Location: result.Location,
key:result.key,
Bucket:result.Bucket
};
res.send({
isError:false,
status:200,
message:"File Uplaod",
data:data
});
}
});
});
I was stuck with this as well, but the following works:
let params = {
ACL: 'public-read',
Bucket: process.env.BUCKET_NAME,
Body: fs.createReadStream(req.file.path),
ContentType: req.file.mimetype,
Key: `avatar/${req.file.originalname}`
};
req.file.mimetype is what fixed it, which is basically the same as ContentType: image/jpeg but it identifies the extension of the file the user has uploaded as opposed to having to hardcode image/jpeg or image/png
I hope your issue is fixed though.
I have find that answer :
used ContentType:'image/jpeg' or ContentType: 'your variable' in keys to upload image in aws s3

Resources