I am uploading file to s3 from node.js graphql api using graphql-upload,
const { createReadStream, mimetype, filename } = file;
const fileStream = createReadStream();
const prefix = encodeURIComponent(folder) + '/';
const uploadParam = {
Bucket: bucket,
Key: `testing`,
Body: fileStream,
ContentType: mimetype,
};
await s3.send(new PutObjectCommand(uploadParam));
Every time I upload, it show this error
NotImplemented: A header you provided implies functionality that is not implemented
Code: 'NotImplemented',
Header: 'Transfer-Encoding',
What am I doing wrong?
Related
When using createWriteStream, without any error it uploads image to bucket but empty(size-0B).
const uploadImage = async (filePath, fileId) => {
const fileStream = fs.createWriteStream(filePath);
const uploadParams = {
Bucket: bucket,
ACL: "public-read",
Body: fileStream,
Key: filePath,
ContentType: "image/png",
};
console.log(filePath);
const data = await s3.upload(uploadParams).promise();
console.log(data);
return;
};
but when using readFileSync it uploads image correctly.
const uploadImage = async (filePath, fileId) => {
const fileStream = fs.readFileSync(filePath);
const uploadParams = {
Bucket: bucket,
ACL: "public-read",
Body: fileStream,
Key: filePath,
ContentType: "image/png",
};
console.log(filePath);
const data = await s3.upload(uploadParams).promise();
console.log(data);
return;
};
why?
Problem that you have is more logical.
When you use createWriteStream you are creating new file on your file system. And basically you are creating empty file. So when you upload empty file on S3 it will be empty.
On the other hand when you use readFileSync you are reading the file from your file system, in your case picture, and send array of bytes to S3. That array of bytes is not empty but read from file system.
The first solution must be a ReadStream to read file data from path. Use fs.createReadStream(filePath).
Flow: read file from path -> write to S3.
I am trying to put data from EFS to S3 using NodeJs lambda function. First of all I am zipping all the files on EFS and then moving that zipped file to S3. Here is my code:
export const lambdaHandler = async (event, context) => {
try {
//file is going to be zipped here
ZipLocal.sync.zip("/mnt/data").compress().save("/mnt/data/zipped.zip");
console.log("files zipped");
// File is going to be uploaded to s3
const fileToUpload = "/mnt/data/zipped.zip";
let bucketName = "matomo-lambda-test-bucket";
const params = {
Bucket: bucketName,
Key: "zipped.zip",
Body: fileToUpload,
ContentType: 'application/json; charset=utf-8'
}
//await s3Client.putObject(params).promise();
const s3Data = await s3Client.send(
new PutObjectCommand(params)
);
console.log("Upload Completed", s3Data);
I don't know how to set Body that usually accepts object or string but in my case I want to give EFS zipped file path. How to do this?
I am sending image data from my react native application to my node js backend which i want to upload to S3 . I want to know exactly which format i must change the data to in order to upload it to my S3 . Below is the formdata which i am logging in my backend at the moment .
[
'file',
{
uri: 'file:///var/mobile/Containers/Data/Application/CA974BC6-6943-4135-89DE-235BC593A54F/Library/Caches/ExponentExperienceData/%2540lb2020%252Fmy/ImagePicker/D7119C77-60D0-46CC-A194-4F1FDE0D9A3D.jpg',
type: 'image/jpeg',
name: 'hi.jpg'
}
]
My backend has this code below also . Would making the above code equal file work ? if not , suggestions will be appreciated .
const params = {
Bucket:"myarrowbucket", // bucket you want to upload to
Key: "filename"+".png",
Body: file,
ContentType:'image/png',
ACL: "public-read",
};
I have tried uploading and the image doesnt open correctly on S3 or gives me Error: Unsupported body payload object
Updated code - - no path found error
app.post("/upload", async (req, res) => {
const uri = (req.body._parts[0][1].uri)
const file = uri.substring(7);
const fileStream = fs.createReadStream(file);
const params = {
Bucket:"myarrowbucket", // bucket you want to upload to
Key: "filename"+".png",
Body: fileStream,
ContentType:'image/png',
ACL: "public-read",
};
const data = await client.upload(params).promise();
return data.Location; // returns the url location
});
I have tried uploading and the image doesnt open correctly on S3 or gives me > Error: Unsupported body payload object
You need to provide a stream to the S3 client.
app.post("/upload", fileUpload(), async (req, res) => {
const uri = (req.body._parts[0][1].uri)
const file = uri.substring(7);
const params = {
Bucket:"myarrowbucket", // bucket you want to upload to
Key: "filename"+".png",
Body: Buffer.from(req.files[0].data, 'binary'), <-- PROVIDE DATA FROM FORM-DATA
ACL: "public-read",
};
const data = await client.upload(params).promise();
return data.Location; // returns the url location
});
You can use a library like form-data to handle the form data conversion.
I'm attempting to upload a base64 encoded pdf to S3 with the following code without having to write the file to the filesystem.
const AWS = require('aws-sdk');
exports.putBase64 = async (object_name, buffer, bucket) => {
const params = {
Key: object_name,
Body: buffer,
Bucket: bucket,
ContentEncoding: 'base64',
ContentType: 'application/pdf'
};
const response = await S3.upload(params).promise();
return response;
};
Where buffer is a blank pdf encoded to base64. When attempting to open the file on s3, I get "We can't open this file
Something went wrong." upon attempting to open it.
However, if I write the base64 encoding into a file and THEN upload it, it works.
await fs.writeFileSync(`./somepdf.pdf`, base_64, 'base64');
exports.put = async (object_name, file_location, bucket, content_type) => {
const file_content = fs.readFileSync(file_location);
const params = {
Key: object_name,
Body: './somepdf.pdf',
Bucket: bucket,
ContentType: 'application/pdf'
};
const response = await S3.upload(params).promise();
return response;
};
I notice that when uploading the file directly, the file encoding when viewing the file through a text editor it isn't base64 encoded, but viewing the file uploaded as strictly defined contentencoding base64 shows the base64. I attempted to convert the base64 to a blob using atob but that yielded the same results, so I assume there's a parameter I maybe missing or a header.
I had the same issue and managed to solve it by making this change:
const AWS = require('aws-sdk');
const S3 = new AWS.S3();
exports.putBase64 = async (object_name, buffer, bucket) => {
const params = {
Key: object_name,
Body: Buffer.from(buffer, 'base64'), // <---------
Bucket: bucket,
ContentType: 'application/pdf'
};
return await S3.upload(params).promise();
};
Create a new buffer
const newBuffer = buffer.replace(/^data:.+;base64,/, "")
Now use this new buffer in params. This should work!
I'm trying to send a file using fs.createReadStream() but it doesn't work or give any errors.
I tried using request to put the file to s3 after steaming it from node.
const fs = require("fs");
const request = require("request");
const path = require("path");
let imagePath = path.join(__dirname,"../../public/images/img.jpg");
fs.createReadStream(imagePath).pipe(request.put(signedRequest));
when I change the first part to get an image from a url;
request.get('http://example.com/img.png').pipe(request.put(signedRequest));
it works and uploads the image to s3.
Is there a reason to why this is happening? Or is there any other method I can use to send a file from node to s3?
Try aws-sdk npm package.
Your code would look like this:
const params = {
Bucket: bucket,
Key: fileName,
Body: data,
};
const options = {
ACL: 'private',
CacheControl: 'max-age=86400',
ContentType: 'text/plain',
};
const s3 = new S3({
region: 'us-east-1',
apiVersion: '2006-03-01',
});
await s3.upload(params, options).promise();
Here is useful links: 1, 2