createReadStream without path - node.js

I'm new to programming and web development, and even I'm not native English speaker so my explanation might be hard to understand.
I'm using aws sdk, aws s3, apollo server, apollo client, react and node
when file is sending to a apollo server from a client, a server destructure file to create readable stream so I can upload file to s3.
in node filesystem module docs fs.createReadStream method need path but, my code works without path
I just did createReadStream() without any argument. And it works fine so I can upload the file to s3
let { createReadStream, filename, mimetype, encoding } = await file;
let stream = createReadStream();
// don't mind Bucket field
s3.upload({
Bucket: 'myBucket',
Key: 'images/' + filename,
Body: stream,
ContentType: mimetype
});
Why this works without path argument?
Am I missing something?

try this.
let { createReadStream, filename, mimetype, encoding } = await file;
let stream = createReadStream();
// don't mind Bucket field
s3.upload({
Bucket: 'myBucket',
Key: 'images/' + filename,
Body: createReadStream(),
ContentType: mimetype
});

Related

amazon s3 - uploading empty image to bucket when using createWriteStream

When using createWriteStream, without any error it uploads image to bucket but empty(size-0B).
const uploadImage = async (filePath, fileId) => {
const fileStream = fs.createWriteStream(filePath);
const uploadParams = {
Bucket: bucket,
ACL: "public-read",
Body: fileStream,
Key: filePath,
ContentType: "image/png",
};
console.log(filePath);
const data = await s3.upload(uploadParams).promise();
console.log(data);
return;
};
but when using readFileSync it uploads image correctly.
const uploadImage = async (filePath, fileId) => {
const fileStream = fs.readFileSync(filePath);
const uploadParams = {
Bucket: bucket,
ACL: "public-read",
Body: fileStream,
Key: filePath,
ContentType: "image/png",
};
console.log(filePath);
const data = await s3.upload(uploadParams).promise();
console.log(data);
return;
};
why?
Problem that you have is more logical.
When you use createWriteStream you are creating new file on your file system. And basically you are creating empty file. So when you upload empty file on S3 it will be empty.
On the other hand when you use readFileSync you are reading the file from your file system, in your case picture, and send array of bytes to S3. That array of bytes is not empty but read from file system.
The first solution must be a ReadStream to read file data from path. Use fs.createReadStream(filePath).
Flow: read file from path -> write to S3.

Broken image from image upload to Amazon s3 via base64 string

I'm having issues with getting the full image back from amazon s3 after sending a base64 string(about 2.43MB when converted to an image).
if I compress this image via https://compressnow.com/, and upload, this works fine and I get the full image.
Is it possible for me to compress the base64 string before sending to Amazon s3?
Here is logic to upload to amazon s3
await bucket
.upload({
Bucket: "test",
Key: "test",
Body: "test",
ContentEncoding: 'base64',
Metadata: { MimeType: "png },
})
Similar issue here Node base64 upload to AWS S3 bucket makes image broken
The ContentEncoding parameter specifies the header that S3 should send along with the HTTP response, not the encoding of the object as far as what is passed to the AWS SDK. According to the documentation the Body parameter is simply the "Object data". In other words, you should probably just drop the ContentEncoding parameter unless you have a specific need for it and pass along raw bytes:
const fs = require('fs');
var AWS = require('aws-sdk');
s3 = new AWS.S3({apiVersion: '2006-03-01'});
// Read the contents of a local file
const buf = fs.readFileSync('source_image.jpg')
// Or, if the contents are base64 encoded, then decode them into buffer of raw data:
// const buf = new Buffer.from(fs.readFileSync('source_image.b64', 'utf-8'), 'base64')
var params = {
Bucket: '-example-bucket-',
Key: "path/to/example.jpg",
ContentType: `image/jpeg`,
ACL: 'public-read',
Body: buf,
ContentLength: buf.length,
};
s3.putObject(params, function(err, data){
if (err) {
console.log(err);
console.log('Error uploading data: ', data);
} else {
console.log('succesfully uploaded the image!');
}
});

Is there a way to stream large data directly to S3 files instead saving locally first?

Node.js app has so big data, I don't want to save it into a file first before streaming, so my question is there a way to stream this data directly to AWS S3?
You can use Upload from #aws-sdk/lib-storage which may allow you to upload buffers, blobs, or streams.
For example, if you have a stream you can pass it to it as Body:
const { S3Client } = require('#aws-sdk/client-s3');
const { Upload } = require('#aws-sdk/lib-storage');
async function upload(stream, fileName, bucketName, contentType) {
const s3Client = new S3Client({ region: "us-east-1" });
const upload = new Upload({
client: s3Client,
params: {
Bucket: bucketName,
Key: fileName,
Body: stream,
ContentType: contentType,
}
});
return await upload.done();
}

Node.js aws s3 sdk v3 showing error while putting object

I am uploading file to s3 from node.js graphql api using graphql-upload,
const { createReadStream, mimetype, filename } = file;
const fileStream = createReadStream();
const prefix = encodeURIComponent(folder) + '/';
const uploadParam = {
Bucket: bucket,
Key: `testing`,
Body: fileStream,
ContentType: mimetype,
};
await s3.send(new PutObjectCommand(uploadParam));
Every time I upload, it show this error
NotImplemented: A header you provided implies functionality that is not implemented
Code: 'NotImplemented',
Header: 'Transfer-Encoding',
What am I doing wrong?

How to upload base64 encoded pdf directly to s3 with nodejs/aws-sdk?

I'm attempting to upload a base64 encoded pdf to S3 with the following code without having to write the file to the filesystem.
const AWS = require('aws-sdk');
exports.putBase64 = async (object_name, buffer, bucket) => {
const params = {
Key: object_name,
Body: buffer,
Bucket: bucket,
ContentEncoding: 'base64',
ContentType: 'application/pdf'
};
const response = await S3.upload(params).promise();
return response;
};
Where buffer is a blank pdf encoded to base64. When attempting to open the file on s3, I get "We can't open this file
Something went wrong." upon attempting to open it.
However, if I write the base64 encoding into a file and THEN upload it, it works.
await fs.writeFileSync(`./somepdf.pdf`, base_64, 'base64');
exports.put = async (object_name, file_location, bucket, content_type) => {
const file_content = fs.readFileSync(file_location);
const params = {
Key: object_name,
Body: './somepdf.pdf',
Bucket: bucket,
ContentType: 'application/pdf'
};
const response = await S3.upload(params).promise();
return response;
};
I notice that when uploading the file directly, the file encoding when viewing the file through a text editor it isn't base64 encoded, but viewing the file uploaded as strictly defined contentencoding base64 shows the base64. I attempted to convert the base64 to a blob using atob but that yielded the same results, so I assume there's a parameter I maybe missing or a header.
I had the same issue and managed to solve it by making this change:
const AWS = require('aws-sdk');
const S3 = new AWS.S3();
exports.putBase64 = async (object_name, buffer, bucket) => {
const params = {
Key: object_name,
Body: Buffer.from(buffer, 'base64'), // <---------
Bucket: bucket,
ContentType: 'application/pdf'
};
return await S3.upload(params).promise();
};
Create a new buffer
const newBuffer = buffer.replace(/^data:.+;base64,/, "")
Now use this new buffer in params. This should work!

Resources