uploading a uri or should it be converted? - node.js

I am sending image data from my react native application to my node js backend which i want to upload to S3 . I want to know exactly which format i must change the data to in order to upload it to my S3 . Below is the formdata which i am logging in my backend at the moment .
[
'file',
{
uri: 'file:///var/mobile/Containers/Data/Application/CA974BC6-6943-4135-89DE-235BC593A54F/Library/Caches/ExponentExperienceData/%2540lb2020%252Fmy/ImagePicker/D7119C77-60D0-46CC-A194-4F1FDE0D9A3D.jpg',
type: 'image/jpeg',
name: 'hi.jpg'
}
]
My backend has this code below also . Would making the above code equal file work ? if not , suggestions will be appreciated .
const params = {
Bucket:"myarrowbucket", // bucket you want to upload to
Key: "filename"+".png",
Body: file,
ContentType:'image/png',
ACL: "public-read",
};
I have tried uploading and the image doesnt open correctly on S3 or gives me Error: Unsupported body payload object
Updated code - - no path found error
app.post("/upload", async (req, res) => {
const uri = (req.body._parts[0][1].uri)
const file = uri.substring(7);
const fileStream = fs.createReadStream(file);
const params = {
Bucket:"myarrowbucket", // bucket you want to upload to
Key: "filename"+".png",
Body: fileStream,
ContentType:'image/png',
ACL: "public-read",
};
const data = await client.upload(params).promise();
return data.Location; // returns the url location
});

I have tried uploading and the image doesnt open correctly on S3 or gives me > Error: Unsupported body payload object
You need to provide a stream to the S3 client.
app.post("/upload", fileUpload(), async (req, res) => {
const uri = (req.body._parts[0][1].uri)
const file = uri.substring(7);
const params = {
Bucket:"myarrowbucket", // bucket you want to upload to
Key: "filename"+".png",
Body: Buffer.from(req.files[0].data, 'binary'), <-- PROVIDE DATA FROM FORM-DATA
ACL: "public-read",
};
const data = await client.upload(params).promise();
return data.Location; // returns the url location
});
You can use a library like form-data to handle the form data conversion.

Related

Node.js aws s3 sdk v3 showing error while putting object

I am uploading file to s3 from node.js graphql api using graphql-upload,
const { createReadStream, mimetype, filename } = file;
const fileStream = createReadStream();
const prefix = encodeURIComponent(folder) + '/';
const uploadParam = {
Bucket: bucket,
Key: `testing`,
Body: fileStream,
ContentType: mimetype,
};
await s3.send(new PutObjectCommand(uploadParam));
Every time I upload, it show this error
NotImplemented: A header you provided implies functionality that is not implemented
Code: 'NotImplemented',
Header: 'Transfer-Encoding',
What am I doing wrong?

The URL provided by AWS to upload files endpoint result in Error: SignatureDoesNotMatch after accessing it

I am trying to use this tutorial to upload files directly from the browser in a bucket using presigned URL.
I tryed this:
export const myFunction = async (imageType: string) => {
const s3 = new AWS.S3();
const imageName = "somename";
const s3Params = {
Bucket: bucketname,
Key: imageName,
ContentType: 'image/' + imageType,
Expires: 300,
};
const uploadURL = await s3.getSignedUrlPromise('putObject', s3Params);
return JSON.stringify({
uploadURL,
imageName,
});
};
And I do get back an URL in postman, I accessed it and then it redirected me to a new request, I set it to put and try to send the request without any file and with a file with the same name from the URL (the name of the image appear in the URL) in the form-data, sent it, but both my tries resulted in this error :
SignatureDoesNotMatch. The request signature we calculated does not match the signature you provided. Check your key and signing method.
I read something on stack, found oput it could be due to the file name if it has exotic characters, but it isn't my case.
How can I solve this problem?

Upload image to s3 from url

I am trying to upload a image, for which i have a url into s3
I want to do the same without downloading the image to local storage
filePath = imageURL;
let params = {
Bucket: 'bucketname',
Body: fs.createReadStream(filePath),
Key: "folder/" + id + "originalImage"
};
s3.upload(params, function (err, data) {
if (err) console.log(err);
if (data) console.log("original image success");
});
expecting a success but getting error :
myURL is a https publicly accesible url.
[Error: ENOENT: no such file or directory, open '<myURL HERE>']
errno: -2,
code: 'ENOENT',
syscall: 'open',
path:
'<myURL HERE>' }
There are two ways to place files into an Amazon S3 bucket:
Upload the contents via PutObject(), or
Copy an object that is already in Amazon S3
So, in the case of your "web accessible Cloudinary image", it will need to be retrieved, then uploaded. You could either fully download the image and then upload it, or you could do some fancy stuff with streaming bodies where the source image is read into a buffer and then written to S3. Either way, the file will need to be "read" from Cloudinary the "written" to S3.
As for the "web accessible Amazon S3 link", you could use the CopyObject() command to instruct S3 to directly copy the object from the source bucket to the destination bucket. (It also works within a bucket.)
This will make a GET request to the image URL, and pipe the response directly to the S3 upload stream, without the need to save the image to the local file system.
const request = require('request');
const fs = require('fs');
const imageURL = 'https://example.com/image.jpg';
const s3 = new AWS.S3();
const params = {
Bucket: 'bucketname',
Key: "folder/" + id + "originalImage"
};
const readStream = request(imageURL);
const uploadStream = s3.upload(params).createReadStream();
readStream.pipe(uploadStream);
You can use following snippet to upload files in S3 using the file URL.
import axios from 'axios';
import awsSDK from 'aws-sdk';
const uploadImageToS3 = () => {
const url = 'www.abc.com/img.jpg';
axios({
method: 'get',
url: url,
responseType: 'arraybuffer',
})
.then(function (response) {
console.log('res', response.data);
const arrayBuffer = response.data;
if (arrayBuffer) {
awsSDK.config.update({
accessKeyId: 'aws_access_key_id',
secretAccessKey: 'aws_secret_access_key',
region: 'aws_region',
});
const s3Bucket = new awsSDK.S3({
apiVersion: '2006-03-01',
params: {
Bucket: 'aws_bucket_name'
}
});
const data = {
Body: arrayBuffer,
Key: 'unique_key.fileExtension',
ACL: 'public-read'
};
const upload = s3Bucket.upload(data, (err, res) => {
if (err) {
console.log('s3 err:', err)
}
if (res) {
console.log('s3 res:', res)
}
})
}
});
}
uploadImageToS3();

Why is my S3 upload not uploading correctly?

I upload an image file using the following format:
var body = fs.createReadStream(tempPath).pipe(zlib.createGzip());
var s3obj = new AWS.S3({params: {Bucket: myBucket, Key: myKey}});
var params = {
Body: body,
ACL: 'public-read',
ContentType: 'image/png'
};
s3obj.upload(params, function(err, data) {
if (err) console.log("An error occurred with S3 fig upload: ", err);
console.log("Uploaded the image file at: ", data.Location);
});
The image successfully uploads to my S3 bucket (there are no error messages and I see it in the S3-console), but when I try to display it on my website, it returns a broken img icon. When I download the image using the S3-console file downloader I am unable to open it with the error that the file is "damaged or corrupted".
If I upload a file manually using the S3-console, I can correctly display it on my website, so I'm pretty sure there's something wrong with how I'm uploading.
What is going wrong?
I eventually found the answer to my question. I needed to post one more parameter because the file is gzip'd (from using var body = ...zlib.createGzip()). This fixed my problem:
var params = {
Body: body,
ACL: 'public-read',
ContentType: 'image/png',
ContentEncoding: 'gzip'
};
Theres a very nice node module s3-upload-stream to upload (and first compress) images to S3, here's their example code which is very well documented:
var AWS = require('aws-sdk'),
zlib = require('zlib'),
fs = require('fs');
s3Stream = require('s3-upload-stream')(new AWS.S3()),
// Set the client to be used for the upload.
AWS.config.loadFromPath('./config.json');
// or do AWS.config.update({accessKeyId: 'akid', secretAccessKey: 'secret'});
// Create the streams
var read = fs.createReadStream('/path/to/a/file');
var compress = zlib.createGzip();
var upload = s3Stream.upload({
"Bucket": "bucket-name",
"Key": "key-name"
});
// Optional configuration
upload.maxPartSize(20971520); // 20 MB
upload.concurrentParts(5);
// Handle errors.
upload.on('error', function (error) {
console.log(error);
});
/* Handle progress. Example details object:
{ ETag: '"f9ef956c83756a80ad62f54ae5e7d34b"',
PartNumber: 5,
receivedSize: 29671068,
uploadedSize: 29671068 }
*/
upload.on('part', function (details) {
console.log(details);
});
/* Handle upload completion. Example details object:
{ Location: 'https://bucketName.s3.amazonaws.com/filename.ext',
Bucket: 'bucketName',
Key: 'filename.ext',
ETag: '"bf2acbedf84207d696c8da7dbb205b9f-5"' }
*/
upload.on('uploaded', function (details) {
console.log(details);
});
// Pipe the incoming filestream through compression, and up to S3.
read.pipe(compress).pipe(upload);

How can I upload a file from an HTTP response body to S3 using putObject() from the AWS SDK (in Node.js)?

I'm trying to save a PDF file into S3 with the AWS SDK.
I'm getting the PDF through the body of a POST request (Application/PDF).
When saving the file into the local HD with fs.writeFile, the file looks ok. But when uploading it to S3, the file is corrupted (it's just a single
page PDF).
Any help or hint would be greatly appreciated!
var data = body // body from a POST request.
var fileName = "test.pdf";
fs.writeFile(fileName, data, {encoding : "binary"}, function(err, data) {
console.log('saved'); // File is OK!
});
s3.putObject({ Bucket: "bucketName", Key: fileName, Body: data }, function(err, data) {
console.log('uploaded') // File uploads incorrectly.
});
EDIT:
It works if I write and then read the file and then upload it.
fs.writeFile(fileName, data, {encoding : "binary"}, function(err, data) {
fs.readFile(fileName, function(err, fileData) {
s3.putObject({ Bucket: "bucketName", Key: fileName, Body: fileData }, function(err, data) {
console.log('uploaded') // File uploads correctly.
});
});
});
Try setting the contentType and/or ContentEncoding on your put to S3.
ContentType: 'binary', ContentEncoding: 'utf8'
See the code sample here for working example putObject makes object larger on server in Nodejs
I think it is because the data is consumed (i.e. a stream).
It would explain why after writting the data you send nothing to S3 and reading again the data you can send a valid PDF.
Try and see if it works by just sending the data directly to S3 without writting it to disk.
Yes, you forgot about callback of writeFile function, so when you started uploading to Amazon S3 your file wasn't saved completly. You shouldn't forget that node.js is asynchronous and an app won't wait when the fs.writeFile finishes it work, it simply run s3.putObject the same time.
/**
* JS library: Promise.promisify from bluebirdjs
**/
My code is as below
global.Promise = require('bluebird');
const aws = require('aws-sdk');
const aswAccessKey = {
accessKeyId: 'your-accesskey-id',
secretAccessKey: 'your-secret-access-key'
};
const fs = require('fs');
const path = require('path');
const uuidV4 = require('uuid/v4');
// Create S3 service object
// available apiVersion: '2006-03-01', '2013-04-01',
const s3 = new aws.S3(Object.assign(aswAccessKey, {
apiVersion: '2013-04-01'
}));
function putObject(bucketName, file) {
console.log('putObject into ', bucketName);
/**
* If we don't use versioned bucket, we must not pass VersionId
*/
const params = {
Bucket: bucketName,
Key: '',
Body: 'Plain text',
ACL: 'public-read',
ContentType: 'binary',
CacheControl: 'max-age=172800'
};
return Promise
.promisify(fs.readFile, {
context: fs
})(file)
.then((fileData) => {
console.log(fileData);
params.Body = fileData;
params.Key = 'g01/' + uuidV4() + '-' + path.basename(file);
return Promise
.promisify(s3.putObject, {
context: s3
})(params)
.then((data) => {
console.log('successful');
console.log(data);
})
.catch((err) => {
console.log('Error', err);
});
})
.catch(() => {
});
}

Resources