I am trying to upload an image through react app to s3 bucket and then receiving back the URL and showing the image on the screen.
I am able to upload the image (sort of) and get the URL back from the s3 server, but when I download it I am unable to open it - the formant is unsupported and I can't use the img tag to show it on the webpage. I guess that it is something to do with conversation to base64 but I can't figure out why it is not working.
The frontend(React) is:
const uploadImageToBucket = async (image) => {
console.log("fff",image)
let image_location
try {
const response = axios.post("http://localhost:5000/user/blogmanage/uploadimage",image)
image_location = response.then((response)=>response.data.body);
console.log("img loc", image_location)
return image_location;
} catch (error) {
}
}
The backend(nodejs) is
router.post("/blogmanage/uploadimage", async (req,res)=>{
const s3 = new AWS.S3({
accessKeyId: process.env["AWS_ACCESS_KEY_ID"],
secretAccessKey: process.env["AWS_SECRET_KEY"],
region: process.env['AWS_REGION']
});
const BUCKET_NAME = "mrandmrseatmedia";
var base64data = new Buffer.from( 'binary',req.body);
const params = {
Bucket: BUCKET_NAME,
Key: "test/test2.jpg",
Body: base64data
}
s3.upload(params, function (err,data){
if (err){
console.log(err)
res.status(404).json({msg:err});
}
else{
const image_location = `${data.Location}`;
console.log(`File uploaded successfully. ${data.Location}`);
res.status(200).json({body:image_location});
}
})
});
Thanks!
After a lot of testing and retesting and rewriting using this repo as an example
https://github.com/Jerga99/bwm-ng/blob/master/server/services/image-upload.js
It works.
The use of base64 is wrong in this case. It corrupts the file in some way. Multer library fixes it.
Related
I'm trying to get an image as a response using the public URL of the file :
var request = require('request');
request('https://bucket-name.s3.amazonaws.com/file-name').pipe(res);
when I send the request this is the response I get
I need to know how can I get an image file as response instead of that.
Here's the upload function that works fine
const fileContent = fs.readFileSync(fileName);
// Setting up S3 upload parameters
const params = {
Bucket: BUCKET_NAME,
Key: '30.png', // File name you want to save as in S3
Body: fileContent,
};
// Uploading files to the bucket
s3.upload(params, function (err, data) {
if (err) {
throw err;
}
console.log(`File uploaded successfully. ${data.Location}`);
});
};```
I'm trying to upload files from a MERN application I'm working on. I'm almost done with the NodeJS back end part.
Said application will allow users to upload images(jpg, jpeg, png, gifs, etc) to an Amazon AWS S3 bucket that I created.
Well, lets put it this way. I created a helper:
const aws = require('aws-sdk');
const fs = require('fs');
// Enter copied or downloaded access ID and secret key here
const ID = process.env.AWS_ACCESS_KEY_ID;
const SECRET = process.env.AWS_SECRET_ACCESS_KEY;
// The name of the bucket that you have created
const BUCKET_NAME = process.env.AWS_BUCKET_NAME;
const s3 = new aws.S3({
accessKeyId: ID,
secretAccessKey: SECRET
});
const uploadFile = async images => {
// Read content from the file
const fileContent = fs.readFileSync(images);
// Setting up S3 upload parameters
const params = {
Bucket: BUCKET_NAME,
// Key: 'cat.jpg', // File name you want to save as in S3
Body: fileContent
};
// Uploading files to the bucket
s3.upload(params, function(err, data) {
if (err) {
throw err;
}
console.log(`File uploaded successfully. ${data.Location}`);
});
};
module.exports = uploadFile;
That helper takes three of my environment variables which are the name of the bucket, the keyId and the secret key.
When adding files from the form(that will eventually be added in the front end) the user will be able to send more than one file.
Right now my current post route looks exactly like this:
req.body.user = req.user.id;
req.body.images = req.body.images.split(',').map(image => image.trim());
const post = await Post.create(req.body);
res.status(201).json({ success: true, data: post });
That right there works great but takes the req.body.images as a string with each image separated by a comma. What would the right approach be to upload(to AWS S3) the many files selected from the Windows directory pop up?. I tried doing this but did not work :/
// Add user to req,body
req.body.user = req.user.id;
uploadFile(req.body.images);
const post = await Post.create(req.body);
res.status(201).json({ success: true, data: post });
Thanks and hopefully your guys can help me out with this one. Right now I'm testing it with Postman but later on the files will be sent via a form.
Well you could just call the uploadFile multiple times for each file :
try{
const promises= []
for(const img of images) {
promises.push(uploadFile(img))
}
await Promise.all(promises)
//rest of logic
}catch(err){ //handle err }
On a side note you should warp S3.upload in a promise:
const AWS = require('aws-sdk')
const s3 = new AWS.S3({
accessKeyId: process.env.AWS_ACCESS_KEY,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY
})
module.exports = ({ params }) => {
return new Promise((resolve, reject) => {
s3.upload(params, function (s3Err, data) {
if (s3Err) return reject(s3Err)
console.log(`File uploaded successfully at ${data.Location}`)
return resolve(data)
})
})
}
Bonus, if you wish to avoid having your backend handle uploads you can use aws s3 signed urls and let the client browser handle that thus saving your server resources.
One more thing your Post object should only contain Urls of the media not the media itself.
// Setting up S3 upload parameters
const params = {
Bucket: bucket, // bucket name
Key: fileName, // File name you want to save as in S3
Body: Buffer.from(imageStr, 'binary'), //image must be in buffer
ACL: 'public-read', // allow file to be read by anyone
ContentType: 'image/png', // image header for browser to be able to render image
CacheControl: 'max-age=31536000, public' // caching header for browser
};
// Uploading files to the bucket
try {
const result = await s3.upload(params).promise();
return result.Location;
} catch (err) {
console.log('upload error', err);
throw err;
}
I have a node js script that uploads files to AWS S3 through the command line. The problem Im having is when I try to view the file in the browser it automatically downloads it.
I have done some research and most other posts point out the headers, but I have verified the headers are correct (image/png)
Additionally, when I upload the same file through the AWS console (log into AWS), I am able to view the file within the browser.
var fs = require('fs');
var path = require('path');
AWS.config.update({region: myRegion});
s3 = new AWS.S3({apiVersion: '2006-03-01'});
var uploadParams = {
Bucket: process.argv[2],
Key: '', // Key set below
Body: '', // Body set below after createReadStream
ContentType: 'image/jpeg',
ACL: 'public-read',
ContentDisposition: 'inline'
};
var file = process.argv[3];
var fileStream = fs.createReadStream(file);
fileStream.on('error', function(err) {
console.log('File Error', err);
});
uploadParams.Body = fileStream;
uploadParams.Key = path.basename(file);
s3.putObject(uploadParams, function(errBucket, dataBucket) {
if (errBucket) {
console.log("Error uploading data: ", errBucket);
} else {
console.log(dataBucket);
}
});
I get successful upload, but unable to view file in browser as it auto downloads.
You have to specify the contentDisposition as part of request headers. You can not specify it as part of request paramenters. Specify it in headers explicitly as below .
var params = {Bucket : "bucketname" , Key : "keyName" , Body : "actualData"};
s3.putObject(params).
on('build',function(req){
req.httpRequest.headers['Content-Type'] = 'application/pdf' ; // Whatever you want
req.httpRequest.headers['ContentDisposition'] = 'inline';
}).
send( function(err,data){
if(err){
console.log(err);
return res.status(400).json({sucess: false});
}else{
console.log(success);
return res.status(200).json({success: true});
}
});
Code to upload obejcts/images to s3
module.exports = function(app, models) {
var fs = require('fs');
var AWS = require('aws-sdk');
var accessKeyId = "ACESS KEY HERE";
var secretAccessKey = "SECRET KEY HERE";
AWS.config.update({
accessKeyId: accessKeyId,
secretAccessKey: secretAccessKey
});
var s3 = new AWS.S3();
app.post('/upload', function(req, res){
var params = {
Bucket: 'bucketname',
Key: 'keyname.png',
Body: "GiveSomeRandomWordOraProperBodyIfYouHave"
};
s3.putObject(params, function (perr, pres) {
if (perr) {
console.log("Error uploading data: ", perr);
} else {
console.log("Successfully uploaded data to myBucket/myKey");
}
});
});
}
The above code will make sure the object has been uploaded to s3. You cab see it listed in s3 bucket in the browser but you cant view its contents in s3 bucket.
You can not view items within S3. S3 is a storage box. you can only download and upload elements in it. If you need to view the contents you would have to download and view it in the browser or any explorer of your choice. If you simply need to list the objects in s3. Use the below code.
Code to list objects of s3
var AWS = require('aws-sdk');
AWS.config.update({accessKeyId: 'mykey', secretAccessKey: 'mysecret', region: 'myregion'});
var s3 = new AWS.S3();
var params = {
Bucket: 'bucketName',
Delimiter: '/',
Prefix: 's/prefix/objectPath/'
}
s3.listObjects(params, function (err, data) {
if(err)throw err;
console.log(data);
});
Use S3 list to list the elements of S3. This way you can view them. Create a hyperlink for each of the listed item and make it point to s3 download url. This way you can view in the browser and also download it if you need.
In case if you need to view the contents of it via node JS, use the code below to load the image as if you are loading it from a remote URL.
Code to Download contents:
var fs = require('fs'),
request = require('request');
var download = function(uri, filename, callback){
request.head(uri, function(err, res, body){
console.log('content-type:', res.headers['content-type']);
console.log('content-length:', res.headers['content-length']);
request(uri).pipe(fs.createWriteStream(filename)).on('close', callback);
});
};
download('httpo://s3/URL' 'name.png', function(){
console.log('done');
});
Code to load image into a buffer :
const request = require('request');
let url = 'http://s3url/image.png';
request({ url, encoding: null }, (err, resp, buffer) => {
// typeof buffer === 'object'
// Use the buffer
// This buffer will now contains the image data
});
Use the above to load the image into a buffer. Once its in buffer, you can manipulate it the way you need. The above code wont downloads the image but it help you to manipuate the image in s3 using a buffer.
Contains Example Code. The link will contain Specific Node JS code examples for uploading and Manipulating objects of s3. use it for reference.
I have a little app to upload a file to AWS S3, it is uploading ok, but when I download the file from S3 bucket, it is encoded, shows type:buffer etc...
if I upload the same file from the console, it shows fine.
Here the code to upload
const fs = require('fs');
const AWS = require('aws-sdk');
const s3 = new AWS.S3({
accessKeyId: process.env.AWS_ACCESS_KEY,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY
});
const fileName = 'su.csv';
const uploadFile = () => {
fs.readFile(fileName, (err, data) => {
if (err) throw err;
const params = {
Bucket: 'mybukk22-test', // pass your bucket name
Key: 'su.csv', // file will be saved as testBucket/contacts.csv
Body: JSON.stringify(data, null, 2)
};
s3.upload(params, function (s3Err, data) {
if (s3Err) throw s3Err
console.log(`File uploaded successfully at ${data.Location}`)
});
});
};
uploadFile();
Is the problem on the body? how to save the same file as on client?
thanks
The issue is that you are trying to do a JSON.stringify on a Buffer, because fs.readFile returns you a Buffer. To make it work you could change your params to the following:
const params = {
Bucket: 'mybukk22-test', // pass your bucket name
Key: 'su.csv', // file will be saved as testBucket/contacts.csv
Body: data
};
(Just pass data 1:1 as the Body of your upload operation)
Otherwise, if you like to stick to your solution just cast data to a string like this: JSON.stringify(data.toString(), null, 2)
i want to
1-choose an image from my filesystem and upload it to server/local
2- get its url back using node js service . i managed to do step 1 and now i want to get the image url instead of getting the success message in res.end
here is my code
app.post("/api/Upload", function(req, res) {
upload(req, res, function(err) {
if (err) {
return res.end("Something went wrong!");
}
return res.end("File uploaded sucessfully!.");
});
});
i'm using multer to upload the image.
You can do something like this, using AWS S3 and it returns the url of the image uploaded
const AWS = require('aws-sdk')
AWS.config.update({
accessKeyId: <AWS_ACCESS_KEY>,
secretAccessKey: <AWS_SECRET>
})
const uploadImage = file => {
const replaceFile = file.data_uri.replace(/^data:image\/\w+;base64,/, '')
const buf = new Buffer(replaceFile, 'base64')
const s3 = new AWS.S3()
s3.upload({
Bucket: <YOUR_BUCKET>,
Key: <NAME_TO_SAVE>,
Body: buf,
ACL: 'public-read'
}, (err, data) => {
if (err) throw err;
return data.Location; // this is the URL
})
}
also you can check this express generator, which has the route to upload images to AWS S3 https://www.npmjs.com/package/speedbe
I am assuming that you are saving the image on the server file system and not a Storage solution like AWS S3 or Google Cloud Storage, where you get the url after upload.
Since, you are storing it on the filesystem, you can rename the file with a unique identifier like uuid or something else.
Then you can make a GET route and request that ID in query or path parameter and then read the file having that ID as the name and send it back.