i want to
1-choose an image from my filesystem and upload it to server/local
2- get its url back using node js service . i managed to do step 1 and now i want to get the image url instead of getting the success message in res.end
here is my code
app.post("/api/Upload", function(req, res) {
upload(req, res, function(err) {
if (err) {
return res.end("Something went wrong!");
}
return res.end("File uploaded sucessfully!.");
});
});
i'm using multer to upload the image.
You can do something like this, using AWS S3 and it returns the url of the image uploaded
const AWS = require('aws-sdk')
AWS.config.update({
accessKeyId: <AWS_ACCESS_KEY>,
secretAccessKey: <AWS_SECRET>
})
const uploadImage = file => {
const replaceFile = file.data_uri.replace(/^data:image\/\w+;base64,/, '')
const buf = new Buffer(replaceFile, 'base64')
const s3 = new AWS.S3()
s3.upload({
Bucket: <YOUR_BUCKET>,
Key: <NAME_TO_SAVE>,
Body: buf,
ACL: 'public-read'
}, (err, data) => {
if (err) throw err;
return data.Location; // this is the URL
})
}
also you can check this express generator, which has the route to upload images to AWS S3 https://www.npmjs.com/package/speedbe
I am assuming that you are saving the image on the server file system and not a Storage solution like AWS S3 or Google Cloud Storage, where you get the url after upload.
Since, you are storing it on the filesystem, you can rename the file with a unique identifier like uuid or something else.
Then you can make a GET route and request that ID in query or path parameter and then read the file having that ID as the name and send it back.
Related
I've created an api to upload image to amazon s3 bucket with nodejs, multer, and multer-s3 .
Its working fine in development returning me a response of image URL which is downloadable and accessible but when I host my node app on Aws Lambda (API GATEWAY) the API returns me the same response with the image URL but this time when I open the downloaded image it shows me INVALID FILE FORMATTED
Here is my Code
const uploadImage = async (req,res) => {
let myFile = req.file.originalname.split(".")
const fileType = myFile[myFile.length - 1]
const params = {
Bucket: 'test-bucket2601',
Key: `${Date.now()}.${fileType}`,
Body: req.file.buffer
}
s3.upload(params, (error, data) => {
if(error){
res.status(500).send(error)
}
res.json({data})
})
}
Route of middleware
routes.route('/upload').post(upload,uploadImage);
in post method the first argument is upload middleware
Middleware Code:
const s3 = new aws.S3({
credentials: {
accessKeyId: awsKeys?.accessKeyId,
secretAccessKey: awsKeys?.secretAccessKey
}
});
const storage = multer.memoryStorage({
destination: function(req, file, callback) {
callback(null, '')
}})
const upload = multer({storage}).single('imageUrl')
I'm trying to get an image as a response using the public URL of the file :
var request = require('request');
request('https://bucket-name.s3.amazonaws.com/file-name').pipe(res);
when I send the request this is the response I get
I need to know how can I get an image file as response instead of that.
Here's the upload function that works fine
const fileContent = fs.readFileSync(fileName);
// Setting up S3 upload parameters
const params = {
Bucket: BUCKET_NAME,
Key: '30.png', // File name you want to save as in S3
Body: fileContent,
};
// Uploading files to the bucket
s3.upload(params, function (err, data) {
if (err) {
throw err;
}
console.log(`File uploaded successfully. ${data.Location}`);
});
};```
I am trying to upload an image through react app to s3 bucket and then receiving back the URL and showing the image on the screen.
I am able to upload the image (sort of) and get the URL back from the s3 server, but when I download it I am unable to open it - the formant is unsupported and I can't use the img tag to show it on the webpage. I guess that it is something to do with conversation to base64 but I can't figure out why it is not working.
The frontend(React) is:
const uploadImageToBucket = async (image) => {
console.log("fff",image)
let image_location
try {
const response = axios.post("http://localhost:5000/user/blogmanage/uploadimage",image)
image_location = response.then((response)=>response.data.body);
console.log("img loc", image_location)
return image_location;
} catch (error) {
}
}
The backend(nodejs) is
router.post("/blogmanage/uploadimage", async (req,res)=>{
const s3 = new AWS.S3({
accessKeyId: process.env["AWS_ACCESS_KEY_ID"],
secretAccessKey: process.env["AWS_SECRET_KEY"],
region: process.env['AWS_REGION']
});
const BUCKET_NAME = "mrandmrseatmedia";
var base64data = new Buffer.from( 'binary',req.body);
const params = {
Bucket: BUCKET_NAME,
Key: "test/test2.jpg",
Body: base64data
}
s3.upload(params, function (err,data){
if (err){
console.log(err)
res.status(404).json({msg:err});
}
else{
const image_location = `${data.Location}`;
console.log(`File uploaded successfully. ${data.Location}`);
res.status(200).json({body:image_location});
}
})
});
Thanks!
After a lot of testing and retesting and rewriting using this repo as an example
https://github.com/Jerga99/bwm-ng/blob/master/server/services/image-upload.js
It works.
The use of base64 is wrong in this case. It corrupts the file in some way. Multer library fixes it.
I'm trying to upload files from a MERN application I'm working on. I'm almost done with the NodeJS back end part.
Said application will allow users to upload images(jpg, jpeg, png, gifs, etc) to an Amazon AWS S3 bucket that I created.
Well, lets put it this way. I created a helper:
const aws = require('aws-sdk');
const fs = require('fs');
// Enter copied or downloaded access ID and secret key here
const ID = process.env.AWS_ACCESS_KEY_ID;
const SECRET = process.env.AWS_SECRET_ACCESS_KEY;
// The name of the bucket that you have created
const BUCKET_NAME = process.env.AWS_BUCKET_NAME;
const s3 = new aws.S3({
accessKeyId: ID,
secretAccessKey: SECRET
});
const uploadFile = async images => {
// Read content from the file
const fileContent = fs.readFileSync(images);
// Setting up S3 upload parameters
const params = {
Bucket: BUCKET_NAME,
// Key: 'cat.jpg', // File name you want to save as in S3
Body: fileContent
};
// Uploading files to the bucket
s3.upload(params, function(err, data) {
if (err) {
throw err;
}
console.log(`File uploaded successfully. ${data.Location}`);
});
};
module.exports = uploadFile;
That helper takes three of my environment variables which are the name of the bucket, the keyId and the secret key.
When adding files from the form(that will eventually be added in the front end) the user will be able to send more than one file.
Right now my current post route looks exactly like this:
req.body.user = req.user.id;
req.body.images = req.body.images.split(',').map(image => image.trim());
const post = await Post.create(req.body);
res.status(201).json({ success: true, data: post });
That right there works great but takes the req.body.images as a string with each image separated by a comma. What would the right approach be to upload(to AWS S3) the many files selected from the Windows directory pop up?. I tried doing this but did not work :/
// Add user to req,body
req.body.user = req.user.id;
uploadFile(req.body.images);
const post = await Post.create(req.body);
res.status(201).json({ success: true, data: post });
Thanks and hopefully your guys can help me out with this one. Right now I'm testing it with Postman but later on the files will be sent via a form.
Well you could just call the uploadFile multiple times for each file :
try{
const promises= []
for(const img of images) {
promises.push(uploadFile(img))
}
await Promise.all(promises)
//rest of logic
}catch(err){ //handle err }
On a side note you should warp S3.upload in a promise:
const AWS = require('aws-sdk')
const s3 = new AWS.S3({
accessKeyId: process.env.AWS_ACCESS_KEY,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY
})
module.exports = ({ params }) => {
return new Promise((resolve, reject) => {
s3.upload(params, function (s3Err, data) {
if (s3Err) return reject(s3Err)
console.log(`File uploaded successfully at ${data.Location}`)
return resolve(data)
})
})
}
Bonus, if you wish to avoid having your backend handle uploads you can use aws s3 signed urls and let the client browser handle that thus saving your server resources.
One more thing your Post object should only contain Urls of the media not the media itself.
// Setting up S3 upload parameters
const params = {
Bucket: bucket, // bucket name
Key: fileName, // File name you want to save as in S3
Body: Buffer.from(imageStr, 'binary'), //image must be in buffer
ACL: 'public-read', // allow file to be read by anyone
ContentType: 'image/png', // image header for browser to be able to render image
CacheControl: 'max-age=31536000, public' // caching header for browser
};
// Uploading files to the bucket
try {
const result = await s3.upload(params).promise();
return result.Location;
} catch (err) {
console.log('upload error', err);
throw err;
}
I'm trying to upload images to AWS S3 using multer-s3. Everything works fine (i.e. uploading video, images and files) but it's incomplete. I have no idea how to track the progress or percentage of the upload.
My code for multer is treated as a middleware like this
const multer = require('multer');
const AWS = require('aws-sdk');
const multerS3 = require('multer-s3');
var s3 = new AWS.S3();
const s3Storage = multerS3({
s3 : s3,
bucket : 'app-bucket',
acl : 'public-read',
key : function (req, file, callback) {
callback(null, file.originalname);
}
});
module.exports.s3Upload = multer({ storage: s3Storage });
Then I will attached the middleware to my route like this:
router.route('/image/upload').get(uploadController.getUploadImageController)
.post(middleware.s3Upload.single('myImage'),
uploadController.postUploadPhotoToAlbumController );
Then on my controller, is a simple post request that will save the path to database:
module.exports.postUploadPhotoToAlbumController = (req, res) => {
let query = Images.findById({ _id: req.params.id });
query.exec((err, images) => {
if(err){
return res.status(500).send({success: false, error: err, message: 'Something went wrong.'});
} if(!images){
return res.status(200).send({success: false, message: 'That image does not exist to your album.'});
}
images.image = !!req.file ? AwsS3PublicURL.setAwsPublicUrlSingle(req) : null;
images.save(err => {
if(err){
return res.status(500).send({success:false, error: err, message: 'Something went wrong.'});
}
req.flash('message', 'Your image was successfully uploaded.');
res.redirect('/album/photos');
});
});
}
The AwsS3PublicURL.setAwsPublicUrlSingle is a a path to my
AmazonS3Bucket set to public.
My problem is I don't know how to properly track the progress or percentage of progress of my upload and display on frontend or in console. Thank you in advance if anyone knows the answer.