I am working on Nodejs and Expressjs applications. I want to delete an image on Aws-s3 image which I uploaded with multer-s3.
I have tried so many examples that I saw online but none of them worked.
I tried this example but it didn't work:
router.delete("/:id/delete", async (req, res) => {
const params = {
Bucket: bucketname,
Key:
"https://nodeilders.s3.us-east-
2.amazonaws.com/public/uploads/programImages/church4%20%281%29.jpeg",
};
s3.deleteObject(params, (error, data) => {
if (error) {
res.status(500).send(error);
} else {
console.log("File has been deleted successfully");
}
});
});
and I also tried this example but it didn't work.
router.delete("/:id/delete", async (req, res) => {
const s3delete = function (params) {
return new Promise((resolve, reject) => {
s3.createBucket({
Bucket: BUCKET_NAME /* Put your bucket name */
}, function () {
s3.deleteObject(params, function (err, data) {
if (err) console.log(err);
else
console.log(
"Successfully deleted file from bucket";
);
console.log(data);
});
});
});
};
});
The first example logged that the file was successfully deleted but it was not deleted.
What can I try to resolve this?
Related
I am trying to do async and await in the product.findOneAndUpdate() but it seems that I am getting "await is only valid in async function error" for the await Product.findOneAndUpdate(). Here is my code. many thanks in advance and greatly appreciate any helps. Thanks
router.post('/product/saveeditproduct/:id',JWTAuthenticatToken, async (req, res) => {
let form = formidable.IncomingForm()
form.parse(req, (err,fields, files) => {
if(err){
return res.json({statusCode: "400", msg:"upload denied"})
}
const{productname, productdescription} = fields
const productslug = slugify(productname)
const{image} = files
const product= await Product.findOneAndUpdate({productslug:req.params.id},
{$set:{productname:productname,productdescription:productdescription}},{new:true})
if(image){
//---Remove old image from AWS S3---
const deleteParams ={
Bucket:process.env.AWS_BUCKET_NAME,
Key:`image/${product.productslug}`,
Body:fs.readFileSync(image.path),
ACL:'public-read',
ContentType:`image/jpg`
}
s3.deleteObject(deleteParams,(err,data) => {
})
//---Remove old image from AWS S3---
//----Upload new image to AWS S3----
const params ={
Bucket:process.env.AWS_BUCKET_NAME,
Key:`image/${productslug}`,
Body:fs.readFileSync(image.path),
ACL:'public-read',
ContentType:`image/jpg`
}
s3.upload(params, async(err,data) => {
if(err) {
res.json({status:true, error:err})
} else{
product.productimageurl = data.Location
const productresult = await product.save()
return res.json({statusCode: "200", data:productresult})
}
})
}
//----Upload new image to AWS S3----
return res.json({statusCode: "200"})
})
})
I think you forget to add async at :
form.parse(req, async (err, fields, files) => {
//code....
}
You should always use async with await.
You made the outer function async:
router.post('/product/saveeditproduct/:id',JWTAuthenticatToken, async (req, res) => {
});
But you forgot to add async in the inner function (Parent of that particular await).
Solution is to make that function async:
form.parse(req, async (err, fields, files) => {
}
I am trying to compress an uploaded file using nodejs zlib.The compression works but trying to uncompress it throws an error.I created a compress route which is a post request to upload the file to be compressed:
app.post('/compress', upload.single('file'), (req, res) => {
try {
var streamInstance = new stream.Readable();
const destination = createWriteStream(`compressed/${req.file.originalname}.gz`);
const source = streamInstance.push(Buffer.from(JSON.stringify(req.file)))
res.json(source)
streamInstance.push(null);
pipeline(source, gzip, destination, (err, file) => {
if (err) {
console.log(err)
return res.json('An error occurred:', err);
} else {
console.log({
file: file
})
return res.json(file)
}
})
} catch (err) {
console.log(err)
res.json(err)
}
})
This add the compressed file in the compressed directory but trying to uncompress it throws an error.
Is there another method i could use in compressing this file using zlib?
Multer gives the ability to use the memoryStorage method.So you can get the actual buffer of that file.With that you can use this:
var storage = multer.memoryStorage()
var upload = multer({
storage: storage
})
app.post('/compress', upload.single('file'), async (req, res) => {
try {
const destination = `compressed/${req.file.originalname}.gz`;
await zlib.gzip(req.file.buffer, async (err, response) => {
if (err) {
return res.status(500).json({error:err})
} else {
await fs.appendFile(path.join(__dirname,destination), response, (err, data) => {
if (err) {
return res.status(500).json({error:err})
} else {
res.download(path.join(__dirname,destination),`${req.file.originalname}.gz`);
}
})
}
})
} catch (err) {
console.log(err)
res.json(err)
}
})
I successfully upload my files to the aws s3 bucket, but cannot get its location back , to store it back to my DB.
Here is my function:
const uploadFile = (filename, key) => {
return new Promise((resolve, reject)=> {
fs.readFile(filename, (err, data) => {
if(err){
reject(err);
};
const params = {
Bucket: "BUCKET_NAME",
Key: `student_${key}`, // File name you want to save as in S3
Body: data,
ACL: 'public-read'
};
s3.upload(params, function(err, data){
if(err){
throw err;
}
resolve(data.Location);
});
});
})
};
My router :
uploadFile.uploadFile(request.file.path, request.file.originalname).then((addr) => {
student_photo = addr;
})
Eventually I get empty string (when I console.log this).
The decision I found was to create a Promise to a function uploadFile, which in terms make it "thenable". So in .then() part I make query request to store info in my SQL.
I'm struggling finding a solution to upload two files to s3. I can upload one file with multer and I have learnt how to do it, but when I try to do a map inside all files in the formdata and upload each file, I push into an array each location URL which is the one I save in my database. Then I try to print each url but for my surprise they are print inside the if statement but not when I save it in the database outside the if. Could it be for an asychronous problem?.
Thanks.
tournamentsCtrl.createTournament = async (req, res) => {
var files_upload = []
if (req.files) {
aws.config.setPromisesDependency();
aws.config.update({
accessKeyId: process.env.AWS_ACCESS_KEY_ID,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
//region: process.env.REGION
});
const s3 = new aws.S3();
req.files.map((item) => {
var params = {
ACL: 'public-read',
Bucket: process.env.AWS_BUCKET_NAME,
Body: fs.createReadStream(item.path),
Key: `tournament_img/${uuidv4()/* +req.file.originalname */}`
};
await s3.upload(params, (err, data) => {
if (err) {
console.log('Error occured while trying to upload to S3 bucket', err);
}
if (data) {
fs.unlinkSync(item.path); // Empty temp folder
const locationUrl = data.Location;
files_upload.push(locationUrl);
console.log(files_upload)
}
});
});
}
console.log(files_upload)
const new_data = { ...JSON.parse(req.body.values), img_source: files_upload[0], info_url: files_upload[1] }
console.log(new_data)
const newUser = new Tournaments(new_data);
newUser
.save()
.then(user => {
res.json({ message: 'User created successfully', user });
})
.catch(err => {
console.log('Error occured while trying to save to DB');
});
};
If you look at the docs for upload it does not return a promise so you should not call await on it. The default map method is not compatible with async code in this form. You need to either use async.map or wrap the async code in a promise like
return await new Promise((resolve, reject) => {
...
if (data) {
fs.unlinkSync(item.path);
resolve(data.location);
}
}
Your other code has some issues as well. A map function should return a value. If you dont want to return anything you should use foreach.
This is a bad place to ask for code advice but something like the following
function uploadFile(s3, element) {
return new Promise((resolve, reject) => {
let folder;
if (element.fieldname.includes('img')) {
folder = 'club_images'
} else if (element.fieldname.inlcudes('poster')) {
folder = 'poster_tournament'
} else {
folder = 'info_tournament'
}
const params = {
ACL: 'public-read',
Bucket: process.env.AWS_BUCKET_NAME,
Body: fs.createReadStream(element.path),
Key: `${folder + '/' + uuidv4() + element.fieldname}`
};
s3.upload(params, (err, data) => {
if (err) {
return reject(err);
}
if (data) {
return fs.unlink(element.path, err=> {
if(err) {
console.error("Failed to unlink file", element.path);
}
return resolve({[element.fieldname]: data.Location});
}); // Empty temp folder
}
return resolve();
});
})
}
tournamentsCtrl.createTournament = async (req, res) => {
aws.config.setPromisesDependency();
aws.config.update({
accessKeyId: process.env.AWS_ACCESS_KEY_ID,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
//region: process.env.REGION
});
const s3 = new aws.S3();
try {
const uploadData = await Promise.all(req.files.map(element => uploadFile(s3, element)));
const returnData = Object.assign({}, ...uploadData);
console.log(Object.assign(JSON.parse(req.body.values), returnData));
} catch(e) {
console.error('Failed to upload file', e);
return res.sendStatus(500);
}
const newUser = new Tournaments(Object.assign(JSON.parse(req.body.values), files_upload));
console.log(newUser)
try {
const user = await newUser.save()
res.json({message: 'User created successfully', user});
} catch(e) {
console.error('Error occured while trying to save to DB');
return res.sendStatus(500);
}
};
I'm trying to build a file uploader with a progress bar. I found angular's way of listening to progress events (reportProgress) as simple and useful when it comes to the documentation. But I had trouble implementing it in my project. For some reason when I upload the image the progress shows 100% as soon as the request is sent and doesn't show real progress.
My service.ts
public upload(file) {
const formData = new FormData();
formData.append('file', file);
return this.http.post<any>(`${config.apiUrl}/articles/image-upload`, formData, {
reportProgress: true,
observe: 'events'
}).pipe(map((event) => {
switch (event.type) {
case HttpEventType.UploadProgress:
const progress = Math.round(100 * event.loaded / event.total);
return { status: 'progress', message: progress };
case HttpEventType.Response:
return event.body;
default:
return `Unhandled event: ${event.type}`;
}
})
);
}
And in my component
addImg() {
this.articlesService.upload(this.file).subscribe(
(res) => console.log(res),
(err) => console.log(err)
);
}
When it comes to the backend I'm using multer-s3 to upload files to aws clouds with express server.
const upload = multer({
storage: multerS3({
s3,
contentLength: 500000000,
bucket: 'bucketss',
acl: 'public-read',
metadata: function (req, file, cb) {
cb(null, {fieldName: 'TESTING_META_DATA!'});
},
key: function (req, file, cb) {
cb(null, Date.now().toString() + randtoken.uid(16))
}
})
})
const singleUpload = upload.single('file');
router.post('/image-upload', async (req, res) =>{
await singleUpload(req, res, function(err) {
if (err) {
console.log(err)
return res.status(422).send({errors: [{title: 'File Upload Error', detail: err.message}] });
}
return res.json({'imageUrl': req.file.location});
});
});