Nodejs - how to upload file from req to aws s3 - node.js

Working on web-app: React & Node.
the client app sends file to the server, and the server should upload it to an S3 bucket.
using import S3 from 'aws-sdk/clients/s3'; i found the upload function.
The upload function expect to get Buffer or Stream, and i don't know how to convert the file to buffer/stream.
Code
app.get('/upload', (req, res) => {
const { file } = req;
s3.upload({ Bucket: 'MY_BUCKET', Key: 'MY_KEY', Body: streamifyFile(file)});
})
const streamifyFile = () => {
// how to implement
}

Solution:
const uploadToS3 = async (req) => {
const {
file
} = req.body;
const { filename, createReadStream } = await file;
const pass = new stream.PassThrough();
const params = {
Bucket: process.env.S3_BUCKET,
Key: `${context.user.email}/${new Date().toISOString()}.${filename}`,
Body: pass,
};
const uploadPromise = s3.upload(params).promise();
const streamInput: Stream = createReadStream();
streamInput.pipe(pass);
const uploadData = await uploadPromise;
logger.info(
`Successfully upload - file: ${filename}, location: ${uploadData.Location}`,
);
};

Related

ASW-SDK does not return mutliple images

I have a node mutler and aws-sdk image upload project to aws s3, I am facing issues with getting the images back.
s3.js
const fs = require('fs');
const S3 = require('aws-sdk/clients/s3');
// function that uploads file to s3
function uploadFile(file){
const fileStream = fs.createReadStream(file.path);
const uploadParams = {
Bucket: bucketName,
Body: fileStream,
Key: file.originalname
};
return s3.upload(uploadParams).promise();
}
// function downloads file from s3
function getFileStream(fileKey){
const downloadedParams = {
Bucket: bucketName,
Key: fileKey
};
return s3.getObject(downloadedParams).createReadStream();
// return s3.getObject(downloadedParams);
}
Here is the controller.js
exports.getUserImages = async (req, res) => {
try {
const userID = req.params.userId;
const images = await Gallery.find({userID});
if (! images) {
res.status(400).json({message: 'No images found'});
return;
}
console.log(images);
// find show all those images
for (const image of images){
const readStream = getFileStream(image.key);
readStream.pipe(res);
}
} catch (e) {
res.status(500).json({message: e.message});
}
};
This works when I use findOne and it displays the image, but when I try to get all the images it does not work, it is just a blank screen. I have the names of the images as Key in my local database and my asw s3 bucket is public I only want to access the image not download.
mongodb
key:"PIC.jpeg"

Extract zip file in S3 Bucket using AWS lambda functions in nodeJs "Error: Invalid CEN header (bad signature)"

I am struggling with unzipping the contents in AWS S3. AWS S3 does not provide the functionality of unzipping the zip folder in the S3 bucket directly. I facing one error . upload code screenshot attached.
"Error: Invalid CEN header (bad signature)"
Any advice or guidance would be greatly appreciated.
My node Js code to upload the zip file:
const AWS = require('aws-sdk');
const s3 = new AWS.S3({signatureVersion: 'v4'});
exports.handler = async (event,context) => {
const bucket = 'bucket-name';
console.log(event)
const body = event.body;
const key=JSON.parse(body).key
console.log(key)
const params = {
Bucket: bucket,
Key: key,
ContentType: 'application/zip',
Expires: 60
};
try{
const signedURL = await s3.getSignedUrl('putObject', params);
const response = {
err:{},
body:"url send",
url:signedURL
};
return response;
}catch(e){
const response = {
err:e.message,
body:"error occured"
};
return response;
}};
My NodeJs code to extract the zip file:
const S3Unzip = require('s3-unzip');
exports.s3_unzip = function(event, context, callback) {
const filename = decodeURIComponent(event.Records[0].s3.object.key.replace(/\+/g, ' '));
const bucketname = event.Records[0].s3.bucket.name;
console.log(event.Records[0].s3.object.key);
new S3Unzip({
bucket: bucketname,
file: filename,
deleteOnSuccess: true,
verbose: true,
}, function(err, success) {
if (err) {
callback(err);
} else {
callback(null);
}
});
}

AWS Lambda unzip from S3 to S3

I'm trying to write an Lambda function that unzips zip files in one S3 directory and extract into another. I had this working in Python but nobody else in my group likes Python so I'm converting it to Node.js which I'm not very good at.
I'm trying to use the unzipper package and I'm able to get a list of files in the zip file using unzipper.Open.S3, but I can't figure out how to stream the files in the zip file into S3.
The meat of the code looks like
const directory = await unzipper.Open.s3(s3,{Bucket: bucket, Key: zip_file});
directory.files.forEach(file => {
console.log("file name = " + file.path + ", type = " + file.type)
const key = dir[0] + "/output/" + file.path;
const params = { Bucket: bucket, Key: key };
const { writeStream, promise } = uploadStream(params)
file.stream().pipe(writeStream);
promise.then(() => {
console.log('upload completed successfully');
}).catch((err) => {
console.log('upload failed.', err.message);
});
});
const uploadStream = ({ Bucket, Key }) => {
const pass = new stream.PassThrough();
return {
writeStream: pass,
promise: s3.upload({ Bucket, Key, Body: pass }).promise()
};
}
I get the console.log for each file, but neither of the logs in promise.then and .catch comes out and no new files appear in S3.
Never mind, I found this code that works better:
exports.handler = async (event) => {
const params = {
Key: zip_directory + "/" + zip_file,
Bucket: input_bucket
};
const zip = s3
.getObject(params)
.createReadStream()
.pipe(unzipper.Parse({ forceStream: true }));
const promises = [];
let num = 0;
for await (const e of zip) {
const entry = e;
const fileName = entry.path;
const type = entry.type;
if (type === 'File') {
const uploadParams = {
Bucket: output_bucket,
Key: output_directory + fileName,
Body: entry,
};
promises.push(s3.upload(uploadParams).promise());
num++;
} else {
entry.autodrain();
}
}
await Promise.all(promises);
};

NodeJS piping data to AWS S3 TypeError: dest.on is not a function

Please can I get some help as to why my code is throwing a Pipe error, using a PassThrough: TypeError: dest.on is not a function
I originally thought it was because I was not returning the PassThrough, as outlined in the other post
Now I'm not so sure? Thanks in advance
const { google } = require('googleapis')
const auth = require('./googleOAuth')
const aws = require('aws-sdk')
const fs = require('fs')
const stream = require('stream')
// AWS S3 bucket name to upload to
const awsBucketName = 'my-backup'
// get AWS keys stored in local file and pass through to AWS auth
const getAWSKeys = async () => {
const awsKeys = await auth.read('./cred/awskeys.json').then(result => {return result})
aws.config.update({
accessKeyId: awsKeys.keys.aws_access_key_id,
secretAccessKey: awsKeys.keys.aws_secret_access_key
})
}
// upload a file to AWS S3 by passing the file stream from getGFileContent into the 'body' parameter of the upload
const s3Upload = async () => {
await getAWSKeys()
let pass = new stream.PassThrough()
let params = {
Bucket: awsBucketName, // bucket-name
Key: 'filePath.jpg', // file will be saved as bucket-name/[uniquekey.csv]
Body: pass // file data passed through stream
}
new aws.S3().upload(params).promise()
.then(() => console.log(`Successfully uploaded data to bucket`))
.catch( err => console.log(`Error, unable to upload to S3: ${err}`))
return pass
}
// download gFile, non google docs files. Downloaded as a stream of data and pipped into the awsUpload function
const getGFileContent = async () => {
const gKeys = await auth.get()
const drive = google.drive({version: 'v3', auth: gKeys})
let params = {fileId: '1bNr_ZM90fM0EnPcFPfdd2LnB7Z2Tts3LiQ', mimeType: "image/jpeg", alt: 'media'}
return drive.files.get(params, {responseType: 'stream'})
.then(res => {
return new Promise((resolve, reject) => {
res.data
.on('end', () => {resolve()})
.on('error', err => {reject(`Error downloading Google docs file: ${err}`)})
.pipe(s3Upload())
})
})
}
getGFileContent()

In NodeJS, how to download files from S3

In ExpressJS, I would like to download files previously uploaded to an Amazon S3 bucket.
Here is my current route:
const express = require('express');
const AWS = require('aws-sdk');
const mammoth = require('mammoth');
const fs = require('fs').promises
const path = require('path')
const router = express.Router();
router.put('/:id/download', async (req, res, next) => {
console.log('hitting download route')
var id = req.params.id;
let upload = await Upload.query().findById( id ).eager('user');
console.log("file to download is: ", upload.name)
AWS.config.update({
accessKeyId: process.env.AWS_ACCESS_KEY_ID,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
});
const s3 = new AWS.S3();
// var fileStream = fs.createWriteStream('/tmp/file.docx');
// var s3Stream = s3.getObject(params).createReadStream();
const downloadFromS3 = async () => {
const params = {
Bucket: process.env.AWS_BUCKET,
Key: upload.file_url.split("com/").reverse()[0]
};
const { Body } = await s3.getObject(params).promise()
await fs.writeFile(`${ __dirname }/download.docx`, Body)
return Body
}
// mammoth.convertToHtml({ path: '/Users/dariusgoore/Downloads/1585930968750.docx' })
// .then(async function(result) {
// await Upload.query().findById( id )
// .patch({
// html: result.value,
// conversion_messages: result.messages
// })
// res.json(result);
// })
// .done();
res.send(downloadFromS3)
});
I get no errors, but the file is not created, or if I manually create the file, it remains empty.
If I've understood you correctly the issue is that you're not waiting for the file to be written to the local file system, you're returning it in the response via express.
Give this code a go.
const express = require('express')
const AWS = require('aws-sdk')
const mammoth = require('mammoth')
const fs = require('fs').promises
const path = require('path')
const router = express.Router()
const s3 = new AWS.S3()
AWS.config.update({
accessKeyId: process.env.AWS_ACCESS_KEY_ID,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
})
const downloadFromS3 = async (key, location) => {
const params = {
Bucket: process.env.AWS_BUCKET,
Key: key,
}
const { Body } = await s3.getObject(params).promise()
await fs.writeFile(location, Body)
return true
}
router.put('/:id/download', async (req, res, next) => {
console.log('hitting download route')
const upload = await Upload.query()
.findById(req.params.id)
.eager('user')
console.log('file to download is: ', upload.name)
const key = upload.file_url.split('com/').reverse()[0]
const location = `${__dirname}/${key}.docx`
await downloadFromS3(key, location)
res.send({ key, location })
})
import { S3 } from 'aws-sdk';
import fs from 'fs';
export default class S3Service {
s3: S3;
constructor() {
this.s3 = new S3({
apiVersion: *****,
region: ********
});
}
//Download File
async download(bucketName: string, keyName: string, localDest?: string): Promise<any> {
if (typeof localDest == 'undefined') {
localDest = keyName;
}
const params = {
Bucket: bucketName,
Key: keyName
};
console.log("params: ", params);
let writeStream = fs.createWriteStream(localDest);
return new Promise<any>((resolve, reject) => {
const readStream = this.s3.getObject(params).createReadStream();
// Error handling in read stream
readStream.on("error", (e) => {
console.error(e);
reject(e);
});
// Resolve only if we are done writing
writeStream.once('finish', () => {
resolve(keyName);
});
// pipe will automatically finish the write stream once done
readStream.pipe(writeStream);
});
}
}

Resources