I tried to upload mp4 video file using s3 pre-signed URL. The video is getting uploaded properly but when I download the same video and play the video, it does not play.
Here is how I did it
// generating pre-signed url
const getSingedUrlforPut = async (bucketName, filename) => {
const params = {
Bucket: bucketName,
Key: filename,
Expires: 60 * 5,
ContentType: "video/mp4",
};
try {
const url = await s3.getSignedUrlPromise('putObject', params)
return url
} catch (err) {
console.log("error generating s3 url ", err)
throw err
}
}
// uploading it ot s3
const url = await getSingedUrlforPut(buckets.toConvertCoursesVideos, fileId)
try {
// file is mp4 file uploaded using form-data
const resp = await axios.put(url, file})
} catch (err){
console.log(err)
}
Related
I'm receiving a post request from FE to my BE code, in Typescript, and I want to upload a file sent from FE to BE in the req to an AWS S3 bucket.
Here's the code I'm using:
import S3 from 'aws-sdk/clients/s3';
import aws from 'aws-sdk';
import fs from 'fs';
async writeDataset(datasetFile: any): Promise<boolean> {
const FOLDER = 'sample';
const BUCKET = 'sample-bkt';
const bucket = new S3();
const credentials = new aws.SharedIniFileCredentials({
profile: 'test-dev'
});
bucket.config.credentials = credentials;
bucket.config.region = 'us-east-1';
fs.readFile(datasetFile.path, function(err, data) {
const params = {
Bucket: BUCKET,
Key: FOLDER + datasetFile.originalname,
Body: data,
ACL: 'private'
};
bucket.upload(params, function() {
fs.unlink(datasetFile.path, function() {
if (err) {
console.error(err);
}
});
if (err) {
return false;
} else {
console.log('Successfully uploaded data');
return true;
}
});
});
}
The upload works perfectly fine, since I'm using Postman (to mock the FE call) and I'm having the file's path field datasetFile.path in the ob, and this is why fs.readFile(datasetFile.path, function(err, data) { ... } because the file is being uploaded from my local PC through postman.
What I want to do is to be able to upload the file from the memory, from the request.files, to S3 without using fs.readFile and datasetFile.path
I am trying to upload a file to AWS S3 using [putObject][1] but it results in files of 0 byte size.
I do get successful response back from the putObject call.
Node.js code:
const aws = require("aws-sdk");
const s3 = new aws.S3();
module.exports = {
upload: function(req, res, next) {
console.log("Going to upload");
console.log(req.files);
let uploadFile = req.files.file;
const s3PutParams = {
Bucket: process.env.S3_BUCKET_NAME,
Key: uploadFile.name,
Body: uploadFile.data,
ACL: "public-read"
};
const s3GetParams = {
Bucket: process.env.S3_BUCKET_NAME,
Key: uploadFile.name
};
console.log(s3PutParams);
s3.putObject(s3PutParams, function(err, response) {
if (err) {
console.error(err);
} else {
console.log("Response is", response);
var url = s3.getSignedUrl("getObject", s3GetParams);
console.log("The URL is", url);
res.json({
returnedUrl: url,
publicUrl: `https://${process.env.S3_BUCKET_NAME}.s3.amazonaws.com/${uploadFile.name}`
});
}
});
}
};
Testing through POSTMAN:
Backend Console log
Can anyone help me in figuring out what is wrong?
EDIT on 11/20:
#EmmanuelNK helped in spotting the fact that Buffer.byteLength(req.files.file.data) is 0. He had the below questions:
Are you trying to write the whole buffer into memory or are you trying to stream it to s3?
Sorry if the answer is not to the point, still getting my feet wet.
Basically I want to upload an image to S3 and then later use that URL to show it on a webpage. In other words like a photobucket
how you are using upload
For now I am just testing my backend code (posted in the question) using postman. Once I get that going, will have a file upload form on the front end calling this route.
Is that helpful? Thanks in advance for your help.
If you're using express-fileupload as the file uploading middleware, and you've set the useTempFiles option to true, keep in mind that your data file buffer will be empty (check usage), which correlates to the issue you're facing. To get around this, simply read the temp. file once more to get the intended file buffer.
import fs from 'fs';
// OR
const fs = require('fs');
// in your route
let uploadFile = req.files.file;
// THIS
fs.readFile(uploadedFile.tempFilePath, (err, uploadedData) => {
if (err) { throw err; }
const s3PutParams = {
Bucket: process.env.S3_BUCKET_NAME,
Key: uploadFile.name,
Body: uploadData, // <--- THIS
ACL: "public-read"
};
const s3GetParams = {
Bucket: process.env.S3_BUCKET_NAME,
Key: uploadFile.name
};
console.log(s3PutParams);
s3.putObject(s3PutParams, function(err, response) {
if (err) {
console.error(err);
throw err;
} else {
console.log("Response is", response);
var url = s3.getSignedUrl("getObject", s3GetParams);
console.log("The URL is", url);
res.json({
returnedUrl: url,
publicUrl: `https://${process.env.S3_BUCKET_NAME}.s3.amazonaws.com/${uploadFile.name}`
});
}
});
});
I need to uplaod a pdf file from UI(written in Javascript) to Amazon S3 but I am trying to upload the file to the S3, I am getting some unicode format text and when I copy that text to notepad, or say, any other text editor I can the human readable text
I am using pdfmake to get the content of the file and upload it using getBufffer method.
var content = generatePDF(base64Img);
pdfMake.createPdf(content).getBuffer(function (data) {//Code}
The code that i used to upload the file to S3.
var params = {
Bucket: bucketName,
Key: file_name,
Body: data.toString(),
ContentType: 'application/pdf'
}
s3.upload(params, function (err, data) {
if (err) {
// code
}else{
//code
}
The file is getting uploaded successfully but I am getting the text like
!
" #$%&!' ()*')+,
!
!
!
!
But I am pasting it to other text editor, I am getting
Date: 04/20/19
I solved the above problem by passing the data from getBuffer to S3.
In S3, I passed to a buffer like
var data = new Buffer(event.data, 'binary');
uploaded the data to S3.
var params = {
Bucket: bucketName,
Key: file_name,
Body: data,
ContentType: 'application/pdf'
}
s3.upload(params, function (err, data) {
if (err) {
// code
}else{
//code
}
To upload a file from client end directly to s3 bucket you can use multer-s3.
FROM CLIENT END:
axios.post(url, data, {
onUploadProgress: ProgressEvent => {
this.setState({
loaded: (ProgressEvent.loaded / ProgressEvent.total * 100),
})
},
})
.then(res => { // then print response status
toast.success('Upload Success!')
})
.catch(err => { // then print response status
toast.error('Upload Failed!')
})
SERVER SIDE:
const upload = multer({
storage: multerS3({
s3: s3,
acl: 'public-read',
bucket: BUCKET_NAME,
key: function (req, file, cb) {
UPLOADED_FILE_NAME = Date.now() + '-' + file.originalname;
cb(null, UPLOADED_FILE_NAME);
}
})
}).array('file');
app.post('/upload', function (req, res) {
upload(req, res, function (err) {
if (err instanceof multer.MulterError) {
return res.status(500).json(err)
// A Multer error occurred when uploading.
} else if (err) {
return res.status(500).json(err)
// An unknown error occurred when uploading.
}
console.log('REQUEST FILE IS', UPLOADED_FILE_NAME)
return res.status(200).send(UPLOADED_FILE_NAME)
// Everything went fine.
})
});
Below is the Rest API to upload the video to S3,unable to play the video by downloading as players throwing invalid file format.
app.post('/insert-video', async(req, res) => {
const {
data,
name: fileName,
size: fileSize,
type: fileType
} = req.body
// ASW code start
const base64data = new Buffer(data, 'binary');
let params = {
Bucket: "uploadvidoe",
Key: fileName,
ContentType: fileType,
Body: base64data
};
try {
let uploadPromise = await new AWS.S3().putObject(params).promise();
console.log("Successfully uploaded data to bucket");
} catch (e) {
console.log("Error uploading data: ", e);
}
});
i have a gz file in s3 and need to fetch the content of the file. The below code doesn't print the actual content of the file.
const params = {
Bucket: mybucket,
Key: mykey.gz,
};
s3.getObject(params, (err, data) => {
if (err) {
console.log(err);
const message = `Error getting object`;
console.log(message);
callback(message);
} else {
const payload = data.Body.toString('ascii');
console.log('printing contents ', payload)
How can we fetch the contents of .gz file from s3?