AWS Api Gateway & CSV Upload Issue - node.js

I am getting some meta-information / junk in CSV in lambda function. I need to remove that junk. If I directly save the file to s3 the junk is included. Could anyone guide me on how to remove this?
----------------------------362648820336892682391117 ***// remove this***
Content-Disposition: form-data; name="file"; filename="Book1.csv" ***// remove this***
Content-Type: text/csv ***// remove this***
o;?name,age // remove this o;?
andy,33
hello,34
----------------------------362648820336892682391117-- ***// remove this***
I can also upload directly to s3 using pre-signed URL however, that is not what I am looking for.
const AWS = require('aws-sdk');
exports.handler = async (event) => {
try {
console.log(JSON.stringify(event, 2, null));
const data = new Buffer(event.body, 'base64');
const text = data.toString('ascii');
const s3 = new AWS.S3();
const params = {Bucket: 'bucket', Key: 'key', Body: text};
const d = await s3.upload(params).promise();
return {
statusCode: 200,
body: JSON.stringify('uploaded successfully'),
};
} catch (e) {
return {
statusCode: 200,
body: JSON.stringify('uploaded successfully'),
};
}
};
Thanks

I assume you are uploading the file using multipart/form-data. If so, you will need to do further processing of the request body. You can either do something very rudimentary like manually parsing the contents using regex or you could use a library like busboy which helps process HTML form data.
A quick example for your scenario could be something like this.
const Busboy = require('busboy');
const AWS = require('aws-sdk');
// This function uses busboy to process the event body and
// return an object containing the file data and other details.
const parseFile = (event) => new Promise((resolve, reject) => {
let contentType = event.headers['content-type']
if (!contentType) {
contentType = event.headers['Content-Type'];
}
const busboy = new Busboy({ headers: { 'content-type': contentType } });
const uploadedFile = {};
busboy.on('file', (fieldname, file, filename, encoding, mimetype) => {
file.on('data', data => {
uploadedFile.data = data;
});
file.on('end', () => {
uploadedFile.filename = filename;
uploadedFile.contentType = mimetype;
});
});
busboy.on('error', error => {
reject(error);
});
busboy.on('finish', () => {
resolve(uploadedFile);
});
busboy.write(event.body, event.isBase64Encoded ? 'base64' : 'binary');
busboy.end();
});
exports.handler = async (event) => {
// Use the parse function here
const { data } = await parseFile(event);
const s3 = new AWS.S3();
const params = { Bucket: 'bucket', Key: 'key', Body: data };
await s3.upload(params).promise();
return {
statusCode: 200,
body: 'uploaded successfully',
};
};

Related

Nodejs - how to upload file from req to aws s3

Working on web-app: React & Node.
the client app sends file to the server, and the server should upload it to an S3 bucket.
using import S3 from 'aws-sdk/clients/s3'; i found the upload function.
The upload function expect to get Buffer or Stream, and i don't know how to convert the file to buffer/stream.
Code
app.get('/upload', (req, res) => {
const { file } = req;
s3.upload({ Bucket: 'MY_BUCKET', Key: 'MY_KEY', Body: streamifyFile(file)});
})
const streamifyFile = () => {
// how to implement
}
Solution:
const uploadToS3 = async (req) => {
const {
file
} = req.body;
const { filename, createReadStream } = await file;
const pass = new stream.PassThrough();
const params = {
Bucket: process.env.S3_BUCKET,
Key: `${context.user.email}/${new Date().toISOString()}.${filename}`,
Body: pass,
};
const uploadPromise = s3.upload(params).promise();
const streamInput: Stream = createReadStream();
streamInput.pipe(pass);
const uploadData = await uploadPromise;
logger.info(
`Successfully upload - file: ${filename}, location: ${uploadData.Location}`,
);
};

Saving uploaded file to Pinata IPFS in NodeJS

I've been trying to save uploaded image files to IPFS in NodeJs , while it seems Pinata saves them, the files are pretty much gibberish (after downloading the images are broken).
My code :
// Nodejs route.
exports.postImage = async (req, res, next) => {
// Using multer to get the file.
fileUploadMiddleware(req, res, async (err) => {
// getting bunch of data from query string.
let meta = {
origin,
originid,
context,
ownerid,
format
} = req.query;
if(!meta.format || !req.files) {
return next(new ErrorResponse("File format not specified", 404));
}
if(!meta.originid) {
meta.originid = uuidv4();
}
// NOTE: is this the right way to get the data of the file ?
const buffer = req.files[0].buffer;
const filename = `${metadata.origin}_${metadata.originid}.${ metadata.format }`;
let stream;
try {
stream = Readable.from(buffer);
// HACK to make PINATA WORK.
stream.path = filename;
}
catch(e) {
logger.logError(e);
return false;
}
const options = {
pinataMetadata: {
name: filename,
keyvalues: {
context: metadata.context,
ownerid: metadata.ownerid
}
},
pinataOptions: {
cidVersion: 0
}
};
try {
var result = await pinata.pinFileToIPFS(stream, options);
console.log("SUCCESS ", result);
return result;
}
catch(e) {
logger.logError(e);
return null;
}
res.status(200).json({
success: true,
data: 'You got access'
})
});
}
So basically creating the stream based on the uploaded file buffer and sending it away to Pinata. Where do I go wrong?
const buffer = req.files[0].buffer;
If you used MemoryStorage. buffer property would be available. It is not available for diskStorage because it will save the file locally.:
const storage = multer.memoryStorage()
const upload = multer({ storage: storage })
Also I think it not req.files[0]
const buffer = req.file.buffer;
after I get the buffer, I convert it to FormData using form-data npm package:
import FormData from "form-data";
const formData = new FormData();
formData.append("file", buffer, {
contentType,
filename: fileName + "-" + uuidv4(),
});
then you send a post request to pinata
const url = `https://api.pinata.cloud/pinning/pinFileToIPFS`;
const fileRes = await axios.post(url, formData, {
maxBodyLength: Infinity,
headers: {
// formData.getBoundary() is specific to npm package. native javascript FormData does not have this method
"Content-Type": `multipart/form-data: boundary=${formData.getBoundary()}`,
pinata_api_key: pinataApiKey,
pinata_secret_api_key: pinataSecretApiKey,
},
});

Serving zip file from s3 in aws lamba node.js - unable to extract after download

I'm trying to create a lambda function to read a zip file from s3 and to serve it. But after downloading this file in the browser I can't unzip it, getting the error "Unable to extract, it is in an unsupported format". What can be a problem?
const file = await s3.getObject({
Bucket: 'mybucket',
Key: `file.zip`
}).promise();
return {
statusCode: 200,
isBase64Encoded: true,
body: Buffer.from(file.Body).toString('base64'),
headers: {
'Content-Type': 'application/zip',
'Content-Disposition': `attachment; filename="file.zip"`,
},
}
Your file.Body should already be a Buffer, so Buffer.from(file.Body) should be unnecessary but unharmful.
I think your problem is that you're doing toString('base64') there. The documentation says:
If body is a binary blob, you can encode it as a Base64-encoded string by setting isBase64Encoded to true and configuring / as a Binary Media Type.
This makes me believe that it actually means that AWS will automatically convert your (non-base64) body into base64 in the response body. If that's the case, due to you doing .toString('base64'), your body is being base64'd twice. You could un-base64 your resulting file.zip and see what it gives.
The solution for me was to set 'Content-Encoding': 'base64' response header.
you can follow this code below
"use strict";
const AWS = require("aws-sdk");
const awsOptions = {
region: "us-east-1",
httpOptions: {
timeout: 300000 // Matching Lambda function timeout
}
};
const s3 = new AWS.S3(awsOptions);
const archiver = require("archiver");
const stream = require("stream");
const request = require("request");
const streamTo = (bucket, key) => {
var passthrough = new stream.PassThrough();
s3.upload(
{
Bucket: bucket,
Key: key,
Body: passthrough,
ContentType: "application/zip",
ServerSideEncryption: "AES256"
},
(err, data) => {
if (err) throw err;
}
);
return passthrough;
};
// Kudos to this person on GitHub for this getStream solution
// https://github.com/aws/aws-sdk-js/issues/2087#issuecomment-474722151
const getStream = (bucket, key) => {
let streamCreated = false;
const passThroughStream = new stream.PassThrough();
passThroughStream.on("newListener", event => {
if (!streamCreated && event == "data") {
const s3Stream = s3
.getObject({ Bucket: bucket, Key: key })
.createReadStream();
s3Stream
.on("error", err => passThroughStream.emit("error", err))
.pipe(passThroughStream);
streamCreated = true;
}
});
return passThroughStream;
};
exports.handler = async (event, context, callback) => {
var bucket = event["bucket"];
var destinationKey = event["destination_key"];
var files = event["files"];
await new Promise(async (resolve, reject) => {
var zipStream = streamTo(bucket, destinationKey);
zipStream.on("close", resolve);
zipStream.on("end", resolve);
zipStream.on("error", reject);
var archive = archiver("zip");
archive.on("error", err => {
throw new Error(err);
});
archive.pipe(zipStream);
for (const file of files) {
if (file["type"] == "file") {
archive.append(getStream(bucket, file["uri"]), {
name: file["filename"]
});
} else if (file["type"] == "url") {
archive.append(request(file["uri"]), { name: file["filename"] });
}
}
archive.finalize();
}).catch(err => {
throw new Error(err);
});
callback(null, {
statusCode: 200,
body: { final_destination: destinationKey }
});
};
If you're not restricted to using the same URI as the URI that is serving your API, you could also create a pre-signed URL and return it as a redirection result. However, this will redirect to a different domain (S3 domain) so won't work out-of-the-box if you have to serve from the same domain name (e.g., because of firewall restrictions).

NodeJS API to upload image to S3 not returning response

I am trying to upload image to AWS S3 bucket using NodeJS. The issue I am facing it is while the image is getting saved but the API is returning 404(Not Found). Here is my controller code:
async UploadProfileImage(ctx) {
try {
var file = ctx.request.files.profileImage;
if (file) {
fs.readFile(file.path, (err, fileData) => {
var resp = s3Utility.UploadProfileImageToS3(file.name, fileData);
//Not reaching here. Although E3 tag printing in console.
console.log(resp);
ctx.status = 200;
ctx.body = { response: 'file Uploaded!' };
});
}
else {
ctx.status = 400;
ctx.body = { response: 'File not found!' };
}
} catch (error) {
ctx.status = 500;
ctx.body = { response: 'There was an error. Please try again later!' };
}
}
Utility Class I am using is:
const AWS = require('aws-sdk');
const crypto = require("crypto");
var fs = require('fs');
const mime = require('mime-types');
export class S3Utility {
constructor() { }
async UploadProfileImageToS3(fileName, data) {
let randomId = crypto.randomBytes(16).toString("hex");
AWS.config.update({ region: "Region", accessKeyId: "KeyID", secretAccessKey: "SecretAccessKey" });
var s3 = new AWS.S3();
var imageName = randomId + fileName;
var params = {
Bucket: "BucketName"
Key: imageName,
Body: data,
ContentType: mime.lookup(fileName)
};
return new Promise((resolve, reject) => {
s3.putObject(params, function (err, data) {
if (err) {
console.log('Error: ', err);
reject(new Error(err.message));
} else {
console.log(data);
resolve({
response: data,
uploadedFileName: imageName
});
}
});
});
}
}
const s3Utility: S3Utility = new S3Utility();
export default s3Utility;
The code is uploading file on S3 but it is not returning proper response. Upon testing this endpoint on postman, I get "Not Found" message. However, I can see E Tag getting logged in console. I don't know what is going wrong here. I am sure it has something to do with promise. Can someone please point out or fix the mistake?
Edit:
Using async fs.readFile does the trick.
const fs = require('fs').promises;
const fileData = await fs.readFile(file.path, "binary");
var resp = await s3Utility.UploadProfileImageToS3(file.name, fileData);

AWS Lambda - NodeJS CSV Data to S3 File

Experts,
I have a JSON object that I need to save to the S3 bucket as a CSV file. This is what I have managed so far but unfortunately, the file is not getting created on S3 and no error is reported.
const AWS = require('aws-sdk');
const converter = require('json-2-csv');
const s3 = new AWS.S3({
accessKeyId: "Key",
secretAccessKey: "Secret"
});
exports.handler = async(event, context, callback) => {
context.callbackWaitsForEmptyEventLoop = false;
const s3Bucket = "bucketname";
const objectName = event.operation + new Date().getTime() + "_" + event.userId + ".";
const objectData = (event.rawData);
const objectType = "text/csv";
converter.json2csv(objectData, async(err, csv) => {
if (err) {
console.log(" Error ", err)
throw err;
}
console.log(csv);
try {
const params = {
Bucket: s3Bucket,
Key: objectName,
Body: (csv),
ContentType: objectType,
ContentDisposition: 'attachment',
};
console.log('Writing to s3 bucket..');
const result = await s3.putObject(params).promise();
console.log('Finished Writing to s3 bucket..', objectName);
return sendRes(200, `File uploaded successfully at https:/` + s3Bucket + `.s3.amazonaws.com/` + objectName);
}
catch (error) {
console.log(error)
return sendRes(404, error);
}
});
};
const sendRes = (status, body) => {
var response = {
statusCode: status,
headers: {
"Content-Type": "application/json",
"Access-Control-Allow-Headers": "Content-Type,X-Amz-Date,Authorization,X-Api-Key,X-Amz-Security-Token",
"Access-Control-Allow-Methods": "OPTIONS,POST,PUT",
"Access-Control-Allow-Credentials": true,
"Access-Control-Allow-Origin": "*",
"X-Requested-With": "*"
},
body: body
};
return response;
};
If I remove the JSON to CSV converter and try saving the JSON data as a JSON file then it works with a charm - by changing the objectType to "application/json".
What am I doing wrong here when converting to CSV ?
P.S: JSON to CSV conversion works fine and this is the last printed log.
Writing to s3 bucket..

Resources