Uploading and Deserializing protobuf data in AWS Lambda - node.js

I have a requirement to send protobuf data to an AWS Lambda written in Node.js.
I am experimenting with a "Hello World" example where I serialize and deserialize a Person messge.
Example:
person.proto
syntax = "proto3";
message Person {
required int32 id = 1;
required string name = 2;
optional string email = 3;
}
Using Node.js and the package protobufjs I can generate the code from the proto file and serialize and deserialize a Person object to a file:
let person = personProto.Person.create();
person.id = 42;
person.name = 'Fred';
person.email = "fred#foo.com";
console.log("Person BEFORE Serialze=" + JSON.stringify(person1,null,2));
// Serialize
let buffer = personProtos.Person.encode(person1).finish();
console.log(buffer);
fs.writeFileSync("person.pb", buffer, "binary");
// Deserialize
let bufferFromFile = fs.readFileSync("person.pb");
let decodedPerson = personProtos.Person.decode(bufferFromFile);
console.log("Decoded Person=\n" + JSON.stringify(decodedPerson,null,2));
Output:
Person BEFORE Serialize={
"id": 42,
"name": "Fred",
"email": "fred#foo.com"
}
<Buffer 08 2a 12 04 46 72 65 64 1a 0c 66 72 65 64 40 66 6f 6f 2e 63 6f 6d>
Decoded Person=
{
"id": 42,
"name": "Fred",
"email": "fred#foo.com"
}
Using Postman, I want to upload the binary protobuf data to an AWS Lambda from the person.pb file and deserialize it in the Lambda.
When I specify the body as "binary" type and specify the person.pb file, the person data shows up in the Lambda event body as:
"body": "\b*\u0012\u0004Fred\u001a\ffred#foo.com"
It looks like it got transformed into Unicode and encoded?
How can I take the body string value and turn it back into the Node.js buffer:
<Buffer 08 2a 12 04 46 72 65 64 1a 0c 66 72 65 64 40 66 6f 6f 2e 63 6f 6d>
so that I can deserialize it back to the JSON object in my Lambda code?
I put the generated code from the .proto file into my Lambda so I can call:
let bufferFromEvent = event.body; <== how do I get a buffer from this?
let decodedPerson = personProtos.Person.decode(bufferFromEvent);
Thanks

The answer is what Daniel mentioned in the comment below your question. You need to use the Buffer class provided by Node.
Your lambda function will then look something like this.
const personProtos = require("./personProtos");
module.exports.handler = async event => {
const buffer = new Buffer.from(event.body);
console.log(personProtos.Person.decode(buffer).toObject());
return {
statusCode: 200,
body: "Success"
};
};

Related

Is there an easy way to do a simple file upload using Node/PostgreSQL for any type of file?

I want to store files in an existing postgreSQL database by uploading them by means of an express server.
The file comes into the POST end point like this:
{ name: 'New Data.xlsx',
data: <Buffer 50 4c 03 04 14 01 06 00 08 00 00 24 21 00 1f 0a 93 21 cf 02 00 00 4f 1f 00 00 13 00 08 02 5b 43 6f 6e 74 65 6e 74 ... >,
size: 6880975,
encoding: '7bit',
tempFilePath: '',
truncated: false,
mimetype: 'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet',
md5: '535c8576e1c94d169ea5a637487ee3b4',
mv: [Function: mv] }
This is a fairly large excel document. Word Docs, pdfs, simple CSVs, etc also need to be possible for upload.
I've tried both node-postgres and sequelize libraries in a similar way:
app.post('/upload', async(req, res) => {
var {upload} = req.files
var data = upload.data
console.log(data);
const x = await pool.query(`insert into attachment (data, file_name, category) values (${data}, '${upload.name}', 'test')`)
// Deconstruct x to get response values
res.send("OK")
});
Some files like txt, plain csv files work and do upload however I receive errors such as
error: invalid message format
for excel or word files.
I've done something like this before with MongoDB but I can't switch databases. Another idea I had was to simply store the files on the production server in an 'uploads' files but I'm not sure that's good practice.
Any advice?
Solved using query parameters.
app.post('/upload', async(req, res) => {
var {upload} = req.files
var data = upload.data
const x = await pool.query(`insert into attachment (data, file_name, category) values ($1, '${upload.name}', 'test')`, [data])
res.send("OK")
});

child_process.spawn returning buffer object

const cp = require("child_process");
ls = cp.spawn("node", ["./scripts/test.js"]);
ls.stdout.on("data", (data) => {
console.log(`stdout": ${data}`);
const result = data.toString();
});
In ls.stdout.on i am getting data as buffer and if i do it to data.toString()
it gives me result like "{evenNumberSum :8, oddNumberSum:6}" but i want result as a JSON object i am not even able to parse this type of result can anyone give me a better way to get result from buffer
CurrentOutput:
<Buffer 7b 3a 6e 61 6d 65 3d 3e 22 4a 6f 68 6e 22 2c 20 3a 61 67 65 3d 3e 33 30 2c 20 3a 63 69 74 79 3d 3e 22 4e 65 77 20 59 6f 72 6b 22 7d 0a>
Required Output:
{evenNumberSum :8, oddNumberSum:6}
test.js
let result = {};
function add(val1, val2) {
return val1 + val2;
}
result.evenNumbersSum = add(2, 4);
result.oddNumbersSum = add(1, 3);
result.mixNumbersSum = add(1, 2);
console.log(result);
I am not sure what test.js looks like, but I was not able to reproduce this issue.
I am using the following code..
--EDIT/DISCLAIMER-- People are most likely downvoting this because the solution I provided uses eval. To summarize:
YOU SHOULD USE eval AS LITTLE AS POSSIBLE (in a perfect world, eval would never be used)
ONLY USE IT ON *TRUSTED* strings (aka DO NOT USE eval ON USER SUPPLIED DATA!!! EVER.)
Considering the disclaimer above, the only way I could get this to work was:
main.js:
const cp = require('child_process');
ls = cp.spawn('node', ['./test.js']);
ls.stdout.on('data', (data) => {
const dataString = data.toString();
const dataJson = eval(`(${dataString})`);
console.log('Data as JSON =', dataJson);
console.log('evenNumberSum =', dataJson.evenNumberSum);
console.log('oddNumberSum =', dataJson.oddNumberSum);
});
test.js:
console.log({ evenNumberSum: 8, oddNumberSum: 6 });
Which produces the following output:
Data as JSON = { evenNumberSum: 8, oddNumberSum: 6 }
evenNumberSum = 8
oddNumberSum = 6

Nodejs - webscoket set encoding for receiving data?

I have a websocket proxy to TCP setup. But the data I receive is HEX buffer. How I can I convert it to string readable format? I think I have to set it to utf-8 but I don't see a option for that in websocket.
Example of data recevied:
Received: <Buffer 3c 63 72 6f 73 73 2d 61 69 6e 272 6f 2a 27 ... 46 more bytes>
Client code:
const ws = new WebSocket('ws://example.com:1211');
ws.onmessage = message => {
console.log('Received: ', message.data)
};
Try this:
const convert = (from, to) => hexMessage => Buffer.from(hexMessage, from).toString(to);
const hexToUtf8 = convert('hex', 'utf8');
hexToUtf8('your hex msg here')
Also check out this post: Hex to String & String to Hex conversion in nodejs

Using gridfs to store uploaded file with its metadata in Node / Express

I know there's a few threads about this, but I couldn't find my answer exactly. So using post i have managed to get this file object to the server side
{ fileToUpload:
{ name: 'resume.pdf',
data: <Buffer 25 50 44 46 2d 31 2e 33 0a 25 c4 e5 f2 e5 eb a7 f3 a0 d0 c4 c6 0a 34 20 30 20 6f 62 6a 0a 3c 3c 20 2f 4c 65 6e 67 74 68 20 35 20 30 20 52 20 2f 46 69 ... >,
encoding: '7bit',
mimetype: 'application/pdf',
mv: [Function] } }
How do I save this along with the metadata using mongoose & gridfs? In most threads I've looked at so far, gridfs-stream was used given a temporary path of the file, which I don't have. Could someone help me save this file by streaming the data along with its metadata + given an example of how I would retrieve it & send it back to the clientside?
I must've been tired, I was using the express-fileupload as a middleware but not using it to save the file which is done with the mv function in the object. Using code below to save file locally and then streaming it to mongo using gridfs-stream
var file = req.files.fileToUpload;
file.mv('./uploads/'+file.name, function(err) {
if (err) {
res.send(err, 500);
}
else {
res.send('File uploaded!');
var gfs = Grid(conn.db);
// streaming to gridfs
//filename to store in mongodb
var writestream = gfs.createWriteStream({
filename: file.name
});
fs.createReadStream('./uploads/'+file.name).pipe(writestream);
writestream.on('close', function (file) {
// do something with `file`
console.log(file.filename + ' Written To DB');
});
}
});

NodeJS TypeError argument should be a Buffer only on Heroku

I am trying to upload an image to store on MongoDB through Mongoose.
I am using multiparty to get the uploaded file.
The code works 100% perfectly on my local machine, but when I deploy it on Heroku, it gives the error:
TypeError: argument should be a Buffer
Here is my code:
exports.create = function (req, res) {
'use strict';
var form = new multiparty.Form();
form.parse(req, function (err, fields, files) {
var file = files.file[0],
contentType = file.headers['content-type'],
body = {};
_.forEach(fields, function (n, key) {
var parsedField = Qs.parse(n)['0'];
try {
parsedField = JSON.parse(parsedField);
} catch (err) {}
body[key] = parsedField;
});
console.log(file.path);
console.log(fs.readFileSync(file.path));
var news = new News(body);
news.thumbnail = {
data: new Buffer(fs.readFileSync(file.path)),
contentType: contentType
};
news.save(function (err) {
if (err) {
return handleError(res, err);
}
return res.status(201);
});
});
};
This is the console logs in the above code for HEROKU:
Sep 26 17:37:23 csgowin app/web.1: /tmp/OlvQLn87yfr7O8MURXFoMyYv.gif
Sep 26 17:37:23 csgowin app/web.1: <Buffer 47 49 46 38 39 61 10 00 10 00 80 00 00 ff ff ff cc cc cc 21 f9 04 00 00 00 00 00 2c 00 00 00 00 10 00 10 00 00 02 1f 8c 6f a0 ab 88 cc dc 81 4b 26 0a ... >
The is the console logs on my LOCAL MACHINE:
C:\Users\DOLAN~1.ADS\AppData\Local\Temp\TsfwadjjTbJ8iT-OZ3Y1_z3L.gif
<Buffer 47 49 46 38 39 61 5c 00 16 00 d5 36 00 bc d8 e4 fe fe ff ae cf df dc ea f1 fd fe fe db e9 f1 ad ce de 46 5a 71 2b 38 50 90 b8 cc 4a 5f 76 9a c3 d7 8f ... >
Does Heroku need any settings or configurations or something?
Sounds like the object passed is not a buffer when
data: new Buffer(fs.readFileSync(file.path)) is executed. Probably a difference in how your local environment is handling file writes or it could be how multiparty is handling streams.
This code works flawlessly for me:
news.thumbnail = {
media: fs.createReadStream(fileLocation),
contentType: contentType
};
But you also have to make sure your file has been saved from upload before you can use the file in the above createReadStream method. Things are inconsistent with Node, sometimes this happens synchronously and sometimes not. Ive used Busboy to handle the fileupload since it handles streams and creates a handler when the file stream is complete. Sorry, based on the above I cannot tell you where your issue is so ive included two solutions for you to try :))
Busboy: https://www.npmjs.com/package/busboy
Ive used this after the file has been uploaded to the temp directory in busboy:
//Handles file upload and stores to a more permanent location.:
//This handles streams.
// request is given by express.
var busboy = new Busboy({ headers: request.headers });
var writeStream;
busboy.on('file', function(fieldname, file, filename, encoding, mimetype) {
writeStream = file.pipe(fs.createWriteStream(saveTo));
})
.on('finish', function() {
writeStream = file.pipe(fs.createWriteStream(saveTo));
writeStream.on('close', function(){
//use the file
});
});

Resources