NODEJS Create zip from byte string - node.js

I have to make a POST request to a server that returns me this.
And I have to write a ZIP from that, how do I get the bytes from that string to generate my zip file?

you just need to create a buffer from the api response and create a zip file using that buffer.
var fs = require('fs');
var buff = new Buffer(response_from_api);
fs.writeFile("./test1.zip", buff, function(err){
//do something
});

Related

How to save returned protobuf object in nodejs?

In my code, a function is returning a protobuf object and I want to save it in a file xyz.pb.
When I am trying to save it using fs.writefilesync it is not saving it.
It is circular in nature. So, I tried to save it using circular-json module to confirm if there is anything inside it and it has data.
But, as I used circular-json in the first place it doesn't have the proper information(not properly formatted) and it is of no use.
How can I save this protobuf in a file using nodejs?
Thanks!
you can try to use streams like mentioned in documentation
as following
const crypto = require('crypto');
const fs = require('fs');
const wstream = fs.createWriteStream('fileWithBufferInside');
// creates random Buffer of 100 bytes
const buffer = crypto.randomBytes(100);
wstream.write(buffer);
wstream.end();
or you can convert the buffer to JSON and save it in file as following:
const crypto = require('crypto');
const fs = require('fs');
const wstream = fs.createWriteStream('myBinaryFile');
// creates random Buffer of 100 bytes
const buffer = crypto.randomBytes(100);
wstream.write(JSON.stringify(buffer));
wstream.end();
and if your application logic doesn't require to use sync nature you should not use writeFileSync due to it will block your code until it will end so be careful.
try instead using writeFile or Streams it's more convenient.
The purpose of Protocol Buffers is to serialize strongly typed messages to binary format and back into messages. If you want to write a message from memory into a file, first serialize the message into binary and then write binary to a file.
NodeJS Buffer docs
NodeJS write binary buffer into a file
Protocol Buffers JavaScript SDK Docs
It should look something like this:
const buffer = messageInstance.serializeBinary()
fs.writeFile("filename.pb", buffer, "binary", callback)
I found how to easily save protobuf object in a file.
Convert the protobuf object into buffer and then save it.
const protobuf = somefunction(); // returning protobuf object
const buffer = protobuf.toBuffer();
fs.writeFileSync("filename.pb", buffer);

Extract WAV header on javascript frontend (ReactJS)

I'm trying to analyze a file I'll be uploading from react, I need to know if it can be uploaded based on several factors.
I found https://github.com/TooTallNate/node-wav
It works great on nodejs and I'm trying to use it on react. The sample creates a readable stream and pipes it to the wav reader.
var fs = require('fs');
var wav = require('wav');
var file = fs.createReadStream('track01.wav');
var reader = new wav.Reader();
// the "format" event gets emitted at the end of the WAVE header
reader.on('format', function (format) {
//Format of the file
console.log(format);
});
file.pipe(reader);
Using FilePond controller I'm able to get a base64 string of the file. But I can't figure out how to pass it to the reader
this is what I have so far on ReactJS:
var reader = new wav.Reader();
reader.on('format', function (format) {
//Format of file
console.log('format', format);
});
const buffer = new Buffer(base64String, 'base64')
const readable = new Readable()
readable._read = () => { }
readable.push(buffer)
readable.push(null)
readable.pipe(reader)
But I get Error: bad "chunk id": expected "RIFF" or "RIFX", got "u+Zj"
Since this file works on NodeJS with the same lib is obvious I'm doing something wrong.
EDIT:
this was a problem with my Base64 string, this method works if anyone needs to analyze a wav on the frontend

File read from angularjs and convert base64 and push into gitlab

multiple zip File read and display in angularjs and those files convert base64 in nodejs and push into gitlab. please suggest me if it possible in nodejs. is there any blug available for reference.
use fs module of nodejs to read the files from directory
const testFolder = './tests/';
const fs = require('fs');
fs.readdirSync(testFolder).forEach(file => {
console.log(file);
});
once you get the files you can covert to base64
function base64_encode(file) {
// read binary data
var bitmap = fs.readFileSync(file);
// convert binary data to base64 encoded string
return new Buffer(bitmap).toString('base64');
}

Can't write/append to JSON file in Node Webkit

I want to have persistent memory (store the user's progress) in a .json file in %AppData%. I tried doing this according to this post, but it doesn't work. For testing purposes I'm only working with storing one object.
The code below doesn't work at all. If I use fs.open(filePath, "w", function(err, data) { ... instead of readFile(..., it does create a json file in %AppData%, but then it doesn't write anything to it, it's always 0 bytes.
var nw = require('nw.gui');
var fs = require('fs');
var path = require('path');
var file = "userdata.json";
var filePath = path.join(nw.App.dataPath, file);
console.log(filePath); // <- This shows correct path in Application Data.
fs.readFile(filePath ,function (err, data) {
var idVar = "1";
var json = JSON.parse(data);
json.push("id :" + idVar);
fs.writeFile(filePath, JSON.stringify(json));
});
If anyone has any idea where I'm messing this up, I'd be grateful..
EDIT:
Solved, thanks to kailniris.
I was simply trying to parse an empty file
There is no json in the file you try to read. Before parsing data check if the file is empty. If it is then create an empty json, push the new data into it then write it to the file else parse the json in the file.

How can I stream an image from node-webshot to filepicker

Node webshot is used to take a picture of an external website. The node webshot API is:
var webshot = require('webshot');
var fs = require('fs');
webshot('google.com', function(err, renderStream) {
var file = fs.createWriteStream('google.png', {encoding: 'binary'});
renderStream.on('data', function(data) {
file.write(data.toString('binary'), 'binary');
});
});
I am confused about file.write. Is the file being stored in the file object?
I want to be able to use filepickers rest API to upload the image like so:
curl -X POST -F fileUpload=#filename.txt https://www.filepicker.io/api/store/S3?key=MY_API_KEY
But I am confused as how to integrate webshot with renderStream with filepicker without saving the file to disk first. When the file is in memory I want to immediately send it to filepicker then get rid of it from memory.
Is this possible? Thanks!
I'm not sure about filepicker, but here is an example that streams the file to S3.
https://github.com/brenden/node-webshot/issues/90

Resources