How to read remote image into memory? - node.js

I must read image file into memory on Firebase Cloud Function so I can convert to base64. But I get error:
Error: ENOENT: no such file or directory
// Read the file into memory.
const fs = require('fs');
const imageFile = fs.readFileSync('https://upload.wikimedia.org/wikipedia/commons/f/f7/Lower_Manhattan_skyline_-_June_2017.jpg');
// Convert the image data to a Buffer and base64 encode it.
const encoded = Buffer.from(imageFile).toString('base64');
How I can read remote image file into memory?

Related

Get contents of in-memory zip archive without saving the zip

I'm getting a ZIP archive from S3 using the aws s3 node SDK.
In this zip file there is a single .json file where I want to get the contents from. I don't want to save this file to storage, but only get the contents of this zip file.
Example:
File.zip contains a single file:
file.json with contents({"value":"abcd"})
I currently have:
const { S3Client, GetObjectCommand} = require("#aws-sdk/client-s3");
const s3Client = new S3Client({ region: 'eu-central-1'});
const file = await s3Client.send(new GetObjectCommand({Bucket:'MyBucket', Key:'file.zip'}));
file.body now contains a Readable stream with the contents of the zip file. I now want to transfer this Readable stream into {"value":"abcd"}
Is there a library or piece of code that can help me do this and produce the result without having to save the file to disk?
You could use the package archiver or zlib (zlib is integrated in nodejs)
a snippet from part of my project looks like this:
import { unzipSync } from 'zlib';
// Fetch data and get buffer
const res = await fetch('url')
const zipBuffer = await res.arrayBuffer()
// Unzip data and convert to utf8
const unzipedBuffer = await unzipSync(zipBuffer)
const fileData = unzipedBuffer.toString('utf8')
Now fileData is the content of your zipped file as a string, you can use JSON.parse(fileData) to get the content as a json and work with it

How to save returned protobuf object in nodejs?

In my code, a function is returning a protobuf object and I want to save it in a file xyz.pb.
When I am trying to save it using fs.writefilesync it is not saving it.
It is circular in nature. So, I tried to save it using circular-json module to confirm if there is anything inside it and it has data.
But, as I used circular-json in the first place it doesn't have the proper information(not properly formatted) and it is of no use.
How can I save this protobuf in a file using nodejs?
Thanks!
you can try to use streams like mentioned in documentation
as following
const crypto = require('crypto');
const fs = require('fs');
const wstream = fs.createWriteStream('fileWithBufferInside');
// creates random Buffer of 100 bytes
const buffer = crypto.randomBytes(100);
wstream.write(buffer);
wstream.end();
or you can convert the buffer to JSON and save it in file as following:
const crypto = require('crypto');
const fs = require('fs');
const wstream = fs.createWriteStream('myBinaryFile');
// creates random Buffer of 100 bytes
const buffer = crypto.randomBytes(100);
wstream.write(JSON.stringify(buffer));
wstream.end();
and if your application logic doesn't require to use sync nature you should not use writeFileSync due to it will block your code until it will end so be careful.
try instead using writeFile or Streams it's more convenient.
The purpose of Protocol Buffers is to serialize strongly typed messages to binary format and back into messages. If you want to write a message from memory into a file, first serialize the message into binary and then write binary to a file.
NodeJS Buffer docs
NodeJS write binary buffer into a file
Protocol Buffers JavaScript SDK Docs
It should look something like this:
const buffer = messageInstance.serializeBinary()
fs.writeFile("filename.pb", buffer, "binary", callback)
I found how to easily save protobuf object in a file.
Convert the protobuf object into buffer and then save it.
const protobuf = somefunction(); // returning protobuf object
const buffer = protobuf.toBuffer();
fs.writeFileSync("filename.pb", buffer);

How do you add a header to wav file?

I am sending audio data stored as a blob to my backend (node/express). When I save the file as .wav and attempt to use in the SpeechRecogition package in python it throws an error saying the "file does not start with RIFF id". So how can I add the headers to my blob file before I save it so that it is a correctly formatted .wav file? I can provide the code if necessary.
node.js file
var multer = require('multer');
var fs = require('fs'); //use the file system so we can save files
var uniqid = require('uniqid');
var spawn = require('child_process').spawn;
const storage = multer.memoryStorage()
var upload = multer({ storage: storage });
router.post('/api/test', upload.single('upl'), function (req, res) {
console.log(req.file);
console.log(req.file.buffer);
var id = uniqid();
fs.writeFileSync(id+".wav", Buffer.from(new Uint8Array(req.file.buffer))); //write file to server as .wav file
const scriptPath = 'handleAudio.py'
const process = spawn('python3', [__dirname+"/../"+scriptPath, "/home/bitnami/projects/sample/"+id+".wav", req.file.originalname, 'True']); //throws error about header in .wav
});
Also I had this same example working with a php endpoint that just saved the blob to a file with .wav extension and the python file accepted it. What could be different in the move_uploaded_file in php and what I am doing above with node?
Every .wav file needs a header specified by the WAVE file format, available here. While it's fine for you to build the header yourself, it's much easier to just use a proper lib to do the work for you.
One example is node-wav, which has a nice API to write WAVE files from raw PCM data (what you have at the moment). Example code is provided by the node-wav documentation.

Save binary image to file

I make an API request which returns a binary image. How can I save it to a file like photo.png on my machine? Doing some research, I've tried the following but when I open the image, my machine says it's damaged:
const buffer = new Buffer(imageBinary);
const b64 = buffer.toString("base64");
const path = `temp/${userId}`;
const url = path + "/photo.png";
if (!fs.existsSync(path)) fs.mkdirSync(path);
if (fs.existsSync(url)) fs.unlinkSync(url)
fs.createWriteStream(url).write(b64);
return url;
Edit: Here is the binary data FYI: https://gist.github.com/AskYous/1fd26dc0eb02b4ec1672dcf5c61a34df
You do not need to re-encode the buffer as base64. Just write the binary buffer as is:
fs.createWriteStream(url).write(imageBinary);

File read from angularjs and convert base64 and push into gitlab

multiple zip File read and display in angularjs and those files convert base64 in nodejs and push into gitlab. please suggest me if it possible in nodejs. is there any blug available for reference.
use fs module of nodejs to read the files from directory
const testFolder = './tests/';
const fs = require('fs');
fs.readdirSync(testFolder).forEach(file => {
console.log(file);
});
once you get the files you can covert to base64
function base64_encode(file) {
// read binary data
var bitmap = fs.readFileSync(file);
// convert binary data to base64 encoded string
return new Buffer(bitmap).toString('base64');
}

Resources