NodeJS: Unable to convert stream/buffer to base64 string - node.js

I need to create base64 string that I need to send to a third party API. I have the stream and buffer. Form stream I am able to create an image so there is no way the stream is corrupted. Here are the two variables:
var newJpeg = new Buffer(newData, "binary");
var fs = require('fs');
let Duplex = require('stream').Duplex;
let _updatedFileStream = new Duplex();
_updatedFileStream.push(newJpeg);
_updatedFileStream.push(null);
No matter whatever I try, I can not convert either of them in base64 string.
_updatedFileStream.toString('base64');
Buffer(newJpeg, 'base64');
Buffer(newData, 'base64');
None of the above works. Sometimes I get Uint8Array[arraySize] or Gibberish string. What am I doing wrong?

Example using promises (but could easily be adapted to other approaches):
return new Promise((resolve, reject) => {
let buffers = [];
let myStream = <...>;
myStream.on('data', (chunk) => { buffers.push(chunk); });
myStream.once('end', () => {
let buffer = Buffer.concat(buffers);
resolve(buffer.toString('base64'));
});
myStream.once('error', (err) => {
reject(err);
});
});

Related

Extracting initVect from node stream a deciphering the rest

I have readableStream with an encrypted file and I would like to decrypt it. So far I was able to create 2 readableStreams - with 1 I extracted the IV and the second was used to decrypt the rest of the data, but I would like to just pipe it into one stream - both IV extraction AND the decryption. Something like this:
function getDecryptionStream(password, initVectLength = 16) {
const cipherKey = getCipherKey(password);
const unzip = zlib.createUnzip();
return new Transform({
transform(chunk, encoding, callback) {
if (!this.initVect) {
this.initVect = chunk.subarray(0, initVectLength);
chunk = chunk.subarray(initVectLength, chunk.length);
}
const decipher = crypto.createDecipheriv(algorithm, cipherKey, this.initVect);
callback(null, chunk.pipe(decipher).pipe(unzip));
}
});
}
function decrypt({ file, newFile, passphrase }) {
const readStream = fs.createReadStream(file);
const writeStream = fs.createWriteStream(newFile);
const decryptStream = getDecryptionStream(passphrase , IV_LENGTH);
readStream
.pipe(decryptStream)
.pipe(writeStream);
}
However I cannot figure out, how to process the chunk as chunk.pipe(decipher) throws an error. TypeError: chunk.pipe is not a function as it as Buffer

Download a file to a base64 string

I'm running Node.js code on a readonly file system and I would like to download a file and convert this file directly to a base64 string but without writing the file on the disk.
Now I have the following:
let file = fs.createWriteStream(`file.jpg`);
request({
uri: fileUrl
})
.pipe(file).on('finish', () => {
let buff = fs.readFileSync(file);
let base64data = buff.toString('base64');
})
But this solution is writing on the disk so this is not possible for me.
I would like to do the same but without the need of the temp file on the disk. Is it possible?
You don't pipe() into a variable. You collect the data off the stream into a variable as the data arrives. I think you can do something like this:
const Base64Encode = require('base64-stream').Base64Encode;
const request = require('request');
let base64Data = "";
request({
uri: fileUrl
}).pipe(new Base64Encode()).on('data', data => {
base64Data += data;
}).on('finish', () => {
console.log(base64Data);
}).on('error', err => {
console.log(err);
});

Node.js copy a stream into a file without consuming

Given a function parses incoming streams:
async onData(stream, callback) {
const parsed = await simpleParser(stream)
// Code handling parsed stream here
// ...
return callback()
}
I'm looking for a simple and safe way to 'clone' that stream, so I can save it to a file for debugging purposes, without affecting the code. Is this possible?
Same question in fake code: I'm trying to do something like this. Obviously, this is a made up example and doesn't work.
const fs = require('fs')
const wstream = fs.createWriteStream('debug.log')
async onData(stream, callback) {
const debugStream = stream.clone(stream) // Fake code
wstream.write(debugStream)
const parsed = await simpleParser(stream)
// Code handling parsed stream here
// ...
wstream.end()
return callback()
}
No you can't clone a readable stream without consuming. However, you can pipe it twice, one for creating file and the other for 'clone'.
Code is below:
let Readable = require('stream').Readable;
var stream = require('stream')
var s = new Readable()
s.push('beep')
s.push(null)
var stream1 = s.pipe(new stream.PassThrough())
var stream2 = s.pipe(new stream.PassThrough())
// here use stream1 for creating file, and use stream2 just like s' clone stream
// I just print them out for a quick show
stream1.pipe(process.stdout)
stream2.pipe(process.stdout)
I've tried to implement the solution provided by #jiajianrong but was struggling to get it work with a createReadStream, because the Readable throws an error when I try to push the createReadStream directly. Like:
s.push(createReadStream())
To solve this issue I have used a helper function to transform the stream into a buffer.
function streamToBuffer (stream: any) {
const chunks: Buffer[] = []
return new Promise((resolve, reject) => {
stream.on('data', (chunk: any) => chunks.push(Buffer.from(chunk)))
stream.on('error', (err: any) => reject(err))
stream.on('end', () => resolve(Buffer.concat(chunks)))
})
}
Below the solution I have found using one pipe to generate a hash of the stream and the other pipe to upload the stream to a cloud storage.
import stream from 'stream'
const Readable = require('stream').Readable
const s = new Readable()
s.push(await streamToBuffer(createReadStream()))
s.push(null)
const fileStreamForHash = s.pipe(new stream.PassThrough())
const fileStreamForUpload = s.pipe(new stream.PassThrough())
// Generating file hash
const fileHash = await getHashFromStream(fileStreamForHash)
// Uploading stream to cloud storage
await BlobStorage.upload(fileName, fileStreamForUpload)
My answer is mostly based on the answer of jiajianrong.

Get PDFKit as base64 string

I'm searching a way to get the base64 string representation of a PDFKit document. I cant' find the right way to do it...
Something like this would be extremely convenient.
var doc = new PDFDocument();
doc.addPage();
doc.outputBase64(function (err, pdfAsText) {
console.log('Base64 PDF representation', pdfAsText);
});
I already tried with blob-stream lib, but it doesn't work on a node server (It says that Blob doesn't exist).
Thanks for your help!
I was in a similar predicament, wanting to generate PDF on the fly without having temporary files lying around. My context is a NodeJS API layer (using Express) which is interacted with via a React frontend.
Ironically, a similar discussion for Meteor helped me get to where I needed. Based on that, my solution resembles:
const PDFDocument = require('pdfkit');
const { Base64Encode } = require('base64-stream');
// ...
var doc = new PDFDocument();
// write to PDF
var finalString = ''; // contains the base64 string
var stream = doc.pipe(new Base64Encode());
doc.end(); // will trigger the stream to end
stream.on('data', function(chunk) {
finalString += chunk;
});
stream.on('end', function() {
// the stream is at its end, so push the resulting base64 string to the response
res.json(finalString);
});
Synchronous option not (yet) present in the documentation
const doc = new PDFDocument();
doc.text("Sample text", 100, 100);
doc.end();
const data = doc.read();
console.log(data.toString("base64"));
I just made a module for this you could probably use. js-base64-file
const Base64File=require('js-base64-file');
const b64PDF=new Base64File;
const file='yourPDF.pdf';
const path=`${__dirname}/path/to/pdf/`;
const doc = new PDFDocument();
doc.addPage();
//save you PDF using the filename and path
//this will load and convert
const data=b64PDF.loadSync(path,file);
console.log('Base64 PDF representation', pdfAsText);
//you could also save a copy as base 64 if you wanted like so :
b64PDF.save(data,path,`copy-b64-${file}`);
It's a new module so my documentation isn't complete yet, but there is also an async method.
//this will load and convert if needed asynchriouniously
b64PDF.load(
path,
file,
function(err,base64){
if(err){
//handle error here
process.exit(1);
}
console.log('ASYNC: you could send this PDF via ws or http to the browser now\n');
//or as above save it here
b64PDF.save(base64,path,`copy-async-${file}`);
}
);
I suppose I could add in a convert from memory method too. If this doesn't suit your needs you could submit a request on the base64 file repo
Following Grant's answer, here is an alternative without using node response but a promise (to ease the call outside of a router):
const PDFDocument = require('pdfkit');
const {Base64Encode} = require('base64-stream');
const toBase64 = doc => {
return new Promise((resolve, reject) => {
try {
const stream = doc.pipe(new Base64Encode());
let base64Value = '';
stream.on('data', chunk => {
base64Value += chunk;
});
stream.on('end', () => {
resolve(base64Value);
});
} catch (e) {
reject(e);
}
});
};
The callee should use doc.end() before or after calling this async method.

Node.JS Convert Base64 String into Binary and write to MongoDB GridFS

I have a Base64 string that I am converting to binary like this:
var b64string = req.body.image.substr(23);//The base64 has a imageURL
var buf = new Buffer(b64string, 'base64');
I need to insert this into MongoDB GridFS. The problem I am having is that createReadStream require a filepath where I already have the file in memory.
This is what I am trying that does not work
var grid = new gfs(db, mongo, 'files');
grid.createWriteStream(options, function (err, ws) {
fs.createReadStream(buf, {autoClose: true})
.pipe(ws)
.on('close', function (f) {
console.log(f._id)
res.send(f._id)
})
.on('error', function (err) {
console.log(err)
})
})
But as I described, it wants a path where I have buf
UPDATE ---
I was over thinking it...
This works
var b64string = req.body.image.substr(23);
var buf = new Buffer(b64string, 'base64');
var grid = new Grid(db, 'files');
grid.put(buf, {}function(err, file){})

Resources