insert binary img as html to pdf with node html-pdf - node.js

I try to insert a binary image to html to generate a PDF from html doc with the node module html-pdf.
According to other questions I tried the following code:
const pictureHtml = `<img src="data:image/png;base64","${binaryPicture}">`;
The picture is stored in mongoDB as datatype Binary.
If not possible with the module html-pdf, can you suggest a different module?

img src must be base64string. We need convert binaryPicture to base64string .We have a code like this
var base64data = Buffer.from(binaryPicture, 'binary').toString('base64');
const pictureHtml = `<img src="data:image/png;base64","${base64data}">`;

Related

How to upload video or image file to database

I am working in a project in which i need to create api using nodejs to upload video or image file to the postgres database
How i can do this
Thanks a lot
You can convert an image or video to base64, and then upload the base64 encoded string to your database.
const fs = require('fs');
function encode_base64(file) {
const bitmap = fs.readFileSync(file);
return new Buffer(bitmap).toString('base64');
}
const image = encode_base64('image.jpg');
const video = encode_base64('image.mp4');

Save binary image to file

I make an API request which returns a binary image. How can I save it to a file like photo.png on my machine? Doing some research, I've tried the following but when I open the image, my machine says it's damaged:
const buffer = new Buffer(imageBinary);
const b64 = buffer.toString("base64");
const path = `temp/${userId}`;
const url = path + "/photo.png";
if (!fs.existsSync(path)) fs.mkdirSync(path);
if (fs.existsSync(url)) fs.unlinkSync(url)
fs.createWriteStream(url).write(b64);
return url;
Edit: Here is the binary data FYI: https://gist.github.com/AskYous/1fd26dc0eb02b4ec1672dcf5c61a34df
You do not need to re-encode the buffer as base64. Just write the binary buffer as is:
fs.createWriteStream(url).write(imageBinary);

Node.js pipe image into memory and display it

I am making a Node.js/Electron application that downloads and displays images. I am downloading an image from the internet using request. I want to save this image in memory and display it without saving the file on the local hard drive. I am aware I could accomplish what I am asking here by inserting <img src="url"> into the DOM, but this is a heavily simplified version of my code and what I am trying to accomplish can not be done with that.
var image = fs.createWriteStream(?); // Is there anyway to save the image to memory?
request(url).pipe(image);
$('img').exampleScriptToSetImage(image); // And is there any way to display that image in a element without saving it to the local disk and loading its path?
Indeed! You can pipe your requested image data into a concat stream, convert the buffer to base64, and set the base64 encoding as your image source.
var path = require('path')
var request = require('request') // or hyperquest, simple-get, etc...
var concat = require('concat-stream')
var image = request(url)
image.pipe(concat({ encoding: 'buffer' }, function (buf) {
var ext = path.extname(file.name).slice(1) // assuming you have the file name
if (ext === 'svg') ext = 'svg+xml'
var src = 'data:image/' + ext + ';base64,' + buf.toString('base64')
var img = document.createElement('img')
img.src = src
document.body.appendChild(img)
})

How to create dynamic template in fs.createWriteStream?

This is my code. I am using pdfkit.
So, instead of sending a text I want to send an HTML template with dynamic data.
Right now I am using doc.text('my text11111').
Can we replace it with a template?
var PDFDocument = require('pdfkit');
var doc = new PDFDocument({
size: 'letter'
});
doc.pipe(fs.createWriteStream('will.pdf'));
doc.text('my text11111')
doc.end();
To send raw html data I suggest you use pdfkitjs npm module. It is inspired by pdfkit.
It's usage is like this
new PDFKit('html', '<h1>Hello</h1>')
Please see the module here

How to load large json model into three.js project

I am trying to load building models into three.js. The format of models is JSON file generated by RvtVa3c, which is an add-in for Revit to generating JSON output. Then I used THREE.ObjectLoader() to load the model into three.js just like this example json loader three.js.
Everything is fine with a model with a size below 100MB. As I tried to load a 200MB model, Chrome throws "Aw, snap" error page. And Firefox throws allocation size overflow. Because the THREE.ObjectLoader() uses XHR to read the json file into String in one time and I guess the size of String is to large for java-script. The length of String is over 200,000,000 with 100MB JSON.
So I am seeking a way of loading the JSON file by stream. JSONStream in Node.js can handle 200MB JSON. Code example is displayed below.
var fs = require('fs'),
JSONStream = require('JSONStream'),
es = require('event-stream');
var getStream = function () {
var jsonData = 'buildingModel.js',
stream = fs.createReadStream(jsonData, {encoding: 'utf8'}),
parser = JSONStream.parse('*');
return stream.pipe(parser);
};
getStream()
.pipe(es.mapSync(function (data) {
console.log(data);
}));
As browser can not used require(/module/) . I tried used browserify to bundle JSONStream into my codes. However, fs can not be browserify.
Here is my questions:
Is using stream the best way to load extreme large JSON model in three.js? If not, what is the better solution?
Is fs can be browserify. Or I can create readable stream in browser with other ways?
Thank you for answering my quesitons.

Resources