Saving Data URI as Image? - node.js

On a node server I would like to save uploaded datauri data as an image. To do this I've tried decoding the content of this png-
data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAMAAAACCAIAAAFlEcHbAAAAB3RJTUUH1gMWFjk7nUWcXQAAAAlwSFlzAABOIAAATiABFn2Z3gAAAARnQU1BAACxjwv8YQUAAAAeSURBVHjaY7h79y7DhAkTGIA04/Tp0xkYGJ49ewYAgYwLV/R7bDQAAAAASUVORK5CYII=
And saving it as a .png extension. Looks like there is more too it than that. How do I decode the datauri and save it as a file?

I've created a library to be used with Node.js that helps with encoding and decoding of data URI schemes. I believe it can help you, check:
https://github.com/DiegoZoracKy/image-data-uri
Using this library, in your case, the code would be:
'use strict';
const ImageDataURI = require('image-data-uri');
const dataURI = 'data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAMAAAACCAIAAAFlEcHbAAAAB3RJTUUH1gMWFjk7nUWcXQAAAAlwSFlzAABOIAAATiABFn2Z3gAAAARnQU1BAACxjwv8YQUAAAAeSURBVHjaY7h79y7DhAkTGIA04/Tp0xkYGJ49ewYAgYwLV/R7bDQAAAAASUVORK5CYII=';
const fileName = 'decoded-image.png';
ImageDataURI.outputFile(dataURI, filePath);

I was trying to decode the data using atob and saving this as a png file. I'm instead saving it base64 encoded but specifying the encoding in the write buffer.
fs.writeFileSync('tmp/myfile.png', new Buffer(data, 'base64'));

You can convert your data uri to a blob using below code:
function dataURItoBlob(dataURI) {
var byteStr;
if (dataURI.split(',')[0].indexOf('base64') >= 0)
byteStr = atob(dataURI.split(',')[1]);
else
byteStr = unescape(dataURI.split(',')[1]);
var mimeStr = dataURI.split(',')[0].split(':')[1].split(';')[0];
var arr= new Uint8Array(byteStr.length);
for (var i = 0; i < byteStr.length; i++) {
arr[i] = byteStr.charCodeAt(i);
}
return new Blob([arr], {type:mimeStr});
}
and then you can append this blob data to from data and upload it as a file:
var blob = dataURItoBlob(dataURI);
var fd = new FormData(document.forms[0]);
fd.append("image", blob);

Related

How to convert image.png to binary in NodeJS?

I am trying to consume Azure Forms Recognizer API, where I have to provide the body in the form of "[Binary PNG data]" as stated here.
The connection seems the be working fine, however I am getting this response:
{"error":{"code":"InvalidImage","innerError":{"requestId":"73c86dc3-51a3-48d8-853b-b6411f54c51e"},"message":"The input data is not a valid image or password protected."}}
I am using a png that is my local directory and I've tried converting it in many different ways including:
fs.readFile('test.png', function(err, data){
if (err) throw err;
// Encode to base64
let encodedImage = new Buffer(data, 'binary').toString('base64');
// Decode from base64
var decodedImage = new Buffer(encodedImage, 'base64').toString('binary');});
or
let data_string = fs.createReadStream('test.png');
and many others. None of them seem to work and I always get the same response from my post request.
I would appreciate if anyone could share how to convert this png into the correct format. Thank you in advance
To base 64:
const file = fs.readFileSync('/some/place/image.png')
const base64String = Buffer.from(file).toString('base64')
Then pass the base64String to Azure
If you want just a BLOB so a binary file, you can do this
const file = fs.readFileSync('/some/place/image.png')
const blob = Buffer.from(file)
const processFile = (file: any) => {
const reader = new FileReader();
reader.readAsArrayBuffer(file);
reader.onload = function(){
const binaryData = Buffer.from(reader.result as string,'binary');
console.log(binaryData);
};
}

Cannot get back the DICOM metadata after decode from base64 string

I am sending DICOM images to my API by encoding as base64 from the frontend, which is in Angular CLI. Also, I have Rest API to get those encoded DICOM images and decode them back before had some process with them. But after decoding the DICOM image into the memory stream, metadata of DICOM images are lost. It is appreciatable if I got a better solution. Please find my codes below.
//Angular code
var file = event.dataTransfer ? event.dataTransfer.files[i] :
event.target.files[0];
//var pattern = /.dcm/;
var reader = new FileReader();
reader.onload = this._handleReaderLoaded.bind(this);
reader.readAsDataURL(file);
//Web API Code
[HttpPost("UploadFile/{Id}")]
public async Task<IActionResult> UploadFile(int Id, [FromBody] DICOMFiles
dicomfiles)
{
String base64Encoded = encodedImage;
string output =
encodedImage.Substring(encodedImage.IndexOf(',') + 1);
byte[] data = Convert.FromBase64String(output);
MemoryStream stream = new MemoryStream(data);
client.UploadFile(stream, "Projects/test_images/Test.dcm");
}
At last, I found a solution for this. The problem is not about decode from base64. The actual problem is with the client.UploadFile() method call.
Before using the client.uploadfile(), we need to make sure that the memory stream object is pointing to position "0". This will allow the client.UploadFile() method to create and write all the content of the mentioned file from the start of the byte[] array. we can do this as mentioned below.
stream.Position = 0;

NodeJS: Updating Exif data and saving image using PIEXIF

I need to update orientation tag(EXIF data) for the uploaded image. I am using "PIEXIF" for this. I am not using express but swagger. The code I've written is:
//Get the uploaded buffer
var _originalBuffer = req.swagger.params.uploadedFile.value.buffer;
let Duplex = require('stream').Duplex;
//Create stream from buffer. This stream is required later to send to cloud.
let _uploadedFileStream = new Duplex();
_uploadedFileStream.push(_originalBuffer);
_uploadedFileStream.push(null);
//Create base 64 string so that "PIEXIF" can read exif data from it.
const jpegData = "data:image/jpeg;base64, " + createStringFromBuffer(_originalBuffer, 'base64');
//Read exif data.
var _exifData = piexif.load(jpegData);
//Create a copy of exif data. Will be used to create a new image with updated orientation tag.
var _exifDataCopied = {};
for (var key in _exifData) {
_exifDataCopied[key] = _exifData[key];
}
//Update orientation tag.
if (_exifDataCopied["0th"][piexif.ImageIFD.Orientation])
_exifDataCopied["0th"][piexif.ImageIFD.Orientation] = 1;
//Example taken from https://www.npmjs.com/package/piexifjs
//From here onwards, there seems to be an issue.
var exifbytes = piexif.dump(_exifDataCopied);
var newData = piexif.insert(exifbytes, createStringFromBuffer(_originalBuffer, 'binary'));
var newJpeg = new Buffer(newData);
//Create a new stream and save it as image back.
let _updatedFileStream = new Duplex();
_updatedFileStream.push(newJpeg);
_updatedFileStream.push(null);
var fs = require('fs');
var writeStream = fs.createWriteStream("./uploads/" + "Whatever.jpg")
The issue is there is no error thrown by the code. The image is also getting saved in the directory but it is corrupted. I can not preview it. Since, the code does not breaks anywhere, I am confused what could be the issue? The function to convert buffer to string with different encoding(since I need it a lot) is:
var createStringFromBuffer = function(_buffer, _encoding) {
return Buffer.from(_buffer).toString(_encoding);
}
Can someone point out where I am mistaking? I am using the example given Here

Gzip and upload to Azure blob

I am trying to gzip and upload a static js file to an Azure blob, but I end up with a 0 byte blob. Can someone tell me what I'm doing wrong?
var filePath = "C:\test.js";
using (var compressed = new MemoryStream()) {
using (var gzip = new GZipStream(compressed, CompressionMode.Compress)) {
var bytes = File.ReadAllBytes(filePath);
gzip.Write(bytes, 0, bytes.Length);
var account = CloudStorageAccount.Parse("...");
var blobClient = new CloudBlobClient(account.BlobEndpoint, account.Credentials);
var blobContainer = blobClient.GetContainerReference("temp");
var blob = blobContainer.GetBlockBlobReference(Path.GetFileName(filePath));
blob.Properties.ContentEncoding = "gzip";
blob.Properties.ContentType = "text/javascript";
blob.Properties.CacheControl = "public, max-age=3600";
blob.UploadFromStream(compressed);
}
}
You need to reset the stream position to zero on the compressed stream. You've written data into the stream, but when you go to upload from the stream on the last line the position of the stream is at the end of the data you have written, so the blob.UploadFromStream starts from the current position in the stream, which has nothing after it.
add the following before you perform the upload:
compressed.Position = 0;
That should upload the full contents of the stream.
This isn't necessarily an Azure thing, most code that works with streams operate of the current position for the stream. I've been burned by it several times.

How do I parse a data URL in Node?

I've got a data URL like this:
data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAA...
What's the easiest way to get this as binary data (say, a Buffer) so I can write it to a file?
Put the data into a Buffer using the 'base64' encoding, then write this to a file:
var fs = require('fs');
var string = "data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAUAAAAFCAYAAACNbyblAAAAHElEQVQI12P4//8/w38GIAXDIBKE0DHxgljNBAAO9TXL0Y4OHwAAAABJRU5ErkJggg==";
var regex = /^data:.+\/(.+);base64,(.*)$/;
var matches = string.match(regex);
var ext = matches[1];
var data = matches[2];
var buffer = Buffer.from(data, 'base64');
fs.writeFileSync('data.' + ext, buffer);
Try this
const dataUrl = "data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAUAAAAFCAYAAACNbyblAAAAHElEQVQI12P4//8/w38GIAXDIBKE0DHxgljNBAAO9TXL0Y4OHwAAAABJRU5ErkJggg==";
const buffer = Buffer.from(dataUrl.split(",")[1], 'base64');
I also met such questions (parsing and validating data URL) recently and found the following workaround: https://gist.github.com/bgrins/6194623
I created 2 packages to make working with data URL easier in the code. Here they are:
https://github.com/killmenot/valid-data-url
https://github.com/killmenot/parse-data-url
Check out examples
I was looking into the sources of Node.js and stumbled upon this code that decodes a data URL into a Buffer. Although the function is not public and exclusively intended to parse encoded ES modules, it sheds light on aspects of data URLs that are apparently not considered by some other answers: the content of data URLs must not be base64 encoded and may be URL encoded, and it may even be unencoded.
Essentially, the Node.js logic boils down to something like the code below plus error handling:
const parsed = new URL(url);
const match = /^[^/]+\/[^,;]+(?:[^,]*?)(;base64)?,([\s\S]*)$/.exec(parsed.pathname);
const { 1: base64, 2: body } = match;
const buffer = Buffer.from(decodeURIComponent(body), base64 ? 'base64' : 'utf8');
This will correctly handle different encodings of a Javascript file with the content console.log("Node.js");:
data:text/javascript,console.log("Node.js");
data:text/javascript,console.log(%22Node.js%22)%3B
data:text/javascript;base64,Y29uc29sZS5sb2coIk5vZGUuanMiKTs=
The resulting buffer can be converted into a string if required with buffer.toString().
This method works for me
function dataURItoBlob(dataURI) {
// convert base64 to raw binary data held in a string
var data = dataURI.split(',')[1];
var byteString = Buffer.from(data, "base64");
// separate out the mime component
var mimeString = dataURI.split(",")[0].split(":")[1].split(";")[0];
// write the ArrayBuffer to a blob, and you're done
var blob = new Blob([byteString], { type: mimeString });
return blob;
}
to use
var uri = 'data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAUAAAAFCAYAAACNbyblAAAAHElEQVQI12P4//8/w38GIAXDIBKE0DHxgljNBAAO9TXL0Y4OHwAAAABJRU5ErkJggg==';
dataURItoBlob(uri)

Resources