Having problems handling UIImageJPEGRepresentation or UIImagePNGRepresentation in Node.js - node.js

TL;DR: How do you write UIImageRepresentation data into the actual file format in a Node.js server? (or any place outside of Swift at that)
.
.
So I'm in a bit of a predicament here...
I wanted to send a UIImageJPEGRepresentation (or any form of data encoded imagery) through Alamofire, over to a node server, to be saved there. I used Busboy to handle MultipartFormData...
busboy.on('field', function(fieldname, val, fieldnameTruncated, valTruncated, encoding, mimetype) {
var datamail = './storage/' + fieldname;
var stream = fs.createWriteStream(datamail)
console.log('Field [' + fieldname + ']: value: ' + util.inspect(val));
console.log('Storing letter in ' + datamail);
stream.write(val);
});
and save it through a write stream. I originally wanted to read a UIImage I had put in, but I wasn't sure how the server would respond to an object like that, so I went and used UIImageJPEGRepresentation. It read the UIImageJPEGRepresentation object right...
Field [test.png]: value:
'�PNG\r\n\u001a\n\u0000\u0000\u0000\rIHDR\u0000\u0000\u0000d\u0000\u0000\u0000d\b\u0002\u0000\u0000\u0000��\u0002\u0003\u0000\u0000\u0000\u0001sRGB\u0000��\u001c�\u0000\u0000\u0000\u001ciDOT\u0000\u0000\u0000\u0002\u0000\u0000\u0000\u0000\u0000\u0000\u00002\u0000\u0000\u0000(\u0000\u0000\u00002\u0000\u0000\u00002\u0000\u0000(aj\rp�\u0000\u0000(-IDATx\u0001��\u0007T\u0014��LJ�\u0005X�\u0002�,u�w�+����KԈ-�\u0016[Ԩ�$1&j�I�b�\u0002��*Mz�lﻔ��م\r���s��yΜ��\u0011�����\u000e"���%S��`r�I\u0014j�Ie�OL,W��d�OM��I��%R�H�\u0019_��\u001bl�M�]�U U�\t�d2�T��\u0017H�k|��\u0013㉤�\te`]\u0002)X��O�!�\f�v�XoȿD\nn�a���?�`��\u0013R��\u001f��\u000e �_\u0007`�=:\u0010B�JozRz����\t�\u001f��\u0013X:^�CX\u000f.<9�c�\r§C��¢�$R�W?��:\u001a�L�\u0016K��\u0001\u0013H���\u000f����`�\u0013e���K\'���\u001a,+X�e\u0005�YY\u000f\u0016��A�`=\u0000�\u000f4zF�,t8t��\r�\u0005;�\b�JUz^��^Y�\t�\u001f��;�8��\u0004K\u001f�`�\u0003�wLX`�����2����e:4�|\u0007�5�L���p\u0006��ڸ��*Ē~��\u0015`�b���k\u0010����\u000fK/���t��\u0012�DD�\bbـ�\u0006�v\u0010,�T�5����tZ����\u000e�����wť����^�뿠�D���~08l-SH���$J0��\u0006�a�\u0006\u0014���\u0012��$\u0012H�������\u000e�`Xz}��a}�)=A��O����A\n\u001a�\f��ХNLt\u0018&�\u0015C�\r^:��q�Â�?\u000f�/x��z^�H�\u000b��[X#J�Ԁ\u0001)�#�\u0005^#Wh����a�ݩ-�tp�O��D�\u0001��&��ucWmDצK-5�\u0002c\'�\u0002\u0016�w�X.\u0014�ty\u0000v>A\u0006�\u0017����m��g��\u0012�����\u0002v�N6�˸\u000e��=�\u0016\u0013��Â}x\fL/Z�zX IX�W\u000e�h#J"ш\u0015=\u0000K(�\u0000,���\u0000�\u0000�\u000e\u0013���\u0018��\u0005|\u0017�a_ʱ�CGG��/I���+,�\u0013u6�\u0007a�_P\u0000E�����\u000elbP��c��\u0002T�\u0001�(u=\u0000\\�ˁ�\u0003�k\u0014�n�H)����n�\u0007\n� L>�\u0001�U�P1���^�\u001f�\u0013b7�nô��`��>����t;���V�rF�d���\u001de\r&\u0005O�=�\u0016\u001f�\u0011ɡ\u0010�R>&%�\nj\b��\u0000d�H�ݩ\u0006�\u0012e�H�\'Q�bE\u001f_�˓�t��;E�\u000e��C��!Ay2�K��#\u0006��J\u0014�\u0012�\u0010\u0016�\u00142�\u001f1:�rD�\u0019`>>\u0018\u0016�v#�S�\bK\u0007np����R�\u001b��0\u0018\u0016\u0000�oa�Ru+\u0015�J�F#��H42�L�Ķz$RT$F�¾�vuE-/3�+%��EJS⫦�IM\u000f^�>~��4���K��7�W�o3��y��"EY������\u000b\u0015I��J\u0002?R\u0001rÄ�\r^:q���\u001a�/U�j�\u001dXBI\u0017��g�\u0004\u0016P�w`\r�\u0005�t�ғ¤$U�%�^��O�A�R\u0000��Ҧ(��H�j|��x?��Σ�;\u000fx7�u�r��ʍ΋��~�����\r��_;�]i��b�OW���|�z��[տݩ�������|Iu���\u000b��~�\u0002d...
was successfully able to save it into a .jpeg...
SO won't let me add a picture because my rep isn't high enough... click instead
to find that it instead was a corrupted image that I couldn't use. My app extracts frames out of the camera on the fly and converts them to UIImage(JPEG/PNG)Representation (see gist). The next best option seemed like to do a direct upload (since it worked in Postman) but Alamofire only supports direct Data objects or URL encoded things, and I don't think UIImageJPEGRepresentations can be directly sent. I really just want to know how to handle these objects.
Thanks in advance.

Fixed this one quicker than I thought I would. I placed a logger in my server code to check for the MIME type:
busboy.on('field', function(mimetype) {
// var datamail = /*path.join('.', 'storage', fieldname);*/ './storage/' + fieldname;
// var stream = fs.createWriteStream(datamail)
console.log(MIMETYPE: ' + mimetype);
});
...discovering a text/plain one. I went Alamofire and changed the params,
from this:
Alamofire.upload(
multipartFormData: { multipartFormData in
multipartFormData.append(compresso!, withName: "test.png")
},
to this:
Alamofire.upload(
multipartFormData: { multipartFormData in
// notice the MIME change
multipartFormData.append(compresso!, withName: "test.png", fileName: "file.jpg", mimeType: "image/png")
},
And it worked! It was able to safely go through processing and do its' thing. They should really change this in the Alamofire examples, as they use a PNG Representation and send it without the extra params. (see Uploading Data to a server.) It is stuff like that that could potentially keep a dev up at night...

Related

Sending data from Screen over Socket IO?

I have been googling around but cannot find a clear answer to this.
I am making a chrome extension which records tabs. The idea is to stream getUserMedia to the backend using Websockets (specifically Socket.io) where the backend writes to a file until a specific condition (a value in the backend) is set.
The problem is, I do not know how I would call the backend with a specific ID and also, how would I correctly write to the file without corrupting it?
You are sending the output from MediaRecorder, via a websocket, to your back end.
Presumably you do this from within MediaRecorder's ondataavailable handler.
It's quite easy to stuff that data into a websocket:
function ondataavailable ( event ) {
event.data.arrayBuffer().then ( buf => {socket.emit('media', buf) })
}
In the backend you must concatenate all the data you receive, in the order of reception, to a media file. This media file likely has a video/webm MIME type; if you give it the .webm filename extension most media players will handle it correctly. Notice that the media file will be useless if it doesn't contain all the Blobs in order. The first few Blobs contain metadata needed to make sense of the stream. This is easy to do by appending each received data item to the file.
Server-side you can use the socket.id attribute to make up a file name; that gives you a unique file for each socket connection. Something like this sleazy poorly-performing non-debugged not-production-ready code will do that.
io.on("connection", (socket) => {
if (!socket.filename)
socket.filename = path.join(__dirname, 'media', socket.id + '.webm))
socket.on('filename', (name) => {
socket.filename = path.join(__dirname, 'media', name + '.webm))
})
socket.on('media', (buf) => {
fs.appendFile(buf, filename)
})
})
On the client side you could set the filename with this.
socket.emit('filename', 'myFavoriteScreencast')

Upload a binary encoded audio file via ajax and save

I have an audio file saved locally that I want to read, upload to a server via ajax and then store on the server. Somewhere along this process the file gets corrupted such that the file that's saved on the server cannot be played.
I'll list simplified bits of code that show the process I'm going through so hopefully it'll be evident where I'm going wrong.
1) After audio is recorded (using getUserMedia and MediaRecorder), a local file is saved:
var audioData = new Blob(chunks, { type: 'audio/webm' });
var fileReader = new FileReader();
fileReader.onloadend = function() {
var buffer = this.result,
uint8Array = new Uint8Array(buffer);
fs.writeFile('path/to/file.webm', uint8Array, { flags: 'w' });
}
fileReader.readAsArrayBuffer(audioData);
2) Later this local file is read and sent to a server (using the library axios to send the ajax request)
fs.readFile('path/to/file.webm', 'binary', (err, data) => {
var formData = new FormData();
formData.append('file', new Blob([data], {type: 'audio/webm'}), 'file.webm');
axios.put('/upload', formData);
});
3) The server then handles this request and saves the file
[HttpPut]
public IActionResult Upload(IFormFile file)
{
using (var fileStream = new FileStream("path/to/file.webm", FileMode.Create))
{
file.CopyTo(fileStream);
}
}
The local audio file can be played successfully however the audio file on the server does not play.
I'm not sure if this is helpful information, but here are the first few lines of text I see when I open the local file in a text editor (notepad++):
And the same when I open the one on the server:
So kinda the same... but different. I've tried encoding a myriad of different ways but everything seems to fail. Fingers crossed someone can point me in the right direction here.
The problem was with how I was passing through the file contents from fs.readFile. If I passed a base64 encoded raw buffer from fs.readFile via json, converted that to a byte array on the server and saved that, then I can successfully play it on the server.
fs.readFile('path/to/file.webm', (err, data) => {
axios.put('/upload', { audioData: data.toString('base64') });
});
[HttpPut]
public IActionResult Upload([FromBody]UploadViewModel upload)
{
var audioDataBytes = Convert.FromBase64String(upload.AudioData);
using (var memoryStream = new MemoryStream(audioDataBytes))
using (var fileStream = new FileStream("path/to/file.webm", FileMode.Create))
{
await memoryStream.CopyToAsync(fileStream);
}
}
Actually, this is a problem of character encoding. You are probably mixing UTF-8 and ISO-8859 which causes the file to be corrupted.
You should probably set the charset in the HTML page to the one expected on the server. Or perform preliminary checks on the server if you do not know the charset of the data you will receive.
Converting to base64 will solve the issue because then it will only use characters in the ASCII range.

Fabric.js loadFromJSON sometimes fails in Node.js if string contains images

I have a problem with PNG image ganeration at server side, using Fabric.js + Node.js. I am wondering that there is no one with similar probem found in forums. I am in total despair. It makes under risk of using Fabric.js in our project.
PNG image generation in Fabric.js Node.js service fails on a unregular basis. I can not determine why sometimes it gets generated and sometimes not.
I need to generate PNG at server side. I’ve developed a small Node.js webservice based on samples here and here.
I’ve developed also a custom Fabric.js image class “RemoteImage”, based on Kangax sample here.
To minimize JSON string size, I am storing a dataless JSON in my database and images are supposed to be loaded using provide link in “src” attribute of the Fabric.js Image element. As the result, I need to load following JSON into canvas that contains 3 images:
{"objects":[{"type":"remote-image","originX":"left","originY":"top","left":44,"top":29,"width":976,"height":544,"fill":"rgb(0,0,0)","stroke":null,"strokeWidth":1,"strokeDashArray":null,"strokeLineCap":"butt","strokeLineJoin":"miter","strokeMiterLimit":10,"scaleX":0.5,"scaleY":0.5,"angle":0,"flipX":false,"flipY":false,"opacity":1,"shadow":null,"visible":true,"clipTo":null,"backgroundColor":"","fillRule":"nonzero","globalCompositeOperation":"source-over","localId":"222c0a8b-46ac-4c01-9c5c-79753937bc24","layerName":"productCanvas","itemName":"mainCanvas","src":"http://localhost:41075/en/RemoteStorage/GetRemoteItemImage/222c0a8b-46ac-4c01-9c5c-79753937bc24","filters":[],"crossOrigin":"use-credentials","alignX":"none","alignY":"none","meetOrSlice":"meet","remoteSrc":"http://localhost:41075/en/RemoteStorage/GetRemoteItemImage/222c0a8b-46ac-4c01-9c5c-79753937bc24","lockUniScaling":true},
{"type":"remote-image","originX":"left","originY":"top","left":382.5,"top":152.25,"width":292,"height":291,"fill":"rgb(0,0,0)","stroke":null,"strokeWidth":1,"strokeDashArray":null,"strokeLineCap":"butt","strokeLineJoin":"miter","strokeMiterLimit":10,"scaleX":0.43,"scaleY":0.43,"angle":0,"flipX":false,"flipY":false,"opacity":1,"shadow":null,"visible":true,"clipTo":null,"backgroundColor":"","fillRule":"nonzero","globalCompositeOperation":"source-over","localId":"8d97050e-eae8-4e95-b50b-f934f0df2d4c","itemName":"BestDeal.png","src":"http://localhost:41075/en/RemoteStorage/GetRemoteItemImage/8d97050e-eae8-4e95-b50b-f934f0df2d4c","filters":[],"crossOrigin":"use-credentials","alignX":"none","alignY":"none","meetOrSlice":"meet","remoteSrc":"http://localhost:41075/en/RemoteStorage/GetRemoteItemImage/8d97050e-eae8-4e95-b50b-f934f0df2d4c","lockUniScaling":true},
{"type":"remote-image","originX":"left","originY":"top","left":38,"top":38.5,"width":678,"height":370,"fill":"rgb(0,0,0)","stroke":null,"strokeWidth":1,"strokeDashArray":null,"strokeLineCap":"butt","strokeLineJoin":"miter","strokeMiterLimit":10,"scaleX":0.21,"scaleY":0.21,"angle":0,"flipX":false,"flipY":false,"opacity":1,"shadow":null,"visible":true,"clipTo":null,"backgroundColor":"","fillRule":"nonzero","globalCompositeOperation":"source-over","localId":"42dc0e49-e45f-4aa7-80cf-72d362deebb7","itemName":"simple_car.png","src":"http://localhost:41075/en/RemoteStorage/GetRemoteItemImage/42dc0e49-e45f-4aa7-80cf-72d362deebb7","filters":[],"crossOrigin":"use-credentials","alignX":"none","alignY":"none","meetOrSlice":"meet","remoteSrc":"http://localhost:41075/en/RemoteStorage/GetRemoteItemImage/42dc0e49-e45f-4aa7-80cf-72d362deebb7","lockUniScaling":true}],"background":""}
At Node.js server side I use the following code. I am transferring JSON string in base64 encoding to avoid some special-character problems:
var fabric = require('fabric').fabric;
function generatePNG(response, postData) {
var canvas = fabric.createCanvasForNode(1500, 800);
var decodedData = new Buffer(postData, 'base64').toString('utf8');
response.writeHead(200, "OK", { 'Content-Type': 'image/png' });
console.log("decodedData data: " + JSON.stringify(decodedData));
console.log("prepare to load");
canvas.loadFromJSON(decodedData, function () {
console.log("loaded");
canvas.renderAll();
console.log("rendered");
var stream = canvas.createPNGStream();
stream.on('data', function (chunk) {
response.write(chunk);
});
stream.on('end', function () {
response.end();
});
});
}
In a console I see that message “prepare to load” appears, but message “loaded” does not. I am not an expert in Node.js and this is the only way how I can determine that error happens during the loadFromJSON call. But I do not understand, where is the problem.
I am using fabric v.1.5.0 and node-canvas v.1.1.6 on server side.
Node.js + Fabric.js service is running on Windows 8 machine. And I am makeing a request from .NET MVC application, using POST request.
Remark: May be I needed to omit my comment about base64 encoding as it is confusing. I tried to run with normal json string and the same result.
If the images referenced in the JSON are on the NodeJS server, try changing the file path to the directory path on the server as opposed to a web URL.
I'm not sure I fully understand how you are using the base64 image, but there are some character corrections that are required for base64 images. I of course don't recall the specifics and don't have my code handy that I perform this in, but a Google search should set you in the right direction.
I hope those ideas help.
It turned out that problem was related to the way how fabric.util.loadImage method works. For external images loadImage mathod makes an http request assuming that no error can happen. Method used for requesting external images just simply logs an error and ends, instead of returning error through callback method back to loadImage method. At this moment image loading routine falls apart with erroneous state and without any feedback - it just terminates crashing whole Node.js.
It took 3 days for me to finally find out that actually it was my image supplying webservice who just responds with status code 500 making Node.js request to fail. Using my image supplying webservice through browser worked correctly and therefore at the first moment I did not considered that error is related particularly with request.
As the result I rewrote fromObject method of my custom Fabric.js object. Now it works in more safe fashion and in case of error I can get more feedback. Here is the implementation of my fromObject method. For http request I use module "request".
fabric.RemoteImage.fromObject = function (object, callback) {
var requestUrl = object.remoteSrc;
request({
url: object.remoteSrc,
encoding: null
},
function(error, response, body) {
if (error || response.statusCode !== 200) {
var errorMessage = "Error retrieving image " + requestUrl;
errorMessage += "\nResponse for a new image returned status code " + response.statusCode;
if (error) {
errorMessage += " " + error.name + " with message: \n" + error.message;
console.log(error.stack);
}
console.log(errorMessage);
callback && callback(null, new Error(errorMessage));
} else {
var img = new Image();
var buff = new Buffer(body, 'binary');
img.src = buff;
var fabrImg = new fabric.RemoteImage(img, object);
callback && callback(fabrImg);
}
});
};

Node.js stream upload directly to Google Cloud Storage

I have a Node.js app running on a Google Compute VM instance that receives file uploads directly from POST requests (not via the browser) and streams the incoming data to Google Cloud Storage (GCS).
I'm using Restify b/c I don't need the extra functionality of Express and because it makes it easy to stream the incoming data.
I create a random filename for the file, take the incoming req and toss it to a neat little Node wrapper for GCS (found here: https://github.com/bsphere/node-gcs) which makes a PUT request to GCS. The documentation for GCS using PUT can be found here: https://developers.google.com/storage/docs/reference-methods#putobject ... it says Content-Length is not necessary if using chunked transfer encoding.
Good news: the file is being created inside the appropriate GCS storage "bucket"!
Bad News:
I haven't figured out how to get the incoming file's extension from Restify (notice I'm manually setting '.jpg' and the content-type manually).
The file is experiencing slight corruption (almost certainly do to something I'm doing wrong with the PUT request). If I download the POSTed file from Google, OSX tells me its damaged ... BUT, if I use PhotoShop, it opens and looks just fine.
Update / Solution
As pointed out by vkurchatkin, I needed to parse the request object instead of just piping the whole thing to GCS. After trying out the lighter busboy module, I decided it was just a lot easier to use multiparty. For dynamically setting the Content-Type, I simply used Mimer (https://github.com/heldr/mimer), referencing the file extension of the incoming file. It's important to note that since we're piping the part object, the part.headers must be cleared out. Otherwise, unintended info, specifically content-type, will be passed along and can/will conflict with the content-type we're trying to set explicitly.
Here's the applicable, modified code:
var restify = require('restify'),
server = restify.createServer(),
GAPI = require('node-gcs').gapitoken,
GCS = require('node-gcs'),
multiparty = require('multiparty'),
Mimer = require('mimer');
server.post('/upload', function(req, res) {
var form = new multiparty.Form();
form.on('part', function(part){
var fileType = '.' + part.filename.split('.').pop().toLowerCase();
var fileName = Math.random().toString(36).slice(2) + fileType;
// clear out the part's headers to prevent conflicting data being passed to GCS
part.headers = null;
var gapi = new GAPI({
iss: '-- your -- #developer.gserviceaccount.com',
scope: 'https://www.googleapis.com/auth/devstorage.full_control',
keyFile: './key.pem'
},
function(err) {
if (err) { console.log('google cloud authorization error: ' + err); }
var headers = {
'Content-Type': Mimer(fileType),
'Transfer-Encoding': 'Chunked',
'x-goog-acl': 'public-read'
};
var gcs = new GCS(gapi);
gcs.putStream(part, myBucket, '/' + fileName, headers, function(gerr, gres){
console.log('file should be there!');
});
});
});
};
You can't use the raw req stream since it yields whole request body, which is multipart. You need to parse the request with something like multiparty give you a readable steam and all metadata you need.

Serving out saved Buffer from Mongo

I'm trying to serve out images that I have stored in a Mongo document. I'm using express, express-resource and mongoose.
The data, which is a JPG, is stored in a Buffer field in my schema. Seems like it's getting there correctly as I can read the data using the cli.
Then I run a find, grab the field and attempt sending it. See code:
res.contentType('jpg');
res.send(img);
I don't think it's a storage issue because I'm performing the same action here:
var img = fs.readFileSync(
__dirname + '/../../img/small.jpg'
);
res.contentType('jpg');
res.send(img);
In the browser the image appears (as a broken icon).
I'm wondering if it's an issue with express-resource because I have the format set to json, however I am indeed overriding the content type before sending the data.
scratches head
I managed to solve this myself. Seems like I was using the right method to send the data from express, but wasn't storing it properly (tricky!).
For future reference to anyone handling image downloads and managing them in Buffers, here is some sample code using the request package:
request(
{
uri: uri,
encoding: 'binary'
},
function (err, response, body)
{
if (! err && response.statusCode == 200)
{
var imgData = new Buffer(
body.toString(),
'binary'
).toString('base64');
callback(null, new Buffer(imgData, 'base64'));
}
}
);
Within Mongo you need to setup a document property with type Buffer to successfully store it. Seems like this issue was due to how I was saving it into Mongo.
Hopefully that saves someone time in the future. =)

Resources