Sending data from Screen over Socket IO? - node.js

I have been googling around but cannot find a clear answer to this.
I am making a chrome extension which records tabs. The idea is to stream getUserMedia to the backend using Websockets (specifically Socket.io) where the backend writes to a file until a specific condition (a value in the backend) is set.
The problem is, I do not know how I would call the backend with a specific ID and also, how would I correctly write to the file without corrupting it?

You are sending the output from MediaRecorder, via a websocket, to your back end.
Presumably you do this from within MediaRecorder's ondataavailable handler.
It's quite easy to stuff that data into a websocket:
function ondataavailable ( event ) {
event.data.arrayBuffer().then ( buf => {socket.emit('media', buf) })
}
In the backend you must concatenate all the data you receive, in the order of reception, to a media file. This media file likely has a video/webm MIME type; if you give it the .webm filename extension most media players will handle it correctly. Notice that the media file will be useless if it doesn't contain all the Blobs in order. The first few Blobs contain metadata needed to make sense of the stream. This is easy to do by appending each received data item to the file.
Server-side you can use the socket.id attribute to make up a file name; that gives you a unique file for each socket connection. Something like this sleazy poorly-performing non-debugged not-production-ready code will do that.
io.on("connection", (socket) => {
if (!socket.filename)
socket.filename = path.join(__dirname, 'media', socket.id + '.webm))
socket.on('filename', (name) => {
socket.filename = path.join(__dirname, 'media', name + '.webm))
})
socket.on('media', (buf) => {
fs.appendFile(buf, filename)
})
})
On the client side you could set the filename with this.
socket.emit('filename', 'myFavoriteScreencast')

Related

Respond with two files data on readFile - Node

I'm using node to respond clients with two files. For now, i'm using a endpoint for each file, cause i can't figure out how pass more than one in a row.
Here's the function that responds with the file:
exports.chartBySHA1 = function (req, res, next, id) {
var dir = './curvas/' + id + '/curva.txt'; // id = 1e4cf04ad583e483c27b40750e6d1e0302aff058
fs.readFile(dir, function read(err, data) {
if (err) {
res.status(400).send("Não foi possível buscar a curva.");
}
content = data;
res.status(200).send(content);
});
};
Besides that, i need to change the default name of the file, when i reach that endpoint, the name brings 1e4cf04ad583e483c27b40750e6d1e0302aff058, but i'm passing the content of 'curva.txt'.
Someone has any tips?
Q: How do I pass back contents of more than one file back to a user without having to create individual endpoints.
A: There are a few ways you can do this.
If the content of each file is not huge then the easiest way out is to read in all of the contents and then transmit them back as a javascript key-value object. E.g.
let data = {
file1: "This is some text from file 1",
file2: "Text for second file"
}
res.send(data);
res.end();
If the content is particularly large then you can stream the data across to the client, while doing so you could add some metadata or hints to tell the client what they are going to receive in the next moment and when is the end of file.
There is probably some libraries which can do the latter for you already, so I would suggest you shop around in github before designing/writing your own.
The former method is the easiest.

NodeJS - Live Stream Audio to Specific URL with mp3 audio "chunks"

I'm working on developing an application that will capture audio from the browser in 5 second "chunks" (these are full audio files and not simply partial files), send these 5 second chunks to the server, convert it from webm to mp3 on the server, and then broadcast the mp3 file to clients connected via a websocket or a static url.
I've successfully managed to do parts 1 and 2; however, I'm not quite sure the best approach to transmit this created mp3 audio file to the user. My thinking was to generate a single url for clients to listen in to, e.g http://localhost/livestream.mp3 (a live stream url that would automatically update itself with the latest audio data), or to emit the audio files to the clients over a websocket and attempt to play these sequenced audio files seamlessly without any noticeable gaps between the audio files as they switch out.
Here's a snippet of my [typescript] code where I create the mp3 file, and I've pointed out the area in which I would perform the writestream and from there I would expect to pipe this to users when they make an HTTP req.
private createAudioFile(audioObj: StreamObject, socket: SocketIO.Socket) : void {
const directory: string = `${__dirname}/streams/live`;
fs.writeFile(`${directory}/audio_${audioObj.time}.webm`, audioObj.stream, (err: NodeJS.ErrnoException) => {
if (err) logger.default.info(err.toString());
try {
const process: childprocess.ChildProcess = childprocess.spawn('ffmpeg', ['-i', `${directory}/audio_${audioObj.time}.webm`, `${directory}/audio_${audioObj.time}.mp3`]);
process.on('exit', () => {
// Ideally, this is where I would be broadcasting the audio from
// the static URL by adding the new stream data to it, or by
// emitting it out to all clients connected to my websocket
// const wso = fs.createWriteStream(`${directory}/live.mp3`);
// const rso = fs.createReadStream(`${directory}/audio_${audioObj.time}.mp3`);
// rso.pipe(wso);
if (audioObj.last == true) {
this.archiveAudio(directory, audioObj.streamName);
}
});
} catch (e) {
logger.default.error('CRITICAL ERROR: Exception occurred when converting file to mp3:');
logger.default.error(e);
}
});
}
I've seen a number of questions out there that ask for a similar concept, but not quite the final goal that I'm looking for. Is there a way to make this work?

How do you read a stream in a middleware and still be streamable in next middleware

I'm using a proxy middleware to forward multipart data to a different endpoint. I would like to get some information from the stream using previous middleware, and still have the stream readable for the proxy middleware that follows. Is there stream pattern that allows me to do this?
function preMiddleware(req, res, next) {
req.rawBody = '';
req.on('data', function(chunk) {
req.rawBody += chunk;
});
req.on('end', () => {
next();
})
}
function proxyMiddleware(req, res, next) {
console.log(req.rawBody)
console.log(req.readable) // false
}
app.use('/cfs', preMiddleware, proxyMiddleware)
I want to access the name value of <input name="fee" type='file' /> before sending the streamed data to the external endpoint. I think I need to do this because the endpoint parses fee into the final url, and I would like to have a handle for doing some post processing. I'm open to alternative patterns to resolve this.
I don't think there is any mechanism for peeking into a stream without actually permanently removing data from the stream or any mechanism for "unreading" data from a stream to put it back into the stream.
As such, I can think of a few possible ideas:
Read the data you want from the stream and then send the data to the final endpoint manually (not using your proxy code that expects the readable stream).
Read the stream, get the data you want out if it, then create a new readable stream, put the data you read into that readable stream and pass that readable stream onto the proxy. Exactly how to pass it only the proxy will need some looking into the proxy code. You might have to make a new req object that is the new stream.
Create a stream transform that lets you read the stream (potentially even modifying it) while creating a new stream that can be fed to the proxy.
Register your own data event handler, then pause the stream (registering a data even automatically triggers the stream to flow and you don't want it to flow yet), then call next() right away. I think this will allow you to "see" a copy of all the data as it goes by when the proxy middleware reads the stream as there will just be multiple data event handlers, one for your middleware and one for the proxy middleware. This is a theoretical idea - I haven't yet tried it.
You would need to be able to send a single stream in two different directions, which is not gonna be easy if you try it on your own - luckily I wrote a helpful module back in the day rereadable-stream
, that you could use and I'll use scramjet for finding the data you're interested in.
I assume your data will be a multipart-boundary:
const {StringStream} = require('scramjet');
const {ReReadable} = require("rereadable-stream");
// I will use a single middleware, since express does not allow to pass an altered request object to next()
app.use('/cfs', (req, res, next) => {
const buffered = req.pipe(new ReReadable()); // rewind file to
let file = '';
buffered.pipe(new StringStream) // pipe to a StringStream
.lines('\n') // split request by line
.filter(x => x.startsWith('Content-Disposition: form-data;'))
// find form-data lines
.parse(x => x.split(/;\s*/).reduce((a, y) => { // split values
const z = y.split(/:\s*/); // split value name from value
a[z[0]] = JSON.parse(z[1]); // assign to accumulator (values are quoted)
return a;
}, {}))
.until(x => x.name === 'fee' && (file = x.filename, 1))
// run the stream until filename is found
.run()
.then(() => uploadFileToProxy(file, buffered.rewind(), res, next))
// upload the file using your method
});
You'll probably need to adapt this a little to make it work in real world scenario. Let me know if you get stuck or there's something to fix in the above answer.

Having problems handling UIImageJPEGRepresentation or UIImagePNGRepresentation in Node.js

TL;DR: How do you write UIImageRepresentation data into the actual file format in a Node.js server? (or any place outside of Swift at that)
.
.
So I'm in a bit of a predicament here...
I wanted to send a UIImageJPEGRepresentation (or any form of data encoded imagery) through Alamofire, over to a node server, to be saved there. I used Busboy to handle MultipartFormData...
busboy.on('field', function(fieldname, val, fieldnameTruncated, valTruncated, encoding, mimetype) {
var datamail = './storage/' + fieldname;
var stream = fs.createWriteStream(datamail)
console.log('Field [' + fieldname + ']: value: ' + util.inspect(val));
console.log('Storing letter in ' + datamail);
stream.write(val);
});
and save it through a write stream. I originally wanted to read a UIImage I had put in, but I wasn't sure how the server would respond to an object like that, so I went and used UIImageJPEGRepresentation. It read the UIImageJPEGRepresentation object right...
Field [test.png]: value:
'�PNG\r\n\u001a\n\u0000\u0000\u0000\rIHDR\u0000\u0000\u0000d\u0000\u0000\u0000d\b\u0002\u0000\u0000\u0000��\u0002\u0003\u0000\u0000\u0000\u0001sRGB\u0000��\u001c�\u0000\u0000\u0000\u001ciDOT\u0000\u0000\u0000\u0002\u0000\u0000\u0000\u0000\u0000\u0000\u00002\u0000\u0000\u0000(\u0000\u0000\u00002\u0000\u0000\u00002\u0000\u0000(aj\rp�\u0000\u0000(-IDATx\u0001��\u0007T\u0014��LJ�\u0005X�\u0002�,u�w�+����KԈ-�\u0016[Ԩ�$1&j�I�b�\u0002��*Mz�lﻔ��م\r���s��yΜ��\u0011�����\u000e"���%S��`r�I\u0014j�Ie�OL,W��d�OM��I��%R�H�\u0019_��\u001bl�M�]�U U�\t�d2�T��\u0017H�k|��\u0013㉤�\te`]\u0002)X��O�!�\f�v�XoȿD\nn�a���?�`��\u0013R��\u001f��\u000e �_\u0007`�=:\u0010B�JozRz����\t�\u001f��\u0013X:^�CX\u000f.<9�c�\r§C��¢�$R�W?��:\u001a�L�\u0016K��\u0001\u0013H���\u000f����`�\u0013e���K\'���\u001a,+X�e\u0005�YY\u000f\u0016��A�`=\u0000�\u000f4zF�,t8t��\r�\u0005;�\b�JUz^��^Y�\t�\u001f��;�8��\u0004K\u001f�`�\u0003�wLX`�����2����e:4�|\u0007�5�L���p\u0006��ڸ��*Ē~��\u0015`�b���k\u0010����\u000fK/���t��\u0012�DD�\bbـ�\u0006�v\u0010,�T�5����tZ����\u000e�����wť����^�뿠�D���~08l-SH���$J0��\u0006�a�\u0006\u0014���\u0012��$\u0012H�������\u000e�`Xz}��a}�)=A��O����A\n\u001a�\f��ХNLt\u0018&�\u0015C�\r^:��q�Â�?\u000f�/x��z^�H�\u000b��[X#J�Ԁ\u0001)�#�\u0005^#Wh����a�ݩ-�tp�O��D�\u0001��&��ucWmDצK-5�\u0002c\'�\u0002\u0016�w�X.\u0014�ty\u0000v>A\u0006�\u0017����m��g��\u0012�����\u0002v�N6�˸\u000e��=�\u0016\u0013��Â}x\fL/Z�zX IX�W\u000e�h#J"ш\u0015=\u0000K(�\u0000,���\u0000�\u0000�\u000e\u0013���\u0018��\u0005|\u0017�a_ʱ�CGG��/I���+,�\u0013u6�\u0007a�_P\u0000E�����\u000elbP��c��\u0002T�\u0001�(u=\u0000\\�ˁ�\u0003�k\u0014�n�H)����n�\u0007\n� L>�\u0001�U�P1���^�\u001f�\u0013b7�nô��`��>����t;���V�rF�d���\u001de\r&\u0005O�=�\u0016\u001f�\u0011ɡ\u0010�R>&%�\nj\b��\u0000d�H�ݩ\u0006�\u0012e�H�\'Q�bE\u001f_�˓�t��;E�\u000e��C��!Ay2�K��#\u0006��J\u0014�\u0012�\u0010\u0016�\u00142�\u001f1:�rD�\u0019`>>\u0018\u0016�v#�S�\bK\u0007np����R�\u001b��0\u0018\u0016\u0000�oa�Ru+\u0015�J�F#��H42�L�Ķz$RT$F�¾�vuE-/3�+%��EJS⫦�IM\u000f^�>~��4���K��7�W�o3��y��"EY������\u000b\u0015I��J\u0002?R\u0001rÄ�\r^:q���\u001a�/U�j�\u001dXBI\u0017��g�\u0004\u0016P�w`\r�\u0005�t�ғ¤$U�%�^��O�A�R\u0000��Ҧ(��H�j|��x?��Σ�;\u000fx7�u�r��ʍ΋��~�����\r��_;�]i��b�OW���|�z��[տݩ�������|Iu���\u000b��~�\u0002d...
was successfully able to save it into a .jpeg...
SO won't let me add a picture because my rep isn't high enough... click instead
to find that it instead was a corrupted image that I couldn't use. My app extracts frames out of the camera on the fly and converts them to UIImage(JPEG/PNG)Representation (see gist). The next best option seemed like to do a direct upload (since it worked in Postman) but Alamofire only supports direct Data objects or URL encoded things, and I don't think UIImageJPEGRepresentations can be directly sent. I really just want to know how to handle these objects.
Thanks in advance.
Fixed this one quicker than I thought I would. I placed a logger in my server code to check for the MIME type:
busboy.on('field', function(mimetype) {
// var datamail = /*path.join('.', 'storage', fieldname);*/ './storage/' + fieldname;
// var stream = fs.createWriteStream(datamail)
console.log(MIMETYPE: ' + mimetype);
});
...discovering a text/plain one. I went Alamofire and changed the params,
from this:
Alamofire.upload(
multipartFormData: { multipartFormData in
multipartFormData.append(compresso!, withName: "test.png")
},
to this:
Alamofire.upload(
multipartFormData: { multipartFormData in
// notice the MIME change
multipartFormData.append(compresso!, withName: "test.png", fileName: "file.jpg", mimeType: "image/png")
},
And it worked! It was able to safely go through processing and do its' thing. They should really change this in the Alamofire examples, as they use a PNG Representation and send it without the extra params. (see Uploading Data to a server.) It is stuff like that that could potentially keep a dev up at night...

How to make the client download a very large file that is genereted on the fly

I have an export function that read the entire database and create a .xls file with all the records. Then the file is sent to the client.
Of course, the time of export the full database requires a lot of time and the request will soon end in a timeout error.
What is the best solution to handle this case?
I heard something about making a queue with Redis for example but this will require two requests: one for starting the job that will generate the file and the second to download the generated file.
Is this possible with a single request from the client?
Excel Export:
Use Streams. Following is a rough idea of what might be done:
Use exceljs module. Because it has a streaming API aimed towards this exact problem.
var Excel = require('exceljs')
Since we are trying to initiate a download. Write appropriate headers to response.
res.status(200);
res.setHeader('Content-disposition', 'attachment; filename=db_dump.xls');
res.setHeader('Content-type', 'application/vnd.ms-excel');
Create a workbook backed by Streaming Excel writer. The stream given to writer is server response.
var options = {
stream: res, // write to server response
useStyles: false,
useSharedStrings: false
};
var workbook = new Excel.stream.xlsx.WorkbookWriter(options);
Now, the output streaming flow is all set up. for the input streaming, prefer a DB driver that gives query results/cursor as a stream.
Define an async function that dumps 1 table to 1 worksheet.
var tableToSheet = function (name, done) {
var str = dbDriver.query('SELECT * FROM ' + name).stream();
var sheet = workbook.addWorksheet(name);
str.on('data', function (d) {
sheet.addRow(d).commit(); // format object if required
});
str.on('end', function () {
sheet.commit();
done();
});
str.on('error', function (err) {
done(err);
});
}
Now, lets export some db tables, using async module's mapSeries:
async.mapSeries(['cars','planes','trucks'],tableToSheet,function(err){
if(err){
// log error
}
res.end();
})
CSV Export:
For CSV export of a single table/collection module fast-csv can be used:
// response headers as usual
res.status(200);
res.setHeader('Content-disposition', 'attachment; filename=mytable_dump.csv');
res.setHeader('Content-type', 'text/csv');
// create csv stream
var csv = require('fast-csv');
var csvStr = csv.createWriteStream({headers: true});
// open database stream
var dbStr = dbDriver.query('SELECT * from mytable').stream();
// connect the streams
dbStr.pipe(csvStr).pipe(res);
You are now streaming data from DB to HTTP response, converting it into xls/csv format on the fly. No need to buffer or store the entire data in memory or in a file.
You do not have to send the whole file once, you can send this file by chunks (line by line for example), just use res.write(chunk) and res.end() at finish to mark it as completed.
You can either send the file information as a stream, sending each individual chunk as it gets created via res.write(chunk), or, if sending the file chunk by chunk is not an option, and you have to wait for the entire file before sending any information, you can always keep the connection open by setting the timeout duration to Infinity or any value you think will be high enough to allow the file to be created. Then set up a function that creates the .xls file and either:
1) Accepts a callback that receives the data output as an argument once ready, sends that data, and then closes the connection, or;
2) Returns a promise that resolves with the data output once its ready, allowing you to send the resolved value and close the connection just like with the callback version.
It would look something like this:
function xlsRouteHandler(req, res){
res.setTimeout(Infinity) || res.socket.setTimeout(Infinity)
//callback version
createXLSFile(...fileCreationArguments, function(finishedFile){
res.end(finishedFile)
})
//promise version
createXLSFile(...fileCreationArguments)
.then(finishedFile => res.end(finishedFile))
}
If you still find yourself concerned about timing out, you can always set an interval timer to dispatch an occasional res.write() message to prevent a timeout on the server connection and then cancel that interval once the final file content is ready to be sent.
Refer to this link which uses jedis (redis java client)
The key to this is the LPOPRPUSH command
https://blog.logentries.com/2016/05/queuing-tasks-with-redis/

Resources