Socket.io write file to user disk - node.js

I have a Node.js web app that has a button. When this button is pressed, it gathers data and sends the data to a socket.io function. Here is the client code:
$("body").on('click', ".download", function() {
var data = [1,2,3,4,5];
socket.emit('download_data', data);
}
On the sever side I have a socket.io function that gets the data:
socket.on('download_data', function(data){
//create csv and download
});
I want the socket function to create a CSV file with the generated data and write the file to the users disk. Is this possible?

On the server, generate the content of the CSV file:
csv = '';
data.forEach(row => { // Assuming an array of arrays.
csv += row.join(';') + '\n';
});
Return the content to the client:
socket.emit('csv_content', csv);
On the client, ask the user to download or open. I am using the file-saver package here.
import { saveAs } from 'file-saver';
const blob = new Blob([content], { type: "text/csv;charset=utf-8" });
saveAs(blob, "data.csv");
Not sure why you are using socket.io for this, as you can do the same with a HTTP POST operation.

Related

Node / Express generate calendar URL

I have a database with a bunch of dates and an online overview where you can view them, now I know I can copy a URL from my Google Agenda and import this in other calendar clients so I can view the events there.
I want to generate an Express endpoint where I fetch every event every time the endpoint is called and return it in a format that can be imported by other calendar clients. Now with packages like iCal-generator I could generate, read, and return the file whenever a user requests the URL. but it feels redudent to write a file to my storage to then read it, return it and delete it every time it's requested.
What is the most effiecent way to go about this?
Instead of generating the file/calendar data on every request, you could implement a simple caching mechanism. That is, upon start of your node app you generate the calendar data and put it in your cache with corresponding time to live value. Once the data has expired or new entries are inserted into your DB you invalidate the cache, re-generate the data and cache it again.
Here's a very simple example for an in-memory cache that uses the node-cache library:
const NodeCache = require('node-cache');
const cacheService = new NodeCache();
// ...
const calendarDataCacheKey = 'calender-data';
// at the start of your app, generate the calendar data and cache it with a ttl of 30 min
cacheCalendarData(generateCalendarData());
function cacheCalendarData (calendarData) {
cacheService.set(calendarDataCacheKey, calendarData, 1800);
}
// in your express handler first try to get the value from the cache
// if not - generate it and cache it
app.get('/calendar-data', (req, res) => {
let calendarData = cacheService.get(calendarDataCacheKey);
if (calendarData === undefined) {
calendarData = generateCalendarData();
cacheCalendarData(calendarData);
}
res.send(calendarData);
});
If your app is scaled horizontally you should consider using redis.
100% untested, but I have code similar to this that exports to a .csv from a db query, and it might get you close:
const { Readable } = require('stream');
async function getCalendar(req, res) {
const events = await db.getCalendarEvents();
const filename = 'some_file.ics';
res.set({
'Content-Type': 'text/calendar',
'Content-Disposition': `attachment; filename=${filename}`,
});
const input = new Readable({ objectMode: true });
input.pipe(res)
.on('error', (err) => {
console.error('SOME ERROR', err);
res.status(500).end();
});
events.forEach(e => input.push(e));
input.push(null);
}
if you were going to use the iCal generator package, you would do your transforms within the forEach method before pushing to the stream.

How to send File through Websocket along with additional info?

I'm developing a Web application to send images, videos, etc. to two monitors from an admin interface. I'm using ws in Node.js for the server side. I've implemented selecting images available on the server and external URLs and sending them to the clients, but I also wanted to be able to directly send images selected from the device with a file input. I managed to do it using base64 but I think it's pretty inefficient.
Currently I send a stringified JSON object containing the client to which the resource has to be sent, the kind of resource and the resource itself, parse it in the server and send it to the appropriate client. I know I can set the Websocket binaryType to blob and just send the File object, but then I'd have no way to tell the server which client it has to send it to. I tried using typeson and BSON to accomplish this, but it didn't work.
Are there any other ways to do it?
You can send raw binary data through the WebSocket.
It's quite easy to manage.
One option is to prepend a "magic byte" (an identifier that marks the message as non-JSON). For example, prepend binary messages with the B character.
All the server has to do is test the first character before collecting the binary data (if the magic byte isn't there, it's probably the normal JSON message).
A more serious implementation will attach a header after the magic byte (i.e., file name, total length, position of data being sent etc').
This allows the upload to be resumed on disconnections (send just the parts that weren't acknowledged as received.
Your server will need to split the data into magic byte, header and binary_data before processing. but it's easy enough to accomplish.
Hope this help someone.
According to socket.io document you can send either string, Buffer or mix both of them
On Client side:
function uploadFile(e, socket, to) {
let file = e.target.files[0];
if (!file) {
return
}
if (file.size > 10000000) {
alert('File should be smaller than 1MB')
return
}
var reader = new FileReader();
var rawData = new ArrayBuffer();
reader.onload = function (e) {
rawData = e.target.result;
socket.emit("send_message", {
type: 'attachment',
data: rawData
} , (result) => {
alert("Server has received file!")
});
alert("the File has been transferred.")
}
reader.readAsArrayBuffer(file);
}
on server side:
socket.on('send_message', async (data, cb) => {
if (data.type == 'attachment') {
console.log('Found binary data')
cb("Received file successfully.")
return
}
// Process other business...
});
I am using pure WebSocket without io, where you cannot mix content - either String or Binary. Then my working solution is like this:
CLIENT:
import { serialize } from 'bson';
import { Buffer } from 'buffer';
const reader = new FileReader();
let rawData = new ArrayBuffer();
ws = new WebSocket(...)
reader.onload = (e) => {
rawData = e.target.result;
const bufferData = Buffer.from(rawData);
const bsonData = serialize({ // whatever js Object you need
file: bufferData,
route: 'TRANSFER',
action: 'FILE_UPLOAD',
});
ws.send(bsonData);
}
Then on Node server side, the message is catched and parsed like this:
const dataFromClient = deserialize(wsMessage, {promoteBuffers: true}) // edited
fs.writeFile(
path.join('../server', 'yourfiles', 'yourfile.txt'),
dataFromClient.file, // edited
'binary',
(err) => {
console.log('ERROR!!!!', err);
}
);
The killer is promoteBuffer option in deserialize function.

Cannot get node to pipe data to a download file correctly

I'm fairly new to node and streaming, and I am having an issue when attempting to stream a large amount of data to a file on the client browser.
So for example, if on the server if i have a large file, test.txt, i can easily stream this to the client browser by setting the header attachment and piping the file to the request response as follows.
res.setHeader('Content-Type', 'text/csv');
res.setHeader('Content-disposition', 'attachment;filename=myfile.text');
fs.createReadStream('./test.txt')
.pipe(res);
When the user clicks the button, the download begins, and we see the data getting streamed to the download file. The stream takes several minutes, but during this time the client is not blocked and they can continue to do other things while the file is downloaded by the browser.
However my data is not stored in a file, I need to retrieve it one string at a time from another server. So I'm attempting to create my own read stream and push my data chunk by chunk, but it does not work, when i do something like this:
var s = new Readable();
s.pipe(res);
for(let i=0; i<=total; i++) {
dataString = //code here to get next string needed to push
s.push(dataString);
};
s.push(null);
With this code, when the user request the download, once the download begins, the client is blocked and cannot do any other actions until the download is completed. Also if the data takes more than 30 seconds to stream, we hit the server timeout in this case, and the download fails. With the file stream this is not an issue
How to I get this to act like a file stream and not block the client from doing other request while it downloads. Any recommendations on the best way to implement this would be appreciated.
I was able resolve this issue by doing something similar to here:
How to call an asynchronous function inside a node.js readable stream
My basic code is as follows, and this is not blocking the client or timing out on the request as the data is continuously piped to the file download on the client side.
res.setHeader('Content-Type', 'text/csv');
res.setHeader('Content-disposition', 'attachment;filename=myfile.text');
function MyStream() {
var rs = new Readable();
var hitsadded = 0;
rs._read = function() {}; // needed to avoid "Not implemented" exception
getResults(queryString, function getMoreUntilDone(err, res) {
if (err){
logger.logError(err);
}
rs.push(res.data);
hitsadded += res.records;
if (res.recordsTotal > hitsadded) {
getNextPage(query, getMoreUntilDone);
} else {
rs.push(null);
}
});
return rs;
}
MyStream().pipe(zlib.createGzip()).pipe(res);

How to make the client download a very large file that is genereted on the fly

I have an export function that read the entire database and create a .xls file with all the records. Then the file is sent to the client.
Of course, the time of export the full database requires a lot of time and the request will soon end in a timeout error.
What is the best solution to handle this case?
I heard something about making a queue with Redis for example but this will require two requests: one for starting the job that will generate the file and the second to download the generated file.
Is this possible with a single request from the client?
Excel Export:
Use Streams. Following is a rough idea of what might be done:
Use exceljs module. Because it has a streaming API aimed towards this exact problem.
var Excel = require('exceljs')
Since we are trying to initiate a download. Write appropriate headers to response.
res.status(200);
res.setHeader('Content-disposition', 'attachment; filename=db_dump.xls');
res.setHeader('Content-type', 'application/vnd.ms-excel');
Create a workbook backed by Streaming Excel writer. The stream given to writer is server response.
var options = {
stream: res, // write to server response
useStyles: false,
useSharedStrings: false
};
var workbook = new Excel.stream.xlsx.WorkbookWriter(options);
Now, the output streaming flow is all set up. for the input streaming, prefer a DB driver that gives query results/cursor as a stream.
Define an async function that dumps 1 table to 1 worksheet.
var tableToSheet = function (name, done) {
var str = dbDriver.query('SELECT * FROM ' + name).stream();
var sheet = workbook.addWorksheet(name);
str.on('data', function (d) {
sheet.addRow(d).commit(); // format object if required
});
str.on('end', function () {
sheet.commit();
done();
});
str.on('error', function (err) {
done(err);
});
}
Now, lets export some db tables, using async module's mapSeries:
async.mapSeries(['cars','planes','trucks'],tableToSheet,function(err){
if(err){
// log error
}
res.end();
})
CSV Export:
For CSV export of a single table/collection module fast-csv can be used:
// response headers as usual
res.status(200);
res.setHeader('Content-disposition', 'attachment; filename=mytable_dump.csv');
res.setHeader('Content-type', 'text/csv');
// create csv stream
var csv = require('fast-csv');
var csvStr = csv.createWriteStream({headers: true});
// open database stream
var dbStr = dbDriver.query('SELECT * from mytable').stream();
// connect the streams
dbStr.pipe(csvStr).pipe(res);
You are now streaming data from DB to HTTP response, converting it into xls/csv format on the fly. No need to buffer or store the entire data in memory or in a file.
You do not have to send the whole file once, you can send this file by chunks (line by line for example), just use res.write(chunk) and res.end() at finish to mark it as completed.
You can either send the file information as a stream, sending each individual chunk as it gets created via res.write(chunk), or, if sending the file chunk by chunk is not an option, and you have to wait for the entire file before sending any information, you can always keep the connection open by setting the timeout duration to Infinity or any value you think will be high enough to allow the file to be created. Then set up a function that creates the .xls file and either:
1) Accepts a callback that receives the data output as an argument once ready, sends that data, and then closes the connection, or;
2) Returns a promise that resolves with the data output once its ready, allowing you to send the resolved value and close the connection just like with the callback version.
It would look something like this:
function xlsRouteHandler(req, res){
res.setTimeout(Infinity) || res.socket.setTimeout(Infinity)
//callback version
createXLSFile(...fileCreationArguments, function(finishedFile){
res.end(finishedFile)
})
//promise version
createXLSFile(...fileCreationArguments)
.then(finishedFile => res.end(finishedFile))
}
If you still find yourself concerned about timing out, you can always set an interval timer to dispatch an occasional res.write() message to prevent a timeout on the server connection and then cancel that interval once the final file content is ready to be sent.
Refer to this link which uses jedis (redis java client)
The key to this is the LPOPRPUSH command
https://blog.logentries.com/2016/05/queuing-tasks-with-redis/

Upload a binary encoded audio file via ajax and save

I have an audio file saved locally that I want to read, upload to a server via ajax and then store on the server. Somewhere along this process the file gets corrupted such that the file that's saved on the server cannot be played.
I'll list simplified bits of code that show the process I'm going through so hopefully it'll be evident where I'm going wrong.
1) After audio is recorded (using getUserMedia and MediaRecorder), a local file is saved:
var audioData = new Blob(chunks, { type: 'audio/webm' });
var fileReader = new FileReader();
fileReader.onloadend = function() {
var buffer = this.result,
uint8Array = new Uint8Array(buffer);
fs.writeFile('path/to/file.webm', uint8Array, { flags: 'w' });
}
fileReader.readAsArrayBuffer(audioData);
2) Later this local file is read and sent to a server (using the library axios to send the ajax request)
fs.readFile('path/to/file.webm', 'binary', (err, data) => {
var formData = new FormData();
formData.append('file', new Blob([data], {type: 'audio/webm'}), 'file.webm');
axios.put('/upload', formData);
});
3) The server then handles this request and saves the file
[HttpPut]
public IActionResult Upload(IFormFile file)
{
using (var fileStream = new FileStream("path/to/file.webm", FileMode.Create))
{
file.CopyTo(fileStream);
}
}
The local audio file can be played successfully however the audio file on the server does not play.
I'm not sure if this is helpful information, but here are the first few lines of text I see when I open the local file in a text editor (notepad++):
And the same when I open the one on the server:
So kinda the same... but different. I've tried encoding a myriad of different ways but everything seems to fail. Fingers crossed someone can point me in the right direction here.
The problem was with how I was passing through the file contents from fs.readFile. If I passed a base64 encoded raw buffer from fs.readFile via json, converted that to a byte array on the server and saved that, then I can successfully play it on the server.
fs.readFile('path/to/file.webm', (err, data) => {
axios.put('/upload', { audioData: data.toString('base64') });
});
[HttpPut]
public IActionResult Upload([FromBody]UploadViewModel upload)
{
var audioDataBytes = Convert.FromBase64String(upload.AudioData);
using (var memoryStream = new MemoryStream(audioDataBytes))
using (var fileStream = new FileStream("path/to/file.webm", FileMode.Create))
{
await memoryStream.CopyToAsync(fileStream);
}
}
Actually, this is a problem of character encoding. You are probably mixing UTF-8 and ISO-8859 which causes the file to be corrupted.
You should probably set the charset in the HTML page to the one expected on the server. Or perform preliminary checks on the server if you do not know the charset of the data you will receive.
Converting to base64 will solve the issue because then it will only use characters in the ASCII range.

Resources