Setup: I am reading data from a database and emailing the dataset as a CSV file. The data get read in chunks of 500 rows, and I'm using ya-csv to write the records as a CSV to a stream. I then want to use mailgun-js to email the file as an attachment.
Option 1 (what I don't want to do):
create temp file;
create write stream to that file;
write all CSV records;
read all of it back into memory to attach to an email;
Option 2 (what I want to do but don't quite know how to):
create a writable stream;
create a readable stream;
somehow pipe the writes from (1) into (2);
pass the writable stream to ya-csv;
pass the readable stream to mailgun;
fetch data and write to the write stream until there's no more data;
end the write stream, thus ending the read stream and sending the email.
I've been reading https://github.com/substack/stream-handbook and https://nodejs.org/api/stream.html, and the problem is that I can't use writable.pipe(readable);.
I have tried using a Duplex stream (i.e. both the write and read streams are just Duplex streams) but this doesn't work as Duplex is an abstract class and I'd have to implement several of the linking parts.
Question: how do I use streams to link up this writing of CSV records to streaming an attachment to mailgun?
Don't overthink it, mailgun-js can take a stream as attachment, it can be as easy as:
var csv = require('csv');
var mailgun = require('mailgun-js');
// this will stream some csv
var file = myDbStream.pipe(csv.stringify());
var data = {
from: 'Excited User <me#samples.mailgun.org>',
to: 'serobnic#mail.ru',
subject: 'Hello',
text: 'Testing some Mailgun awesomness!',
attachment: file // attach it to your message, mailgun should deal with it
};
mailgun.messages().send(data, function (error, body) {
console.log(body);
});
I don't know what your Db is, maybe the driver already has a support for streams or you'll have to feed csv manually (can be done very easily with event-stream).
Edit
ya-csv seems to not be able to be piped easily, csv works better.
Related
I'm building a file sharing application with WebRTC and Node.js. It is a command line application so there will be no HTML invloved. I'm reading the file as a stream and sending it, then at reciever's side I'll download the file. Here's how I'll be writing the sender's code :
// code taken from https://github.com/coding-with-chaim/file-transfer-
// final/blob/master/client/src/routes/Room.js
const reader = stream.getReader();
reader.read().then(obj => {
handlereading(obj.done, obj.value);
});
// recursive function for sending out chunks of stream
function handlereading(done, value) {
if (done) {
peer.write(JSON.stringify({ done: true, fileName: file.name }));
return;
}
peer.write(value);
reader.read().then(obj => {
handlereading(obj.done, obj.value);
})
}
On the reciever's side I'll be converting the incoming file (stream) to Blob but people online are saying that there will be an issue of backpressure if the size of the file is too large. How should I write the file dowloading code to avoid backpressure so that it doesn't crash the reciever's side due to buffer overflow? Or should there be another approach to downloading the file?
You want to listen to onbufferedamountlow after setting bufferedAmountLowThreshold
You will want to put all your logic on the sender side, the receiver doesn't have any control. I think MDN is your best resource, I didn't find any good single article on this.
I do have an example in Pion here but that is in Go. The same concept though so hopefully helpful!
I set up my discord BOT using node.js. For my advantage, I would need to store some data on a external file, but I don't seem to be able to access it from my index.js file (the main Bot file).
I've tried having one static array in the external js/json files, but I can only retrieve undefined/empty values. Additionally, when I tried with a .txt file, once retrieved the content, I found it unable to call functions such as string.split().
Did I miss something in the package content perhaps?
Assuming the data you are storing is in UTF-8 encoding:
var fs = require('fs');
fs.readFile('path/to/file', 'utf8', function(err, contents) {
// code using file data
});
Assuming no errors contents will be a string of the data that is inside that file.
https://code-maven.com/reading-a-file-with-nodejs
I'm trying to create a Read Stream from a remote file without writing it to disc.
var file = fs.createWriteStream('Video.mp4');
var request = http.get('http://url.tld/video.mp4', function(response){
response.pipe(file);
});
Can I create a Read Stream directly from an HTTP response without writing it to disc ? Maybe creating a buffer in chunks and covert it to readable stream ?
Seems like you can use the request module.
Have a look at 7zark7's answer here: https://stackoverflow.com/a/14552721/7189461
I'm trying to download a large XML file and parse it using xml-stream library. I'm using request to download the file and it's capable of streaming the content of the file. Ideally, I'd like to pass that stream directly to xml-stream and have it parsed. But I can't figure out how to connect those two.
Here is the code a have so far:
request('http://example.com/data.xml').pipe(fs.createWriteStream('data.xml'));
...
var stream = fs.createReadStream('data.xml');
var xml = new XmlStream(stream);
Is it possible to connect them directly without a temp data.xml file?
request() returns a Readable stream, so just pass that return value to XmlStream():
var xml = new XmlStream(request('http://example.com/data.xml'));
I've never used streams in Node.js, so I apologize in advance if this is trivial.
I'm using the ya-csv library to create a CSV. I use a line like this:
csvwriter = csv.createCsvStreamWriter(process.stdout)
As I understand it, this takes a writable Stream and writes to it when I add a record.
I need to use this CSV as an email attachment.
From nodemailer's docs, here is how to do that:
attachments: [
{ // stream as an attachment
fileName: "text4.txt",
streamSource: fs.createReadStream("file.txt")
}
]
As I understand it, this takes a readable Stream and reads from it.
Therein lies the problem. I need a readable Stream, I need a writable Stream, but at no point do I have a Stream.
It would be nice if ya-csv had a:
csvwriter = csv.createReadableCsvStream()
But it doesn't. Is there some built-in stream that makes available for writing whatever it reads? I've looks for a library with no success (though there are a few things that could work but seem like overkill).
you can use PassThrough stream for that:
var PassThrough = require('stream').PassThrough
var stream = new PassThrough
var csvwriter = csv.createCsvStreamWriter(stream)
now you can read from stream whatever is written