How can you pipe a readable stream to gulp.dest()? - node.js

I've got the following sample code which attempts to pipe a stream to gulp.dest():
var gulp = require('gulp');
var stream = require('stream');
var readable = new stream.Readable;
readable.push('Hello, world!');
readable.push(null);
readable
.pipe(gulp.dest('./test.txt'));
This code produces the following error:
path.js:146
throw new TypeError('Arguments to path.resolve must be strings');
^
TypeError: Arguments to path.resolve must be strings
at Object.win32.resolve (path.js:146:13)
at DestroyableTransform.saveFile [as _transform] (C:\paylocity\expense\node_modules\gulp\node_modules\vinyl-fs\lib\dest\index.js:36:26)
at DestroyableTransform.Transform._read (C:\paylocity\expense\node_modules\gulp\node_modules\vinyl-fs\node_modules\through2\node_modules\readable-stream\lib\_stream_transform.js:184:10)
at DestroyableTransform.Transform._write (C:\paylocity\expense\node_modules\gulp\node_modules\vinyl-fs\node_modules\through2\node_modules\readable-stream\lib\_stream_transform.js:172:12)
at doWrite (C:\paylocity\expense\node_modules\gulp\node_modules\vinyl-fs\node_modules\through2\node_modules\readable-stream\lib\_stream_writable.js:237:10)
at writeOrBuffer (C:\paylocity\expense\node_modules\gulp\node_modules\vinyl-fs\node_modules\through2\node_modules\readable-stream\lib\_stream_writable.js:227:5)
at DestroyableTransform.Writable.write (C:\paylocity\expense\node_modules\gulp\node_modules\vinyl-fs\node_modules\through2\node_modules\readable-stream\lib\_stream_writable.js:194:11)
at Readable.ondata (_stream_readable.js:540:20)
at Readable.emit (events.js:107:17)
at Readable.read (_stream_readable.js:373:10)
I can't, however, see what exactly is wrong with the code. If I replace gulp.dest() with process.stdout then it works and the gulp.dest() works within the context of other calls. What is a viable way of doing this?

Gulp works with Vinyl streams:
var gulp = require('gulp'),
stream = require('stream'),
source = require('vinyl-source-stream');
var readable = new stream.Readable;
readable.push('Hello, world!');
readable.push(null);
readable
.pipe(source('test.txt'))
.pipe(gulp.dest('.'));
A Gulp stream normally begins with some file or files as source, so you need to wrap that readable stream into a Vinyl stream allowing Gulp and any gulp-plugin getting info from it like filename (obviously faked), avoiding that error of yours.
So, Gulp streams are streams of files, just check the source...

Related

Redirect Readable object stdout process to file in node

I use an NPM library to parse markdown to HTML like this:
var Markdown = require('markdown-to-html').Markdown;
var md = new Markdown();
...
md.render('./test', opts, function(err) {
md.pipe(process.stdout)
});
This outputs the result to my terminal as intended.
However, I need the result inside the execution of my node program. I thought about writing the output stream to file and then reading it in at a later time but I can't figure out a way to write the output to a file instead.
I tried to play around var file = fs.createWriteStream('./test.html'); but the node.js streams rather give me headaches than results.
I've also looked into the library's repo and Markdown inherits from Readable via util like this:
var util = require('util');
var Readable = require('stream').Readable;
util.inherits(Markdown, Readable);
Any resources or advice would be highly appreciated. (I would also take another library for parsing the markdown, but this gave me the best results so far)
Actually creating a writable file-stream and piping the markdown to this stream should work just fine. Try it with:
const writeStream = fs.createWriteStream('./output.html');
md.render('./test', opts, function(err) {
md.pipe(writeStream)
});
// in case of errors you should handle them
writeStream.on('error', function (err) {
console.log(err);
});

What does calling a function inside pipe, return in gulp?

I know the pipe function in NodeJs. A readable stream calls the pipe function with the first argument being the writable stream. Something like:
readable.pipe(fs.createWriteStream('file.txt'));
This will pipe all the output to file.txt. But I have not understood this in context of gulp.
What does a call to a pipe function like:
gulp.src('./assets/styles/**/*.scss')
.pipe(sass());
mean? Here is the full snippet:
var gulp = require('gulp');
var gutil = require('gulp-util');
// require sass
var sass = require('gulp-ruby-sass');
gulp.task('sass', function () {
gulp.src('./assets/styles/**/*.scss')
.pipe(sass())
.pipe(gulp.dest('./assets/styles'));
});
gulp.task('default', ['sass']);
I understand the dest part. But do not understand the pipe(sass()) part. What does it do? What stream does each of these functions return?
Note: I have taken the sample example from a blog
The pipe in gulp is the exact same as the pipe in Node.
This flow streams the sources files from .src() and creates a new stream a pipes it through the sass plugin - the sass plugin will then process all sass files into css and concat them to the destination path as a new stream.

How can process.stdin be used as the start point for a gulp task?

I'm using gulp to convert SCSS into CSS code with the gulp-sass plugin. This is all working fine, but I also want to use gulp to receive input (SCSS code) from a Unix pipe (i.e. read process.stdin) and consume this and stream the output to process.stdout.
From reading around process.stdin is a ReadableStream and vinyl seems like it could wrap stdin and then be used onwards in a gulp task, e.g.
gulp.task('stdin-sass', function () {
process.stdin.setEncoding('utf8');
var file = new File({contents: process.stdin, path: './test.scss'});
file.pipe(convert_sass_to_css())
.pipe(gulp.dest('.'));
});
However, when I do this I get an error:
TypeError: file.isNull is not a function
This makes me think that stdin is somehow special, but the official documentation for node.js states that it is a true ReadableStream.
So I got this to work by processing process.stdin and writing to process.stdout:
var buffer = require('vinyl-buffer');
var source = require('vinyl-source-stream');
var through = require('through2');
gulp.task('stdio-sass', function () {
process.stdin.setEncoding('utf8');
process.stdin.pipe(source('input.scss'))
.pipe(buffer())
.pipe(convert_sass_to_css())
.pipe(stdout_stream());
});
var stdout_stream = function () {
process.stdout.setEncoding('utf8');
return through.obj(function (file, enc, complete) {
process.stdout.write(file.contents.toString());
this.push(file);
complete();
});
};

node.js zlib returning 'Check Headers' error when gunzipping stream

So I am working on the nodeschool.io stream-adventure tutorial track and I'm having trouble with the last problem. The instructions say:
An encrypted, gzipped tar file will be piped in on process.stdin. To beat this
challenge, for each file in the tar input, print a hex-encoded md5 hash of the
file contents followed by a single space followed by the filename, then a
newline.
You will receive the cipher name as process.argv[2] and the cipher passphrase as
process.argv[3]. You can pass these arguments directly through to
`crypto.createDecipher()`.
The built-in zlib library you get when you `require('zlib')` has a
`zlib.createGunzip()` that returns a stream for gunzipping.
The `tar` module from npm has a `tar.Parse()` function that emits `'entry'`
events for each file in the tar input. Each `entry` object is a readable stream
of the file contents from the archive and:
`entry.type` is the kind of file ('File', 'Directory', etc)
`entry.path` is the file path
Using the tar module looks like:
var tar = require('tar');
var parser = tar.Parse();
parser.on('entry', function (e) {
console.dir(e);
});
var fs = require('fs');
fs.createReadStream('file.tar').pipe(parser);
Use `crypto.createHash('md5', { encoding: 'hex' })` to generate a stream that
outputs a hex md5 hash for the content written to it.
This is my attempt so far to work on it:
var tar = require('tar');
var crypto = require('crypto');
var zlib = require('zlib');
var map = require('through2-map');
var cipherAlg = process.argv[2];
var passphrase = process.argv[3];
var cryptoStream = crypto.createDecipher(cipherAlg, passphrase);
var parser = tar.Parse(); //emits 'entry' events per file in tar input
var gunzip = zlib.createGunzip();
parser.on('entry', function(e) {
e.pipe(cryptoStream).pipe(map(function(chunk) {
console.log(chunk.toString());
}));
});
process.stdin
.pipe(gunzip)
.pipe(parser);
I know it's not complete yet, but my issue is that when I try to run this, the input never gets piped to the tar file parsing part. It seems to hang up on the piping to gunzip. This is my exact error:
events.js:72
throw er; // Unhandled 'error' event
^
Error: incorrect header check
at Zlib._binding.onerror (zlib.js:295:17)
I'm totally stumped because the node documentation for Zlib has no mention of headers except for when it has examples with the http/request modules. There are a number of other questions regarding this error with node, but most use buffers rather than streams, so I couldn't find a relevant answer to my problem. All help is greatly appreciated
I actually figured it out, I was supposed to decrypt the stream before unzipping it.
So instead of:
process.stdin
.pipe(gunzip)
.pipe(parser);
it should be:
process.stdin
.pipe(cryptoStream)
.pipe(gunzip)
.pipe(parser);

node.js tar module, 'entry' readable stream

how should I use 'entry' readable stream from tar module to pipe - npmsj.org their content without get a stream error in pipe?
this is to get hint for stream-adventure - github last exercise.
I'm not looking for a answer. but a hint or advice.
Here is my code:
var zlib = require('zlib');
var tar = require('tar');
var crypto = require('crypto');
var through = require('through');
var unzip = zlib.createGunzip();
var parser = tar.Parse();
var stream = process.stdin.pipe(crypto.createDecipher(process.argv[2], process.argv[3])).pipe(unzip);
var md5 = crypto.createHash('md5', { encoding: 'hex' });
parser.on('entry', function(entry) {
if (entry.type === 'File') {
entry.pipe(md5).pipe(process.stdout);
console.log(entry.path);
}
});
unzip.pipe(parser);
here is the output:
$> stream-adventure run app
97911dcc607865d621029f6f927c7851
stream.js:94
throw er; // Unhandled stream error in pipe.
^
Error: write after end
at writeAfterEnd (_stream_writable.js:130:12)
at Hash.Writable.write (_stream_writable.js:178:5)
at Entry.ondata (stream.js:51:26)
at Entry.EventEmitter.emit (events.js:117:20)
at Entry._read (/home/n0t/stream-adventure/secretz/node_modules/tar/lib/entry.js:111:10)
at Entry.write (/home/n0t/stream-adventure/secretz/node_modules/tar/lib/entry.js:68:8)
at Parse._process (/home/n0t/stream-adventure/secretz/node_modules/tar/lib/parse.js:104:11)
at BlockStream.<anonymous> (/home/n0t/stream-adventure/secretz/node_modules/tar/lib/parse.js:46:8)
at BlockStream.EventEmitter.emit (events.js:95:17)
at BlockStream._emitChunk (/home/n0t/stream-adventure/secretz/node_modules/tar/node_modules/block-stream/block-stream.js:145:10)
and with the verify:
$> stream-adventure verify app
ACTUAL: "97911dcc607865d621029f6f927c7851"
EXPECTED: "97911dcc607865d621029f6f927c7851 secretz/METADATA.TXT"
ACTUAL: null
EXPECTED: "2cdcfa9f8bbefb82fb7a894964b5c199 secretz/SPYING.TXT"
ACTUAL: null
EXPECTED: ""
# FAIL
You get this error because entry writes into the md5 stream after is has been closed. Once a stream is closed, you can't write into it again: for md5 this is easy to understand because you have to reset the internal buffers, otherwise the hash will be skewed.
In your example, on each file in the tar module, you pipe the file stream into the same md5 stream. You just have to pipe the file stream into a new MD5 stream; here is how you can do it properly:
parser.on('entry', function(entry) {
if (entry.type === 'File') {
var md5 = crypto.createHash('md5', { encoding: 'hex' });
entry.pipe(md5).pipe(process.stdout);
console.log(entry.path);
}
});

Resources