I'm trying to make an extension that uses chrome native messaging to communicate with youtube-dl using a node.js host script. I've been able to successfully parse the stdin from the extension & also been able to run a child process (i.e. touch file.dat), but when I try to exec/spawn youtube-dl it hangs on the command. I've tried the host script independently of chrome native input and it works fine. I think the problem may have something to do with 1MB limitations on buffer size of chrome native messaging. Is there a way around reading the buffer?
#! /usr/bin/env node
"use strict";
const fs = require('fs');
const exec = require('child_process').execSync;
const dlPath = '/home/toughluck/Music';
let first = true;
let buffers = [];
process.stdin.on('readable', () => {
let chunk = process.stdin.read();
if (chunk !== null) {
if (first) {
chunk = chunk.slice(4);
first = false;
}
buffers.push(chunk);
}
});
process.stdin.on('end', () => {
const res = Buffer.concat(buffers);
const url = JSON.parse(res).url;
const outTemplate = `${dlPath}/%(title)s.%(ext)s`;
const cmdOptions = {
shell: '/bin/bash'
};
const cmd = `youtube-dl --extract-audio --audio-format mp3 -o \"${outTemplate}\" ${url}`;
// const args = ['--extract-audio', '--audio-format', 'mp3', '-o', outTemplate, url];
// const cmd2 = 'youtube-dl';
process.stderr.write('Suck it chrome');
process.stderr.write('stderr doesnt stop host');
exec(cmd, cmdOptions, (err, stdout, stderr) => {
if (err) throw err;
process.stderr.write(stdout);
process.stderr.write(stderr);
});
process.stderr.write('\n Okay....');
});
The full codebase can be found at https://github.com/wrleskovec/chrome-youtube-mp3-dl
So I was right about what was causing the problem. It had to do with 1 MB limitation on host to chrome message. You can avoid this by redirecting the stdout/stderr to a file.
const cmd = `youtube-dl --extract-audio --audio-format mp3 -o \"${outTemplate}\" ${url} &> d.txt`;
This worked for me. To be honest I'm not entirely why the message is considered > 1 MB and if someone can give a better explanation that would be great.
Related
When I run a little node program and I break the output through a pipe, the bash terminal output remains hidden and I'm forced to run reset (which works every time). How should I restore correctly after a broken pipe to avoid going through reset?
The program:
const { unmarshall } = require("#aws-sdk/util-dynamodb");
const fs = require('fs');
(async () => {
const input = fs.readFileSync(process.argv[2], 'utf-8');
const records = JSON.parse(input);
if (records.Items) {
records.Items = records.Items.map((a) => unmarshall(a));
}
process.stdout.on('error', function( err ) {
if (err.code === 'EPIPE') {
process.exit(0);
}
});
process.stdout.write(JSON.stringify(records, undefined, 2));
})();
And when I run this program like this and exit from less via a q keystroke, subsequent terminal output is hidden (after exiting the JS program and breaking the pipe). Output is restored via reset:
node example.js dynamo_output.json | less
# no terminal output is visible
$ reset
# output is restored
This appears to work:
const fs = require('fs');
const writeStdoutSync = (str) => {
fs.writeSync(process.stdout.fd, str);
}
I'm writing a cli in node, I want to open the users $EDITOR to edit data that is read from a stream (an http response IncomingMessage).
How can I send the data to a file descriptor?
In bash I could write this:
$EDITOR <(curl $url)
or
$DIFF <(curl $url_1) <(curl $url_2)
<(curl $url) expands to something like /proc/self/fd/11
echo <(curl $url)
/proc/self/fd/11
But how would I write it in javascript?
import cp from 'child_process'
const fisrt = request(...);
const second = require(...);
const first_fd = ???;
const second_fd = ???;
const proc = cp.spawn(process.env.DIFF, [first_fd, second_fd] { stdio: 'inherit' });
Okay, if stream is backed by a socket or fd you can pass it to options.stdio, but what if it isn't, what if it's a transform stream?
options.stdio
object - Share a readable or writable stream that refers to a tty, file, socket, or a pipe with the child process. The stream's underlying file descriptor is duplicated in the child process to the fd that corresponds to the index in the stdio array. The stream must have an underlying descriptor (file streams do not until the 'open' event has occurred).
Yes I could create a temp file but can I do it without one?
You can stream a downloaded content into vim text editor in your terminal with the following nodejs code:
const { spawn } = require('child_process');
const request = require('request');
//
request({
url: 'https://google.com'
}, function (err, res, body) {
const vi = spawn('vi', ['-'], { stdio: ['pipe', process.stdout, process.stderr] });
vi.stdin.write(body);
vi.stdin.end();
});
Then from your terminal if you execute this code, it will download google's html and let you edit and save it in a file. You can use :w myfile.txt to save to a file in vim.
Further reading on this matter: https://2ality.com/2018/05/child-process-streams.html
const { spawnSync } = require('child_process');
const string_1 = 'foo';
const string_2 = 'foobar';
const command = 'diff';
const args = [
'--unified',
`<(echo "${string_1}")`,
`<(echo "${string_2}")`,
];
const options = {
'shell': '/bin/bash',
};
const diff = spawnSync(command, args, options);
console.log(diff.stdout.toString());
I am trying to programm an converter which can take any video source and convert it to mp3. The mp3 should be saved on my hard drive, or in an buffer to send it via telegram.
It works good so far, the only problem I am facing is that it can only take one video at a time, and I don't know why.
// IMPORTS
var fs = require('fs');
var https = require('https');
var child_process = require('child_process');
// EVENTEMITER (Not used so far)
var util = require('util');
var EventEmitter = require('events').EventEmitter;
// STREAMHANDLER
var StreamHandler = function(url, name){
// VARIABLES
self = this;
this.url = url;
this.name = name;
// CREATE FFMPEG PROCESS
var spawn = child_process.spawn;
var args = ['-i', 'pipe:0', '-f', 'mp3', '-ac', '2', '-ab', '128k', '-acodec', 'libmp3lame', 'pipe:1'];
this.ffmpeg = spawn('ffmpeg', args);
// GRAB STREAM
https.get(url, function(res) {
res.pipe(self.ffmpeg.stdin);
});
// WRITE TO FILE
this.ffmpeg.stdout.pipe(fs.createWriteStream(name));
//DEBUG
this.ffmpeg.stdout.on("data", function (data) {
console.error(self.name);
});
}
util.inherits(StreamHandler, EventEmitter);
// TESTING
var test1 = new StreamHandler(vidUrl, "test1.mp3");
test1.ffmpeg.on("exit", function (code, name, signal) {
console.log("Finished: " + test1.name);
});
var test2 = new StreamHandler(vidUrl, "test2.mp3");
test2.ffmpeg.on("exit", function (code, name, signal) {
console.log("Finished: " + test2.name);
});
It skips test1.mp3 and only converts test2.mp3, but 2 ffmpeg processes were created:
After test2.mp3 is converted the other ffmpeg thread stays open, but does nothing, and the node program gets stuck waiting (i guess so) for it to send something.
I hope someone can help me :)
Using your code, I had the same problem. It would hang at the end and only output data for the test2.mp3 file. I'm not exactly sure what caused the problem, but I changed it a bit and this works for me:
// IMPORTS
var fs = require('fs');
//var https = require('https');
var http = require('http');
var child_process = require('child_process');
// EVENTEMITER (Not used so far)
var util = require('util');
var EventEmitter = require('events').EventEmitter;
// These never change...
var spawn = child_process.spawn;
var args = ['-i', 'pipe:0', '-f', 'mp3', '-ac', '2', '-ab', '128k', '-acodec', 'libmp3lame', 'pipe:1'];
// STREAMHANDLER
var StreamHandler = function(url, name){
// CREATE FFMPEG PROCESS
var ffmpeg = spawn('ffmpeg', args);
// GRAB STREAM
http.get(url, function(res) {
res.pipe(ffmpeg.stdin);
});
// WRITE TO FILE
ffmpeg.stdout.pipe(fs.createWriteStream(name));
ffmpeg.on("exit", function() {
console.log("Finished:", name);
});
//DEBUG
ffmpeg.stdout.on("data", function(data) {
console.error(name, "received data");
});
}
util.inherits(StreamHandler, EventEmitter);
// TESTING
var vidUrl = 'http://www.sample-videos.com/video/mp4/720/big_buck_bunny_720p_1mb.mp4';
var test1 = new StreamHandler(vidUrl, "test1.mp3");
var test2 = new StreamHandler(vidUrl, "test2.mp3");
I am using http instead of https, because I didn't have a sample video at an https url available. It shouldn't make a difference.
I moved the spawn and args variables out of the object, because they don't change. I also do not use this to store the local variables. I just use a normal closure instead. Finally, I moved the exit event handling code inside the object. I just think it's better to group all that stuff together -- plus, it's only declared once rather than for each new process you create.
Running this gives me the following output (I saved the script as ffmpeg.js):
$ node ffmpeg.js
test2.mp3 received data
Finished: test2.mp3
test1.mp3 received data
Finished: test1.mp3
Also, just a tip. If you want to use the this object inside StreamHandler, I would recommend using arrow functions if your version of Node supports them. This code also works:
var StreamHandler = function(url, name){
// CREATE FFMPEG PROCESS
this.ffmpeg = spawn('ffmpeg', args);
// GRAB STREAM
http.get(url, (res) => {
res.pipe(this.ffmpeg.stdin);
});
// WRITE TO FILE
this.ffmpeg.stdout.pipe(fs.createWriteStream(name));
this.ffmpeg.on("exit", () => {
console.log("Finished:", name);
});
//DEBUG
this.ffmpeg.stdout.on("data", (data) => {
console.error(name, "received data");
});
}
Notice that with arrow functions, I don't have to use var self = this; Avoiding that is pretty much the reason arrow functions were added to javascript.
Hope this helps!
-- EDIT --
Ok, I figured it out. The problem in your code is this line:
self = this;
It should be:
var self = this;
Without the var specifier, you are creating a global variable. So, the second time you are calling new StreamHandler, you are overwriting the self variable. That's why the test1.mp3 file hangs and the test2.mp3 file is the only one finishing. By adding var, your original script now works for me.
I am using wkhtmltopdf to generate pdfs in nodejs
Below is my sample code to generate pdf
var wkhtmltopdf = require('wkhtmltopdf')
, createWriteStream = require('fs').createWriteStream;
var r = wkhtmltopdf('http://www.google.com', { pageSize: 'letter' })
.pipe(createWriteStream('C:/MYUSERNAME/demo.pdf'));
r.on('close', function(){
mycallback();
});
The above code is generating corrupt pdfs. I could not figure out the issue.
Although when I generate pdfs using command prompt it is generating correctly
like when I use below code in windows command prompt
wkhtmltopdf http://www.google.com demo.pdf
I get correct pdf generated,sadly when I try to generate pdf in node environment, it generates corrupt pdfs.
Incase it helps I'm using wkhtmltopdf 0.11.0 rc2
Thanks in advance.
wkhtmltopdf for node has a bug for windows, so you can write a new one.
Like this:
function wkhtmltopdf(input, pageSize) {
var spawn = require('child_process').spawn;
var html;
var isUrl = /^(https?|file):\/\//.test(input);
if (!isUrl) {
html = input;
input = '-';
}
var args = ['wkhtmltopdf', '--quiet', '--page-size', pageSize, input, '-']
if (process.platform === 'win32') {
var child = spawn(args[0], args.slice(1));
} else {
var child = spawn('/bin/sh', ['-c', args.join(' ') + ' | cat']);
}
if (!isUrl) {
child.stdin.end(html);
}
return child.stdout;
}
// usage:
createWriteStream = require('fs').createWriteStream;
wkhtmltopdf('http://google.com/', 'letter')
.pipe(createWriteStream('demo1.pdf'));
wkhtmltopdf('<body>hello world!</body>', 'letter')
.pipe(createWriteStream('demo2.pdf'));
note: the param is now 'letter' not { pageSize: 'letter' }
I would like to
C:\>ACommandThatGetsData > save.txt
But instead of parsing and saving the data in the console, I would like to do the above command with Node.JS
How to execute a shell command with Node.JS?
Use process.execPath():
process.execPath('/path/to/executable');
Update
I should have read the documentations better.
There is a Child Process Module which allows to execute a child process. You will need either child_process.exec, child_process.execFile or child_process.spawn. All of these are similar in use, but each has its own advantages. Which of them to use depends on your needs.
You could also try the node-cmd package:
const nodeCmd = require('node-cmd');
nodeCmd.get('dir', (err, data, stderr) => console.log(data));
On newer versions of the package, the syntax changed a little:
const nodeCmd = require('node-cmd');
nodeCmd.run('dir', (err, data, stderr) => console.log(data));
I know this question is old, but it helped me get to my solution using promises.
Also see: this question & answer
const util = require('util');
const exec = util.promisify(require('child_process').exec);
async function runCommand(command) {
const { stdout, stderr, error } = await exec(command);
if(stderr){console.error('stderr:', stderr);}
if(error){console.error('error:', error);}
return stdout;
}
async function myFunction () {
// your code here building the command you wish to execute ...
const command = 'dir';
const result = await runCommand(command);
console.log("_result", result);
// your code here processing the result ...
}
// just calling myFunction() here so it runs when the file is loaded
myFunction();