nodejs exec ffmpeg, callback is fired before ffmpeg finishes - node.js

I run ffmpeg command via a nodejs script:
var exec = require('child_process').exec;
exec('ffmpeg -with -some -options somefile.mp4',function(error, stdout, stderr){
// some processing
});
I've noticed that sometimes, the callback is fired even before ffmpeg finishes. How can I make sure to do the callback processing only after ffmpeg finishes? and what may have caused it to fire the callback without finishing?

Instead of exec use process.spawn. You can then have callbacks for different events.
this.ffmpeg = process.spawn("ffmpeg",
"-f dshow -some -other -options somefile.mp4".split(" "),
{ shell: true })
this.ffmpeg.on('error', (error) => {
console.log("ffmpeg error: " + error)
})
this.ffmpeg.stderr.on('data', (data) => {
console.error(`stderr: ${data}`)
})
this.ffmpeg.on('exit', (code, signal) => {
console.log("ffmpeg shutdown: code " + code + ", signal " + signal)
})

This is my approach
downloadHLS = function (uri, opath) {
var self = this;
// defaults
var loglevel = self.logger.isDebug() ? 'debug' : 'warning';
return new Promise((resolve, reject) => {
const args = [
'-y',
'-loglevel', loglevel,
'-v', 'quiet',
'-i', uri,
opath
];
const opts = {
cwd: self._options.tempDir
};
cp.spawn('ffmpeg', args, opts)
.stderr.on('data', (data) => {
self.logger.warn(`stderr: ${data}`);
})
.on('message', msg => self.logger.info(msg))
.on('error', reject)
.on('exit', (code, signal) => {
self.logger.info("downloadHLS [exit] code:%s signal:%s", code, signal);
})
.on('close', () => {
self.logger.info("downloadHLS [close]");
return resolve();
})
});
}//downloadHLS
The event called when ffpmeg ends the processing is close. The event exit is not being called in this modality actually.

Related

How to use promisify() with the spawn() function for the 'child_process'?

I have the following code example, and I have now ideas on how to resolve this using utils.promisify(); ONLY! Not Promise!
const spawn = child_process.spawn('docker', ['--version']);
spawn.stdout.on('data', (data) => {
process.stdout.write(data);
});
spawn.on('error', () => {
process.stderr.write(error);
process.exit(1);
});
The second code example works fine.
const promisifiedExecFile = promisify(child_process.execFile);
async function test() {
const version = await promisifiedExecFile('docker', ['--version']);
console.log(version);
}
test();
I couldn't quickly find out why the promisify function does now work properly with the spawn function. But you can create your own promisify function like this:
TS
import { spawn as spwn } from 'child_process';
const spawn = (
cmd: string,
args: ReadonlyArray<string>,
) => new Promise((resolve, reject) => {
const cp = spwn(cmd, args);
const error: string[] = [];
const stdout: string[] = [];
cp.stdout.on('data', (data) => {
stdout.push(data.toString());
});
cp.on('error', (e) => {
error.push(e.toString());
});
cp.on('close', () => {
if (error.length) reject(error.join(''));
else resolve(stdout.join(''));
});
});
(async () => {
try {
const stdOut = await spawn('docker', ['--version']);
console.log('stdOut: ', stdOut);
} catch (error) {
console.log('error:', error);
process.exit(1);
}
})();
JS
const { spawn: spwn } = require('child_process');
const spawn = (
cmd,
args,
) => new Promise((resolve, reject) => {
const cp = spwn(cmd, args);
const error = [];
const stdout = [];
cp.stdout.on('data', (data) => {
stdout.push(data.toString());
});
cp.on('error', (e) => {
error.push(e.toString());
});
cp.on('close', () => {
if (error.length) reject(error.join(''));
else resolve(stdout.join(''));
});
});
(async () => {
try {
const stdOut = await spawn('docker', ['--version']);
console.log('stdOut: ', stdOut);
} catch (error) {
console.log('error: ', error);
process.exit(1);
}
})();
Node.js' built-in util package has a promisify() function that converts callback-based functions to promise-based functions. This lets you use promise chaining and async/await with callback-based APIs.
I think that we can't use the promisify() with the spawn() function.
For example we can use promisify() with execFile() instead of spawn():
async asyncExecFile(tool) {
const execFile = util.promisify(child_process.execFile);
return await execFile(tool, ['--version'])
.catch(() => {
this.printError(`The "${tool}" don't exist in the current environment. \n`);
process.exit(0);
});
}
It is not possible because there is nothing to promisify. promisify works on functions where it takes a callback and spawn does not take a callback.
You use spawn by taking the returned ChildProcess then adding listeners to the ChildProcess' readable streams (stdout, stderr, stdio...)
Omar Omeiri's answer is similar to how execFile works inside node itself, so you can just use the promisified execFile instead. (if you need unlimited buffer, pass in maxBuffer: Infinity inside options)

How to correctly handle child_process `close` event on try catch block? [duplicate]

I'm using the Bluebird promise library under Node.js, it's great! But I have a question:
If you take a look at the documentation of Node's child_process.exec and child_process.execFile you can see that both of these functions are returning a ChildProcess object.
So what's the recommended way to promisify such functions?
Note that the following works (I get a Promise object):
var Promise = require('bluebird');
var execAsync = Promise.promisify(require('child_process').exec);
var execFileAsync = Promise.promisify(require('child_process').execFile);
But how can one get access to the original return value of the original Node.js functions? (In these cases I would need to be able to access the originally returned ChildProcess objects.)
Any suggestion would be appreciated!
EDIT:
Here is an example code which is using the return value of the child_process.exec function:
var exec = require('child_process').exec;
var child = exec('node ./commands/server.js');
child.stdout.on('data', function(data) {
console.log('stdout: ' + data);
});
child.stderr.on('data', function(data) {
console.log('stderr: ' + data);
});
child.on('close', function(code) {
console.log('closing code: ' + code);
});
But if I would use the promisified version of the exec function ( execAsync from above ) then the return value will be a promise, not a ChildProcess object. This is the real problem I am talking about.
I would recommend using standard JS promises built into the language over an additional library dependency like Bluebird.
If you're using Node 10+, the Node.js docs recommend using util.promisify which returns a Promise<{ stdout, stderr }> object. See an example below:
const util = require('util');
const exec = util.promisify(require('child_process').exec);
async function lsExample() {
try {
const { stdout, stderr } = await exec('ls');
console.log('stdout:', stdout);
console.log('stderr:', stderr);
} catch (e) {
console.error(e); // should contain code (exit code) and signal (that caused the termination).
}
}
lsExample()
Handle errors first from stderr.
It sounds like you'd like to return two things from the call:
the ChildProcess
a promise that resolves when the ChildProcess completes
So "the recommended way to promisify such functions"? Don't.
You're outside the convention. Promise returning functions are expected to return a promise, and that's it. You could return an object with two members (the ChildProcess & the promise), but that'll just confuse people.
I'd suggest calling the unpromisified function, and creating a promise based off the returned childProcess. (Maybe wrap that into a helper function)
This way, it's quite explicit for the next person who reads the code.
Something like:
var Promise = require('bluebird');
var exec = require('child_process').execFile;
function promiseFromChildProcess(child) {
return new Promise(function (resolve, reject) {
child.addListener("error", reject);
child.addListener("exit", resolve);
});
}
var child = exec('ls');
promiseFromChildProcess(child).then(function (result) {
console.log('promise complete: ' + result);
}, function (err) {
console.log('promise rejected: ' + err);
});
child.stdout.on('data', function (data) {
console.log('stdout: ' + data);
});
child.stderr.on('data', function (data) {
console.log('stderr: ' + data);
});
child.on('close', function (code) {
console.log('closing code: ' + code);
});
If you're just wanting to promisify specifically child_process.exec() and child_process.execFile(), in recent node versions there is a better answer here.
Since Node v12 the built-in util.promisify allows access to the ChildProcess object in the returned Promise for built-in functions where it would have been returned by the un-promisified call. From the docs:
The returned ChildProcess instance is attached to the Promise as a child property.
This correctly and simply satisfies the need to access ChildProcess in the original question and makes other answers out of date providing that Node v12+ can be used.
Adapting the example (and concise style) provided by the questioner, access to the ChildProcess can be achieved like:
const util = require('util');
const exec = util.promisify(require('child_process').exec);
const promise = exec('node ./commands/server.js');
const child = promise.child;
child.stdout.on('data', function(data) {
console.log('stdout: ' + data);
});
child.stderr.on('data', function(data) {
console.log('stderr: ' + data);
});
child.on('close', function(code) {
console.log('closing code: ' + code);
});
// i.e. can then await for promisified exec call to complete
const { stdout, stderr } = await promise;
Here's another way:
function execPromise(command) {
return new Promise(function(resolve, reject) {
exec(command, (error, stdout, stderr) => {
if (error) {
reject(error);
return;
}
resolve(stdout.trim());
});
});
}
Use the function:
execPromise(command).then(function(result) {
console.log(result);
}).catch(function(e) {
console.error(e.message);
});
Or with async/await:
try {
var result = await execPromise(command);
} catch (e) {
console.error(e.message);
}
There's probably not a way to do nicely that covers all use cases. But for limited cases, you can do something like this:
/**
* Promisified child_process.exec
*
* #param cmd
* #param opts See child_process.exec node docs
* #param {stream.Writable} opts.stdout If defined, child process stdout will be piped to it.
* #param {stream.Writable} opts.stderr If defined, child process stderr will be piped to it.
*
* #returns {Promise<{ stdout: string, stderr: stderr }>}
*/
function execp(cmd, opts) {
opts || (opts = {});
return new Promise((resolve, reject) => {
const child = exec(cmd, opts,
(err, stdout, stderr) => err ? reject(err) : resolve({
stdout: stdout,
stderr: stderr
}));
if (opts.stdout) {
child.stdout.pipe(opts.stdout);
}
if (opts.stderr) {
child.stderr.pipe(opts.stderr);
}
});
}
This accepts opts.stdout and opts.stderr arguments, so that stdio can be captured from the child process.
For example:
execp('ls ./', {
stdout: new stream.Writable({
write: (chunk, enc, next) => {
console.log(chunk.toString(enc));
next();
}
}),
stderr: new stream.Writable({
write: (chunk, enc, next) => {
console.error(chunk.toString(enc));
next();
}
})
}).then(() => console.log('done!'));
Or simply:
execp('ls ./', {
stdout: process.stdout,
stderr: process.stderr
}).then(() => console.log('done!'));
Just want to mention that there's a nice tool that will solve your problem completely:
https://www.npmjs.com/package/core-worker
This package makes it a lot easier to handle processes.
import { process } from "CoreWorker";
import fs from "fs";
const result = await process("node Server.js", "Server is ready.").ready(1000);
const result = await process("cp path/to/file /newLocation/newFile").death();
or combine these functions:
import { process } from "core-worker";
const simpleChat = process("node chat.js", "Chat ready");
setTimeout(() => simpleChat.kill(), 360000); // wait an hour and close the chat
simpleChat.ready(500)
.then(console.log.bind(console, "You are now able to send messages."))
.then(::simpleChat.death)
.then(console.log.bind(console, "Chat closed"))
.catch(() => /* handle err */);
Here are my two cents. Uses spawn which streams the output and writes to stdout and stderr. The error and standard output is captured in buffers and are returned or rejected.
This is written I Typescript, feel free to remove typings if using JavaScript:
import { spawn, SpawnOptionsWithoutStdio } from 'child_process'
const spawnAsync = async (
command: string,
options?: SpawnOptionsWithoutStdio
) =>
new Promise<Buffer>((resolve, reject) => {
const [spawnCommand, ...args] = command.split(/\s+/);
const spawnProcess = spawn(spawnCommand, args, options);
const chunks: Buffer[] = [];
const errorChunks: Buffer[] = [];
spawnProcess.stdout.on("data", (data) => {
process.stdout.write(data.toString());
chunks.push(data);
});
spawnProcess.stderr.on("data", (data) => {
process.stderr.write(data.toString());
errorChunks.push(data);
});
spawnProcess.on("error", (error) => {
reject(error);
});
spawnProcess.on("close", (code) => {
if (code === 1) {
reject(Buffer.concat(errorChunks).toString());
return;
}
resolve(Buffer.concat(chunks));
});
});
Just another example you might run into issues when running multiple commands when destructuring with the same const's you can rename them like this.
const util = require('util');
const exec = util.promisify(require('child_process').exec);
async function runCommands() {
try {
const { stdout, stderr } = await exec('ls');
console.log('stdout:', stdout);
console.log('stderr:', stderr);
const { stdout: stdoutTwo, stderr: stderrTwo } = await exec('ls');
console.log('stdoutTwo:', stdoutTwo);
console.log('stderrTwo:', stderrTwo);
const { stdout: stdoutThree, stderr: stderrThree } = await exec('ls');
console.log('stdoutThree:', stdoutThree);
console.log('stderrThree:', stderrThree);
} catch (e) {
console.error(e); // should contain code (exit code) and signal (that caused the termination).
}
}
runCommands()
Here's mine. It doesn't deal with stdin or stdout, so if you need those then use one of the other answers on this page. :)
// promisify `child_process`
// This is a very nice trick :-)
this.promiseFromChildProcess = function (child) {
return new Promise((resolve, reject) => {
child.addListener('error', (code, signal) => {
console.log('ChildProcess error', code, signal);
reject(code);
});
child.addListener('exit', (code, signal) => {
if (code === 0) {
resolve(code);
} else {
console.log('ChildProcess error', code, signal);
reject(code);
}
});
});
};

How to catch error while using fork child process in node?

I am using the fork method to spawn a child process in my electron app, my code looks like this
'use strict'
const fixPath = require('fix-path');
let func = () => {
fixPath();
const child = childProcess.fork('node /src/script.js --someFlags',
{
detached: true,
stdio: 'ignore',
}
});
child.on('error', (err) => {
console.log("\n\t\tERROR: spawn failed! (" + err + ")");
});
child.stderr.on('data', function(data) {
console.log('stdout: ' +data);
});
child.on('exit', (code, signal) => {
console.log(code);
console.log(signal);
});
child.unref();
But my child process exits immediately with exit code 1 and signal, Is there a way I can catch this error? When I use childprocess.exec method I can catch using stdout.on('error'... Is there a similar thing for fork method? If not any suggestions on how I can work around this?
Setting the option 'silent:true' and then using event handlers stderr.on() we can catch the error if any. Please check the sample code below:
let func = () => {
const child = childProcess.fork(path, args,
{
silent: true,
detached: true,
stdio: 'ignore',
}
});
child.on('error', (err) => {
console.log("\n\t\tERROR: spawn failed! (" + err + ")");
});
child.stderr.on('data', function(data) {
console.log('stdout: ' +data);
});
child.on('exit', (code, signal) => {
console.log(code);
console.log(signal);
});
child.unref();

child_process.spawn doesn't emit any events

I'm trying to run ripgrep from my Node app and am seeing a strange behavior with child_process.spawn: none of the events fire and the app never finishes (is stuck somewhere inside the spawn call):
import { spawn } from 'child_process';
async function run() {
await spawnWrapper('rg', ['-F', '"demo"'], { cwd: __dirname });
}
export function spawnWrapper(command, args, options) {
return new Promise((resolve, reject) => {
let stdout = '';
let stderr = '';
const child = spawn(command, args, options);
console.log('spawn wrapper');
child.on('close', (code, signal) => {
console.log('close');
resolve({ code, signal, stdout, stderr });
});
child.on('error', (error) => {
console.log('error');
(error as any).stderr = stderr;
reject(error);
});
child.on('exit', (code, signal) => {
console.log('exit');
resolve({ code, signal, stdout, stderr });
});
child.stdout.setEncoding('utf8');
child.stderr.setEncoding('utf8');
child.stdout.on('data', (data) => {
console.log('stdout data');
stdout += data;
});
child.stderr.on('data', (data) => {
console.log('stderr data');
stderr += data;
});
});
}
I only get "spawn wrapper" in the console, no other events. I've never seen this behavior with other binaries, maybe it's something with ripgrep but still, shouldn't I be getting at least some hints by Node? Any suggestions on how to debug this?
It was caused by ripgrep waiting for input which was not obvious to me (on command line, it just executes straight away). Details here: https://github.com/BurntSushi/ripgrep/issues/410

Differentiate between error and standard terminal log with ffmpeg - nodejs

I'm using ffmpeg in node js. Both the standard terminal output and the error seems to be sent to stdout, so I don't know how to differentiate between error and success... Here's my code:
var convertToMp3 = function(filePath) {
var ffmpeg = child_process.spawn('ffmpeg',['-i', filePath, '-y', 'output.mp3']);
var err = '';
ffmpeg.stderr
.on('data', function(c) { err += c; })
.on('end', function() { console.log('stderr:', err); });
var d = '';
ffmpeg.stdout
.on('data', function(c){d +=c;})
.on('end', function(){ console.log('stdout', d); });
}
wether the conversion succeeds or fails, stdout is empty and stderr contains what I'd get if I'd run the corresponding command in the terminal
Ffmpeg outputs all of its logging data to stderr, to leave stdout free for piping the output data to some other program or another ffmpeg instance.
When running ffmpeg as an automatic process it's often useful give the option
-loglevel error
which turns it completely mute in a normal scenario and only outputs the error data (to stderr), which is normally what you would expect from a command-line program.
Inspired by #aergistal comment, here is a solution that works for me, where callback is to be executed at the end of the task, with the signature function(error, success), as usual.
var convertToMp3 = function(filePath, callback) {
var ffmpeg = child_process.spawn('ffmpeg',['-i', filePath, '-y', 'output.mp3']);
var err = '';
ffmpeg.stderr.on('data', function(c) { err += c; }).on('end', function() { console.log('stderr:', err); });
ffmpeg.on('exit', function(code) {
if (code) {
callback({code: code, message: err});
} else {
callback(null, {success: true, message: err});
}
});
}
or in the new js style:
const convertToMp3 = (filePath) => new Promise((resolve, reject) {
const ffmpeg = child_process.spawn('ffmpeg', ['-i', filePath, '-y', 'output.mp3']);
let output = '';
ffmpeg.stderr
.on('data', c => { output += c; });
ffmpeg.on('exit', code => {
if (code) {
reject({ code: code, message: output });
} else {
resolve(output);
}
});
});
const ffmpegOutput = await convertToMp3('./video.mp4');
...

Resources