I'm using a video stream on a raspberry pi with raspivid and ffmpeg in a node app. Using them in the terminal (without using node) will stream for hours, but when I use them in a node child_process (I spawn 2, one for each) it works great for a little over 3 minutes and then the stream stops. The child_processes are still running and I'm not seeing any errors.
The gist of my code:
let camera = spawn('raspivid', args)
let ffmpeg = spawn('ffmpeg', args)
camera.stdout.on('data', (data) => {
ffmpeg.stdin.write(data)
})
Any ideas why it is stopping after 3 minutes? Thanks!
use video.mkv format
It happened to me in mp4 format and when I switched to mkv it worked out
Related
I used to do this with omxplayer before, on Raspberry Pi, and it worked fine:
const ChildProcess = require("child_process");
ChildProcess.exec(command, function (err) {
if (err !== null) {
console.log("" + err);
}
});
I could pass any omxplayer-related arguments within the command string, without issues.
With VLC on the other hand, executing cvlc [pathToAudioFile] doesn't do anything.
Oh, and I tried the exact same command, i.e. using the same file path, from the CLI and it played the audio file perfectly.
So how do I start VLC (using cvlc) from within NodeJS and pass arguments to it as well?
The reason that the cvlc [pathToFile] command didn't produce an outcome was incorrect permissions on the media file. chmod-ing that file with 777 solved the problem.
I'm running the following code using node and I always get only about 80% of what stdout is supposed to return when the file is past 12ko or so. If the file is 240ko it will still output 80% of it. If it's under 12 it will output completely.
When I open a cmd and run the command manually I always get the full output.
I tried exec, execFile, I tried increasing the max buffer or changing the encoding and it's not the issue. I tried to add options {shell: true, detached: true} but it vain, in fact, when I run it as detached it seems to be running completely fine as it does open an actual console but I'm not able to retrieve the stdout when the process is completed.
const spawn = require('child_process').spawn;
const process = spawn(
'C:\\Users\\jeanphilipped\\Desktop\\unrtf\\test\\unrtf.exe',
['--html' ,'C:\\Users\\jeanphilipped\\Desktop\\unrtf\\test\\tempaf45.rtf'],
);
let chunks = [];
process.stdout.on('data', function (msg) {
chunks = [...chunks, ...msg];
});
process.on('exit', function (msg) {
const buffer = Buffer.from(chunks);
console.log(buffer.toString());
});
Any clues ? It seems to be Node since when I run it manually everything works fine.
according to nodejs documentation, all of the child_process commands are asynchronous . so when you try to access your chunk variable there is no guarantee that you command has been finished , maybe your command is still on process. so it is recommended you should wrap your entire child_process command in async/await function.
I'm building an Electron application (Node.js) which needs to spawn gcloud app deploy from the application with realtime feedback (stdin/stdout/stderr).
I rapidly switched from child_process to execa because I had some issues on Mac OS X with the child_process buffer which is limited to 200kb (and gcloud app deploy sends some big chunk of string > 200kb which crash the command).
Now, with execa everything seems to work normally on OSX but not on Windows.
The code looks something like this:
let bin = `gcloud${/^win/.test(process.platform) ? '.cmd' : ''}`
//which: https://github.com/npm/node-which
which(bin, (err, fullpath) => {
let proc = execa(fullpath, ['app', 'deploy'], {
cwd: appPath
})
proc.stdout.on('data', data => {
parseDeploy(data.toString())
})
proc.stderr.on('data', data => {
parseDeploy(data.toString())
})
proc.then(() => {
...
}).catch(e => {
...
})
})
This code works perfectly on Mac OS X while I haven't the same result on Windows
I have tried lots of thing:
execa()
execa.shell()
options shell:true
I tried maxBuffer to 1GB (just in case)
It works with detached:true BUT I can't read stdout / stderr in realtime in the application as it prompts a new cmd.exe without interaction with the Node.js application
Lots of child_process variant.
I have made a GIST to show the responses I get for some tests I have done on Windows with basic Child Process scripts:
https://gist.github.com/thyb/9b53b65c25cd964bbe962d8a9754e31f
I also opened an issue on execa repository: https://github.com/sindresorhus/execa/issues/97
Does someone already got this issue ? I've searched around and found nothing promising except this reddit thread which doesn't solve this issue.
Behind the scene, gcloud.cmd is running a python script. After reading tons of Node.js issue with ChildProcess / Python and Windows, I fell on this thread: https://github.com/nodejs/node-v0.x-archive/issues/8298
There is some known issue about running Python scripts from a Node.js Child Process.
They talk in this comment about an unbuffered option for python. After updating the shell script in gcloud.cmd by adding the -u option, I noticed everything was working as expected
This comment explains how to set this option as an environment variable (to not modify the windows shell script directly): https://docs.python.org/2/using/cmdline.html#envvar-PYTHONUNBUFFERED
So adding PYTHONUNBUFFERED to the environment variable fix this issue !
execa(fullpath, ['app', 'deploy'], {
cwd: appPath,
env: Object.assign({}, process.env, {
PYTHONUNBUFFERED: true
})
})
I am writing a CLI tool for a node.js app. Some of the commands have to run npm and show the results. This is what I have so far:
import {spawn} from 'child_process';
let projectRoot = '...';
let npm = (process.platform === "win32" ? "npm.cmd" : "npm"),
childProcess = spawn(npm, ["install"], { cwd: projectRoot });
childProcess.stdout.pipe(process.stdout);
childProcess.stderr.pipe(process.stderr);
childProcess.on('close', (code) => {
// Continue the remaining operations
});
The command does run fine and outputs the results (or errors). However, it doesn't give me a live feed with the progress bar, etc. It waits until the entire operation is over and then dumps the output into the console.
I've tried different variations of the spawn configuration but I can't get it to show me the live feed.
I am on Windows 10 and use node.js 4 and npm 3.
As discussed in the comments: Run the spawn with { stdio: 'inherit' }.
However, good question is why the 'manual piping' does not do the same. I think that's because npm uses the 'fancy progress bar'. It probably uses some special way how to deal with stdout that does not play well with process.stdout. If you try some other long-running command (such as 'find ./'), your way of piping works fine.
i'm trying to play raw pcm data, which came from socket.io (using nodejs) with alsa. Recently i used node-speaker that solved my problem, but i can't install it to my target device. Now i'm trying to do it with nodejs "fs" module:
...
var pcm = fs.createWriteStream("audio.pcm");
socket.on('stream', function(data) {
console.log(data);
pcm.write(data);
});
....
Afterwards i'm trying to run aplay command immediately:
aplay -t raw -c 1 -r 48000 -f S16_LE audio.pcm
I able to listen my data with delay (2-3 seconds. It depends of how quickly I ran above command), but it crashes after 5-10 seconds without any messages. I guess it is not a right way to play live pcm. What is the best way to do that?