I have a node.js script that plays a mp3 file with mpg321 triggered by HTTP request on my RasPi 3B and want to run continuously even after rebooting the Pi.
I'm able to play a mp3 file as a background job with forever start command, and also able to run a simple script that does not involve mp3 after rebooting with crontab setting. However, although everything is working fine, the mp3 sound is always missing only when I reboot.
Does anyone know a way to get around this issue?
Node.js script:
var mpg321 = require('mpg321');
var filepath = "./audio/beep-01a.mp3";
var player = mpg321().remote();
//infinity loop
player.play(filepath);
player.on('end', function () {
console.log('end');
player.play(filepath);
});
Crontab settings:
#reboot /usr/bin/forever start /home/pi/Documents/nodejs/index.js
I found the cause which is interesting.
A relative file path doesn't work when you run it after rebooting although it works perfectly when you run the script explicitly typing a command from the terminal window by yourself. So every path used in your script needs to be an absolute path.
Hope it helps someone who runs into the same problem in the future.
Related
I have a nodejs program running as background process with nssm (I build an executable and then have it be managed by nssm).
My program needs to open a desktop app when the user requests it, the program is runned fine but the problem is that it is runned as a background process as well !
I want the program to be opened on the desktop because the user has to interact with it.
Here is the code running the program ( I tried running the exec directly as well, to no avail)
const batchfile = `
start "PROGRAM" "${this.EXEC_PROGRAM_PATH + '\\' + this.EXEC_NAME}" ${this.ARGS.join(' ')}
`
fs.writeFileSync('launch.bat', batchfile);
// execute the batch file
const program = child.execFile('launch.bat');
If i run the nodejs program directly (not through nssm) it works, but I need to program to run in the background as it needs to be ready at startup.
Thank you for your help
I'm trying to create a process where, on detecting my smart watch connecting to it, my Raspberry Pi (4b 2GB) would automatically download new podcast episodes then transfer them to my watch. I have managed to create a udev rule to execute a shell script on detection, but it seems to skip the most vital commands (upodder and mtp-sendfile)? Code below:
#!/usr/bin/env sh
PATH=$PATH:/home/pi/.local/bin:/home/pi/.vscode-server/bin/485c41f9460bdb830c4da12c102daff275415b53/bin:/home/pi/.local/bin:/home/pi/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/local/games:/usr/games
echo "Watch connected at $(date)" >>/tmp/scripts.log
# Download new podcasts
/home/pi/.local/bin/upodder
# Transfer podcasts
/usr/bin/mtp-sendfile ~/Downloads/podcasts/WFH_Beckham_Sweden_and_Bango.mp3 /Podcasts
echo "Upodder should've run by now">>/tmp/scripts.log
I know that the file runs because /tmp/scripts.log is created and updated, but the podcasts aren't downloaded or transferred. upodder is a linux podcast downloader and mtp-tools is a way to transfer files through MTP. It's pretty clear that the issue is with these two commands but I can't seem to find the solution anywhere. Thanks in advance.
edit: Iruvar suggestion has helped but now I get the error:
Unable to determine user's home directory, MTPZ disabled.
Device 0 (VID=091e and PID=4c05) is UNKNOWN in libmtp v1.1.16.
Please report this VID/PID and the device model to the libmtp development team
~/Downloads/podcasts/WFH_Beckham_Sweden_and_Bango.mp3: stat: No such file or directory
Seems the issue is with the home directory.
My requirement is to embed a terminal in my React-Electron app wherein all commands which I can run from bash can be run in the embedded terminal too.
Suppose I want to 'npm install' I want it to be possible through my embedded terminal too. Could anyone suggest possible solutions ?
I'm not exactly sure, but I bet you can create a interface with an text input, get the content from it, and use some function of NodeJS to run that content (witch should be a command). Then, just print the result on the screen.
You can use the exec function from "child_process" dependencie, like this.
const { exec } = require("child_process");
exec("ls");
For more details, you can check here: https://nodejs.org/api/child_process.html#child_process_child_process_exec_command_options_callback
I am working on a BBB device running on an Angstrom image and the display panel is a 7 inch LCD cape. I want to start an application as soon as log-in message prompts means the application should get start automatically just after the booting. To achieve this I tried
1) To put my script files in /etc/init.d and linked the script with /etc/rc5.d as S99myscript and then updated the rc.d
But it was not an successful attempt.
2) I changed the /etc/issue file a little bit and to invoke the script i appended myscript file in the last as this.
. /home/root/myscript
// tried it like this also
sh . /home/root/myscript
but this time also i couldn't get my desired result.
What's i am missing ? Or how can I get the result.
The service file
Status of service file
I have a gulp.js process using the gulp-phantom plugin that works perfectly on my dev setup, Mac OS X 10.10, however on my test / prod environment (EC2 Amazon Linux) it just doesn't work at all, however it also isn't giving any sort of error message or any other helpful output, the task just starts and finishes again almost straight away:
Dev environment output:
$ gulp crawlSite
[17:39:19] Using gulpfile ~/Documents/dev/mysite.co.uk/gulpfile.js
[17:39:19] Starting 'crawlSite'...
[17:40:15] Finished 'crawlSite' after 57 s
Test environment output:
$ gulp crawlSite
[17:34:27] Using gulpfile /var/www/html/mysite.co.uk/gulpfile.js
[17:34:27] Starting 'crawlSite'...
[17:34:27] Finished 'crawlSite' after 715 ms
As you can see on the dev environment the process takes 57 seconds however on test it is only 715 milliseconds and on test it is not creating the files that my phantom script should be creating. My gulp task is very simple:
gulp.task('crawlSite', function() {
return gulp.src("phantom-crawl-website.js")
.pipe(phantom());
});
and my phantom script "phantom-crawl-website.js" file is in the same directory as the gulpfile.js file.
I have check that all the node modules are installed and that PhantomJS is installed globally on the test environment and everything checks out ok. If I run:
$ phantomjs phantom-crawl-website.js
from the command prompt on the test environment that works fine and it crawls the site and creates the files.
I have tried to use the gulp-phantom options for "debug" however I can never seem to see any output from this. I have tried using gulp-debug as well as follows:
gulp.task('crawlSite', function() {
return gulp.src("phantom-crawl-website.js")
.pipe(phantom({debug: true}))
.pipe(debug());
});
However all this does is give me the gulp-phantom output filename ("phantom-crawl-website.txt"). I have also tried to write the gulp-phantom output file in the following way:
gulp.task('crawlSite', function() {
return gulp.src("phantom-crawl-website.js")
.pipe(phantom({debug:true}))
.pipe(gulp.dest("./phantomOutput/"));
});
But all I get from this is a blank file created in the "phantomOutput" directory called "phantom-crawl-website.txt".
Can anyone advise what I am doing wrong and how I would be able to see the phantomJS debug output so I can work out what the problem is.
Thanks so much in advance.
UPDATE
I've managed to get some output from the gulp-phantom process by adding the following to the gulp-phantom index.js file:
program.stderr.on('data', function (data) {
console.log('stderr: ' + data);
});
Once this was added I'm now getting the following error message:
stderr: Can't open '/dev/stdin'
But still no luck actually getting it to work.
Found the issue. In the gulp-phantom module there appears to be an error with it using /dev/stdin were phantomjs expecting the phantom filename to be passed. On Mac OS X the /dev/stdin contains the contents of the file but on Linux it is denied permission to read it.
To fix it I removed the line that was pushing '/dev/stdin' into the arguments stack and then added one a bit further down in the "through" function call to pass the full path and filename to the phantomjs process instead.
I will issue a pull request to the gulp-phantom module creator and see if they accept this as fix for the issue.