panel launcher
mate-terminal -e pactl.sh
pactl.sh
a pulse audio hardware switching utility
calls a test video file, also run for system startu
p
runs fine but the called script background process terminates
thx.sh called script
mplayer -quiet -noborder /home/caltrop/.videos/THX.mp4 > /dev/null 2>&1 &
disown
this runs fine everywhere except when run with an app launcher
Related
I want to do a bash script with these requirements:
REQUIREMENTS:
The script:
launches, in parallel, the execution of several processes
waits for the processes to finish execution before continuing its own execution.
Each process:
takes a relative long time to finish execution
should output its logs to both a child gnome-terminal window and to a log file, which should be different from the terminal windows and log files of the other processes and the terminal window used by the father script.
should not finish even if the user closed the corresponding child gnome-terminal window. That means it should keep on running and outputing logs to its corresponding log file even if the user either closed the child gnome-terminal window or pressed ctrl+c on the child gnome-terminal window.
MY TRY:
I have tried to implement this with the code below:
nohup gnome-terminal --wait -- bash -c "cd /home;find . -name "foo*" | tee /home/$USER/log-home.txt" &
nohup gnome-terminal --wait -- bash -c "cd /home/$USER; find . -name "foo*" | tee /home/$USER/log-$USER.txt" &
wait;
echo "Finished";
The code abgove does not meet the expected requirements though:
If I close a child gnome-terminal window, the process running within it stops outputing logs to the corresponding log file. I guess this process just finishes running.
If I close all the child gnome-terminal windows launched from the father script, the father script continues running the script lines after the wait command.
I am building a rest API with java, when the endpoint gets called. I will spawn a virtual shell with RunTime.getRuntime().exec(My_script.sh)
My script then proceeds to ssh into a server, create some files, and most importantly... I needs to run a script that sits on that server to process the files.
I run the command with
nohup myscript.sh > /dev/null/ 2>&1 &
I also added sleep 5 because i read that nohup may take a second to get things started
The script runs until the shell disconnects and dies.
So as of now
java application launches a shell
the bash script initiates a ssh connection and creates a subshell that runs a secondary script
while in the subshell, with our second script, we create some files and then we call nohup script to start the script
subshell exits and terminates
main shell exits and terminates
script we called disconnects and never finishes running
Some things I have tried
I have tried calling the script from the ssh connection,
psudo example
ssh user#host "nohup script.sh > /dev/null/ 2>&1 &"
also tried with different quotes
ssh user#host "nohup script.sh > /dev/null/" 2>&1 &
Another thing I can potentially do is keep the initial shell open. So that the script wont die when it disconnects... because it never will
I have looked into to other options like screen but I don't think it will be useful if the machine that initiated the script gets completely terminated after it starts
Things to note
I don't have control over the Linux box I ssh into, so I cannot install packages on that machine. I can however install packages on my shell
I want to run "screen" on a debian linode server, starting up over a ssh terminal window. I'd like a shell script to start and detach a screen, so that a process can continue when I log off. I'd also like the logging file screenlog.0 to be produced, so that there's a record if the process crashes.
But there's a problem in getting the log file to write. Locally, on a mac terminal window,
% screen -dm -L sh -c 'echo hello'
works fine, "hello" gets written to screenlog.0. But the same command issued in a ssh window to the server executes, but nothing gets written.
However, if in that window I go into screen,
% screen -L
and then do some stuff, the activity is written to screenlog.0 (on the server).
What am I missing?
It turns out that the screen() command can have problems. The above command sends no output to screenlog.0 under 'Debian GNU/Linux 9 (stretch)' , while 'Ubuntu 14.04.1 LTS' writes the odd message, "error: could not start server! try running as root!", to screenlog.0, even when running as root. 'Linux Mint 18.1' and MacOSX run correctly.
I was advised to use the venerable unix command "nohup" to solve my problem of detaching a process and logging its output, even when you close the ssh connection. Ordinarily, when you close a terminal window, the signal SIGHUP is sent to any processes that were started there. But
% nohup myprog > logfile.txt &
works perfectly. Old way, good way.
I'm pretty new to Linux / Raspberry PI.
I want to run a command from a shell script in a new shell window since commands like "cvlc music.mp3" (VLC PLAYER) would block the shell until playback has beenn finished.
Therefore it would be nice to export the playback command to another shell
Is this correct?
gnome-terminal && lxterminal don't seem to be an option for the distribution
for testing purpose I created two dumnmy shell-scripts:
[start.sh]
#!/bin/sh
lxterminal\
--title="MyScriptWindow" \
-e "bash -c ./exe.sh;bash"\
[exe.sh]
#!/bin/sh
echo "Hello World"
[output]
root#raspberrypi:/home/pi# ./start.sh
(lxterminal:1315): Gtk-WARNING **: cannot open display:
If I've understood correctly, you are doing all this only because you want the shell to be released at the execution of your cvlc.
You only need to detach it from shell standard output and run it as a background process
nohup cvlc music.mp3 &
is this enought ?
You could also run the program in background
$> ./test.sh &
Or use screen
Using these command you wont block your shell.
I'm trying to execute a firmware upgrade while my programming is running in inittab. My program will run 2 commands. One to extract the installer script from the tarball and the other to execute the installer script. In my code I'm using the system() function call. These are the 2 command strings below,
system ( "tar zvxf tarball.tar.gz -C / installer.sh 2>&1" );
system( "nohup installer.sh tarball >/dev/null 2>&1 &" );
The installer script requires the tarball to be an argument. I've tried using sudo but i still have the same problem. I've tried nohup with no success. The installer script has to kill my program when doing the firmware upgrade but the installer script will stay alive.
If my program is run from the command line or rc.local, on my target device, my upgrade works fine, i.e. when my program is killed my installer script continues.
But I need to run my program from /etc/inittab so it can respawn if it dies. To stop my program in inittab the installer script will hash it out and execute "telinit q". This is where my program dies (but thats what I want it to do), but it also kills my installer script.
Does anyone know why this is happening and what can I do to solve it?
Thanks in advance.
My guess what happens here is that init is sending the SIGTERM/SIGKILL not only to the process but to the whole process group. It does this to ensure that all children of a process are properly cleaned up. When your program calls system(), it will internally do a fork()/exec(). This newly forked process is in the same process group as you program so it also gets killed.
You could try to run your installer script in a new session by doing a
system( "setsid nohup installer.sh tarball >/dev/null 2>&1 &" );
If your system doesn't provide the setsid commandline utility you can simply write your own. setsid is just a small wrapper around the setsid() system call.