I tried many things today to have ssh start a screen session which executes a command. The goal is to run a command on a remote machine and to be able to see the output and to detach and reattach latter. I want to do it from within a script without any interaction except detaching the screen session to close. No satisfying solution so far.
ssh -t ${host} "\
source ~/.bashrc; \
echo \"done.\"; \
cd \"$exedir\"; \
if [ \$? -ne 0 ]; then \
echo \"could not cd into directory\"; \
exit 1; \
fi; \
echo \"executing remotexe.sh ...\"; \
screen -S "remotexe" -t "remotexe" -R "nice -n$prio ./remotexe.sh ${exeparams[#]}";"
Some of the problems I encounter are related to the strange ways to pass commands to screen/ssh/bash which interfere with arguments and options (I don't quite understand why they do not use -- to interpret whatever follows as commands with arguments). The above version almost works. The remaining difficulty is that commands in remotexe.sh (in particular make) obviously miss exports and definitions from .bashrc. This is why I tried to include the source ~/.bashrc. I tried to add similar commands or explicit exports to remotexe.sh but it behaves as if it was executed by /bin/sh. If I do a conventional ssh login I can immediately run the remotexe.sh script without error. I also tried adding shell -$SHELL to my .screenrc.
Where is the mistake in this solution? How can I correct it?
I haven't tested your code at all, and will not vouch for the sanity of this, but you definitely have a quoting error. Try:
ssh -t ${host} "
source ~/.bashrc;
echo done.;
cd \"$exedir\" || exit 1;
echo executing remotexe.sh ...;
screen -S remotexe -t remotexe -R nice -n$prio ./remotexe.sh ${exeparams[#]};"
Related
I run into an issue last week that drives me crazy. I wrote a BASH script which does a remote ssh connection to acamai and than performs a simple 'ls'. I want to redirect the 'ls' sdtout output to a given file.
While the script itself works like a charm when run manually, it does not while it runs via cron. The cronjob runs as root and each command works as expected expect the ssh command. My System is Gentoo Linux and cron is the old but gold vixie-cron.
To reduce the 200 LOC I put the basics herein which alone (as a single script) are enough to demonstrate the problem.
#!/bin/bash
PATH='/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/opt/bin'
#set -x
shopt -s lastpipe
exec 2>log.out
(ssh -i <path to key> -o HostKeyAlgorithms=+ssh-dss -o StrictHostKeyChecking=no <account#example.com> 'ls -r <path>') > '/root/listing.txt'
Even in -vvv debug mode of ssh I can see, that everything works...just except that I get no stdout output.
Than I tried something else that I found in another posting on the internet:
#!/bin/bash
PATH='/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/opt/bin'
#set -x
shopt -s lastpipe
exec 2>log.out
(ssh -T -i <path to key> -o HostKeyAlgorithms=+ssh-dss -o StrictHostKeyChecking=no <account#example.com> 'ls -r <path>' </dev/zero) > '/root/listing.txt'
Drawback here, I start a ssh session that I can't close and I guess its due to /dev/zero.
Another approach was to TEE Pipe the sub-shell of the ssh command...this worked for a short time ( and why not yet anymore ?!)
Now I'm clueless and need help. Cron has its PATH, uses BASH etc. Curious my boss did that with success with java (and he hates BASH...).
Any explanation and helpful tips are greatly welcome.
I have same issue, I make script for CRON and it gets output from remote SSH host.
If i run script manually - it works as should. But when CRON runs it - i get just a part of remote output.
I cant realise why its happening.
#!/bin/sh
pass=123
filelist=$(sshpass -p "$pass" ssh -q -tt -o StrictHostKeyChecking=no user#"10.10.10.10" "list")
filestring=$(echo "$filelist" | grep -Po "(\S+\s\S+\s+\d+\s\d{2}:\d{2}:\d{2}\s\d{4})\slist0\.lst")
filedate=${filestring% list0.lst}
echo $filedate
filestamp=$(date -d "$filedate" +"%s")
echo $filestamp
When i get echos in file via CRON - there are date from 0:00:00 - field with date (echo $filedate) is empty. But when i run manually - i get normal date with time...
It really bother me.
Help?
I found solution - add "-tt" to ssh command and all input goes to variable.
filelist=$(sshpass -p "$pass" ssh -q -tt -o StrictHostKeyChecking=no user#"10.10.10.10" "list")
Apologies for the confusing Question title. I am trying to launch an interactive bash shell from a shell script ( say shel2.sh) which has been launched by a parent script (shel1.sh) in a sudo-ed environment. ( I am creating a guided deployment
script for my software which needs to be installed as super-user , hence the sudo, but may need the user to access the shell. )
Here's shel1.sh
#!/bin/bash
set -x
sudo bash << EOF
echo $?
./shel2.sh
EOF
echo shel1 done
And here's shel2.sh
#!/bin/bash
set -x
bash --norc --verbose --noprofile -i
echo $?
echo done
I expected this to launch an interactive bash shell which waits for my input before returning to shel1.sh. This is what I see:
+ ./shel1.sh
+ sudo bash
0
+ bash --norc --verbose --noprofile -i
bash-4.3# exit
+ echo 0
0
+ echo done
done
+ echo shel1 done
shel1 done
The bash-4.3# calls an exit automatically and quits. Interestingly if I invoke the bash shell with -l (or --login) the automatic entry is logout !
Can someone explain what is happening here ?
When you use a here document, you are tying up the shell's -- and its spawned child processes' -- standard input to the here document input.
You can avoid using a here document in many situations. For example, replace the here document with a single-quoted string.
#!/bin/bash
set -x
sudo bash -c '
# Aside: How is this actually useful?
echo $?
# Spawned script inherits the stdin of "sudo bash"
./shel2.sh'
echo shel1 done
Without more details, it's hard to see where exactly you want to go with this, but most modern Linux platforms have package managers which allow all kinds of hooks for installation, so that you would typically not need to do this sort of thing. Have you looked into that?
I have a PHP script with a web interface which is used to provide input into a process which I am automating. In this process, I am attempting to include using SSH and screen to run some commands (all sent through PHP's exec).
I am currently using the following which is writing echo sent from script to the screen, but isn't executing it. I've tried adding the -ne option to echo and adding \n or ^M to the end. I've also tried changing around the types of quotes I'm using, but I'm having trouble getting the code to execute (sending enter).
ssh -t -t myUser#myDomain.com 'screen -r -d -S -X myScreen stuff "echo -ne sent from script"' 2>&1
How do I go about getting that code to execute?
This took much longer to figure out than I would like to admit. They key was adding a $ before the command sent to stuff and then using a \n to send the enter press.
I added an extra screen -list; to demonstrate how to send multiple commands before entering the screen. I also added an extra echo to demonstrate how to send multiple commands to the screen.
ssh -t -t myUser#myDomain.com "screen -list; screen -r -d -X -S myScreen stuff $'echo here; echo here\n'" 2>&1
I have this script:
#!/bin/sh
while [ true ] ; do
urlfile=$( ls /root/wget/wget-download-link.txt | head -n 1 )
dir=$( cat /root/wget/wget-dir.txt )
if [ "$urlfile" = "" ] ; then
sleep 30
continue
fi
url=$( head -n 1 $urlfile )
if [ "$url" = "" ] ; then
mv $urlfile $urlfile.invalid
continue
fi
mv $urlfile $urlfile.busy
wget -b $url -P $dir -o /www/wget.log -c -t 100 -nc
mv $urlfile.busy $urlfile.done
done
The script basically checks for any new URLs at wget-download-link.txt for every 30 seconds and if there's a new URL it'll download it with wget, the problem is that when I try to run this script on Putty like this
/root/wget/wget_download.sh --daemon
it's still running in the foreground, I still can see the terminal output. How do I make it run in the background ?
In OpenWRT there is neither nohup nor screen available by default, so a solution with only builtin commands would be to start a subshell with brackets and put that one in the background with &:
(/root/wget/wget_download.sh >/dev/null 2>&1 )&
you can test this structure easily on your desktop for example with
(notify-send one && sleep 15 && notify-send two)&
... and then close your console before those 15 seconds are over, you will see the commands in the brackets continue execution after closing the console.
The following command will also work:
((/root/wget/wget_download.sh)&)&
This way you don't have to install the 'nohub' command in the tight memory space of the router used for OpenWrt.
I found this somewhere several years ago. It works.
The &at the end of script should be enough, if you see output from the script it means, that stdout and/or stderr is not closed, or not redirect to /dev/null
You can use this answer:
How to redirect all output to /dev/null
I am using openwrt merlin and the only way to get it working was using the crud cron manager[1]. Nohub and screen are not available as solutions.
cru a pinggw "0 * * * * /bin/ping -c 10 -q 192.168.2.254"
works like charm
[1][https://www.cyberciti.biz/faq/how-to-add-cron-job-on-asuswrt-merlin-wifi-router/]
https://openwrt.org/packages/pkgdata/coreutils-nohup
opkg update
opkg install coreutils-nohup
nohup yourscript.sh &
You can use nohup.
nohup yourscript.sh
or
nohup yourscript.sh &
Your script will keep running even if you close your putty session, and all the output will be written to a text file in same directory.
nohup is often used in combination with the nice command to run processes on a lower priority.
nohup nice yourscript.sh &
See: http://en.wikipedia.org/wiki/Nohup
For busybox in Openwrt Merlin system, I got a better solution which combined cru and date command
cru a YOUR_UNIQUE_CRON_NAME "`date -D '%s' +'%M %H %d %m *' -d $(( \`date +%s\`+2*60 ))` YOUR_CMD_HERE"
which add a cron job running 2 minutes later, and only run once.
Inspired by PlagTag's idea.
In another way these code would tried:
ssh admin#192.168.1.1 "/jffs/your_script.sh &"
Simple and without any programs like nohup screen...
(BTW: worked on Asus-Merlin firmware)
Try this:
nohup /root/wget/wget_download.sh >/dev/null 2>&1 &
It will go to the background so when you close your Putty session, it will be still running, and it won't send messages to the terminal.
How do you run a shell script in a new terminal in Linux from a terminal like "start test.bat" in Windows, also it should be working in the console mode.
Here's a simple example to get you started:
To write a shell script, do this on your command prompt:
echo -e '#!/bin/sh\n echo "hello world"' > abc.sh
This writes:
#!/bin/sh
echo "hello world"
To a file called abc.sh
Next, you want to set it to executable by:
chmod +x abc.sh
Now, you can run it by:
./abc.sh
And you should see:
hello world
On your terminal.
To run it in a new terminal, you can do:
gnome-terminal -x ./abc.sh
or, if it's xterm:
xterm -e ./abc.sh
Here's a list of different terminal emulators.
Alternatively, you just run it in your current terminal, but background it instead by:
./abc.sh &
I came here wanting to figure out how to make a script spawn a terminal and run it self in it, so for those who want to do that I figured out this solution:
if [ ! -t 0 ]; then # script is executed outside the terminal?
# execute the script inside a terminal window with same arguments
x-terminal-emulator -e "$0" "$#"
# and abort running the rest of it
exit 0
fi
For gnome try this.
Replace ls with the command you want to run
gnome-terminal -x sh -c "ls|less"
I hope this is what you want
As of January 2020, the -e and -x option in gnome-terminal still run properly but throw out the following warnings:
For -e:
# Option “-e” is deprecated and might be removed in a later version
of gnome-terminal.
# Use “-- ” to terminate the options and put the command line to
execute after it.
For -x:
# Option “-x” is deprecated and might be removed in a later version
of gnome-terminal.
# Use “-- ” to terminate the options and put the command line to
execute after it.
Based on that information above, I confirmed that you can run the following two commands without receiving any warning messages:
gnome-terminal -- /bin/sh -c '<your command>'
gnome-terminal -- ./<your script>.sh
I hope this helps anyone else presently having this issue :)