Remote ls output is not redirecting to file - linux

When I run the below code, I'getting this error
bash: /var/out.txt: No such file or directory
#!/usr/bin/expect
set timeout -1
spawn ssh user#10.103.234.1 'ls -t /var/backups/archives/' > /var/outp.log
expect "user#10.103.234.1's password:"
send "Password\n"
expect eof
if [catch wait] {
puts "failed"
exit 1
}
exit 0

Expect/Tcl does not understant the redirection (>) char. Try this:
spawn bash -c "ssh user#10.103.234.1 ls -t /var/backups/archives/ > /var/outp.log"

use tee replaced
spawn ssh user#10.103.234.1 'ls -t /var/backups/archives/|tee -a /var/outp.log'

Related

How to wait for first rsync process to complete before running next command in shell/bash script

Below is the script I have. Basically I just want to copy files from the other server by calling out this script. Some files are large and what happens is that it will kill the first rsync command before it completes and proceed with the next. I tried to use screen command but I'm not sure how to code Ctrl+a d (to detach) in shell/bash.
HFDIR=/var/opt/ubkp/data/local/prework/hotfixes
RODIR=/var/opt/ubkp/data/local/prework/rollouts
THFDIR=$(ls -t /var/opt/ubkp/data/local | grep hotfix | head -1)
TRODIR=$(ls -t /var/opt/ubkp/data/local | grep rollout | grep -v check | head -1)
user=$(/usr/seos/bin/sewhoami)
if [ $user = "root" ]; then
echo "This script should not be run as the TRUE root user"
echo "Log in so that \"sewhoami\" does not display \"root\" and then execute this script."
exit
else
#list of ROs and HFs
list=/tmp/list.txt
echo -n "Enter Password: "
read -s PWD
# first rsync command
/usr/bin/expect<<EOD
spawn rsync -a $user#server:$HFDIR/* /var/opt/ubkp/data/local/$THFDIR
expect "assword"
send "$PWD\r"
wait $!
expect eof
EOD
# second rsync command
/usr/bin/expect<<EOD
spawn rsync -a $user#server:$RODIR/* /var/opt/ubkp/data/local/$TRODIR
expect "assword"
send "$PWD\r"
expect eof
EOD
fi
exit
Your second rsync will be killed after 10 seconds as that is the default timeout for expect eof. You should add a wait after the send, to wait forever until the process ends.
Also, your should remove the $! in the wait. It is a shell variable, not an expect variable. Fortunately in this case $! is empty because you have not run any commands in the shell in the background with &.

Bash+Expect script not running properly in crontab

I have a bash script that is supposed to run periodically. the script should connect to a remote SFTP server and get a file from there.
Since this is a SFTP server I had to use expect with the bash script.
the script runs well when I run it manually but fails when running via crontab.
the problematic function is the get_JSON_file()
please advise...
this is the code:
#!/bin/bash
export xxxxx
export xxxxx
export PATH=xxxxx
check_if_file_is_open(){
while :
do
if ! [[ `lsof | grep file.txt` ]]
then
break
fi
sleep 1
done
}
get_JSON_file(){
/usr/bin/expect -f <(cat << EOF
spawn sftp -P port user#ip
expect "Password:"
send "password\r"
expect "$ "
send "get path/to/file/file.json\r"
send "exit\r"
interact
EOF
)
}
get_JSON_file
check_if_file_is_open
cp file.txt /path/to/destination/folder
Expect's interact works only when stdin is on a tty/pty but cron job is not running on tty/pty. So replace interact with expect eof (or expect -timeout 12345 eof if necessary).
That's a very awkward way to pass expect commands to the expect interpreter. Use a (quoted) heredoc instead, and you would drop the -f option for expect
get_JSON_file(){
/usr/bin/expect <<'EOF'
spawn sftp -P port user#ip
expect "Password:"
send "password\r"
expect "$ "
send "get path/to/file/file.json\r"
send "exit\r"
expect eof
EOF
}
The most important tip for debugging expect scripts is to invoke expect's debug output. While you're working out the kinks, use
expect -d <<'EOF'
and in the crontab, you'd want to redirect stderr to stdout so you get the debugging output
* * * * * /path/to/script.sh 2>&1
To run a function within a shell script, no parentheses should be used.
Your code then becomes:
#!/bin/bash
export xxxxx
export xxxxx
export PATH=xxxxx
function check_if_file_is_open(){
while :
do
if ! [[ `lsof | grep file.txt` ]]
then
break
fi
sleep 1
done
}
function get_JSON_file(){
/usr/bin/expect -f <(cat << EOF
spawn sftp -P port user#ip
expect "Password:"
send "password\r"
expect "$ "
send "get path/to/file/file.json\r"
send "exit\r"
interact
EOF
)
}
get_JSON_file
check_if_file_is_open
cp file.txt /path/to/destination/folder
create a new script with screen command and add it in crontab
new_script.sh
#!/bin/bash
cd script_path
screen -dm -S screen_name ./your_script.sh

how to use value that is calculated inside ssh

I have linux script like below:
sshpass -p "pwd" ssh -tt user << 'EOF'
cd /directory
file=$(ls -1t | head -1)
exit
EOF
How to use the file parameter outside ssh. That is after EOF statement.
I think that you have to work with the output of the SSH command to capture it into a local variable.
This could be a viable solution (tried with obviously different parameters locally, OS Ubuntu 17.04):
CMD=`cat <<EOF
cd /directory
ls -1t | head -1
EOF`
FILE=`sshpass -p "pass" ssh -t user#host -o LogLevel=QUIET "$CMD"`
echo "$FILE"

Run bash script inside the expect script

Im trying to run my bash script inside an expect script but getting errors.
/usr/bin/expect <<EOD
spawn ssh nginubud#10.123.25.83 $(< try1.sh)
expect "assword:"
send "$reg\r"
expect eof
EOD
im trying to do this in expect ssh nginubud#10.123.25.83 "$(< try1.sh)", this one is working but i need to find a way to run it in automated way. I dont want to use RSA keys.
error that in encountered:
spawn ssh nginubud#10.123.25.83 #tats script
invalid command name "echo"
while executing
"echo "Enter Year:""
Also i can run my expect ssh script but when i include and try to run my $(< try1.sh) im getting "no variable errors"
You can use ssh user#host bash -c .... For example:
[bash] % cat foo.sh
export CMD=$( printf '%q' "$(< try.sh)" )
expect << EOF
spawn ssh foo#localhost bash -c \$::env(CMD)
expect -nocase password:
send bar\r
expect eof
EOF
[bash] % cat try.sh
echo hello world | tr a-z A-Z
[bash] % bash foo.sh
spawn ssh foo#localhost bash -c echo\ hello\ world\ \|\ tr\ a-z\ A-Z
foo#localhost's password:
HELLO WORLD
[bash] %

shell script for remote connection to other system and execute bunch of command in it

I need a shell script that can take remote login in to a system and i can execute a bunch of commands in that system.
I made a script and actually it's working:
#!/bin/bash
USERNAME=KRUNAL
IP=10.61.162.241
ssh -l ${USERNAME} ${IP} "pwd "
ssh -l ${USERNAME} ${IP} "ls -la"
ssh -l ${USERNAME} ${IP} ./a.out
I have problem that if suppose i made script
ssh -l ${USERNAME} ${IP} "pwd " # this execute in remote system
ls -la # this execute in current system.
so every time i need ssh command to execute file on remote system.
Is there any way that i can run bunch of code in remote system with one time login.
You can send as much commands to ssh as you want, provided that you separate them with ; or linebreaks. So this should work:
ssh -l ${USERNAME} ${IP} "pwd; ls -la"
#Joao's suggestion works fine however its impractical when writing many lines.
If this is the case you can do
ssh -1 ${USERNAME} ${IP} bash << 'EOF'
cd /some/directory
./a.out
who am i
for i in `seq 1 10`
do
echo $i
done
EOF
Anything between 'EOF' and the final EOF will be executed in the server side.
You can also replace bash with csh or python and write code for that interpreter instead
If you want the output of the ssh session be stored in a file (say session.log) then replace
ssh -1 ${USERNAME} ${IP} bash << 'EOF'
with
ssh -1 ${USERNAME} ${IP} bash << 'EOF' > 'session.log'
rest remains unchanged

Resources