I have a script something like below
sshpass -p "pwd" ssh -tt user#host << EOF
cd /directory
file=$(ls -1t| head -1)
exit
EOF
cd /directory is changing the directory successfully inside shell. But ls gives the result which is outside the shell. The result of ls is same as when executed outside ssh. Please help in this.
The $(...) part is being evaluated by the outer shell. You can disable this by quoting 'EOF' so that $(...) is passed to the remote shell. It's akin to using single quotes instead of double quotes with regular strings.
sshpass -p "pwd" ssh -tt user#host << 'EOF'
cd /directory
file=$(ls -1t| head -1)
exit
EOF
Related
I have linux script like below:
sshpass -p "pwd" ssh -tt user << 'EOF'
cd /directory
file=$(ls -1t | head -1)
exit
EOF
How to use the file parameter outside ssh. That is after EOF statement.
I think that you have to work with the output of the SSH command to capture it into a local variable.
This could be a viable solution (tried with obviously different parameters locally, OS Ubuntu 17.04):
CMD=`cat <<EOF
cd /directory
ls -1t | head -1
EOF`
FILE=`sshpass -p "pass" ssh -t user#host -o LogLevel=QUIET "$CMD"`
echo "$FILE"
#!/bin/ksh
CTN=1
ssh -q user#host 'exec bash -s' << 'ENDSSH'
cd abc/def
./scriptname \$CTN
ENDSSH
exit;
However in the remote server, value of variable CTN is not getting passed.
Please help.
It should be:
CTN=1
ssh -q user#host 'exec bash -s' << ENDSSH
cd abc/def
./scriptname "$CTN"
ENDSSH
Since you want $CTN to expanded locally you must not escape the $ and must not put ENDSSH between single quotes.
how can I pass $line to the cmd command?
#!/bin/bash
while read line
do
sshpass -p "..." ssh -o StrictHostKeyChecking=no -tt windows#172....-p 333 cmd /c "cd C:\ & download_give_id.exe '$#$line' "
done <apps.txt
Basically, if you want to interpolate a variable into a bash string you need to use double quotes instead of single quotes:
str="${var} foo bar" # works
str='${var} for bar' # works not
In the special case that you are running ssh commands inside a while loop, I strongly recommend to pass /dev/tty explicitly as input to the command since once the remote command should read from stdin - for whatever reason - it will slurp stdin of the while loop otherwise:
while read line ; do
ssh ... -- "${line} ..." < /dev/tty # Pass tty to stdin
done < input.txt
Note: The above command will work only if the process has been started in a terminal. If the process is not running in a terminal, you need to pass something else as stdin for the inner ssh command.
I need a shell script that can take remote login in to a system and i can execute a bunch of commands in that system.
I made a script and actually it's working:
#!/bin/bash
USERNAME=KRUNAL
IP=10.61.162.241
ssh -l ${USERNAME} ${IP} "pwd "
ssh -l ${USERNAME} ${IP} "ls -la"
ssh -l ${USERNAME} ${IP} ./a.out
I have problem that if suppose i made script
ssh -l ${USERNAME} ${IP} "pwd " # this execute in remote system
ls -la # this execute in current system.
so every time i need ssh command to execute file on remote system.
Is there any way that i can run bunch of code in remote system with one time login.
You can send as much commands to ssh as you want, provided that you separate them with ; or linebreaks. So this should work:
ssh -l ${USERNAME} ${IP} "pwd; ls -la"
#Joao's suggestion works fine however its impractical when writing many lines.
If this is the case you can do
ssh -1 ${USERNAME} ${IP} bash << 'EOF'
cd /some/directory
./a.out
who am i
for i in `seq 1 10`
do
echo $i
done
EOF
Anything between 'EOF' and the final EOF will be executed in the server side.
You can also replace bash with csh or python and write code for that interpreter instead
If you want the output of the ssh session be stored in a file (say session.log) then replace
ssh -1 ${USERNAME} ${IP} bash << 'EOF'
with
ssh -1 ${USERNAME} ${IP} bash << 'EOF' > 'session.log'
rest remains unchanged
I'm trying to execute commands on a remote machine via ssh, and I need the script to wait until the ssh password is provided (if necessary).
This my code snippet:
ssh -T ${remote} << EOF
if [ ! -d $HOME/.ssh ]; then
mkdir $HOME/.ssh
touch $HOME/.ssh/authorized_keys
chmod 0600 $HOME/.ssh/authorized_keys
fi;
EOF
The problem is, commands between EOFs start executing on the local machine without waiting for the pass to be provided. Is there any way to wait for the pass before continuing with the script?
This that simple as :
ssh -T ${remote} << 'EOF'
if [ ! -d $HOME/.ssh ]; then
`mkdir $HOME/.ssh`
`touch $HOME/.ssh/authorized_keys`
`chmod 0600 $HOME/.ssh/authorized_keys`
else
EOF
Note the ' single quotes around EOF.
But I recommend you to use $( ) form instead of backticks : the backquote (`)
is used in the old-style command substitution, e.g.
foo=`command`
The
foo=$(command)
syntax is recommended instead. Backslash handling inside $() is less surprising, and $() is easier to nest. See http://mywiki.wooledge.org/BashFAQ/082
If you need to pass variables :
var=42
ssh -T ${remote} << EOF
if [ ! -d \$HOME/.ssh ]; then
\`mkdir \$HOME/.ssh\`
\`touch \$HOME/.ssh/authorized_keys\`
\`chmod 0600 \$HOME/.ssh/authorized_keys\`
else
echo $var
EOF