Execute a remote command over SSH with remote evaluation - linux

I'm trying to run a command over SSH and want the evaluation of an expression in the command to happen on the remote machine.
I'm trying to run this:
ssh -A username#ip "sudo docker exec -it "$(docker ps | grep 'some' | awk '{ print $1 }')" python manage.py shell"
but the expression $(docker ps | grep 'some' | awk '{ print $1 }') is not evaluated correctly on the remote machine when I use the ssh command.
To confirm, if I first ssh into the remote machine, and then run sudo docker exec -it "$(docker ps | grep 'some' | awk '{ print $1 }')" python manage.py shell, it does evaluate correctly and gives me a shell successfully. I just cannot make it work directly from my local machine as a part of an argument to the ssh command.
What can I do to make it work as a part of the ssh command?
The problem with doing the command below is that I receive a Pseudo-terminal will not be allocated because stdin is not a terminal. message from my terminal (iTerm) and do not get a shell like I'm expecting after the execution of this command.
ssh -A username#ip <<'EOL'
name="$(docker ps | grep 'some' | awk '{ print $1 }')"
docker exec -it $name python manage.py shell
EOL

You need to escape all the characters that need to be interpreted by the remote shell like so:
ssh -A username#ip "sudo docker exec -it \"\$(docker ps | grep 'some' | awk '{ print \$1 }')\" python manage.py shell"
This way you will send the quotes belonging to the -it argument, as well as the $ sign unchanged and the remote shell will execute them.

Related

Retrieving the result of a bash command run from ssh into a variable

The following command runs well when i run it localy on the dsired machine:
app_name=*some_proccess_name*
pid=`pgrep $app_name | tail -n 1`
But when i run it in the following way, from a remote pc using ssh it dosn't work:
pid=$(ssh $USER_NAME#$HOST_NAME "echo `pgrep $app_name | tail -n 1`")
The value of pid afterwards is just blank. I am not sure what's wrong (just to clarify i have tried several processes names that are all running on the target pc- that's not the problem ).
P.S when i run the command without echo, like that, i just get stuck inside the remote pc and have to use exit in order to get unstuck and return to my local pc:
pid=$(ssh tester#mir1 "`pgrep indicator-apple | tail -n 1`")
Less is more
pid=$(ssh tester#mir1 pgrep indicator-apple \| tail -n 1)

Bash for loops on a remote server

I am attempting to run multiple commands via a bash script on a remote server. specifically, the for loop to be run on the remote server is giving me issues. I suspect it is because I don't know how to properly escape characters or use $().
Below is the code.
ssh (user)#(server) <<EOF
sudo su - (username)
whoami
'for e in $(`ls -lrt /usr/jboss/jbosseap | awk '{print $9}' | grep multichannel`);
do
echo "$e";
done'
Removing user and server names for obvious reasons. Just concentrate on the for loop. when I run that for loop command line (without the $()) its works fine. Just not sure how to nest it in a remote call.
Thanks very much for any and all help!
If you've got a complex script that you're trying to run over ssh you're going to be better off putting that script in a file and piping that file into ssh like:
cat remote_script.sh | ssh user#host
or:
cat remote_script.sh | ssh user#host sudo -u username
And now you don't have to worry about N levels of escaping.
You can run it as below .
here file "list " includes your list of nodes and script should be present in all nodes
for i in $(cat list ) ;do ssh -o StrictHostKeyChecking=no $i "/path/your_script" ;done

Problems with fish shell and ssh remote commands

I use fish shell on my desktop.
We use many servers running nginx within docker. I've tried to create a function so I can ssh to the servers and then log into the docker.
The problem is fish is complaining about the $ in the command, but the command is the one to be executed on the remote server (running bash), not on my machine running fish. I've simplified the script to make it easier to see.
config.fish snippet
function ssh-docker-nginx
ssh -t sysadmin#10.10.10.10 "sudo bash && docker exec -it $(docker ps | grep -i nginx | awk '{print $1}') bash"
end
Fish error:
$(...) is not supported. In fish, please use '(docker)'.
~/.config/fish/config.fish (line 59): ssh -t sysadmin#10.10.10.10 "sudo bash && docker exec -it $(docker ps | grep -i nginx | awk '{print $1}') bash"
^
from sourcing file ~/.config/fish/config.fish
called during startup
Is there a way to get fish to ignore this?
You'll want to single-quote that argument.
In double-quotes (") fish will try to expand everything that starts with a $, so it will see that $( and then print the error for it. But it will also see the $1 in your arguments to awk and expand that.
And when you want single-quotes to go to the called command (like here, where you want the argument to awk to be single-quoted because this'll go through bash's expansion), you need to escape the quotes with \.
Try
ssh -t sysadmin#10.10.10.10 'sudo bash && docker exec -it $(docker ps | grep -i nginx | awk \'{print $1}\') bash'
Thanks for the great advice and tip above about the single/double quotes. Unfortunately the escaped quotes in awk did not play nicely being passed to ssh.
After various options, I settled with this approach (which needed force tty):
function ssh-docker-nginx
cat docker-bash.sh | ssh -t -t sysadmin#10.10.10.10
end
# docker-bash.sh
#!/bin/bash
sudo chmod 777 /var/run/docker.sock
sudo docker exec -it $(docker ps | grep -i nginx | awk '{print $1}') bash

remote ssh command not working properly

The following local command on host xyz provides the following correct output
taskset -p `ps -ef | grep ripit | grep -v grep| awk '{print \$2}'`
pid 21352's current affinity mask: 1
When I run the following command and ssh to xyz host I also get correct output
ssh xyz "ps -ef | grep ripit | grep -v grep |awk '{print \$2}'"
21352
However When I add the taskset command and run remotely on host xyz host i get this incorrect output.
ssh xyz "taskset -p `ps -ef | grep ripit | grep -v grep | awk '{print \$2}'`"
sched_getaffinity: No such process
failed to get pid 27599's affinity
bash: line 1: 32127: command not found
I tried many different single and double quote combination and I used escape character all over the place to no avail. Can anyone help?
Thanks
I haven't tested with your exact commands, but
ssh host 'lsof -p $(pgrep program)'
worked for me
For running commands remotely:
#!/bin/bash
SCRIPT='
#Your commands
'
sshpass -p<pass> ssh -o 'StrictHostKeyChecking no' -p <port> user#host "$SCRIPT"
When I add the taskset command and run remotely on host xyz host
ssh xyz "taskset -p `ps -ef | grep ripit | grep -v grep | awk '{print \$2}'`"
Here, the command substitution between `` is executed on the local host and yields a local process ID - no wonder that there is No such process on the remote host. If you escape the backquotes like
ssh xyz "taskset -p \`ps -ef | grep ripit | grep -v grep | awk '{print \$2}'\`"
the command substitution is executed on the remote host and yields the correct process ID.

find files on remote machine

Hi i am trying to execute the following command on a remote machine how do i do this
ssh login.com find /nfs/repo/ -name ".user_repo.log" | \
xargs cat | awk '{$NF=""; print $0}' | \
sed "1i Owner RepoName CreatedDate" | column -t
I get the following error message
cat: /nfs/repo/new1/info/.user_repo.log: No such file or directory
cat: /nfs/repo/new2/info/.user_repo.log: No such file or directory
cat command is trying to find file on the local system while these files are present on the remote machine.How do i handle this
If you do:
ssh host command1 | command2
Then the shell will break at the pipe, so you'll get "ssh host command1" run as one command (i.e. remotely), and then "command2" run as another command (i.e., locally.) You can force all the commands to run remotely by enclosing in quotes:
ssh host "command1 | command2"
Note, since you already have quotes in the command, you might have to get creative with escaping.
Or, you might put all those commands in a short shell script and then just run the script:
ssh host myscript.sh

Resources