Jenkins | ssh not working - linux

I am doing ssh to servers which has password less authentication from jenkins server to these servers in /tmp/san.txt by running a shell script.
jenkins#pc4mobjen01:/tmp> cat check.sh
#!/bin/bash
for i in `cat /tmp/sand.txt`
do
ssh -q mpdevops#"$i"
bash /app/home/mpdevops/sand.sh
done
jenkins#pc4mobjen01:/tmp> bash check.sh
Last login: Wed Apr 19 09:20:03 2017 from 10.4.70.42
Powered by Monsoon (Version 2.2.1519) Platform: suse 11.3
Hostname : mo-97df9aafa.dc19.saas.sap.cor Name : PC19MOBDEVOPS01
Organization : saas_prod Project : dc19_production
Url : https://monsoon.mo.sap.corp/instances/mo-97df9aafa
mo-97df9aafa[PC19MOBDEVOPS01]:~ # logout
bash: /app/home/mpdevops/sand.sh: No such file or directory
But sand.sh is present.
mo-97df9aafa[PC19MOBDEVOPS01]:~ # cat sand.sh
for j in `cat sand.txt`
do
ssh -q mpdevops#"$j"
sudo python /tmp/test.py
done
Please help.

Try entering the command to be executed on the remote client on the same line as the actual ssh command. Looks like the bash command is being executed on the local host as opposed to the client.

You should run sand.sh on the same line with ssh command:
check.sh should be:
#!/bin/bash
for i in `cat /tmp/sand.txt`
do
ssh -q mpdevops#"$i" "bash /app/home/mpdevops/sand.sh"
done

Related

ssh-add on command argument to su

I want to start a docker container that adds a ssh key at startup :
My entrypoint looks like this :
#!/bin/bash
set -e
service ssh start
su anotherUser -s /bin/bash -c "eval \"$(ssh-agent)\" && ssh-add /Keys/id_rsa"
I've seen many posts that use sudo, but I do not have sudo available. I've found this solution but at the startup it shows me :
[....] Starting OpenBSD Secure Shell server: sshd 7[ ok 8.
Agent pid 36
Error connecting to agent: Permission denied
But when I execute the same lines at the promp everythings is ok :
xxx# su anotherUser
anotherUser#xxx:~$ eval $(ssh-agent)
Agent pid 47
anotherUser#xxx:~$ ssh-add /keys/id_rsa
Identity added: /keys/id_rsa (yyy#yyy-HP-EliteBook-850-G4)
You are running ssh-agent before su runs. The $ needs to be escaped so that the literal command substitution is passed to bash for execution.
su anotherUser -s /bin/bash -c 'eval $(ssh-agent) && ssh-add /Keys/id_rsa'
(Untested; probably needs more details about how the container is run and why ssh-add needs to be run as a different user.)
It may be simpler, though, to run your entry point with ssh-agent. For example,
# In the Dockerfile...
ENTRYPOINT ["ssh-agent", "entry.sh"]
Inside entry.sh, your environment will already have access to the agent.
#!/bin/bash
set -e
service ssh start
su anotherUser -s ssh-add /Keys/id_rsa

Running linux commands inside bash script throws permission denied error

We have linux script in our environment which does ssh to remote machine with a common user and copies a script from base machine to remote machine through scp.
Script Test_RunFromBaseVM.sh
#!/bin/bash
machines = $1
for machine in $machines
do
ssh -tt -o StrictHostKeyChecking=no ${machine} "mkdir -p -m 700 ~/test"
scp -r bin conf.d ${machine}:~/test
ssh -tt ${machine} "cd ~/test; sudo bash bin/RunFromRemotevm.sh"
done
Script RunFromRemotevm.sh
#!/bin/bash
echo "$(date +"%Y/%m/%d %H:%M:%S")"
Before running Test_RunFromBaseVM.sh script base vm we run below two commands.
eval $(ssh-agent)
ssh-add
Executing ./Test_RunFromBaseVM.sh "<list_of_machine_hosts>" getting permission denied error.
[remote-vm-1] bin/RunFromRemotevm.sh:line 2: /bin/date: Permission denied
any clue or insights on this error will be of great help.
Thanks.
I believe the problem is the presence of the NOEXEC: tag in the sudoers file, corresponding to the user (or group) that's executing the "cd ~/test; sudo bash bin/RunFromRemotevm.sh" command. This causes any further execv(), execve() and fexecve() calls to be refused, in this case it's /bin/date.
The solution is obviously remove the NOEXEC: from the main /etc/sudoers file or some file under /etc/sudoers.d, whereever is this defined.

FTP script not working

I have an FTP script running on Linux and it is failing. Here is the script:
/usr/bin/ftp -v -i -n my.ftp.server << cmd
user ftpuser password
binary
ls
<some other commands here>
quit cmd
It returns an error:
421 Service not available, remote server has closed connection
Not connected.
The weird thing here is that if I just typed this in the command line:
/usr/bin/ftp my.ftp.server
It asks for a username and password and after I supplied them, I was able to connect!
In ftp> I type ls and I can see the files from the FTP server.
What is wrong with my script?
And also, I don't have putty access to the FTP server so I can't see the logs from there. Any ideas?
Thanks!
Here is an example of a correct ftp linux script.
#!/bin/sh
HOST='ftp.server.com'
USER='user'
PASSWD='pw'
FILE='file.txt' #sample file to upload
ftp -n $HOST <<END_SCRIPT
quote USER $USER
quote PASS $PASSWD
put $FILE #sample command
#some other commands here
quit
END_SCRIPT
exit 0

Assign contents of file to variable over remote ssh from a script running in Jenkins

I have opened a remote ssh session from a script and on remote server there is a file containing version information.
I am trying to assign that version to variable and move current version contents to folder name same as version.
The main script is running in jenkins
I am doing something like this
ssh -i /home/user/.ssh/id_rsa -t -t remoteServer<<EOF
cd $WEB_DIR
VERSION=$(cat $WEB_DIR/version.info)
mv -f $WEB_DIR $BACKUP_DIR/$VERSION
exit
EOF
My VERSION variable is always empty. When I run same locally on that server it gives me version value. Something is different over remote ssh session within a script
Actually I found the way to do it in 2 steps.
$WEB_DIR is set as local variable set in main script.
$WEB_DIR="/usr/local/tomcat/webapps/ROOT"
OLD_VERSION=$(ssh -i /home/user/.ssh/id_rsa -tt user#remoteServer "cat $WEB_DIR/version.info")
ssh -i /home/user/.ssh/id_rsa -t -t user#remoteServer<<EOF
cd $WEB_DIR
mv -f $WEB_DIR $BACKUP_DIR/$OLD_VERSION
# I am executing more commands in here
exit
EOF
Use of double quotes "" in first command is must if want to use local variable.

How to cd on remote server via SSH bash script?

I have a bash script I'm using to connect to a remote server via ssh. That works, however I want the script to immediately pass the command of cd /some/dir after connecting. That doesn't seem to be working. Here's my code:
#!/bin/bash
echo "SSHing.."
ssh -i ~/.ssh/some-site.pem xxx#yyy.com
cd /some/dir
read
How can I have the cd command be executed right after SSH connection is established?
There are two easy ways to execute commands via SSH from inside the script:
1) ssh user#host 'command'
2)
ssh user#host <<<EOF
command1
command2
<...>
commandn
EOF
Normally you'd just edit your ~/.profile on the remote machine.
If that is not an option, you could do something like this:
ssh -t theserver.com 'cd /some/dir && bash -i'
You can use the following command
ssh user#watevr <the_cmd_to_be_executed>
You can try this :
ssh abc#hostname :/pathto/specific directory

Resources