Jenkins rsync excludes do not exclude any files - linux

I've got a Jenkins server running with some pipelines. I wanted to move from copying a tar file to a web server (and extracting it there) to synching the files with rsync. This works fine, but if I define an excluded file or folder, it is still synched.
I have this line in my Jenkinsfile:
sh "rsync -av --dry-run --exclude={build.sh,run.sh} . ${SERVER_USER}#${SERVER}:${SERVER_DIRECTORY}"
As reference: these are the contents of ".":
$ tree src/
src/
├── build.sh
├── foo.bar
├── run.sh
└── var
1 directory, 3 files
And the contents of SERVER_DIRECTORY:
$ tree .
.
└── public
└── index.php
1 directory, 1 file
If I run the command from above manually, then I get an incremental file list without the files build.sh and run.sh. On Jenkins they both appear in this file list:
[Pipeline] sh
+ rsync -av --dry-run --exclude={build.sh,run.sh} . deploy_user#webserver.local:/srv/apache/projects/test
sending incremental file list
./
build.sh
foo.bar
run.sh
var/
What I tried so far:
using a trailing slash at the end of ".", so instead I used "./", since the man page states that this leads to creation of the directory at the destination path
using a trailing slash at the end of SERVER_DIRECTORY
comparing rsync versions, path to rsync binaries and environments, but the shells are the same and versions and paths also
// in Jenkinsfile:
sh "echo $SHELL"
// Output on Jenkins:
[Pipeline] sh
+ echo /bin/bash
/bin/bash
did the whole rsync process manually: logged into Jenkins with jenkins user, go to build folder and run rsync command (worked as expected)
I tried it with "single excludes", meaning "--exclude build.sh --exclude run.sh", which worked, but I' curious, why the other solution only works when run manually, but not via Jenkins. Is this some kind of escaping issue or what am I missing here?
I hope that someone can help me on this and I'm looking forward to every anwser! :)
Edit #1: I got the same behaviour when I tried mkdir {foo,bar}: Jenkins creates one folder
[Pipeline] sh
+ mkdir {foo,bar}
[Pipeline] sh
+ ls -la
total 1504
drwxr-xr-x 2 jenkins jenkins 4096 Sep 5 14:26 {foo,bar}

x={a,b} syntax is not something specific to rsync and it's something available in Bash.
For example, if you execute the following in bash.
echo --exclude={run.sh,deploy.sh}
You will see the following output.
--exclude=run.sh --exclude=deploy.sh
Jenkins executes shell commands using the default Shell using sh -c .... hence some of the Bash stuff will not work. In order to get around this set the Shebang line to point to bash, before executing the script.
sh"""
#!/bin/bash
rsync -av --dry-run --exclude={build.sh,run.sh} . ${SERVER_USER}#${SERVER}:${SERVER_DIRECTORY}
"""

I solved it now by setting Jenkins > Manage Jenkins > Configure System > Shell executable to /bin/bash as in this post.
I'm not sure, why the answer of #ycr didn't work for me, since this solution obviously works for other people.

Related

How to make Ubuntu bash script wait on password input when using scp command

I want to run a script that deletes a files on computer and copies over another file from a connected host using scp command
Here is the script:
#!/bin/bash
echo "Moving Production Folder Over"
cd ~
sudo rm -r Production
scp -r host#192.168.123.456:/home/user1/Documents/Production/file1 /home/user2/Production
I would want to cd into the Production directory after it is copied over. How can I go about this? Thanks!

Problems running first shellscript

I'm trying to create my first shell script in bash. I've created the code and I've managed to save the script in my home directory but it wont run. At first I try running it from the home directory with: ./testscript.sh with "permission denied" as a response, i then tried sudo ./testscript.sh and then the "command was not found".
This is my script:
#!/bin/bash
mkdir -p/home/filer
touch /home/filer/fil1
touch /home/filer/fil2
touch /home/filer/fil3
tar-zcvf file.tar.gz /home/filer
So I've tried creating a script that will create a directory called "filer" in my home directory, using touch to create 3 separate files within the "filer" directory and then creating a tar.archive out of the whole "filer" directory. I think the script is correct, I could just use a hand running the script.
Other than a couple of typos (mkdir -p/path -> mkdir -p /path, tar-zcvf ... -> tar -zcvf ...), you should refer to your home directory using $HOME environment variable. /home/filer is an absolute directory path, which I am assuming, is not your actual home directory.
#!/bin/bash
mkdir -p $HOME/filer
touch $HOME/filer/fil1
touch $HOME/filer/fil2
touch $HOME/filer/fil3
tar -zcvf file.tar.gz $HOME/filer
You can execute the script, ./testscript.sh as bash testscript.sh or ./testscript.sh.
In the second case, the script need to have proper executable permissions. chmod +x ./testscript.sh gives it full executable permissions.

crontab bash script not running

I updated the script with the absolute paths. Also here is my current cronjob entry.
I went and fixed the ssh key issue so I know it works know, but might still need to tell rsync what key to use.
The script runs fine when called manually by user. It looks like not even the rm commands are being executed by the cron job.
UPDATE
I updated my script but basically its the same as the one below. Below I have a new cron time and added an error output.
I get nothing. It looks like the script doesn't even run.
crontab -e
35 0 * * * /bin/bash /x/y/z/s/script.sh 2>1 > /tmp/tc.log
#!/bin/bash
# Clean up
/bin/rm -rf /z/y/z/a/b/current/*
cd /z/y/z/a/to/
/bin/rm -rf ?s??/D????
cd /z/y/z/s/
# Find the latest file
FILE=`/usr/bin/ssh user#server /bin/ls -ht /x/y/z/t/a/ | /usr/bin/head -n 1`
# Copy over the latest archive and place it in the proper directory
/usr/bin/rsync -avz -e /urs/bin/ssh user#server:"/x/y/z/t/a/$FILE" /x/y/z/t/a/
# Unzip the zip file and place it in the proper directory
/usr/bin/unzip -o /x/y/z/t/a/$FILE -d /x/y/z/t/a/current/
# Run Dev's script
cd /x/y/z/t/
./old.py a/current/ t/ 5
Thanks for the help.
I figured it out, I'm use to working in cst and the server was in gmt time.
Thanks everybody for the help.

bash: cd: No such file or directory

I'm writing a bash function to jump into my last editted folder.
In my example, the last edited folder is titled 'daniel'.
The bash function looks fine.
>>:~$ echo $(ls -d -1dt -- */ | head -n 1)
daniel/
And I can manually cd into the directory.
>>:~$ cd daniel
>>:~/daniel$
But I can't use the bash function to cd into the directory.
>>:~$ cd $(ls -d -1dt -- */ | head -n 1)
bash: cd: daniel/: No such file or directory
Turns out someone added alias ls=ls --color to the bashrc of this server. My function works once the alias was removed. – Daniel Tan
This error is usually thrown when you enter a path that does not exist. See -bash: cd: Desktop: No such file or directory.
But the $(ls -d -1dt -- */ | head -n 1) is not wrong in the output. Thus the reason must be the different usage of sh and bash in that moment.
In my case, I had a docker container with that error when I accessed the folder with bash. The container was broken since I had force-closed it after docker-compose up which did not work. After that, on the existing containers, I could only use sh, not bash. I found this because of OCI runtime exec failed: exec failed: container_linux.go:348: starting container process caused "exec: "bash": executable file not found in $PATH": unknown. I guess that bash is loaded later than sh, and that at an early error at the start of the container, only sh gets loaded.
That would fit since you are in the sh, which can be seen from >>. Using sh, everything will work as expected. But the expression gets solved by bash. Which is probably not loaded for whatever reason.
In docker, using docker-compose, I also had a similar error saying sh: 1: cd: can't cd to /root/MYPROJECT. That could be solved by mounting the needed volumes in the services using
services:
host:
volumes:
- ~/MYPROJECT:/MYPROJECT # ~/path/on/host:/path/on/container
See Mount a volume in docker-compose. How is it done? and How to mount a host directory with docker-compose? or the official docs.

Jenkins adding single quotes to bash shell script

My shell script looks like this:
#!/bin/bash
USER=$1
sudo rm -rf /home/$USER/system/logs/*
exit 0
It's checked into cvs in a shell folder, Jenkins is configured to execute it on a Linux machine via a job with 'Execute Shell' build step:
bash -ex shell/clear-logs.sh myuser
But Jenkins is wrapping the whole sudo line in single quotes which results in my log files not been deleted (although the Jenkins job passes successfully):
[workspace] $ /bin/sh -xe /tmp/hudson7785398405733321556.sh
+ bash -ex shell/clear-logs.sh myuser
+ USER=myuser
+ sudo rm -rf '/home/myuser/system/logs/*'
+ exit 0
Any ideas why Jenkins is doing this? If I call the script from the Jenkins workspace location as the root user, then it works fine.
EDIT:
I have the same shell script, in different cvs modules, being executed by Jenkins on the same linux server. Have created a new job, either as freestyle or by copying an existing job where this works, but makes no difference.
Okay, seemed to have resolved this by adding the 'jenkins' user to the 'myuser' group and restarting the jenkins service. If the logs directory is empty, then Jenkins console output does report the path in single quotes, as no files found. But run the job a second time where there are files, and no single quotes, files correctly deleted.
Jenkins is not doing anything with your quotation marks, such as changing double to single - you are seeing the output of set -x. Try this in your shell:
set -x
ls "some string with spaces"
Output will be something like:
+ ls --color=auto 'some string with spaces'
bash is just showing you debug output of its interpretation and tokenization of your command.
Adapt the permissions of /home/$USER/... I got the following in the Console Output at first:
+ USER=geri
+ rm -rf '/home/geri/so-30802898/*'
rm: cannot remove ‘/home/geri/so-30802898/*’: Permission denied
Build step 'Execute shell' marked build as failure
After adapting the permissions the build/deletion succeeded.

Resources