I have the bash script which makes pg_dumpall and uploads to my FTP server.
#/bin/bash
timeslot=`date +\%Y-\%m-\%d-\%H:\%M:\%S`
backup_dir="/var/lib/backup"
name=recycle_backup_$timeslot.sql.gz
set -e
find /var/lib/backup/*.sql.gz -mtime +7 -type f -exec rm -rvf {} \; -exec curl -p --insecure ftp://aaa.bbb/backup/{} --user user:password -Q 'DELE backup/'{} \;
echo $timeslot "Running docker container to backup Postgresql"
sudo docker run --rm \
-t \
--name postgres-backup \
-v "/var/lib/backup":"/mnt" \
-w "/mnt" \
-e PGPASSWORD="password" \
--entrypoint=/usr/bin/pg_dumpall \
postgres:9 \
-h 1.2.3.4 -U postgres | gzip -9 > $backup_dir/$name
echo "Your back is successfully stored in " $backup_dir
curl -T /var/lib/backup/$name ftp://aaa.bbb/backup/ --user user:password
It cleans up all old .sql.gz files (older than 7 days).
After these steps, the script makes a docker container with Postgres, runs commands to make backup and saves it locally to /var/lib/backup/filename.sql.gz.
The problem is I can't clean up files on my FTP server with arguments that "find" returns.
find /var/lib/backup/*.sql.gz -mtime +7 -type f -exec rm -rvf {} \; -exec curl -p --insecure ftp://aaa.bbb/backup/{} --user user:password -Q 'DELE backup/'{} \;
How to add the argument {} from find command to this curl request? -Q 'DELE BACKUP/'{} \;
Thanks for the support.
Solved the problem with a second script, which connects to FTP server and deletes outdated files in a loop from this topic Linux shell script for delete old files from ftp
That's not going to work, you can only use {} once with find.
Maybe try this approach? Remove the echo if this looks sane ...
find /var/lib/backup/*.sql.gz -mtime +7 -type f -exec rm -rvf {} + | awk '{print gensub( /...(.*)/, "\\1","1",$2)}' | while read -s file
do
echo curl -p --insecure ftp://aaa.bbb/backup/{} --user user:password -Q 'DELE backup/'${file}
done
Related
I'm trying to write a line in my bash script that will take all the subdirectories that exist in the working directory that are older than 7 days and zip them up into one zip file, then delete those subdirectories.
Any guidance would be much appreciated!
You should rewrite you command as:
find $WORK_DIR \
-type d \
-mtime +7 \
-exec bash -c "zip -q -m -j -J $WORK_DIR/$NEWZIP.zip {} && rm -rf {}" \;
Where {} is the file (directory) name placeholder.
I got a oneliner as a command to pull all git repos in a directory.
But at the end I would like some sort of summary displaying all errors.
I've already gotten quite far:
cd ~/bitbucket; \
find . \
-maxdepth 2 -type d \
-name ".git" \
-execdir python3 -c 'import os; print("\33[33m---------------------------------------------- " + os.path.abspath(".") + " ----------------------------------------------\33[0m")' \; \
-execdir git checkout development \; \
-execdir git branch \; \
-execdir git pull \; \
-execdir git status \; \
-execdir echo \; \
-execdir echo \; \
| grep -i -e "^" -e "error"
This will go into the right dir and for each git repo it will do all the git stuff and highlight the word error.
2 issues I have left are the output coloring from git is gone (for example from the git branch command) and the errors aren't displayed at the end.
Is there a solution to this?
I am trying to run this command
find . -name "new_impl.jar" | xargs -I '{}' sh -c 'java -jar jd-cli.jar --skipResources -n -g ALL '{}';'
it is not working and the error looks like it is not able to pick value of '{}'. When I am removing the section sh -c that help in running multiple command
find . -name "new_impl.jar" | xargs -I '{}' java -jar jd-cli.jar --skipResources -n -g ALL '{}';'
This command is working fine. I am using Oracle Linux 7. Can someone tell what is the reason behind this and if there is any other way to run multiple command.
I'd recommend passing the file name as an argument to sh.
find . -name "new_impl.jar" |
xargs -I '{}' \
sh -c 'java -jar jd-cli.jar --skipResources -n -g ALL "$1";rm "$1";mv *.jar "$1";unzip "$1" -d "$1".bk/;rm "$1"' _ {}
Note this will also work for using -exec from find instead.
find -name "new_impl.jar" \
-exec sh -c 'java -jar jd-cli.jar --skipResources -n -g ALL "$1";rm "$1";mv *.jar "$1";unzip "$1" -d "$1".bk/;rm "$1"' _ {}
I have the following script that works fine when called from the command line:
#!/bin/sh
/usr/bin/mysqldump -u root -ppassword redmine > /home/administrateur/backup/backup_$(date+%Y-%m-%d-%H.%M.%S).sql
find /home/administrateur/backup/* -mtime +15 -exec rm {} \;
rsync -e 'ssh -p 22' -avzp /home/administrateur/backup is-uber-1:/home/administrateur/backup
But this script omits the rsync line when called from cron.
Does anyone know why?
Basically you need to run your script as administrateur. You can use sudo for it:
/usr/bin/sudo -H -u administrateur -- /bin/sh /path/to/your/script.sh
I'm trying to copy certain files from one directory to another. Using this command
find "$HOME" -name '*.txt' -type f -print0 | xargs -0 cp -t $HOME/newdir
I get an warning message saying
cp: '/home/me/newdir/logfile.txt' and '/home/me/newdir/logfile.txt'
are the same file
How to avoid this warning message?
The problem is that you try to copy a file to itself. You can avoid it by excluding the destination directory from the results of the find command like this:
find "$HOME" -name '*.txt' -type f -not -path "$HOME/newdir/*" -print0 | xargs -0 cp -t "$HOME/newdir"
try using install instead, this replaces by removing the file first.
install -v target/release/dynnsd-client target/
removed 'target/dynnsd-client'
'target/release/dynnsd-client' -> 'target/dynnsd-client'
and then remove the source files
Make it unique in the process. But this require sorting
find "$HOME" -name '*.txt' -type f -print0 | sort -zu | xargs -0 cp -t "$HOME/newdir"
Or if it's not about the generated files, try to use the -u option of cp.
find "$HOME" -name '*.txt' -type f -print0 | xargs -0 cp -ut "$HOME/newdir"
-u copy only when the SOURCE file is newer than the destination file or when
the destination file is missing
install
worked perfectly in a Makefile context with docker - thanks!
copy:
#echo ''
bash -c 'install -v ./docker/shell .'
bash -c 'install -v ./docker/docker-compose.yml .'
bash -c 'install -v ./docker/statoshi .'
bash -c 'install -v ./docker/gui .'
bash -c 'install -v ./docker/$(DOCKERFILE) .'
bash -c 'install -v ./docker/$(DOCKERFILE_SLIM) .'
bash -c 'install -v ./docker/$(DOCKERFILE_GUI) .'
bash -c 'install -v ./docker/$(DOCKERFILE_EXTRACT) .'
#echo ''
build-shell: copy
docker-compose build shell
Try using rsync instead of cp:
find "$HOME" -name "*.txt" -exec rsync {} "$HOME"/newdir \;