Shell Scripting for archiving and encrypting - linux

I need to create a script which receives from the CLI the name of a file with the extension .tar.gz and a folder(e.g ./archivator.sh box.tar.gz MyFolder). This script will archive the files from the folder(only the files WITHIN the folder and without any compression) and they will be moved into the archive received as a parameter. The archive will be then encrypted(using the aescrypt) with the password 'apple'.
OS: Debian 6
Note: The final encrypted archive will have the same name as the first given parameter.
What i have tried so far is this:
tar -cvf $1 $2/* | aescrypt -e -p apple - > $1.aes | mv $1.aes $1
And this is what I receive when I am trying to check my script:
tar: This does not look like a tar archive
tar: Exiting with a failure status due to previous errors

Try doing this :
tar cf - $2/* | aescrypt -e -p apple - > $1
- here, means STDIN
Works well on Linux (archlinux) with GNU tar 1.26
If it doesn't work, run the script in debug mode:
bash -x script.sh
then come again to post the output.

After a little research, here is the solution:
pushd $2
tar cvf $1 .
openssl aes-256-cbc -in $1 -out $1.enc -pass pass:apple
mv $1.enc $1
popd

Your error seems to signal that the interpreter is receiving a file which is not a tar archive, yet it expects one. Have you checked to make sure the file your providing is a tar archive?

Related

Creating compressed tar file, with only subset of files, remotely over SSH

I've successfully managed to transfer a tar file over SSH on stdout from a remote system, creating a compressed file locally, by doing something like this:
read -s sudopass
ssh me#remote "echo $sudopass | sudo -S tar cf - '/dir'" 2>/dev/null | XZ_OPT='-6 -T0 -v' xz > dir.tar.xz
As expected this gets me a dir.tar.xz locally which is all of the remote /dir compressed.
I've also managed to figure out how to locally only compress a subset of files, by passing a filelist to tar with -T on STDIN:
find '/dir' -name '*.log' | XZ_OPT='-6 -T0 -v' tar cJvf /root/logs.txz -T -
My main question is: how would I go about doing the first thing (transfer plain tar remotly, then compress locally) while at the same time telling tar that I only want to do it on a specific subset of files?
When I try combining the two:
ssh me#remote "echo $sudopass | sudo -S find '/dir' -name '*.log' | tar cf
-T -" | XZ_OPT='-6 -T0 -v' xz > cypress_logs.tar.xz
I get errors like:
tar: -: Cannot stat: No such file or directory
I feel like tar isn't liking the fact that I'm both passing it something on STDIN as well as expecting it to output to STDOUT. Adding another - didn't seem to help either.
Also, as a bonus question, if anyone has a better idea on how to pass $sudopass above that would be great, since this method -- while avoiding having the password in the bash history -- makes the sudo password show up in the process list while it's running.
Remember that the f option requires an argument, so when you write cf -T -, I suspect that the -T is getting consumed as the argument to f, which throws off the rest of the command line.
This works for me:
ssh me#remote "echo $password | sudo -S find /tmp/dir -name '*.log' | tar -cf- -T-"
You could also write it like this:
ssh me#remote "echo $password | sudo -S find /tmp/dir -name '*.log' | tar cf - -T-"
But I prefer to always use - for options, rather than legacy tar's weird options without any prefix.

How to exclude a specific file in scp linux shell command?

I am trying to execute the scp command in such a way that it can copy .csv files from source to sink, except a few specific CSV file.
For example in the source folder I am having four files:
file1.csv, file2.csv, file3.csv, file4.csv
Out of those four files, I want to copy all files, except file4.csv, to the sink location.
When I was using the below scp command:
scp /tmp/source/*.csv /tmp/sink/
It would copy all the four CSV files to the sink location.
How can I achieve the same by using the scp command or through writing a shell script?
You can use rsync with the --exclude switch, e.g.
rsync /tmp/source/*.csv /tmp/sink/ --exclude file4.csv
Bash has an extended globbing feature which allows for this. On many installations, you have to separately enable this feature with
shopt -e extglob
With that in place, you can
scp tmp/source/(!fnord*).csv /tmp/sink/
to copy all *.csv files except fnord.csv.
This is a shell feature; the shell will expand the glob to a list of matching files - scp will have no idea how that argument list was generated.
As mentioned in your comment, rsync is not an option for you. The solution presented by tripleee works only if the source is on the client side. Here I present a solution using ssh and tar. tar does have the --exclude flag, which allows us to exclude patterns:
from server to client:
$ ssh user#server 'tar -cf - --exclude "file4.csv" /path/to/dir/*csv' \
| tar -xf - --transform='s#.*/##' -C /path/to/destination
This essentially creates a tar-ball which is send over /dev/stdout which we pipe into a tar extract. To mimick scp we need to remove the full path using --transform (See U&L). Optionally you can add the destination directory.
from client to server:
We do essentially the same, but reverse the roles:
$ tar -cf - --exclude "file4.csv" /path/to/dir/*csv \
| ssh user#server 'tar -xf - --transform="s#.*/##" -C /path/to/destination'
You could use a bash array to collect your larger set, then remove the items you don't want. For example:
files=( /tmp/src/*.csv )
for i in "${!files[#]}"; do
[[ ${files[$i]} = *file4.csv ]] && unset files[$i]
done
scp "${files[#]}" host:/tmp/sink/
Note that our for loop steps through array indices rather than values, so that we'll have the right input for the unset command if we need it.

How do I get the files from SFTP server and move them to another folder in bash script?

How do I get the one by one files from SFTP server and move them do another folder in Ubuntu bash script?
#!bin/sh
FOLDER=/home/SFTP/Folder1/
sftp SFTP#ip_address
cd /home/FSTP/Folder1/
for file in "$FOLDER"*
<<EOF
cd /home/local/Folder1
get $file
EOF
mv $file /home/SFTP/Done
done
I know it's not right, but i've tried my best and if anyone can help me, i will appreciate it. Thanks in advance.
OpenSSH sftp is not very powerful client for such tasks. You would have to run it twice. First to collect list of files, use the list to generate list of commands, and execute those in a second run.
Something like this:
# Collect list of files
files=`sftp -b - user#example.com <<EOF
cd /source/folder
ls
EOF`
files=`echo $files|sed "s/.*sftp> ls//"`
# Use the list to generate list of commands for the second run
(
echo cd /source/folder
for file in $files; do
echo get $file
echo rename $file /backup/folder/$file
done
) | sftp -b - user#example.com
Before you run the script on production files, I suggest, you first output the generated command list to a file to check, if the results are as expected.
Just replace the last line with:
) > commands.txt
Maybe use SFTP internal command.
sftp get -r $remote_path $local_path
OR with the -f option to flush files to disk
sftp get -rf $remote_path $local_path

Tar Error In First Compress With Command Sed And "Argument list too long" When I Set 100.000 Distance In Sed

I'v 200.000 rows in file not_found_test1.txt
I'am running command as bellow, but getting error in first result
tar czvf /home/bukanadmin/test.tar.gz -T $(sed -n 1,10p /home/bukanadmin/not_found_test1.txt)
This is error what i got
tar: RT #StCecilias_PE\: Sara McBay y10 finished an impressive 4th in the JG 75m hurdles final. Sara only took up the hurdles a few months ago! #dedicated #workshard: Cannot stat: No such file or directory
tar: By stcecilias_re on 11-May-2018 17\:49: Cannot stat: No such file or directory
tar: at http\://twitter.com/stcecilias_re/statuses/994892363523874816: Cannot stat: No such file or directory
tar: : Cannot stat: No such file or directory
2018/05/2018-05-11/TWITTER.DATA_POST/abfeda55a6f5b9ad1622f5484c7452f1.txt
2018/05/2018-05-11/TWITTER.DATA_POST/73a38258c9e91110065c3973b90fc841.txt
2018/05/2018-05-11/TWITTER.DATA_POST/240ae384d7e1e1d2f5f4fa1f70e7f0e8.txt
2018/05/2018-05-11/TWITTER.DATA_POST/e5a6f6c8bccc3c1d0ed9f11eb543c0a2.txt
2018/05/2018-05-11/TWITTER.DATA_POST/23a051f72192affbe2e57e91df62e372.txt
2018/05/2018-05-11/TWITTER.DATA_POST/f629b60d212a04dc4d42695f348446f3.txt
2018/05/2018-05-11/TWITTER.DATA_POST/c7037ea6e3912496fc546b7135a763f3.txt
2018/05/2018-05-11/TWITTER.DATA_POST/93675eeb45dbd6385cbf37b0d9d39341.txt
2018/05/2018-05-11/TWITTER.DATA_POST/ded62f41db4a069bd4fd36e83661cdd2.txt
tar: Exiting with failure status due to previous errors
And when i remove Sed on command Tar, i got no issue
tar czvf /home/bukanadmin/test.tar.gz -T /home/bukanadmin/not_found_test1.txt
When i trying another command in Tar like command Head, i got same issue
Can someone help me and explain please
**NEW ISSUE :) **
Last issue is done
czvf /home/bukanadmin/test.tar.gz $(sed -n 1,10p /home/bukanadmin/not_found_test1.txt)
Now, i got error when i change my code to
czvf /home/bukanadmin/test.tar.gz $(sed -n 100000,200000p /home/bukanadmin/not_found_test1.txt)
This is error explain
-bash: /usr/bin/tar: Argument list too long
Using -T is meant to read the entire file, so trying to grab just the first ten lines won't work.
You can likely elimiate -T altogether and simply do:
tar czvf file.tar.gz $( sed -n 1,10p file )
...or using head...
tar czvf file.tar.gz $( head -10 file )

MySQL Dump to tar.gz from remote without shell access

I'm trying to get a dump from MySQL to my local client. This is what I currently have:
mysqldump -u $MyUSER -h $MyHOST -p$MyPASS $db | gunzip -9 > $FILE
What I want though is .tar.gz instead of a gunzip archive. I have shell access on local client but not on the server. So, I can't do a remote tar and copy it here. So, is there a way of piping the gzip to a tar.gz. (Currently, the .gz does not get recognized as a tar archive.)
Thanks.
If you are issuing the above command in client side, your compression is done in client side. mysqldump connects the remote server and downloads the data without any compression.
mysqldump -u $MyUSER -h $MyHOST -p$MyPASS $db > filename
tar cfz filename.tar.gz filename
rm filename
Probably some unix gurus will have a one liner to do it.
No. The files (yes, plural, since tar is usually used for more than one file) are first placed in a tar archive, and then that is compressed. If you are trying to use the tar command line tool then you will need to save the result in a temporary file and then tar that.
Personally though, I'd rather hit the other side with a cluebat.
mysqldump -u $MyUSER -h $MyHOST -p$MyPASS $db | tar -zcvf $FILE -
Where $FILE is your filename.tar.gz
Archived backup and renamed by time and date:
/usr/bin/mysqldump -u $MyUSER -h $MyHOST -p$MyPASS $db | gzip -c > /home/backup_`/bin/date +"\%Y-\%m-\%d_\%H:\%M"`.gz

Resources