How to exclude a specific file in scp linux shell command? - linux

I am trying to execute the scp command in such a way that it can copy .csv files from source to sink, except a few specific CSV file.
For example in the source folder I am having four files:
file1.csv, file2.csv, file3.csv, file4.csv
Out of those four files, I want to copy all files, except file4.csv, to the sink location.
When I was using the below scp command:
scp /tmp/source/*.csv /tmp/sink/
It would copy all the four CSV files to the sink location.
How can I achieve the same by using the scp command or through writing a shell script?

You can use rsync with the --exclude switch, e.g.
rsync /tmp/source/*.csv /tmp/sink/ --exclude file4.csv

Bash has an extended globbing feature which allows for this. On many installations, you have to separately enable this feature with
shopt -e extglob
With that in place, you can
scp tmp/source/(!fnord*).csv /tmp/sink/
to copy all *.csv files except fnord.csv.
This is a shell feature; the shell will expand the glob to a list of matching files - scp will have no idea how that argument list was generated.

As mentioned in your comment, rsync is not an option for you. The solution presented by tripleee works only if the source is on the client side. Here I present a solution using ssh and tar. tar does have the --exclude flag, which allows us to exclude patterns:
from server to client:
$ ssh user#server 'tar -cf - --exclude "file4.csv" /path/to/dir/*csv' \
| tar -xf - --transform='s#.*/##' -C /path/to/destination
This essentially creates a tar-ball which is send over /dev/stdout which we pipe into a tar extract. To mimick scp we need to remove the full path using --transform (See U&L). Optionally you can add the destination directory.
from client to server:
We do essentially the same, but reverse the roles:
$ tar -cf - --exclude "file4.csv" /path/to/dir/*csv \
| ssh user#server 'tar -xf - --transform="s#.*/##" -C /path/to/destination'

You could use a bash array to collect your larger set, then remove the items you don't want. For example:
files=( /tmp/src/*.csv )
for i in "${!files[#]}"; do
[[ ${files[$i]} = *file4.csv ]] && unset files[$i]
done
scp "${files[#]}" host:/tmp/sink/
Note that our for loop steps through array indices rather than values, so that we'll have the right input for the unset command if we need it.

Related

How do I get the files from SFTP server and move them to another folder in bash script?

How do I get the one by one files from SFTP server and move them do another folder in Ubuntu bash script?
#!bin/sh
FOLDER=/home/SFTP/Folder1/
sftp SFTP#ip_address
cd /home/FSTP/Folder1/
for file in "$FOLDER"*
<<EOF
cd /home/local/Folder1
get $file
EOF
mv $file /home/SFTP/Done
done
I know it's not right, but i've tried my best and if anyone can help me, i will appreciate it. Thanks in advance.
OpenSSH sftp is not very powerful client for such tasks. You would have to run it twice. First to collect list of files, use the list to generate list of commands, and execute those in a second run.
Something like this:
# Collect list of files
files=`sftp -b - user#example.com <<EOF
cd /source/folder
ls
EOF`
files=`echo $files|sed "s/.*sftp> ls//"`
# Use the list to generate list of commands for the second run
(
echo cd /source/folder
for file in $files; do
echo get $file
echo rename $file /backup/folder/$file
done
) | sftp -b - user#example.com
Before you run the script on production files, I suggest, you first output the generated command list to a file to check, if the results are as expected.
Just replace the last line with:
) > commands.txt
Maybe use SFTP internal command.
sftp get -r $remote_path $local_path
OR with the -f option to flush files to disk
sftp get -rf $remote_path $local_path

scp multiple files with different names from source and destination

I am trying to scp multiple files from source to destination.The scenario is the source file name is different from the destination file
Here is the SCP Command i am trying to do
scp /u07/retail/Bundle_de.properties rgbu_fc#<fc_host>:/u01/projects/MultiSolutionBundle_de.properties
Basically i do have more than 7 files which i am trying seperate scps to achieve it. So i want to club it to a single scp to transfer all the files
Few of the scp commands i am trying here -
$ scp /u07/retail/Bundle_de.properties rgbu_fc#<fc_host>:/u01/projects/MultiSolutionBundle_de.properties
$ scp /u07/retail/Bundle_as.properties rgbu_fc#<fc_host>:/u01/projects/MultiSolutionBundle_as.properties
$ scp /u07/retail/Bundle_pt.properties rgbu_fc#<fc_host>:/u01/projects/MultiSolutionBundle_pt.properties
$ scp /u07/retail/Bundle_op.properties rgbu_fc#<fc_host>:/u01/projects/MultiSolutionBundle_op.properties
I am looking for a solution by which i can achieve the above 4 files in a single scp command.
Looks like a straightforward loop in any standard POSIX shell:
for i in de as pt op
do scp "/u07/retail/Bundle_$i.properties" "rgbu_fc#<fc_host>:/u01/projects/MultiSolutionBundle_$i.properties"
done
Alternatively, you could give the files new names locally (copy, link, or move), and then transfer them with a wildcard:
dir=$(mktemp -d)
for i in de as pt op
do cp "/u07/retail/Bundle_$i.properties" "$dir/MultiSolutionBundle_$i.properties"
done
scp "$dir"/* "rgbu_fc#<fc_host>:/u01/projects/"
rm -rf "$dir"
With GNU tar, ssh and bash:
tar -C /u07/retail/ -c Bundle_{de,as,pt,op}.properties | ssh user#remote_host tar -C /u01/projects/ --transform 's/.*/MultiSolution\&/' --show-transformed-names -xv
If you want to use globbing (*) with filenames:
cd /u07/retail/ && tar -c Bundle_*.properties | ssh user#remote_host tar -C /u01/projects/ --transform 's/.*/MultiSolution\&/' --show-transformed-names -xv
-C: change to directory
-c: create a new archive
Bundle_{de,as,pt,op}.properties: bash is expanding this to Bundle_de.properties Bundle_as.properties Bundle_pt.properties Bundle_op.properties before executing tar command
--transform 's/.*/MultiSolution\&/': prepend MultiSolution to filenames
--show-transformed-names: show filenames after transformation
-xv: extract files and verbosely list files processed

Automate SCP copy files from multiple directories (in brackets) to appropraite directories

I have a bash script used for copy some files from different directories in remote host. All of them have the same parent. So i put them into list:
LIST=\{ADIR, BDIR, CDIR\}
and i use the scp command
sshpass -p $2 scp -o LogLevel=debug -r $1#192.168.121.1$/PATH/$LIST/*.txt /home/test/test
that command makes me able to copy all of .txt files from ADIR, BDIR, CDIR to my test directory. Is there any option which can put all of .txt files in appropriate directory like /home/test/test/ADIR or /home/test/test/BDIR ... ?
Have you considered using rsync?
You could try something along these lines:
# Rsync Options
# -a, --archive archive mode; equals -rlptgoD (no -H,-A,-X)
# -D same as --devices --specials
# -g, --group preserve group
# -l, --links copy symlinks as symlinks
# -o, --owner preserve owner (super-user only)
# -O, --omit-dir-times omit directories from --times
# -p, --perms preserve permissions
# -r, --recursive recurse into directories
# -t, --times preserve modification times
# -u, --update skip files that are newer on the receiver
# -v, --verbose increase verbosity
# -z, --compress compress file data during the transfer
for DIR in 'ADIR' 'BDIR' 'CDIR'
do
rsync -zavu --rsh="ssh -l {username}" 192.168.121.1:/$PATH/$DIR /home/test/test/
done
Finally my working code:
SOURCE='/usr/.../'
DEST='/home/test/test'
DIRS_EXCLUDED='test/ADIR test/BDIR'
EXTENSIONS_EXCLUDED='*.NTX *.EXE'
EXCLUDED_STRING=''
for DIR in $DIRS_EXCLUDED
do
EXCLUDED_STRING=$EXCLUDED_STRING'--exclude '"$DIR"' '
done
for EXTENSION in $EXTENSIONS_EXCLUDED
do
EXCLUDED_STRING=$EXCLUDED_STRING'--exclude '"$EXTENSION"' '
done
rsync -zavu $EXCLUDED_STRING --rsh="sshpass -p $2 ssh -l $1" 192.168.xxx.xxx:$SOURCE $DEST

Send multiple file to multiple location using scp

I need to send multiple files to multiple location, but can't find the proper way.
e.g. I need to send file1 to location1 and file2 to location2. This is what I am doing:
scp file1 file2 root#192.168.1.114:/location1 /location2
But this is not working. Any suggestion?
It's not possible to send to multiple remote locations with a single scp command. It can accommodate multiple source files but only one destination. Using just the scp command itself, it must be run twice.
scp file1 file2 user#192.168.1.114:/location1
scp file1 file2 user#192.168.1.114:/location2
It is possible to turn this into a "one-liner" by running a for loop. In bash for example:
for REMOTE in "user#192.168.1.114:/location1" "user#192.168.1.114:/location2"; do scp file1 file2 $REMOTE; done
scp still runs multiple times but the loop takes care of the iteration. That said, I find it easier to run the command once, hit the Up Arrow (which brings the original command back up in most environments) and just change the remote location and resend.
you can put your scp command in .sh file and execute scp commands with the user name and password in one line
vim ~/fileForSCP.sh
#!/bin/sh
sshpass -p {password} scp file1 root#192.168.1.114:/location1
sshpass -p {password} scp file2 root#192.168.1.114:/location2
sshpass -p {password} scp file3 root#192.168.1.114:/location3
...
sshpass -p {password} scp file{n} root#192.168.1.114:/location{n}
and then:
chmod 777 ~/fileForSCP.sh
~/fileForSCP.sh
You cannot do it with single scp command. Just use scp twice:
scp file1 root#192.168.1.114:/location1
scp file2 root#192.168.1.114:/location2
you can build the desired remote tree structure using sym links
for example
ln -s file1 target/location1/file1
... etc
after this you can use a dereferencing copy to push these files
rsync -rL target -e ssh user#host:/tmp
Hi there is a Hack to go about this query since SCP targets only single Destination, but warn you this might be a bit Naive but gets your work done.
use the && operator between two SCP calls
I use it like this:
scp fileTosend userName1#IPAddress_1:path/to/folder && scp fileToSend userName2#IPAddress_2:path/to/folder
This will simultaneously send the data to both the destinations.

rsync with --remove-sent-files option and open files

Every minute I need to copy recorded files from 3 servers to one data storage. I don't need to save original files - data processing is outside of all of them.
But when i use option --remove-sent-files, rsync sends and removes not finished (not closed) files.
I've tried to prevent sending these open files with lsof and --exclude-from, but it seems, that rsync does not unserstand full paths in exlude list:
--exclude-from=FILE read exclude >>patterns<< from FILE
lsof | grep /projects/recordings/.\\+\\.\\S\\+ -o | sort | uniq
/projects/recordings/<uid>/<path>/2012-07-16 13:24:32.646970-<id>.WAV
So, the script looks like:
# get open files in src dir and put them into rsync.exclude file
lsof | grep /projects/recordings/.\\+\\.\\S\\+ -o | sort | uniq > /tmp/rsync.exclude
# sync without these files
/usr/bin/rsync -raz --progress --size-only --remove-sent-files --exclude-files=/tmp/rsync.excldude /projects/recordings/ site.com:/var/www/storage/recordings/
# change owner
ssh storage#site.com chown -hR storage:storage /var/www/storage/recordings
So, may be i should try another tool? Or why rsync does not listen to exludes?
I'm not sure if this helps you, but here's my solution to only rsync files which are not currently being written to. I use it for tshark captures, writing to a new file every N seconds with the -a flag (e.g. tshark -i eth0 -a duration:30 -w /foo/bar/caps). Watch out for that tricky rsync, the order of the include and exclude is important, and if we want sub-directories we need to include "*/".
-G
$save_path=/foo/bar/
$delay_between_syncs=30
while true;
do
sleep $delay_between_syncs
# Calculate which files are currently open (i.e. the ones currently being written to)
# and avoid uploading it. This is to ensure that when we process files on the server, they
# are complete.
echo "" > /tmp/include_list.txt
for i in `find $save_path/ -type f`
do
op=`fuser $i`
if [ "$op" == "" ]
then
#echo [+] $i is good for upload, will add it list.
c=`echo $i | sed 's/.*\///g'`
echo $c >> /tmp/include_list.txt
fi
done
echo [+] Syncing...
rsync -rzt --include-from=/tmp/include_list.txt --include="*/" --exclude \* $save_path user#server:/home/backup/foo/
echo [+] Sunk...
done
rsync the files, then remove the ones that have been rsync'd by capturing the list of transferred files, and then removing only the transferred files that are not currently open. Rsync figures out what files to transfer when it gets to the directory, so your solution was bound to fail later even if it worked at first, when a newly opened file (since rsync started) was not in the exclude list.
An alternate approach would be to do a
find dir -type f -name pattern -mmin +10 | xargs -i rsync -aP {} dest:/path/to/backups

Resources