scp multiple files with different names from source and destination - linux

I am trying to scp multiple files from source to destination.The scenario is the source file name is different from the destination file
Here is the SCP Command i am trying to do
scp /u07/retail/Bundle_de.properties rgbu_fc#<fc_host>:/u01/projects/MultiSolutionBundle_de.properties
Basically i do have more than 7 files which i am trying seperate scps to achieve it. So i want to club it to a single scp to transfer all the files
Few of the scp commands i am trying here -
$ scp /u07/retail/Bundle_de.properties rgbu_fc#<fc_host>:/u01/projects/MultiSolutionBundle_de.properties
$ scp /u07/retail/Bundle_as.properties rgbu_fc#<fc_host>:/u01/projects/MultiSolutionBundle_as.properties
$ scp /u07/retail/Bundle_pt.properties rgbu_fc#<fc_host>:/u01/projects/MultiSolutionBundle_pt.properties
$ scp /u07/retail/Bundle_op.properties rgbu_fc#<fc_host>:/u01/projects/MultiSolutionBundle_op.properties
I am looking for a solution by which i can achieve the above 4 files in a single scp command.

Looks like a straightforward loop in any standard POSIX shell:
for i in de as pt op
do scp "/u07/retail/Bundle_$i.properties" "rgbu_fc#<fc_host>:/u01/projects/MultiSolutionBundle_$i.properties"
done
Alternatively, you could give the files new names locally (copy, link, or move), and then transfer them with a wildcard:
dir=$(mktemp -d)
for i in de as pt op
do cp "/u07/retail/Bundle_$i.properties" "$dir/MultiSolutionBundle_$i.properties"
done
scp "$dir"/* "rgbu_fc#<fc_host>:/u01/projects/"
rm -rf "$dir"

With GNU tar, ssh and bash:
tar -C /u07/retail/ -c Bundle_{de,as,pt,op}.properties | ssh user#remote_host tar -C /u01/projects/ --transform 's/.*/MultiSolution\&/' --show-transformed-names -xv
If you want to use globbing (*) with filenames:
cd /u07/retail/ && tar -c Bundle_*.properties | ssh user#remote_host tar -C /u01/projects/ --transform 's/.*/MultiSolution\&/' --show-transformed-names -xv
-C: change to directory
-c: create a new archive
Bundle_{de,as,pt,op}.properties: bash is expanding this to Bundle_de.properties Bundle_as.properties Bundle_pt.properties Bundle_op.properties before executing tar command
--transform 's/.*/MultiSolution\&/': prepend MultiSolution to filenames
--show-transformed-names: show filenames after transformation
-xv: extract files and verbosely list files processed

Related

How to exclude a specific file in scp linux shell command?

I am trying to execute the scp command in such a way that it can copy .csv files from source to sink, except a few specific CSV file.
For example in the source folder I am having four files:
file1.csv, file2.csv, file3.csv, file4.csv
Out of those four files, I want to copy all files, except file4.csv, to the sink location.
When I was using the below scp command:
scp /tmp/source/*.csv /tmp/sink/
It would copy all the four CSV files to the sink location.
How can I achieve the same by using the scp command or through writing a shell script?
You can use rsync with the --exclude switch, e.g.
rsync /tmp/source/*.csv /tmp/sink/ --exclude file4.csv
Bash has an extended globbing feature which allows for this. On many installations, you have to separately enable this feature with
shopt -e extglob
With that in place, you can
scp tmp/source/(!fnord*).csv /tmp/sink/
to copy all *.csv files except fnord.csv.
This is a shell feature; the shell will expand the glob to a list of matching files - scp will have no idea how that argument list was generated.
As mentioned in your comment, rsync is not an option for you. The solution presented by tripleee works only if the source is on the client side. Here I present a solution using ssh and tar. tar does have the --exclude flag, which allows us to exclude patterns:
from server to client:
$ ssh user#server 'tar -cf - --exclude "file4.csv" /path/to/dir/*csv' \
| tar -xf - --transform='s#.*/##' -C /path/to/destination
This essentially creates a tar-ball which is send over /dev/stdout which we pipe into a tar extract. To mimick scp we need to remove the full path using --transform (See U&L). Optionally you can add the destination directory.
from client to server:
We do essentially the same, but reverse the roles:
$ tar -cf - --exclude "file4.csv" /path/to/dir/*csv \
| ssh user#server 'tar -xf - --transform="s#.*/##" -C /path/to/destination'
You could use a bash array to collect your larger set, then remove the items you don't want. For example:
files=( /tmp/src/*.csv )
for i in "${!files[#]}"; do
[[ ${files[$i]} = *file4.csv ]] && unset files[$i]
done
scp "${files[#]}" host:/tmp/sink/
Note that our for loop steps through array indices rather than values, so that we'll have the right input for the unset command if we need it.

How do I get the files from SFTP server and move them to another folder in bash script?

How do I get the one by one files from SFTP server and move them do another folder in Ubuntu bash script?
#!bin/sh
FOLDER=/home/SFTP/Folder1/
sftp SFTP#ip_address
cd /home/FSTP/Folder1/
for file in "$FOLDER"*
<<EOF
cd /home/local/Folder1
get $file
EOF
mv $file /home/SFTP/Done
done
I know it's not right, but i've tried my best and if anyone can help me, i will appreciate it. Thanks in advance.
OpenSSH sftp is not very powerful client for such tasks. You would have to run it twice. First to collect list of files, use the list to generate list of commands, and execute those in a second run.
Something like this:
# Collect list of files
files=`sftp -b - user#example.com <<EOF
cd /source/folder
ls
EOF`
files=`echo $files|sed "s/.*sftp> ls//"`
# Use the list to generate list of commands for the second run
(
echo cd /source/folder
for file in $files; do
echo get $file
echo rename $file /backup/folder/$file
done
) | sftp -b - user#example.com
Before you run the script on production files, I suggest, you first output the generated command list to a file to check, if the results are as expected.
Just replace the last line with:
) > commands.txt
Maybe use SFTP internal command.
sftp get -r $remote_path $local_path
OR with the -f option to flush files to disk
sftp get -rf $remote_path $local_path

Executing same command for several files in same repository in linux

I'd like to execute the following command for several files in same repository in linux:
../../../../../openSMILE-2.1.0/SMILExtract -C ../../../../../openSMILE-2.1.0/config/IS13_ComParE.conf -I inputfilename.wav -D outputfilename.csv
there are several files (named 1.wav, 2.wav, 3.wav) in the directory and if I execute
../../../../../openSMILE-2.1.0/SMILExtract -C ../../../../../openSMILE-2.1.0/config/IS13_ComParE.conf -nologfile 1 -noconsoleoutput 1 -I 1.wav -D 1.csv
it outputs 1.csv.
How can I create 1.csv, 2.csv, 3.csv, .. by executing just one single command in linux? (or do I have to make .sh file?)
It's probably cleaner to put the following to a script, but you can type it directly into the bash command line as well:
#! /bin/bash
for file in *.wav ; do
prefix=${file%.wav} # Remove from the right.
../../../../../openSMILE-2.1.0/SMILExtract \
-C ../../../../../openSMILE-2.1.0/config/IS13_ComParE.conf \
-I "$file" -D "$prefix".csv
done

ntfsundelete sh shell script failure

I am trying to write a script to undelete a lot of files from a windows partition. I was able to pull the inode numbers for all of the files and their names by using the scan function of ntfsundelete.
I then took this huge list and made a file like this:
#!/bin/sh
ntfsundelete /dev/sda2 -u -i 50365 -o file1.doc -d .
ntfsundelete /dev/sda2 -u -i 58234 -o file1.doc -d .
I did chmod +x script.sh and ran it sh ./script.sh
I get the error "Couldn't create output file: No such file or directory".
If I run those commands individually, it works, but if I run the script, it fails. I have 1200+ files.

shell sftp download remote file

How to download the lastest file via sftp in command line?
This connects to the server and lists the current directory.. But how can I find the last file sorted by filename and download it?
sshpass -p $pass sftp root#$host << EOF
cd /var/www/bak/db
dir
quit
EOF
update
#!/bin/sh
pass="pwd"
host="ftps://host:22"
mkdir /ftp
cd /ftp
curlftpfs $host /ftp -o user=root:$pass
ls
error
Error connecting to ftp: gnutls_handshake() failed: An unexpected TLS packet was received.
Maybe that
Get latest file and save to batchfile file:
ssh user#server "find /path/to/dir -type f -printf 'get %p\n' | sort -n | tail -1" > batchfile
And get file:
sftp -b batchfile user#server:/
I checked and it works!
It may be more convenient to use CurlFtpFS to mount sftp folder. Tutorial on "using curlftpfs to mount a FTP folder" explains details.
And then use standard commands to achieve what you want to do.
Or for sshfs follow tutorial on how to Mount a SFTP Folder (SSH + FTP).
Not sure which sftp do you mean.

Resources