Copying all files in directory based on filename pattern match - linux

I am new to linux commands use pattern matching. In my local folder, I have many .txt files with certain endings.
foobar_type01.txt
foobar_type02.txt
foobar_type03.txt
baz_type01.txt
baz_type02.txt
baz_type03.txt
My goal is to scp all of the files from my directory to a server that end with ...type03.txt so in this case only the following would be copied:
foobar_type03.txt
baz_type03.txt
What is the correct command to scp filename matches as opposed to extensions? All I have been able to do is use the extension pattern
scp -C -r /my_folder/*.txt server#10.10.55.28:

scp -C -r /my_folder/*type03.txt server#10.10.55.28:

Related

Is there a way to download files matching a pattern trough SFTP on shell script?

I'm trying to download multiple files trough SFTP on a linux server using
sftp -o IdentityFile=key <user>#<server><<END
get -r folder
exit
END
which will download all contents on a folder. It appears that find and grep are invalid commands, so are for loops.
I need to download files having a name containing a string e.g.
test_0.txt
test_1.txt
but no file.txt
Do you really need the -r switch? Are there really any subdirectories in the folder? You do not mention that.
If there are no subdirectories, you can use a simple get with a file mask:
cd folder
get *test*
Are you required to use sftp? A tool like rsync that operates over ssh has flexible include/exclude options. For example:
rsync -a <user>#<server>:folder/ folder/ \
--include='test_*.txt' --exclude='*.txt'
This requires rsync to be installed on the remote system, but that's very common these days. If rsync isn't available, you could do something similar using tar:
ssh <user>#<server> tar -cf- folder/ | tar -xvf- --wildcards '*/test_*.txt'
This tars up all the files remotely, but then only extracts files matching your target pattern on the receiving side.

SCP a file that match a particular pattern and extension

I am trying to SCP a file from a remote host onto local host.
The file on the remote host would be, KMST_DataFile_[MMDDYY]T[HHMM].kms
I have come up with 2 SCP commands, but I was wondering if there's a way to combine these, to only SCP file that match both the file name pattern above and the extension .kms
scp -v user#remotehost:/location/KMST_DataFile_*
scp -v user#remotehost:/location/{*.kms}
This will do your job:
scp -v user#remotehost:/location/KMST_DataFile_*.kms
As #manu mentioned in the comment, on Ubuntu or Mac, you may need to escape the asterisk:
scp -v user#remotehost:/location/KMST_DataFile_\*.kms
The main thing here is to use recursive mode -r even if you copy files and not directories. It works.
If you want to copy files that start with "val" and contain also the string "v2" then use:
scp -r makis#server.gr:/media/Data/results/val*v2* /Users/makis/Desktop/
Here, vecs*v2* will expand and get only files that start with val and also contain v2 string.
Similarly, if the files end with .png for example use:
scp -r makis#server.gr:/media/Data/results/val*.png /Users/makis/Desktop/
You should use \* instead of using *
scp -v user#remotehost:/location/KMST_DataFile_\*
ssh user#host 'tar cf - /location/KMST_DataFile_* /location/{*.kms}' | tar tvpf -
Note that these taroptions only give you a table of contents. You'll want to check before you extract, and almost certainly remove the absolute path.

How to download all files from a Linux server using SCP which contan a given String

I need to download all 492 files from a Linux directory which contain a given String within the file. I can't quite manage to find a command which can accomplish this from my searching so far. Could anybody help me out?
Cheers.
Use grep to filter the files with a given string and loop over them to scp like this
for file in $(grep <some-pattern> <directory>); do scp $file <remote>; done;
Just in case if you need to filter out also the files in subdirectories of directory add the -R option to grep

Recursively copy contents of directory to all target directories

I have a directory containing a set of subdirectories and files. I need to recursively copy all the content of this directory to all the subdirectories of another directory, also recursively.
How do I achieve this, preferably without using a script and only with the cp command?
You can write this in a script but you don't have to. Just write it line by line in the terminal:
# $TARGET is the directory containing subdirectories where you want to STORE the copies
# $SOURCE is the directory containing the subdirectories you want to COPY
for dir in $(ls $TARGET); do
cp -r $SOURCE/* $TARGET/$dir
done
Only uses cp and runs on both bash and zsh.
You can't. cp can copy multiple sources but will only copy to a single destination. You need to arrange to invoke cp multiple times - once per destination - for what you want to do; using, as you say, a loop or some other tool.
The first part of the command before the pipe instruct tar to create an archive of everything in the current directory and write it to standard output (the – in place of a file-name frequently indicates stdout).
tar cf - * | ( cd /target; tar xfp -)
The commands within parentheses cause the shell to change directory to the target directory and untar data from standard input. Since the cd and tar commands are contained within parentheses, their actions are performed together.
The -p option in the tar extraction command directs tar to preserve permission and ownership information, if possible given the user executing the command. If you are running the command as superuser, this option is turned on by default and can be omitted.
Also you can use the following command, but it seems to be quite slower than tar;
cp -a * /target

Scp bulk files from current directory to another directory

I need to transfer a bunch of files from a production host to my local machine. I'm already in the directory that I need to transfer the files from. I know the names of log files that I need to transfer to my local machine. They are log.timestamp.hostnames and these tend to be long. How can I transfer in bulk using scp ? Is there an easier way than just typing the long file names ? Can I get it out from a filename ?
Use wildcards:
scp log.* user#host:/target/directory
If you didn't want to copy over all of your files in the current directory (which would just be using ./*), what you could do is parse all of the files in your current directory and run a regular expression on it to match up log.timestamp.hostname then pipe that into scp. For the regex I found this example regex with find. To send big files here is an example: scp syntax. Something along the lines of:
scp $(find . -regextype sed -regex ".*/log\.[a-z0-9\-]\.[a-z0-9\-]") user#remote:~/
You will probably want to modify the regex to make it work.
This command line execution option helped to solve my issue of transfering a subset of files. As the AIX unix does not provide the -regextype option with find I used the grep command instead in order to retrieve files tab1.msg to tab9 msg
scp $(find . -name "*" | grep tab.\.msg) user#host:/tmp

Resources