scp files using wildcard. Destination contains result of wildcard - linux

How can a user download many text files at once from a remote host using scp from the terminal using wildcards? In addition, using the result from the wildcard to save the file in the same named directory (lets assume already exists). The remote directories also contain other files with different names. For instance:
4 files in remote host:
[Remote-host]:1Dir/File.txt -> [Local-host]:1Dir/File.txt
[Remote-host]:2Dir/File.txt -> [Local-host]:2Dir/File.txt
[Remote-host]:3Dir/File.txt -> [Local-host]:3Dir/File.txt
[Remote-host]:4Dir/File.txt -> [Local-host]:4Dir/File.txt
I have tried using the following to no avail. Please assist
scp [remote-host]:'*Dir/File.txt' '*Dir/'

Try the following to retrieve your files:
scp user#host:~"/*Dir/*.txt" .
Or you can try:
scp user#host:"~/*Dir/*.txt" .
It really depends on how your user account is mapped in your environment..

Thanks #thatotherguy for the great answer.
For anyone else thats interested, the following command for rsync works
rsync -a --include '*Dir/' --include 'File.txt' --exclude '*' [Remote-host]: '\*Dir'
This means, include all folders with '*Dir' and files called 'File.txt', exclude everything else. Note that this creates a new directory called *Dir in which all the 1Dir, 2Dir, 3Dir etc. are contained.

Related

rsync multiple files from multiple directories in linux

I have multiple directories named by date (ex: 2017-09-05) and inside those directories multiple log.gz files from BRO IDS. I am trying to enter each directory, and get only specific log.gz files by name, and send those to a remote system using rysnc. A log file looks like this:app_stats.00:00:00-01:00:00.log.gz I am attempting to use wildcards to accomplish this.
Ex: rsync -avh --ignore-existing -e ssh -r /home/data/logs/2017-09-* {dns,http}.*.log.gz / root#10.23.xx.xx:/home/pnlogs/
This is close, but its just copying all the files in each folder and ignoring my attempt at just getting http and dns logs as seen in the example. Is this possible to do in one line? Would I need multiple?

copy directory from another computer on Linux

On a computer with IP address like 10.11.12.123, I have a folder document. I want to copy that folder to my local folder /home/my-pc/doc/ using the shell.
I tried like this:
scp -r smb:10.11.12.123/other-pc/document /home/my-pc/doc/
but it's not working.
So you can use below command to copy your files.
scp -r <source> <destination>
(-r: Recursively copy entire directories)
eg:
scp -r user#10.11.12.123:/other-pc/document /home/my-pc/doc
To identify the location you can use the pwd command, eg:
kasun#kasunr:~/Downloads$ pwd
/home/kasun/Downloads
If you want to copy from B to A if you are logged into B: then
scp /source username#a:/destination
If you want to copy from B to A if you are logged into A: then
scp username#b:/source /destination
In addition to the comment, when you look at your host-to-host copy options on Linux today, rsync is by far, the most robust solution around. It is brought to you by the SAMBA team[1] and continues to enjoy active development. Most distributions include the rsync package by default. (if not, you should find an easily installable package for your distro or you can download it from rsync.samba.org ).
The basic use of rsync for host-to-host directory copy is:
$ rsync -uav srchost:/path/to/dir /path/to/dest
-uav simply recursively copies -ua only new or changed files preserving file & directory times and permissions while providing -v verbose output. You will be prompted for the username/password on 10.11.12.123 unless your have setup ssh-keys to allow public/private key authentication (see: ssh-keygen for key generation)
If you notice, the syntax is basically the same as that for scp with a slight difference in the options: (e.g. scp -rv srchost:/path/to/dir /path/to/dest). rsync will use ssh for secure transport by default, so you will want to insure sshd is running on your srchost (10.11.12.123 in your case). If you have name resolution working (or a simple entry in /etc/hosts for 10.11.12.123) you can use the hostname for the remote host instead of the remote IP. Regardless, you can always transfer the files you are interested in with:
$ rsync -uav 10.11.12.123:/other-pc/document /home/my-pc/doc/
Note: do NOT include a trailing / after document if you want to copy the directory itself. If you do include a trailing / after document (i.e. 10.11.12.123:/other-pc/document/) you are telling rsync to copy the contents, (i.e. the files and directories under) document to 10.11.12.123:/other-pc/ without also copying the document directory.
The reason rsync is far superior to other copy apps is it provides options to truly synchronize filesystems and directory trees both locally and between your local machine and remote host. Meaning, in your case, if you have used rsync to transfer files to /home/my-pc/doc/ and then make changes to the files or delete files on 10.11.12.123, you can simply call rsync again and have the changes/deletions reflected in /home/my-pc/doc/. (look at the several flavors of the --delete option for details in rsync --help or in man 1 rsync)
For these, and many more reasons, it is well worth the time to familiarize yourself with rsync. It is an invaluable tool in any Linux user's hip pocket. Hopefully this will solve your problem and get you started.
Footnotes
[1] the same folks that "Opened Windows to a Wider World" allowing seemless connection between windows/Linux hosts via the native windows server message block (smb) protocol. samba.org
If the two directories (document and /home/my-pc/doc/) you mentioned are on the same 10.11.12.123 machine.
then:
cp -ai document /home/my-pc/doc/
else:
scp -r document/ root#10.11.12.123:/home/my-pc/doc/

SFTP move files within remote dir

I need to move files between remote directories. It will always be multiple files and there is no naming convention to work with. Is there any way to use the rename command with a wildcard?
For example:
rename /dir1/dir2/* /dir1/dir2/history/
This does not work, it returns the following error:
Couldn't rename file "/dir1/dir2/*" to "/dir1/dir2/history": No such file or directory
Suggestions are highly appreciated.
I don't know rename, is this a SFTP command?
Anyway, you don't have to use SFTP. You can use SSH like this:
ssh user#fqdn "mv /dir1/dir2/* /dir1/dir2/history/"

How can I upload an entire folder, that contains other folders, using sftp on linux?

I have tried put -r directory/*, which only uploaded the files and not folders. Gave me the error, cannot Couldn't canonicalise.
Any help would be greatly appreciated.
For people actually wanting a direct answer to this question (instead of being told to use something other than sftp)...
put -r local/path/to/directoryName
The uploaded directory must already exist in the working directory on the server, so you might need to create it first.
mkdir directoryName
Here you can find detailed explanation as how to copy a directory using scp. In your case, it would be something like:
$ scp -r foo your_username#remotehost.edu:/some/remote/directory/bar
This will copy the directory "foo" from the local host to a remote host's directory "bar".
Here -r is -recursively copy entire directories.
You can also use rcp with similar syntax. The only difference between them is that scp uses secure shell and rcp uses remote shell.
BTW The "Couldn't canonicalise" error you mentioned appear when sftp server is unable to access the file/directory mentioned in the command.
UPDATE: For users who want to use put specifically, please refer to Ben Thielker answer here.
sftp> mkdir source
sftp> put -r source
Uploading source/ to /home/myself/source
Entering source/
source/file1
source/file2
if you have issues using sftp, you can use ncftp
For centos
yum install ncftp
To copy a whole directory recursively
ncftpput -R -v -u username -P 21 ftp.server.dev /remote-path/ /localdirectory
Use scp instead. It uses SSH too and can easily handle recursion.

rsync not synchronizing .htaccess file

I am trying to rsync directory A of server1 with directory B of server2.
Sitting in the directory A of server1, I ran the following commands.
rsync -av * server2::sharename/B
but the interesting thing is, it synchronizes all files and directories except .htaccess or any hidden file in the directory A. Any hidden files within subdirectories get synchronized.
I also tried the following command:
rsync -av --include=".htaccess" * server2::sharename/B
but the results are the same.
Any ideas why hidden files of A directory are not getting synchronized and how to fix it. I am running as root user.
thanks
This is due to the fact that * is by default expanded to all files in the current working directory except the files whose name starts with a dot. Thus, rsync never receives these files as arguments.
You can pass . denoting current working directory to rsync:
rsync -av . server2::sharename/B
This way rsync will look for files to transfer in the current working directory as opposed to looking for them in what * expands to.
Alternatively, you can use the following command to make * expand to all files including those which start with a dot:
shopt -s dotglob
See also shopt manpage.
For anyone who's just trying to sync directories between servers (including all hidden files) -- e.g., syncing somedirA on source-server to somedirB on a destination server -- try this:
rsync -avz -e ssh --progress user#source-server:/somedirA/ somedirB/
Note the slashes at the end of both paths. Any other syntax may lead to unexpected results!
Also, for me its easiest to perform rsync commands from the destination server, because it's easier to make sure I've got proper write access (i.e., I might need to add sudo to the command above).
Probably goes without saying, but obviously your remote user also needs read access to somedirA on your source server. :)
I had the same issue.
For me when I did the following command the hidden files did not get rsync'ed
rsync -av /home/user1 server02:/home/user1
But when I added the slashes at the end of the paths, the hidden files were rsync'ed.
rsync -av /home/user1/ server02:/home/user1/
Note the slashes at the end of the paths, as Brian Lacy said the slashes are the key. I don't have the reputation to comment on his post or I would have done that.
I think the problem is due to shell wildcard expansion. Use . instead of star.
Consider the following example directory content
$ ls -a .
. .. .htaccess a.html z.js
The shell's wildcard expansion translates the argument list that the rsync program gets from
-av * server2::sharename/B
into
-av a.html z.js server2::sharename/B
before the command starts getting executed.
The * tell to rsynch to not synch hidden files. You should not omit it.
On a related note, in case any are coming in from google etc trying to find while rsync is not copying hidden subfolders, I found one additional reason why this can happen and figured I'd pay it forward for the next guy running into the same thing: if you are using the -C option (obviously the --exclude would do it too but I figure that one's a bit easier to spot).
In my case, I had a script that was copying several folders across computers, including a directory with several git projects and I noticed that the I couldn't run any of the normal git commands in the copied repos (yes, normally one should use git clone but this was part of a larger backup that included other things). After looking at the script, I found that it was calling rsync with 7 or 8 options.
After googling didn't turn up any obvious answers, I started going through the switches one by one. After dropping the -C option, it worked correctly. In the case of the script, the -C flag appears to have been added as a mistake, likely because sftp was originally used and -C is a compression-related option under that tool.
per man rsync, the option is described as
--cvs-exclude, -C auto-ignore files in the same way CVS does
Since CVS is an older version control system, and given the man page description, it makes perfect sense that it would behave this way.

Resources