Using scp to copy multiple files and rename each one - linux

I need many image files from a remote server, each in a different directory dir_A, dir_B... etc but all with the same name image.png. I therefore need to rename each file as I copy it. Can I achieve this with a single command?

You can use following bash script to achieve this:
while true; do
echo "Remote File Path: "
read path
scp $path "image-$RANDOM.jpg"
done;
Path should be fully qualified resource locator similar to server.com:/home/user/image.jpg

Related

rsync multiple files from multiple directories in linux

I have multiple directories named by date (ex: 2017-09-05) and inside those directories multiple log.gz files from BRO IDS. I am trying to enter each directory, and get only specific log.gz files by name, and send those to a remote system using rysnc. A log file looks like this:app_stats.00:00:00-01:00:00.log.gz I am attempting to use wildcards to accomplish this.
Ex: rsync -avh --ignore-existing -e ssh -r /home/data/logs/2017-09-* {dns,http}.*.log.gz / root#10.23.xx.xx:/home/pnlogs/
This is close, but its just copying all the files in each folder and ignoring my attempt at just getting http and dns logs as seen in the example. Is this possible to do in one line? Would I need multiple?

scp files using wildcard. Destination contains result of wildcard

How can a user download many text files at once from a remote host using scp from the terminal using wildcards? In addition, using the result from the wildcard to save the file in the same named directory (lets assume already exists). The remote directories also contain other files with different names. For instance:
4 files in remote host:
[Remote-host]:1Dir/File.txt -> [Local-host]:1Dir/File.txt
[Remote-host]:2Dir/File.txt -> [Local-host]:2Dir/File.txt
[Remote-host]:3Dir/File.txt -> [Local-host]:3Dir/File.txt
[Remote-host]:4Dir/File.txt -> [Local-host]:4Dir/File.txt
I have tried using the following to no avail. Please assist
scp [remote-host]:'*Dir/File.txt' '*Dir/'
Try the following to retrieve your files:
scp user#host:~"/*Dir/*.txt" .
Or you can try:
scp user#host:"~/*Dir/*.txt" .
It really depends on how your user account is mapped in your environment..
Thanks #thatotherguy for the great answer.
For anyone else thats interested, the following command for rsync works
rsync -a --include '*Dir/' --include 'File.txt' --exclude '*' [Remote-host]: '\*Dir'
This means, include all folders with '*Dir' and files called 'File.txt', exclude everything else. Note that this creates a new directory called *Dir in which all the 1Dir, 2Dir, 3Dir etc. are contained.

Shell Script - SFTP -> If copied, remove?

Iam trying to copy textfiles with a shellscript over sftp.
I already wrote a script that does the job.
#!/bin/bash
HOST='Servername'
USER='Username'
sftp -b - ${USER}#${HOST} << EOFFTP
get /files/*.txt /tmp/ftpfiles/
rm /files/*.txt
quit
EOFFTP
Before I remove all the textfiles on the FTP, I want to make sure, I copied all the files without errors. How can I do this? I use SSH-keys for login.
Task is:
Copy all textfiles over and over but make sure, its not the same ones... (thats why I use remove...)
Maybe I could move them on the FTP? like copy and then move to /files/copied ?
Actually, rsync is ideal for this:
rsync --remove-source-files ${USER}#${HOST}:/files/*.txt /tmp/ftpfiles/

SFTP move files within remote dir

I need to move files between remote directories. It will always be multiple files and there is no naming convention to work with. Is there any way to use the rename command with a wildcard?
For example:
rename /dir1/dir2/* /dir1/dir2/history/
This does not work, it returns the following error:
Couldn't rename file "/dir1/dir2/*" to "/dir1/dir2/history": No such file or directory
Suggestions are highly appreciated.
I don't know rename, is this a SFTP command?
Anyway, you don't have to use SFTP. You can use SSH like this:
ssh user#fqdn "mv /dir1/dir2/* /dir1/dir2/history/"

Scp bulk files from current directory to another directory

I need to transfer a bunch of files from a production host to my local machine. I'm already in the directory that I need to transfer the files from. I know the names of log files that I need to transfer to my local machine. They are log.timestamp.hostnames and these tend to be long. How can I transfer in bulk using scp ? Is there an easier way than just typing the long file names ? Can I get it out from a filename ?
Use wildcards:
scp log.* user#host:/target/directory
If you didn't want to copy over all of your files in the current directory (which would just be using ./*), what you could do is parse all of the files in your current directory and run a regular expression on it to match up log.timestamp.hostname then pipe that into scp. For the regex I found this example regex with find. To send big files here is an example: scp syntax. Something along the lines of:
scp $(find . -regextype sed -regex ".*/log\.[a-z0-9\-]\.[a-z0-9\-]") user#remote:~/
You will probably want to modify the regex to make it work.
This command line execution option helped to solve my issue of transfering a subset of files. As the AIX unix does not provide the -regextype option with find I used the grep command instead in order to retrieve files tab1.msg to tab9 msg
scp $(find . -name "*" | grep tab.\.msg) user#host:/tmp

Resources