SFTP move files within remote dir - linux

I need to move files between remote directories. It will always be multiple files and there is no naming convention to work with. Is there any way to use the rename command with a wildcard?
For example:
rename /dir1/dir2/* /dir1/dir2/history/
This does not work, it returns the following error:
Couldn't rename file "/dir1/dir2/*" to "/dir1/dir2/history": No such file or directory
Suggestions are highly appreciated.

I don't know rename, is this a SFTP command?
Anyway, you don't have to use SFTP. You can use SSH like this:
ssh user#fqdn "mv /dir1/dir2/* /dir1/dir2/history/"

Related

rsync multiple files from multiple directories in linux

I have multiple directories named by date (ex: 2017-09-05) and inside those directories multiple log.gz files from BRO IDS. I am trying to enter each directory, and get only specific log.gz files by name, and send those to a remote system using rysnc. A log file looks like this:app_stats.00:00:00-01:00:00.log.gz I am attempting to use wildcards to accomplish this.
Ex: rsync -avh --ignore-existing -e ssh -r /home/data/logs/2017-09-* {dns,http}.*.log.gz / root#10.23.xx.xx:/home/pnlogs/
This is close, but its just copying all the files in each folder and ignoring my attempt at just getting http and dns logs as seen in the example. Is this possible to do in one line? Would I need multiple?

Using scp to copy multiple files and rename each one

I need many image files from a remote server, each in a different directory dir_A, dir_B... etc but all with the same name image.png. I therefore need to rename each file as I copy it. Can I achieve this with a single command?
You can use following bash script to achieve this:
while true; do
echo "Remote File Path: "
read path
scp $path "image-$RANDOM.jpg"
done;
Path should be fully qualified resource locator similar to server.com:/home/user/image.jpg

How to use command zip in linux that folder have short path?

I used command zip in linux (RedHat), this is my command:
zip -r /home/username/folder/compress/zip.zip /home/username/folder/compressed/*
Then, i open file zip.zip, i see architecture as path folder compress.
I want to in folder zip only consist list file *.txt
Because i used this command in script crontab hence i can't use command cd to path folder before run command zip
Please help me
I skimmed the zip man page and this is what I have found. There is not an option archive files relative to a different directory. The closest I have found is zip -j which removes the entire path and stores the files directly in the zip rather than sub directories. I do not know what happens in the case of file name conflicts such as if /home/username/folder/compressed/a.txt and /home/username/folder/compressed/subdir/a.txt both exist. If this is not a problem for you, you can use this option, but I am concerned because you did specify the -r option indicating that you expect zip to traverse sub folders.
I also thought of the possibility that your script could somehow call zip with a different working directory, but I took a look at this unix stack exchange page and it looks like their options use cd.
I have to admit I do not understand why you cannot use cd and I am very curious about it. You said something about using crontab, but I have never heard of anything wrong with changing directories in a crontab script.
I used option -j in command zip
zip -jr /home/username/folder/compress/zip.zip /home/username/folder/compressed/*
and i was yet settled this problem, thanks

scp files using wildcard. Destination contains result of wildcard

How can a user download many text files at once from a remote host using scp from the terminal using wildcards? In addition, using the result from the wildcard to save the file in the same named directory (lets assume already exists). The remote directories also contain other files with different names. For instance:
4 files in remote host:
[Remote-host]:1Dir/File.txt -> [Local-host]:1Dir/File.txt
[Remote-host]:2Dir/File.txt -> [Local-host]:2Dir/File.txt
[Remote-host]:3Dir/File.txt -> [Local-host]:3Dir/File.txt
[Remote-host]:4Dir/File.txt -> [Local-host]:4Dir/File.txt
I have tried using the following to no avail. Please assist
scp [remote-host]:'*Dir/File.txt' '*Dir/'
Try the following to retrieve your files:
scp user#host:~"/*Dir/*.txt" .
Or you can try:
scp user#host:"~/*Dir/*.txt" .
It really depends on how your user account is mapped in your environment..
Thanks #thatotherguy for the great answer.
For anyone else thats interested, the following command for rsync works
rsync -a --include '*Dir/' --include 'File.txt' --exclude '*' [Remote-host]: '\*Dir'
This means, include all folders with '*Dir' and files called 'File.txt', exclude everything else. Note that this creates a new directory called *Dir in which all the 1Dir, 2Dir, 3Dir etc. are contained.

Shell Script - SFTP -> If copied, remove?

Iam trying to copy textfiles with a shellscript over sftp.
I already wrote a script that does the job.
#!/bin/bash
HOST='Servername'
USER='Username'
sftp -b - ${USER}#${HOST} << EOFFTP
get /files/*.txt /tmp/ftpfiles/
rm /files/*.txt
quit
EOFFTP
Before I remove all the textfiles on the FTP, I want to make sure, I copied all the files without errors. How can I do this? I use SSH-keys for login.
Task is:
Copy all textfiles over and over but make sure, its not the same ones... (thats why I use remove...)
Maybe I could move them on the FTP? like copy and then move to /files/copied ?
Actually, rsync is ideal for this:
rsync --remove-source-files ${USER}#${HOST}:/files/*.txt /tmp/ftpfiles/

Resources