This question already has answers here:
SFTP: return number of files in remote directory?
(4 answers)
Closed 6 years ago.
I am writing a script in bash and I need to count how many files starting with ddd that are in a remote directory using SFTP. After it download each file, so them I can compare how many files had in the remote directory and how many files were downloaded. Check if they match and such.
I was doing something like this:
echo ls -l | sftp "user#$123.45.67.8:/home/user/datafolder/ddd*" | wc -l
The one above works, but when I run this it downloads all the files to my local folder, which I do not want.
How can I count the number of files and do not download them. I want to download them in another part of the code.
As it said in the comments the best way to do is using ssh. So this outputs what I wanted
ssh user#123.45.67.8 ls /home/user/datafolder/ddd* | wc -l
rsync --list-only provides a succinct way to list the files in a remote directory. Simply passing the result to wc -l takes care of the count (excluding the . and .. (dot) files), e.g.
rsync --list-only server:/path/to/dir/ | wc -l
(note the trailing '/' to count the contents rather than the directory itself. Add -r for a recursive count. You have all rsync options available to tailor the files counted, e.g. --exclude="stuff", etc.)
Related
I'm trying to download multiple files trough SFTP on a linux server using
sftp -o IdentityFile=key <user>#<server><<END
get -r folder
exit
END
which will download all contents on a folder. It appears that find and grep are invalid commands, so are for loops.
I need to download files having a name containing a string e.g.
test_0.txt
test_1.txt
but no file.txt
Do you really need the -r switch? Are there really any subdirectories in the folder? You do not mention that.
If there are no subdirectories, you can use a simple get with a file mask:
cd folder
get *test*
Are you required to use sftp? A tool like rsync that operates over ssh has flexible include/exclude options. For example:
rsync -a <user>#<server>:folder/ folder/ \
--include='test_*.txt' --exclude='*.txt'
This requires rsync to be installed on the remote system, but that's very common these days. If rsync isn't available, you could do something similar using tar:
ssh <user>#<server> tar -cf- folder/ | tar -xvf- --wildcards '*/test_*.txt'
This tars up all the files remotely, but then only extracts files matching your target pattern on the receiving side.
This question already has answers here:
Rename multiple files based on pattern in Unix
(24 answers)
Closed 2 years ago.
I have a folder containing a sequence of files whose names bear the form filename-white.png. e.g.
images
arrow-down-white.png
arrow-down-right-white.png
...
bullets-white.png
...
...
video-white.png
I want to strip out the -white bit so the names are simply filename.png. I have played around, dry run with -n, with the Linux rename command. However, my knowledge of regexes is rather limited so I have been unable to find the right way to do this.
If you are in the directory above images, the command is
rename "s/-white.png/.png/" images/*
If your current directory is images, then run rename "s/-white.png/.png/" ./* instead. To do a dry run, just attach a -n like you said:
rename -n "s/-white.png/.png/" images/*
or
rename -n "s/-white.png/.png/" ./*
I need to download all 492 files from a Linux directory which contain a given String within the file. I can't quite manage to find a command which can accomplish this from my searching so far. Could anybody help me out?
Cheers.
Use grep to filter the files with a given string and loop over them to scp like this
for file in $(grep <some-pattern> <directory>); do scp $file <remote>; done;
Just in case if you need to filter out also the files in subdirectories of directory add the -R option to grep
This question already has answers here:
How can I recursively find all files in current and subfolders based on wildcard matching?
(19 answers)
Closed 1 year ago.
I'm on Ubuntu, and I'd like to find all files in the current directory and subdirectories whose name contains the string "John". I know that grep can match the content of the files, but I have no idea how to use it with file names.
Use the find command,
find . -type f -name "*John*"
The find command will take long time because it scans real files in file system.
The quickest way is using locate command, which will give result immediately:
locate "John"
If the command is not found, you need to install mlocate package and run updatedb command first to prepare the search database for the first time.
More detail here: https://medium.com/#thucnc/the-fastest-way-to-find-files-by-filename-mlocate-locate-commands-55bf40b297ab
This is a very simple solution using the tree command in the directory you want to search for. -f shows the full file path and | is used to pipe the output of tree to grep to find the file containing the string filename in the name.
tree -f | grep filename
use ack its simple.
just type ack <string to be searched>
hi as a continue to may previous question (ls command error via SFTP in Linux shell script) i have a question:
How can i get the name (or enter) of the latest created directory via SFTP connection ?
As i was told here the function ls -tr | tail -1 option won't work here, as parameters as -tr are not recognized in SFTP.
for example the script after SFTP connection:
cd temp_dir
?????????
assuming that the temp_dir containing several directories, i need to enter the last created directory in it (in order to download the files from it).
how can i do that ?
Thanks.
While sftp use ssh, the better solution is to ssh to the server and :
cd $(ls -t | sed q)
Your previous question contains the essential fact that you use lftp; therewith using cls instead of ls will help.
cls -1t|sed -n 1s/^/cd\\ /p>/tmp/cd
source /tmp/cd
Beware, this uses the file /tmp/cd and is not suited for concurrent operation.