I have a file structure as following:
/home/myhome/me/staging/15/1234/my_stats/
/home/myhome/me/staging/16/5678/my_stats/
/home/myhome/me/staging/17/7890/my_stats/
/home/myhome/me/staging/18/3456/my_stats/
I need to travel to the dir "my_stats" and execute query to find files in my cmd. There are multiple dirs in "staging" and I need to go into every one of them and check if 'my_stats' dir exists. If it exists, then I need to run a cmd query in "my_stats" dir.
The dir structure will always be in the following format:
/home/myhome/me/staging/<2 digit name>/<4 digit name>/my_stats/
I have tried iterating through the structure using a nested for loop and checking all dirs in 'staging' which is proving to be slow. Is there a way to using the 'find' command with 'depth' to do the same?
Or can we implement this with pattern matching ?
Appreciate the help. Thanks!
found the answer!
we can use * for it.
/home/myhome/me/staging/*/**/my_stats/*
Will try to find a better solution which can maybe use len of dir to better differentiate it
Try this one
find . -type f -path "./[0-9][0-9]/[0-9][0-9][0-9][0-9]/my_stats/*"
can replace the . with your own path.
Related
I'm sure I had a working oneliner that allowed me to search a directory (or .) for files containing names matching names in a txt file and copying these to a new directory.
Somehow I cannot get it to work - any help please.
Sorry if this is a duplicate - I have really searched for an answer (here and elsewhere), but cannot find a solution.
foo/movehere/sample.txt file:
141516
141619
Files I want to find and move i.e.:
foo/folder/folder2/141516_S2_R1.fastq.gz
foo/folder/folder2/141516_S2_R1.fastq.gz
Where I want to move them:
foo/movehere/
my current (nonfunctioning) oneliner:
while read -r FILE; do find . -name "$FILE*.fastq.gz" -type f -exec cp {} /foo/movehere/ \;;done </foo/movehere/sample.txt
There are some errors in the oneliner. It still does not work.
you can use eval in your code
SEARCH="-name '$FILE*.fastq.gz'"
eval "find . $SEARCH -type f exec cp '{}' /foo/movehere/ \";
security note : do not put user supplied data into eval.
Not sure if I should delete the post - but I'll leave my solution here if anyone else encounter the exact same problem.
Still not 100% sure I understand why it failed, but I got the oneliner working by copying all the sample names from the txt to a unedited file with no suffix.
I guess some (hidden) "\r" editing in the txt file messed up the "$FILE" so that it searched for something like this:
151617*fastq.gz\r
Perhaps someone with a better understanding of terminal scripts may confirm this.
EDIT 190128: happened across my old question, and just in case anyone struggle with something similar, make sure you have UNIX or similar line shifts, my txt files had weird window line shifts.
In my directory there are the files:
file1.txt fix.log fixRRRRRR.log fixXXXX.log output.txt
In order to understand the find command, I tried a lot of stuff among other things I wanted to use 2 wildcards. Target was to find files that start with an f and have an extension starting with an l.
$ find . f*.l*
./file1.txt
./fix.log
./fixRRRRRR.log
./output.txt
./fixXXXX.log
fix.log
fixRRRRRR.log
fixXXXX.log
I read in a forum answer to use quotation marks with find find . "f*.l*" with the result: `
./file1.txt
./fix.log
./fixRRRRRR.log
./output.txt
./fixXXXX.log
It results in find: ‘f*.l*’: No such file or directory
What am I doing wrong, where is my error in reasoning?
Thanks for an answer.
find doesn't work like that. In general find's call form looks like:
find [entry1] [entry2] ... [expressions ...]
Where an entry is a starting point where find starts the search for files.
In your case, you haven't actually supplied any expressions.
In the first command (without quotes), the shell expands the wildcards to a list of matching files (in the current directory), then passes the list to find as arguments. So find . f*.l* is essentially equivalent to find . fix.log fixRRRRRR.log fixXXXX.log. As a result, find treats all of those arguments as directories/files to search (not patterns to search for), and lists all files under ., (everything) then all files under fix.log (it's not a directory, so that's just the file itself), then all files under fixRRRRRR.log and finally all files under fixXXXX.log.
In the second one (with quotes) it searches for all files beneath the current directory (.) and tries the same for the file literally called "f*.l*".
Actually you are likely seeking for the "-name" expression, which may be used like this:
find . -name "f*.l*"
I need to rename hundreds of files in Linux to change the unique identifier of each from the command line. For sake of examples, I have a file containing:
old_name1 new_name1
old_name2 new_name2
and need to change the names from new to old IDs. The file names contain the IDs, but have extra characters as well. My plan is therefore to end up with:
abcd_old_name1_1234.txt ==> abcd_new_name1_1234.txt
abcd_old_name2_1234.txt ==> abcd_new_name2_1234.txt
Use of rename is obviously fairly helpful here, but I am struggling to work out how to iterate through the file of the desired name changes and pass this as input into rename?
Edit: To clarify, I am looking to make hundreds of different rename commands, the different changes that need to be made are listed in a text file.
Apologies if this is already answered, I've has a good hunt, but can't find a similar case.
rename 's/^(abcd_)old_name(\d+_1234\.txt)$/$1new_name$2/' *.txt
Should work, depending on whether you have that package installed. Also have a look at qmv (rename-utils)
If you want more options, use e.g.
shopt -s globstart
rename 's/^(abcd_)old_name(\d+_1234\.txt)$/$1new_name$2/' folder/**/*.txt
(finds all txt files in subdirectories of folder), or
find folder -type f -iname '*.txt' -exec rename 's/^(abcd_)old_name(\d+_1234\.txt)$/$1new_name$2/' {} \+
To do then same using GNU find
while read -r old_name new_name; do
rename "s/$old_name/$new_name/" *$old_name*.txt
done < file_with_names
In this way, you read the IDs from file_with_names and rename the files replacing $old_name with $new_name leaving the rest of the filename untouched.
I was about to write a php function to do this for myself, but I came upon a faster method:
ls and copy & paste the directory contents into excel from the terminal window. Perhaps you may need to use on online line break removal or addition tool. Assume that your file names are in column A In excel, use the following formula in another column:
="mv "&A1&" prefix"&A1&"suffix"
or
="mv "&A1&" "&substitute(A1,"jpeg","jpg")&"suffix"
or
="mv olddirectory/"&A1&" newdirectory/"&A1
back in Linux, create a new file with
nano rename.txt and paste in the values from excel. They should look something like this:
mv oldname1.jpg newname1.jpg
mv oldname1.jpg newname2.jpg
then close out of nano and run the following command:
bash rename.txt. Bash just runs every line in the file as if you had typed it.
and you are done! This method gives verbose output on errors, which is handy.
I have a list of files that I need to copy. I want to recursively search a drive and copy those files to a set location if that filename exists in the list. The list is a text file/
the text file would look something like this:
A/ART-FHKFX1.jpg
B/BIG-085M.jpg
B/BIG-085XL.jpg
L/LL-CJFK.jpg
N/NRT-56808EA.jpg
P/PFE-25.10.jpg
P/PFE-7/60.jpg
P/PFE-7L.20.jpg
P/PFE-8.25.jpg
P/PFE-9.15.jpg
P/PFE-D11.1.tiff
P/PFE-D11.1.tiff
P/PFE-D12.2.tiff
P/PFE-D12.2.tiff
using find will take a lot of time, try to use locate if possible.
what will happen when there's several matches? like searching for foo.bar and having a/foo.bar and also b/foo.bar what would you do in that case?
your csv seems to include a path, given the previous I'll assume those paths are actually valid from where the script is run so in that case just do this:
#!/bin/bash
while read path; do
cp "$path" "$1"
done
then call it like this:
teh_script /path/to/destination < csv-file.csv
I am using Linux(Ubuntu), I am trying to find the files, but it is not working properly.
I have created some files in my directory structure, for example: World/India/Maharashtra/Pune/filename.xml
When I use the find command like:
find /home/lokesh/Desktop/Testing_India2/Test/World/India/Maharashtra/ -name filename*.xml -mmin -3000
It is giving the result perfectly.
But, when I am using the same command at "World" or "India" level:
find /home/lokesh/Desktop/Testing_India2/Test/World/ -name filename*.xml -mmin -3000
it does not give any result.
I have lots of directories at "India" level as well as at "Maharashtra" level and may be some directories within "Maharashtra's" inner directories. I have to find each file created in all directories.
And I have mounted all folders from different machine.(I mean some state from different and some from different machine.)
If someone knows how to solve this problem please reply me as soon as possible.
Double quote your search string and -L to make it follow symbolic links:
find -L /home/lokesh/Desktop/Testing_India2/Test/World/ -name "filename*.xml" -mmin -30000
This is something I ran into earlier today actually when using the '*' wildcard. I couldn't get it to continually traverse the subdirectories unless I escaped the * with a .
Give this a try:
find -L /home/lokesh/Desktop/Testing_India2/Test/World/ -name filename\*.xml -mmin -30000
Yes, as mentioned you have to double qoute your -name argument or use a backslash prior to the *. The reason for it not working from one directory, but working fine in other directories, is that the * character is used for filename generation by your shell. This of course happens before the find command is executed. Therefore, if you have a file that match the filename*.xml pattern in your current directory it will be substituted before find is executed, which is not what you want. On the other hand, if there is no pattern match in the current directory, the * character is passed on to the find command unmodified. By qouting you protect the string from shell filename generation.
Regards