Tar'ing using wildcards where one type may not exist - linux

I have a shell script to automate the creation of separate tar files for several directories; cd'ing to each and calling the command:
tar cf pakage1.tar *.csv *.fmt
Most directories contain .fmt and .csv files, I need a solution for when a *.csv may not exist but *.fmt does and therefore a tar is required. I haven't found an 'ignore wildcard if not found' command, does one exist?
Thankyou in advance.

Use find in combination with xargs:
find . \( -name '*.csv' -or -name '*.fmt' \) -print0 | xargs -0 tar cf pakage1.tar
-print0 and -0 to use null-separators instead of spaces otherwise it will choke on filenames with spaces in them.

Related

Bash - Find directory containing specific logs file

I've created a script to quickly analyze some logs and automatically provide advices to solve problems based on errors found.
All works as expected.
However, it's appears that folders structure containing these logs can change (depends on system configuration) and my script not work any more.
I would like to find a way to find the directory containing a specifics files like logs or appinfo.txt file.
Once obtains I could use it as variable and finally solve my problem.
Here is an example:
AppLogDirectory ='Your_Special_Command_You_Will_HelpMe_To_Find'
grep -i "Error" $AppLogDirectory/esl*.log
Log format is: ESL.randomValue.log
Files analyzed : appinfo.txt,
system.txt etc ..
A suggested in comment section, I edit my orginal post with more detail to clarify the context, below an example:
Log files (esl.xxx.tt.ss.log ) can be in random directory, like:
/var/log/ApplicationName/logs/
/opt/ApplicationName/logs/
/var/data/ApplicationName/Extended/logs/
Because of random directory, I need to find a solution to print the directory names of the files that match esl*.log patter (without esl filename)
Use find and pass the output to xargs with grep, like so, which runs grep on multiple files and prints the output together with the file name where the pattern was found:
find /path/to/files /another/path/to/other/files \( -name 'appinfo.txt' -o -name 'system.txt' -o -name 'esl*.log' \) -print0 | xargs -0 grep -i 'Error'
Or simply use -exec ... \+, which gives the same effect, without the need for xargs:
find /path/to/files /another/path/to/other/files \( -name 'appinfo.txt' -o -name 'system.txt' -o -name 'esl*.log' -exec grep -i 'Error' \+
To find the directories which contain the files that contain the desired pattern, use grep -l to print file names only (not the lines that match), and pipe the results to xargs dirname to print the directory names. If you need the unique dir names, pipe it further to sort -u:
find /path/to/files /another/path/to/other/files \( -name 'appinfo.txt' -o -name 'system.txt' -o -name 'esl*.log' -exec grep -il 'Error' \+ | xargs dirname | sort -u
SEE ALSO:
GNU find manual
To search for files based on their contents
xargs
Solution found thanks to you thank you again!
#Ask for extracted tar.gz folder
read -p "Where did you extract the tar.gz file? r1
#directory path where esl files is located
logpath=`find $r1 -name "esl*.log" | xargs dirname | sort -u`
#Search value (here "Error") into all esl*.log
grep 'Error' $logpath/esl*.log | awk '{print $8}'

How to find specific file types and tar them?

It seems I've got a problem. I've got some different file types in my current directory, and I want to just tar the .png files. I started with this:
find -name "*.png" | tar -cvf backupp.tar
It wouldn't work because I didn't specify which files, so looking on how others did it, I added xargs:
find -name "*.png" | xargs tar -cvf backupp.tar
It did work this time, and backupp.tar file was created, but here is the problem. I can't seem to extract it. Whenever I type:
tar -xvf backupp.tar
Nothings happens. I've tried changing chmod and sudo, but nothing gives in.
So, did I type the wrong command completely or is there somethings I just missed?
tar expects a list of names as arguments. Your use of xargs can be improved by adding the -print0 option to find and adding the -0 option to xargs to insure find is providing filenames separated by a nul-character and that xargs is processing a list of filenames separated by the same. This prevents any whitespace or other stray characters in the filenames from causing problems, e.g.
find dir -type f -name "*.png" -print0 | xargs -0 tar -cf tarfile.tar
The above will find all files in or below dir matching name "*.png" and provide a list of filenames separated by the nul-character to xargs for use by tar. You can list the files contained in the resulting archive with:
tar -tf tarfile.tar
Consider using compression (if wanted) by adding the z (gzipped) j (bzip2) or J (xz compression) and the appropriate extension to reduce you archive size. e.g.
... | xargs -0 tar -czf tarfile.tar.gz

Error exit delayed from previous error

I am trying to find the files in a directory and then gzip and then tar it .
The script :
find /home -type f -name "*.log" -newer /home/path/start_date \
! -newer /home/path/end_date | xargs -0 tar -cvzf files.tar.gz
The tar is still created but I am getting some errors :
tar:/home/path/filename.log\n Cannot stat : No such file or directory
tar:Error exit delayed from previous errors.
Can someone explain what are these errors? Thanks.
You forgot -print0.
-print0
True; print the full file name on the standard output, followed by a
null character (instead of the newline character that -print uses).
This allows file names that contain newlines or other types of white
space to be correctly inter‐ preted by programs that process the
find output. This option corresponds to the -0 option of xargs.
Also quote your exclamation mark to prevent history expansion just in case:
find /home -type f -name "*.log" -newer /home/path/start_date \! -newer /home/path/end_date -print0 | xargs -0 tar -cvzf files.tar.gz
It's not POSIX but if you can use -not, use -not instead:
... -not -newer ...

search through files and put into array

cat `find . -name '*.css'`
This will open any css file. I now what do two things.
1) How do I add *.js to this as well. So I want to look inside all css and javascript files.
2) I want to look for any css or image files within those (css or js files) and push those into an array. So I guess look for a .png, .jpg, .gif, .tif, .css and put everything before that until the quote or single quote into an array. I want an array because this command will go into a shell script and after I get all the names of the files that I need I will need to loop through and download those files later.
Any help would be appreciated.
Extra hackery, in case someone needs it:
find ./ -name "*.css" | xargs grep -o -h -E '[A-Za-z0-9:./_-]+\.(png|jpg|gif|tif|css)'| sed -e 's/\.\./{{url here}}/g'|xargs wget
will download every missing resource
Do the command:
find ./ -name "*.css" -or -name "*.js" > fileNames.txt
Then read each line of fileNames.txt in the loop and download them.
Or if you are going to use wget to download the images you could do:
find ./ -name "*.css" -or -name "*.js" | xargs grep '*.png' | xargs wget
May need a little refinement like a cut after the grep but you get the idea
1) simple answer: you can add the names of all .js files to your cat command, by instructing find to find more files:
cat `find . -name '*.css' -or -name '*.js'`
2) a text-searching tool such as grep is probably what you're after:
find . -name '*.css' -or -name '*.js' | xargs grep -o -h -E '[A-Za-z0-9:./_-]+\.(png|jpg|gif|tif|css)'
Note: my grep pattern isn't universal or perfect, but it's a starting example. It matches any string that includes alpha-numeric,colon,dot,slash,underscore or hyphens in it, followed by any one of the given extensions.
The -o option causes grep to output only the parts of the .css/.js files that match the pattern (i.e. only the apparent filenames).
If you want to download them you could add | xargs wget -v to the command, which would instruct wget to fetch all those filenames.
NOTE: this won't work for relative filenames; some other magic will be required (i.e. you'll have to resolve them with respect to the grepped file's location). Perhaps some extra hackery, such as sed or awk.
Also: How often do you see references to TIFFs in your CSS/JS?

Linux command for removing all ~ files

What command can I use in Linux to check if there is a file in a given directory (or its subdirectories) that contains a ~at the end of the file's name?
For example, if I'm at a directory called t which contains many subdirectories, etc, I would like to remove all files that end with a ~.
Watch out for filenames with spaces in them!
find ./ -name "*~" -type f -print0 | xargs -0 rm
with GNU find
find /path -type f -name "*~" -exec rm {} +
or
find /path -type f -name "*~" -delete
find ./ -name '*~' -print0 | xargs -0 rm -f
Here find will search the directory ./ and all sub directories, filtering for filenames that match the glob '*~' and printing them (with proper quoting courtesy of alberge). The results are passed to xargs to be appended to rm -f and the resulting string run in a shell. You can use multiple paths, and there are many other filters available (just read man find).
you can use a find, grep, rm combination, something like
find | grep "~" | xargs rm -f
Probably others have better ideas :)

Resources