Error exit delayed from previous error - linux

I am trying to find the files in a directory and then gzip and then tar it .
The script :
find /home -type f -name "*.log" -newer /home/path/start_date \
! -newer /home/path/end_date | xargs -0 tar -cvzf files.tar.gz
The tar is still created but I am getting some errors :
tar:/home/path/filename.log\n Cannot stat : No such file or directory
tar:Error exit delayed from previous errors.
Can someone explain what are these errors? Thanks.

You forgot -print0.
-print0
True; print the full file name on the standard output, followed by a
null character (instead of the newline character that -print uses).
This allows file names that contain newlines or other types of white
space to be correctly inter‐ preted by programs that process the
find output. This option corresponds to the -0 option of xargs.
Also quote your exclamation mark to prevent history expansion just in case:
find /home -type f -name "*.log" -newer /home/path/start_date \! -newer /home/path/end_date -print0 | xargs -0 tar -cvzf files.tar.gz
It's not POSIX but if you can use -not, use -not instead:
... -not -newer ...

Related

Write a script that deletes all the regular files (not the directories) with a .js extension that are present in the current directory and all its sub [duplicate]

I'm trying to work out a command which deletes sql files older than 15 days.
The find part is working but not the rm.
rm -f | find -L /usr/www2/bar/htdocs/foo/rsync/httpdocs/db_backups -type f \( -name '*.sql' \) -mtime +15
It kicks out a list of exactly the files I want deleted but is not deleting them. The paths are correct.
usage: rm [-f | -i] [-dIPRrvW] file ...
unlink file
/usr/www2/bar/htdocs/foo/rsync/httpdocs/db_backups/20120601.backup.sql
...
/usr/www2/bar/htdocs/foo/rsync/httpdocs/db_backups/20120610.backup.sql
What am I doing wrong?
You are actually piping rm's output to the input of find. What you want is to use the output of find as arguments to rm:
find -type f -name '*.sql' -mtime +15 | xargs rm
xargs is the command that "converts" its standard input into arguments of another program, or, as they more accurately put it on the man page,
build and execute command lines from standard input
Note that if file names can contain whitespace characters, you should correct for that:
find -type f -name '*.sql' -mtime +15 -print0 | xargs -0 rm
But actually, find has a shortcut for this: the -delete option:
find -type f -name '*.sql' -mtime +15 -delete
Please be aware of the following warnings in man find:
Warnings: Don't forget that the find command line is evaluated
as an expression, so putting -delete first will make find try to
delete everything below the starting points you specified. When
testing a find command line that you later intend to use with
-delete, you should explicitly specify -depth in order to avoid
later surprises. Because -delete implies -depth, you cannot
usefully use -prune and -delete together.
P.S. Note that piping directly to rm isn't an option, because rm doesn't expect filenames on standard input. What you are currently doing is piping them backwards.
find /usr/www/bar/htdocs -mtime +15 -exec rm {} \;
Will select files in /usr/www/bar/htdocs older than 15 days and remove them.
Another simpler method is to use locate command. Then, pipe the result to xargs.
For example,
locate file | xargs rm
Use xargs to pass arguments, with the option -rd '\n' to ignore spaces in names:
"${command}" | xargs -rd '\n' rm
Include --force if you want to also remove read only files.
Assuming you aren't in the directory containing the *.sql backup files:
find /usr/www2/bar/htdocs/foo/rsync/httpdocs/db_backups/*.sql -mtime +15 -exec rm -v {} \;
The -v option above is handy it will verbosely output which files are being deleted as they are removed.
I like to list the files that will be deleted first to be sure. E.g:
find /usr/www2/bar/htdocs/foo/rsync/httpdocs/db_backups/*.sql -mtime +15 -exec ls -lrth {} \;

I want to get an output of the find command in shell script

Am trying to write a script that finds the files that are older than 10 hours from the sub-directories that are in the "HS_client_list". And send the Output to a file "find.log".
#!/bin/bash
while IFS= read -r line; do
echo Executing cd /moveit/$line
cd /moveit/$line
#Find files less than 600 minutes old.
find $PWD -type f -iname "*.enc" -mmin +600 -execdir basename '{}' ';' | xargs ls > /home/infa91punv/find.log
done < HS_client_list
However, the script is able to cd to the folders from HS_client_list(this file contents the name of the subdirectories) but, the find command (find $PWD -type f -iname "*.enc" -mmin +600 -execdir basename '{}' ';' | xargs ls > /home/infa91punv/find.log) is not working. The Output file is empty. But when I run find $PWD -type f -iname "*.enc" -mmin +600 -execdir basename '{}' ';' | xargs ls > /home/infa91punv/find.log as a command it works and from the script it doesn't.
You are overwriting the file in each iteration.
You can use xargs to perform find on multiple directories; but you have to use an alternate delimiter to avoid having xargs populate the {} in the -execdir command.
sed 's%^%/moveit/%' HS_client_list |
xargs -I '<>' find '<>' -type f -iname "*.enc" -mmin +600 -execdir basename {} \; > /home/infa91punv/find.log
The xargs ls did not seem to perform any useful functionality, so I took it out. Generally, don't use ls in scripts.
With GNU find, you could avoid the call to an external utility, and use the -printf predicate to print just the part of the path name that you care about.
For added efficiency, you could invoke a shell to collect the arguments:
sed 's%^%/moveit/%' HS_client_list |
xargs sh -c 'find "$#" -type f -iname "*.enc" -mmin +600 -execdir basename {} \;' _ >/home/infa91punv/find.log
This will run as many directories as possible in a single find invocation.
If you want to keep your loop, the solution is to put the redirection after done. I would still factor out the cd, and take care to quote the variable interpolation.
while IFS= read -r line; do
find /moveit/"$line" -type f -iname "*.enc" -mmin +600 -execdir basename '{}' ';'
done < HS_client_list >/home/infa91punv/find.log

cronjob to remove files older than N days with special characters

I'm trying to create a job to delete files on a linux box older than X days. Pretty straightforward with:
find /path/to/files -mtime +X -exec rm {}\;
Problem is all my files have special characters b/c they are pictures from a webcam - most contain parenthesis so the above command fails with "no such file or directory".
Have you tried this:
find /path/to/files -mtime +X -exec rm '{}' \;
Or perhaps:
rm $(find /path/to/files -mtime +X);
Or even this method using xargs instead of -exec:
find /path/to/files -mtime +X | xargs rm -f;
Another twist on xargs is to use -print0 which will help the script differentiate between spaces in filenames & spaces between the returned list by using the ASCII null character as a file separator:
find /path/to/files -mtime +X -print0 | xargs -0 rm -f;
Or as man find explains under -print0:
This primary always evaluates to true. It prints the pathname of
the current file to standard output, followed by an ASCII NUL
character (character code 0).
I would also recommend adding the -maxdepth and -type flags to better control what the script does. So I would use this for a dry-run test:
find /path/to/files -maxdepth 1 -type f -mtime +1 -exec echo '{}' \;
The -maxdepth flag controls how many directories down the find will execute and -type will limit the search to files (aka: f) so the script is focused on files only. This will simply echo the results. Then when you are comfortable with it, change the echo to rm.
Does
find /path/to/files -mtime +X -print | tr '()' '?' | xargs rm -f
work?

How to tar certain file types in all subdirectories?

I want to tar and all .php and .html files in a directory and its subdirectories. If I use
tar -cf my_archive *
it tars all the files, which I don't want. If I use
tar -cf my_archive *.php *.html
it ignores subdirectories. How can I make it tar recursively but include only two types of files?
find ./someDir -name "*.php" -o -name "*.html" | tar -cf my_archive -T -
If you're using bash version > 4.0, you can exploit shopt -s globstar to make short work of this:
shopt -s globstar; tar -czvf deploy.tar.gz **/Alice*.yml **/Bob*.json
this will add all .yml files that starts with Alice from any sub-directory and add all .json files that starts with Bob from any sub-directory.
One method is:
tar -cf my_archive.tar $( find -name "*.php" -or -name "*.html" )
There are some caveats with this method however:
It will fail if there are any files or directories with spaces in them, and
it will fail if there are so many files that the maximum command line length is full.
A workaround to these could be to output the contents of the find command into a file, and then use the "-T, --files-from FILE" option to tar.
This will handle paths with spaces:
find ./ -type f -name "*.php" -o -name "*.html" -exec tar uvf myarchives.tar {} +
If you want to produce a zipped tar file (.tgz) and want to avoid problems with spaces in filenames:
find . \( -name \*.php -o -name \*.html \) -print0 | xargs -0 tar -cvzf my_archive.tgz
The -print0 “primary” of find separates output filenames using the NULL (\0) byte, thus playing well with the -0 option of xargs, which appends its (NULL-separated, in this case) input as arguments to the command it precedes.
The parentheses around the two -name primaries are needed, because otherwise the -print0 would only output the filenames of the second -name (there is no implied printing if -print or -print0 is present, and these only have an effect if they are evaluated).
If you need to skip some filenames or directories (e.g., the node_modules directory if you work with Node.js), prepend one or more -prune primaries like this:
find . -name skipThisName -prune -o \
-name skipThisOtherName -prune -o \
\( -name \*.php -o -name \*.html \) -print0 | xargs -0 tar -cvzf my_archive.tgz
Put them in a file
find . \( -name "*.php" -o -name "*.html" \) -print > files.txt
Then use the file as input to tar, use -I or -T depending on the version of tar you use
Use h to copy symbolic links
tar cfh my.tar -I files.txt
Easy with zsh:
tar cvzf foo.tar.gz **/*.(php|html)
find ./ -type f -name "*.php" -o -name "*.html" -printf '%P\n' |xargs tar -I 'pigz -9' -cf target.tgz
for multicore or just for one core:
find ./ -type f -name "*.php" -o -name "*.html" -printf '%P\n' |xargs tar -czf target.tgz

How to copy all the files with the same suffix to another directory? - Unix

I have a directory with unknown number of subdirectories and unknown level of sub*directories within them. How do I copy all the file swith the same suffix to a new directory?
E.g. from this directory:
> some-dir
>> foo-subdir
>>> bar-sudsubdir
>>>> file-adx.txt
>> foobar-subdir
>>> file-kiv.txt
Move all the *.txt files to:
> new-dir
>> file-adx.txt
>> file-kiv.txt
One option is to use find:
find some-dir -type f -name "*.txt" -exec cp \{\} new-dir \;
find some-dir -type f -name "*.txt" would find *.txt files in the directory some-dir. The -exec option builds a command line (e.g. cp file new.txt) for every matching file denoted by {}.
Use find with xargs as shown below:
find some-dir -type f -name "*.txt" -print0 | xargs -0 cp --target-directory=new-dir
For a large number of files, this xargs version is more efficient than using find some-dir -type f -name "*.txt" -exec cp {} new-dir \; because xargs will pass multiple files at a time to cp, instead of calling cp once per file. So there will be fewer fork/exec calls with the xargs version.

Resources