Error using find command to find, compress and delete files - linux

I need to find files, compress and delete them.
Example: In the current directory, I have files "log.0", "log.1" and "log.2". If I run:
find . -type f -exec tar -zcvf "logs.tar.gz" "{}" \;
and decompress the file logs.tar.gz, I will get only the file log.0.
How can I compress all files found in find command?

Try this:
find . -type f -print0 |
tar -zcvf logs.tar.gz --null --files-from - --remove-files

Related

Create a empty tar file and then store its name in a variable

I am writing a shell script in which a tar file with today's date and time will be created.
tar -zvcf "log_grabber$(date '+%y-%m-%d_%H%M').tar.gz" --files-from /dev/null
Now, to add more files to this tar files after running the find command. How can I get the name of the tar file and use it in the output of the find command?
find . -type f -name 'local*' -newermt "$user_date" -exec tar -rvf <variable tar file> {} \;
Any help will be very much useful.
Instead of
tar -zvcf "log_grabber$(date '+%y-%m-%d_%H%M').tar.gz" --files-from /dev/null
Create a variable with the name first and use that:
name="log_grabber$(date '+%y-%m-%d_%H%M').tar.gz"
tar -zvcf "$name" --files-from /dev/null
And then:
find . -type f -name 'local*' -newermt "$user_date" -exec tar -rvf "$name" {} +
Note that I changed \; to + so that tar gets multiple files in one invocation, rather than one tar invocation per file.

script to tar up multiple log files separately

I'm on a RedHat Linux 6 machine, running Elasticsearch and Logstash. I have a bunch of log files that were rotated daily from back in June til August. I am trying to figure out the best way to tar them up to save some diskspace, without manually taring up each one. I'm a bit of a newbie at scripting, so I was wondering if someone could help me out? The files have the name elasticsearch-cluster.log.datestamp. Ideally they would all be in their individual tar files, so that it'd be easier to go back and take a look at that particular day's logs if needed.
You could use a loop :
for file in elasticsearch-cluster.log.*
do
tar zcvf "$file".tar.gz "$file"
done
Or if you prefer a one-liner (this is recursive):
find . -name 'elasticsearch-cluster.log.*' -print0 | xargs -0 -I {} tar zcvf {}.tar.gz {}
or as #chepner mentions with the -exec option:
find . -name 'elasticsearch-cluster.log.*' -exec tar zcvf {}.tar.gz {} \;
or if want to exclude already zipped files:
find . -name 'elasticsearch-cluster.log.*' -not -name '*.tar.gz' -exec tar zcvf {}.tar.gz {} \;
If you don't mind all the files being in a single tar.gz file, you can do:
tar zcvf backups.tar.gz elasticsearch-cluster.log.*
All these commands leave the original files in place. After you validate the tar.gz files, you can delete them manually.

Trying to FIND and then TAR.GZ found files

I found what I thought was a solution in this forum to being able to find my specific LOG files and then doing TAR.GZ on these files for a backup. However, when execute the command I'm getting an error. The command prior to the pipe works great and finds the files that I'm needing but when trying to create the backup file I blow up. Any suggestions/direction would be appreciated. Thanks.
Here is the command:
find /var/log/provenir -type f -name "*2014-09-08.log" | tar -cvzf backupProvLogFiles_20140908.tar.gz
Here is the error I'm getting:
find /var/log/provenir -type f -name "*2014-09-08.log" | tar -czvf backupProvLogFiles_20140908.tar.gz --null -T -
tar: Removing leading `/' from member names
tar: /var/log/provenir/BureauDE32014-09-08.log\n/var/log/provenir/DE_HTTP2014-09
-08.log\n/var/log/provenir/BureauDE22014-09-08.log\n/var/log/provenir/DE_HTTP220
14-09-08.log\n: Cannot stat: No such file or directory
tar: Exiting with failure status due to previous errors
You can also use gzip to do so
find /var/log/provenir -type f -name "*2014-09-08.log" | gzip > tar -cvzf backupProvLogFiles_20140908.tar EDIT
EDIT
A better solution would be to use command substituion
tar -cvzf backupProvLogFiles_20140908.tar $(find /var/log/provenir -type f -name "*2014-09-08.log")
I think you mean something like this:
find . -name "*XYZ*" -type f -print | tar -cvz -T - -f SomeFile.tgz
I was finally able to find a solution just in case someone else might be looking for another option to answer this question:
find /var/log/provenir -type f -name "*2014-09-08.log" -print0 | tar -czvf /var/log/provenir/barchive/backupProvLogFile_20140908.tar.gz --null -T -
This worked great. The answer came from this post: Find files and tar them (with spaces)
Thanks again for the help I received.
Regards.

Find and tar for each file in Linux

I have a list of files with different modification times, 1_raw,2_raw,3_raw... I want to find files that are modified more than 10 days ago and zip them to release disk space. However, the command:
find . -mtime +10 |xargs tar -cvzf backup.tar.gz
will create a new file backup.tar.gz
What I want is to create a tarball for each file, so that I can easily unzip each of them when needed. After the command, my files should become: 1_raw.tar.gz, 2_raw.tar.gz, 3_raw.tar.gz...
Is there anyway to do this? Thanks!
Something like this is what you are after:
find . -mtime +10 -type f -print0 | while IFS= read -r -d '' file; do
tar -cvzf "${file}.tar.gz" "$file"
done
The -type f was added so that it doesn't also process directories, just files.
This adds a compressed archive of each file that was modified more than 10 days ago, in all subdirectories, and places the compressed archive next to its respective unarchived version (in the same folder). I assume this is what you wanted.
If you didn't need to handle whitespaces in the path, you could do with simply:
for f in $(find . -mtime +10 -type f) ; do
tar -cvzf "${f}.tar.gz" "$f"
done
Simply, try this
$ find . -mtime +10 | xargs -I {} tar czvf {}.tar.gz {}
Here, {} indicates replace-str
-I replace-str
Replace occurrences of replace-str in the initial-arguments with names read from standard input. Also, unquoted blanks do not terminate input items; instead the separator is the newline character. Implies -x and -L 1.
https://linux.die.net/man/1/xargs

How to zip 90days old files and move it to a specific folder using bash in linux

I have lots of files in my FILES folder. I want to zip the files that are 90days old then remove it from the FILES folder and move it to the ARCHIVES folder using bash in linux.
This is my folder structure:
root#user:/var/FILES
root#user:/var/ARCHIVES
I have created a script to zip a file but don't know how to specify the age of the file
zip -r zipped.zip *.*
so i coded something like
FILE=find *.* -mtime +90
zip -r zipped.zip $FILE
but only returns error. Thanks
You can use:
find . -mtime +90 -exec zip zipped.zip '{}' +
EDIT If you want move zipped file to an archive folder then you can do:
find . -mtime +90 -exec zip zipped.zip '{}' + && mv zipped.zip /var/ARCHIVES
you can try find
find /var/FILES/ -type f -mtime +90 -exec zip -r zipped.zip {} \; -exec mv {} /var/ARCHIVES \;
Not sure whether I understand you correct, if you want to save zipped.zip in /var/ARCHIVES
just use this:
find /var/FILES/ -type f -mtime +90 -exec zip -r /var/ARCHIVES/zipped.zip {} \;

Resources