Gzip or tars logs that are older than 7 days [closed] - linux

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 2 years ago.
Improve this question
Need to create a single script that would Gzip or tars logs that are older than 7 days in multiple (3) paths /home/temp, home/logs, then confirm this one is over 50% capacity home/var/lib/mongo. This is what I got so far but I can't think of how to combine these:
find . -mtime +7 -print -exec gzip {} \; for all 3 but them
find /tmp/log/ -mtime +7 -type f -exec sh -c \
'tar -czvPf /tmp/older_log_$(basename $0)_$(date +%F).tar.gz $0' {} ;
# create a single tar file for each archive +7 days old on one mount

If I understand you correclty, you would like to archive multiple old logs in one tar files:
find /tmp/log /home/temp /home/logs -mtime +7 -type f \
| xargs tar -czf /tmp/older_log_$(date +%F).tar.gz --remove-files
For easy reading I put it on two lines.
It searches for all files you want to archive
Pass the found files as arguments to the tar archive (using xargs)
tar will make the new archive and add all the files
additionally tar will remove the original files with the last oiption (from GNU tar)

Related

Copy files from one folder to another folder with filters [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 2 years ago.
Improve this question
I have these files in my source folder
source_path/date=20191230/somefile.txt
source_path/date=20191231/somefile.txt
source_path/date=20200101/somefile.txt
source_path/date=20200102/somefile.txt
If I do the bellow command all files will be copied to my dest_path folder
cp --recursive source_path/ dest_path/
I just want to copy all folders where dates are in 2020 or something
I just need these to files of 2020
source_path/date=20200101/somefile.txt
source_path/date=20200102/somefile.txt
How can I add filters with cp command
This question is not suitable for Stack Overflow, but this is the answer:
cp --recursive source_path/date=20200* dest_path/
Or does dest_path not exist? Then you would write
mkdir -p dest_path && cp --recursive source_path/date=20200* dest_path/
You can use find with the -name, -type and -exec flags and so if the source directory was /home/foo and the destination directory /tmp:
find /home/foo -type d -name "*date-2020*" -exec cp '{}' /tmp \;
-type signifies that we only searching directories, -name is the name we are searching for and exec for command we are executing with the results

No such file or directory find command on linux [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 6 years ago.
Improve this question
I've created script which delete old backup file from directory but this command was worked fine before 1 week and Nothing change on script or packages but still getting below error:
root#:# find /var/backups/abc/* -type d -mtime +6
/var/backups/abc/2016-03-09_0321
root#:~# find /var/backups/abc/* -type d -mtime +6 -exec rm -rf {} \;
find: `/var/backups/abc/2016-03-08_0321': No such file or directory
Problem is that, this script run every day on cron, I getting a mail like " find: `/var/backups/abc/2016-03-08_0321': No such file or directory". files are deleted but such a mails are getting from root.
find /var/backups/abc/* -type d -mtime +6 -prune -exec rm -rf {} \;
Here, we useĀ -pruneĀ on the directories that we're about to delete, so find will then not try to read their contents.
This is because of after having returned your directory, find will try to look in it (to continue his recursive search), and will fail because of you just removed it.

Find empty files, if found update files with [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 7 years ago.
Improve this question
Can someone with more Linux knowledge answer this correctly for me.
On our web server, we host and run ALOT of web scripts.
we control these via Datestamp files, So the script is not over ran, or ran more than once.
A lot of files are all 0 KB. I wanted to know if there is a quick way in Linux to locate the files and update them.
I have located the files using:
find /var/www/vhosts/DOMAINNAME.co.uk/httpdocs -name "datestamp.*" -type f -empty
I have a long list of files, Can i update these with a simple datestamp format:
i.e.
20150923114046
You can use the -exec option of find:
find /var/www/vhosts/DOMAINNAME.co.uk/httpdocs -name "datestamp.*" -type f -empty \
-exec bash -c 'echo 20150923114046 > {}' \;
To get the timestamp dynamically, use date:
bash -c 'echo $(date +%Y%m%d%H%M%S) > {}'
To use the last modified timestamp, use the -r option:
bash -c 'echo $(date +%Y%m%d%H%M%S -r {}) > {}'

linux pipe argument list too long [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
I use the following bash script to remove files older than $days.
find /home/xxx/conf_* -maxdepth 0 -mindepth 0 -type d -ctime +5 -exec rm -rf {} \;
However if the files are more than 32000+, I get
/usr/bin/find: Argument list too long
how do I trim the list down to like 20000 only?
From comment to answer:
Your problem is the glob expansion but you are already using a tool that can perfectly well handle an arbitrary number of found results, namely find. As such you should use a glob at all. Instead you should let find do all the work.
Something like:
find /home/xxx -maxdepth 1 -name 'conf_*' -type d -ctime +5 -exec rm -rf {} \;
Also if your find has -exec \+ you should probably use this instead:
find /home/xxx -maxdepth 1 -name 'conf_*' -type d -ctime +5 -exec rm -rf {} \+
For such a large number of matching directories I imagine the significantly reduced amount of executions of rm might be significantly more efficient.

find -mtime returns wrong listing [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 9 years ago.
Improve this question
When I run this command:
root:/home/mws 0$ ls -lrt `find /home/data/ll_misc_logs/ -mtime +20`
And there are no files meeting the mtime setting, 20 days, it lists the contents of the current directory, /home/mws
Why?
Is there a way to just return nothing or a message?
When there are no files meeting the mtime setting, the output of find .... expands to ... nothing. In which case, your command becomes ls -lrt, which will always list the current directory.
If there aren't too many files on a typical run, this might work better:
find /home/data/ll_misc_logs -mtime +20 -print0 | xargs -0 -r ls -ltr
But, if you get so many files that xargs decides to split it into multiple invocations, it probably won't do exactly what you want, either.
Which leads me to... What exactly are you trying to do? On the surface, it looks like "show me the old files, in order by modification time", but it's likely part of something bigger that might be solved in a more efficient (and less error-prone) manner...
If you just want a list of files older than 20 days sorted by oldest first:
find /home/data/ll_misc_logs -mtime +20 -exec ls -l --time-style=%s {} \; | sort -n -k 6

Resources