I'm developing in Android Studio with gradle.
My .gradle/daemon/ takes up to 50GB of space.
It consist of about 50 files between 1MB and 3GB in size.
How can I reduce size of that folder? How to clean it up safely?
You could for example, regularly delete *.out.log files in your home directory .gradle folder, by following command:
find ~/.gradle -name '*.out.log' -exec rm {} \;
You can put it with crontab -e to be repeated periodically at some appropriate time.
In this example, 'cleaning' would be executed on 11am every Monday:
$ crontab -l
0 11 * * 1 /usr/bin/find ~/.gradle -name '*.out.log' -exec rm {} \;
This will work on any linux or mac os x.
Related
I'm setting up a webserver with Plesk on Ubuntu 18.04 and I would like to use a part of the space I've available to store security footage. I succeeded in automatically uploading the photos and videos to the correct folder, but the problem is that they are not automatically removed, so that the server is full of security images. I upload the footages to a folder on the server that is also available from the internet (secured). I did some research on the internet to a cron job that automatically deleted the files older than 7 days where I found this:
find /var/www/vhosts/path to files/* -mtime +7 -exec rm -f {} \;
I also found that you can name a file to, for example: delete-files and which can be executed with crontab -e. (Yes, I made it executable;-)
I added this cron to run every hour and stated that I received notifications from the cron. Now, however, I get the following output: find: missing argument to `-exec '
Is there anything else that I need to share? Like logs?
change find /var/www/vhosts/path to files/* -mtime +7 -exec rm -f {} \;
to
find /var/www/vhosts/path to files/ -mtime +7 -exec rm -f {} \;
the * is unnecessary in the path
Can you try this as well?
find /var/www/vhosts/path to files/ -mtime +7 | xargs rm -f
I have a folder with large files in centos and how to delete the files older than 30 minutes.
Please suggest your ideas and snippets
It's simple. just use find and add the following line incrontab:
30 * * * * find /path/to/dir -type f -mmin +30 -exec rm -f {} \;
the above command will run every 30 minutes and delete ONLY files older that 30 minutes from the directory /path/to/dir
I am having the following directory with multiple Backup from $(date +"%d.%m.%Y at %H:%M:%S").zip files.
/opt/
/opt/files/
/opt/files/private/*
/opt/files/backup.sh
/opt/files/backup.txt
/opt/files/Backup from $(date +"%d.%m.%Y at %H:%M:%S").zip
With a daily cronjob 0 0 * * * cd /opt/files/ && ./backup.sh > /opt/files/backup.txt I am currently managing my backups.
As you can imagine, this directory gets bigger and bigger over time. I now would like to create another script (or cronjob if it works with one command) to delete the oldest /opt/files/Backup from $(date +"%d.%m.%Y at %H:%M:%S").zip after 14 days (so that I have 14 recent backups all the time).
It would be great if you could explain your answer.
find /opt/files/Backup -name \*.zip -a -mtime +14 -ls
If you are satisfied the files being matched are the ones to delete, replace -ls with "-exec rm {} \;"
I am using a combination of find and copy command in my backup script.
it is used on a fairly huge amount of data,
first, out of 25 files it needs to find all the files older than 60 mins
then copy these files to a temp directory - each of these files are 1.52GB to 2GB
one of these 25 files will have data being appended continuously.
I have learnt from googling that Tarring operation will fail if there is an update going on to the file being attempted to tar, is it the same thing with find and copy also??
I am trying something like this,
/usr/bin/find $logPath -mmin +60 -type f -exec /bin/cp {} $logPath/$bkpDirectoryName \;
after this I have a step where I tar the files copied to the temp directory as mentioned above(&bkpDirectoryName), here I use as mentioned below,
/bin/tar -czf $bkpDir/$bkpDirectoryName.tgz $logPath/$bkpDirectoryName
and this also fails.
the same backup script was running from past many days and suddenly it has started failing and causing me headache! can someone please help me on this??
can you try these steps please
instead of copying files older than 60 min, move them.
run the tar on the moved files
If you do the above, the file which is continuously appended will not be moved.
In case any of your other 24 files might be updated after 60 min, you can do the following
Once you move a file, touch a file with the same name in case there are async updates which are not continuous.
When tarring the file, give a timestamp name to the tar file.This way you have a rolling tar of your logs
If nothing works due to some custom requirement on your side, try doing a rsync and then do the same operations on the rsynced files (i.e find and tar or just tar)
try this
output=`find $logPath -mmin 60 -type f`
if [ "temp$output" != "temp" ];then
cp -rf $output $other_than_logPath/$bkpDirectoryName/
else
echo sorry
fi
I think, you are using +60 instead of 60.
I also want to know, at what interval your script gets called.
#!/bin/bash
for find in `find / -name "*" -mmin 60`
do
cp $find / ## Choose directory
done
That's basically what you need, just change the directory I guess//
so I have looked at every single script on here regarding deleting directories older than 14 days. The Script I wrote works with files but for some reason it is not deleting the directories. So here is my scripts.
#!/bin/bash
find /TBD/* -mtim +1 | xargs rm -rf
So this code successfully deleted the FILES inside TBD but it left two directories. I checked the timestamp on them and they are atleast 2 days since last modification according to the timestamp. Specifically Dec 16 16:10 So I can't figure this out. My crontab I have running this runs every minute and logs and in the log it only shows.
+ /scripts/deletebackups.sh: :2:BASH_XTRACEFD=3xargs rm -rf
+ /scripts/deletebackups.sh: :2: BASH_XTRACEFD=3find /TBD/contents TBD/contents -mtime +1
I used contents since the contents are actually peoples name in our pxe server. I checked every file and folder INSIDE these two directories and their timestamps are the same as the parent directory as they should be but it's still not deleting.
Could it be a permissions thing? I wrote the script using sudo nano deletebackups.sh
When I type ls under TBD in the far left it shows
drwxr-xr-x 3 hscadministrator root 4096 DEC 16 16:10 for each of the two directories that won't delete.
I'm not overly familiar with what all those letters mean.
Other iterations of this code I have already attempted are
find /TBD/* -mtime +1 rm -r {} \;
To delete directories in /TBD older than 1 day:
find /TBD -mtime +1 -type d | xargs rm -f -r
Add -exec and -f to your find:
find /TBD/* -mtime +1 -exec rm -rf {} \;
Note, if you're looking to delete files older than 14 days, you need to change mtime:
-mtime +14