Cron Job email find: missing argument to `-exec' - linux

I'm setting up a webserver with Plesk on Ubuntu 18.04 and I would like to use a part of the space I've available to store security footage. I succeeded in automatically uploading the photos and videos to the correct folder, but the problem is that they are not automatically removed, so that the server is full of security images. I upload the footages to a folder on the server that is also available from the internet (secured). I did some research on the internet to a cron job that automatically deleted the files older than 7 days where I found this:
find /var/www/vhosts/path to files/* -mtime +7 -exec rm -f {} \;
I also found that you can name a file to, for example: delete-files and which can be executed with crontab -e. (Yes, I made it executable;-)
I added this cron to run every hour and stated that I received notifications from the cron. Now, however, I get the following output: find: missing argument to `-exec '
Is there anything else that I need to share? Like logs?

change find /var/www/vhosts/path to files/* -mtime +7 -exec rm -f {} \;
to
find /var/www/vhosts/path to files/ -mtime +7 -exec rm -f {} \;
the * is unnecessary in the path
Can you try this as well?
find /var/www/vhosts/path to files/ -mtime +7 | xargs rm -f

Related

why find and delete vsftp log file,but disk space not release , try and remains failed

why delete vsftp log file,but disk space not release,any good way to reslove that
first, shell scripts follow as
find /var/log -mtime +7 -name "vsftpd.log-*" -exec rm -rf {} \;
lsof | grep delete , and find log files consumed by process, i want not delete the process manually
So, I modify the shell, follow as
find /var/log -mtime +7 -name "vsftpd.log-*" -exec echo ""> {} \;
but faild, that create a file named {}

Linux -mtime not working as supposed to

In RedHat I have a script with the following line:
find /oracle/app/oracle/diag/tnslsnr/listener/alert -name 'log_*' -mtime +5 -exec rm -f {} \;
It supposed to remove files 'log_*' which are older then 5 days, but it removes also files which are only 1 day old.
I think the command is good, what else could I check that is doing this unexpected behaviour?

Linux cron job remove .zip files from folder every 12 hours

I'm looking for help to make a cron job that every 12 hours will delete all .zip files from a subfolder in my hostgator hosting account.
http://prntscr.com/7rr22d
After some google research I tried this command but nothing seems to happen
find /home/username/domain.com -type f -name "*.zip" |xargs rm
What should I put in the "Command:" field ?
Maybe try:
find /home/username/domain.com -name "*.zip" -exec rm -rf {} \;

Deleting files that are older than one day [duplicate]

This question already has answers here:
find files older than X days in bash and delete
(3 answers)
Closed 7 years ago.
I have a server which creates several log files in the log directory. Due to this logging mechanism it eats up a lot of disk space on my server. I want to write a script that deletes all the files that are older than one day and keep the latest ones.
I am able to list the directories in sorted form using ls -trl command. But I am not able to understand how to remove these files. Please help.
You can use the following command:
/usr/bin/find <Your Log Directory> -mtime +1 | xargs rm -f
mtime - provides the file modification time.
+1 - indicates greater than one day.
Try using rm and find command like:
find . -mmin +$((60*24)) -exec rm {} \;
You don't want ls, you want find.
It has a neat argument, -mtime, that limits the results to a specific time delta, and -exec which allows you to provide a command to run on the results.
So for example,
find -mtime +10 -name "*tmp*" -exec rm {} \;
Does an rm on all files older than 10 days, with tmp in the name.
Oh, and be careful.
Very careful.
find . -mtime +1 -exec rm {} \;

In linux shell, How to cp/rm files by time?

In linux shell, When I run
ls -al -t
that show the time of files.
How to cp/rm files by time? just like copy all the files that created today or yesterday. Thanks a lot.
Depending on what you actually want to do, find provides -[acm]time options for finding files by accessed, created or modified dates, along with -newer and -min. You can combine them with -exec to copy, delete, or whatever you want to do. For example:
find -maxdepth 1 -mtime +1 -type f -exec cp '{}' backup \;
Will copy all the regular files in the current directory more than 1 day old to the directory backup (assuming the directory backup exists).
Simple Example
find /path/to/folder/ -mtime 1 -exec rm {} \; // Deletes all Files modified yesterday
For more examples google for bash find time or take a look here

Resources