Plesk panel Cronjob delete folder older then x days - cron

i'm trying to set a cronjob on plesk panel to remove folders in a directory /uploads/temp_files.
I'm using this command:
find /uploads/temp_files/* -type d -ctime +30 -exec rm -rf {} \;
but i get an error from plesk: -: find: command not found
what can i do?
Thanks!

You should use the full path. Instead of find use /bin/find. Depending of your linux distro, the location might be differnt. On a ssh shell console use this:
which find
The output will show you the exact location of find. Then use that full path in your cron job!

Because of security reasons hosting providers use chrooted shell.
In case your subscription has chrooted shell you have limited access to server commands and there is no find binary in Plesk default chrooted shell.
You can check it by following path "/var/www/vhosts/example.com/bin/" in "File Manager":
In this case you may ask your hosting provider to add find in your subscription or in common chroot template by following KB: https://support.plesk.com/hc/en-us/articles/213909545--HOWTO-How-to-add-new-programs-to-a-chrooted-shell-environment-template

Related

Cron job Linux Shell command SiteGround - What Directory does it run in?

New to Siteground hosting, Cron jobs and Linux.
I have a directory where I want to remove files older than 30 days on my WordPress website.
The Linux shell command is
find path/to/files/* -mtime +30 -exec rm {} \;
The question is about the path to use. I don't know where this Cron job is running from.
If I create a cron job with just
find *
The log file produces the following results...
tmp
tmp/somefile1
tmp/somefile2
etc...
Where is tmp? And what should I use in my path for my command to get to where I want it.
If I FTP to my site using FileZilla, this is the path to my directory in question...
/mydomain.com/public_html/sr
So, if I edit my Cron job to execute the following command...
find /mydomain.com/public_html/sr/*
I get
find: '/mydomain.com/public_html/sr/*': No such file or directory
So, just not sure how to specify the path for my Cron job so that it operates on the desired directory.
Found the answer after much searching.
/home/customer/www/yourdomain.com/public_html/sr
Hope this helps others. The support was no help at all.

Find command not working as expected in centOS

I am using CentOS Linux release 7.0.1406 on virtual box. I am trying to find the files using find command.
this find command is not giving any response:
find . -name "orm.properties"
My current working directory is /eserver6. File orm.properties is present in /eserver6/share/system/config/cluster, but find command is not able to find the file.
I have tried other combinations like
find . -name "orm.*"
find . -name 'orm*'
this is finding few files staring with orm but not all the files present inside the current working directory.
The command line looks correct and it should find the file. Some reasons why it might fail:
You don't have permission to enter one of the folders in the path to /eserver6/share/system/config/cluster.
You made a typo
The file system is remote and the remote file system behaves oddly
There is a simlink somewhere in the path. By default, find doesn't follow symlinks to avoid recursive loops. Use find /eserver6 -L ... to tell find to look at the target of the link and follow it if it's a folder.
The command
find /eserver6 -name "orm.properties"
should definitely find the file, no matter where you are. If it doesn't, look at -D debugoptions in the manpage. You probably want -D stat to see at which files find looks and what it sees.
If your user have entry into sudoers file then its ok and you can run
sudo find / -name "orm.properties"
or else ask your admin to give an entry in sudoers file of your user and run the same command then it will work.

Access forbidden to website after scp transfer

I used scp2 to tranfer a folder from windows to ubuntu.
I executed the scp2 process as part of a gulp execution.
My project was successfuly transfered to the server but when I tried to navigate to the site from the browser I encountered a 403 Forbidden message.
The problem is that the scp2 process didn't grant permissions to the newly created folder and files.
When I execute the following lines on the server it's work fine:
find ProjFolder -type d -exec chmod 755 {} \;
find ProjFolder -type f -exec chmod 644 {} \;
My question is: how can I transfer my project from my local machine to the server without the need to repeatedly write the permission orders?
To preserve permissions try to use rsync, it has a lot more benefits besides keeping ownership, permissions and incremental copies:
rsync -av source 192.0.2.1:/dest/ination
EDIT [according to comments]:
This works well for transferring between 2 Linux systems but doesn't seem to work for Windows -> Linux transfer. Apparently PuTTY seems to work best for transfers involving Windows on one side and Linux on another

migrate perforce depot from windows to linux

I am trying to migrate Perforce depot from Windows to Linux and having issues accessing files in linux after the migration. Following are the steps I followed to migrate:
On Windows I ran following commands:
p4d -r P4ROOT -xv
p4d -r P4ROOT -jc finalcheckpoint
Then copied the depot onto Ubuntu and ran following commands:
p4d -r P4ROOT -jr finalcheckpoint
p4d -r . -p localhost:1666
p4 verify -q //...
I didn't get any errors while running p4 verify, but when I try to checkout files I am getting error - Path not found.
Am I missing any step here? If any one has migrated from Windows to Linux could you please share the steps taken to migrate.
Thanks,
Vijay
There is a big difference moving from Windows to Linux - you are typically moving from a case-insensitive platform to a case-sensitive platform. There is a very good and detailed knowledge base article that details this on the main perforce.com web site: http://kb.perforce.com/article/75/cross-platform-perforce-server-migration
Your steps look like they are more or less correct, though you never updated the internal line-endings for the files. The KB article recommends this short shell + perl script:
find . -type f -name '*,v' -print -exec perl -p -i -e 's/\r\n/\n/' {} \;
Your path not found error is likely a mismatch in your client workspace definition. I would suggest creating a new one to test with and ensure the paths you are using are correctly specified paying special attention to any upper or lowercase characters.
You will need to use p4migrate tool
ftp://ftp.perforce.com/perforce/tools/p4-migrate/p4migrate.html
check the part
Migrating from Windows to Unix

Is there a way to diff chown/chmod between two servers' directories?

Platform: CentOS 5.6 x86_64
I have a production server and a development server. I want to debug file ownership and permissions across a large directory structure, which is almost identical, give or take a few ephemeral files in temporary caches.
Does anyone know if this is possible? Manually checking file-for-file would not be practical, given the size of the directory tree.
Thanks in advance.
http://linuxconfig.org/backup-permissions-in-linux
This is the BEST script to bakup and restore the permissions of directories. When you get the directory permissions list from both servers, just run a diff on them (you might want to make some modificatins before that)
Just use find on both directory servers with the -ls flag, like:
find directory_a -not ( test_for_ephemeral_files ) -ls > listing_a
find directory_b -not ( test_for_ephemeral_files ) -ls > listing_b
diff listing_a listing_b

Resources