I have thought of using cron jobs recently. In my site, I have css, js and images folders in my setup, which isn't very relevant, but might be needed.
I know how to do a cron job, but am unsure as to where to put it in my files so that it always runs every day.
So where should I put the cron job file, should I create a new folder for it and what should the file extension be?
Log in to your system via SSH, and then enter,
crontab -e
If this is your first time editing, it may ask you what editor you would like to use.
Then start editing.
*/1 * * * * /var/www/mysite/public/cron/script.php
Will run script.php every minute.
The cron I have installed on my Mythbuntu system keeps its daily cron scripts in /etc/cron.daily/.
File extensions don't matter on *nix. The file just needs to have executable permissions (and should have a shebang line at the top to state what program it should be run with).
anywhere, but I recommend outside web root. file extension to match the file type
Does not matter where you put it, as long as you call all included files by their absolute paths to avoid confusion. I've run into situations where
include '../../start.php';
had issues when running using the php command (usr/bin/php I think it was). Probably because it was running it from the different folder under which the php running command lies on the Apache server. So when including files I would use $_SERVER['DOCUMENT_ROOT'] as a reference point to include files.
alternatively you can always use the wget command to run it as if you are running it out of your own browser. Here's what I use:
wget http://www.mydomain.ca/cron/cron_whatever.php
And the timing can be set using the cPanel cron option, or you can write it out too.
And always have email notifications turned on to make sure you get the results written out to see if there's any issues.
You don't edit cron directly but rather run crontab -e which will effectively save allow you to edit and save it into a system area.
You can use SSH as described by guys before, but there is some hosting service providers who use cPanel and allows you to create those cron jobs easily via a web based interface easy to use and you will also easily make the correct time for run with them .
Related
Good day!
There are PHP scripts, classes, configs. All this stuff is interconnected, I need to give a person access to the server so that he works (started under the root) with these scripts, while changing only the config files, and in order to not be able to view the source code.
I've researched various free obsfukators which converting code into something:
<?php include(base64_decode('Li4vY29uZmlnLnBocA=='));include(base64_decode('cHJpdmF0ZS92ZW5kb3IvYXV0b2xvYWQucGhw'));$krc_5bf7f45b=[];foreach($bhi_6f9322e1as $xol_e8b7be43){$xol_e8b7be43=explode(base64_decode('Og=='),$xol_e8b7be43);try{$uic_c59361f8=new \xee_d9cb1642\cko_659fc60();$uic_c59361f8->ldc_aa08cb10($xol_e8b7be43[0],$xol_e8b7be43[1]);$krc_5bf7f45b[]=$uic_c59361f8;}catch(Exception $wky_efda7a5a)
What if the files of configs have variable names and it turns out that when obfuscating the main working code, the variables have different names? Not to force the user to run through the obsfukation corrected config every time? So far, this option seems the only one.
Is it possible inside the server under Ubuntu to somehow limit the ability to copy or view or download certain files or make some other methods of protection-hiding, but at the same time with the ability to run this code. It was thought to hide the code somewhere in the depths of the file system folders, calling them random names, and run them somehow through the symlinks by the file name or something like that. Is it possible to?
Option not to provide root access to the server, but to launch via the browser, to give access only to FTP to upload the config to a separate folder. But there are a number of points - all scripts run up to a week, and must be executed as root. How to solve it?
I am organizing a movie library when many users update movie files and then I manually update a Database using a PHP script so that from the URL user can know the list of movie residing in the library.
I know I can make an entry to Crontab in Linux so that in every 10 hour it run the PHP script to update the database however I wanted to know is there a way in which PHP file can be run automatically when user paste a movie file to the database kind of notifier which will invoke the php so the database can be updated in real time?
I am using Linux Mint.
You can use incron which is an inotify based crontab.
For instance, here is a sample you can use with incrontab -e:
/home/moviedb/download IN_CLOSE_WRITE /home/moviedb/classify-script.php $#
which will hook on the /home/moviedb/download folder for any finished download (closed file) and run the classify-script.php with the event related filename.
I'm using Open cart in my website.
I want to run a cron job for these urls
http://www.mywebsite.com/index.php?route=feed/google_sitemap
http://www.mywebsite/index.php?route=feed/google_sitemap/mainIndexXml
How to specify these URL's into cron manager?
The simplest way would be to use wget depending on your privileges. That said I don't understand exactly why you would be running these though I'm guessing you want to get the caches for them updated periodically?
I need to audit the directory and call a script with the file-path parameter as the file is created there. Reading the man of auditctl i can't find a way to do it.
There're references in the web to inotify or iwatch services, that should do what i need, but i'd rather use the standard auditd functionality not installing an extra software.
If that's really not possible to use auditd to track the file creation and call the script for that file, a short sample of iwatch/inotify command to do the trick will be appreciated and accepted.
For the CentOs environment pyinotify module was used which handles directories watch pretty well and triggers the desired scripts.
Unfortunately i wasn't able to find solution using pure auditd.
The list of examples of how do someone use pyinotify is here.
Need to launch two application on system startup.
A desktop file is created starting the fist one. Unfortunately could not find any way to launch the two application in single desktop file. If there is way please write in comments ?
As a workaround created the second desktop file for launching the second app.
Now Need to know the order of invoking *.desktop files in autostartup.
It that performed by alphabetical order ?
Regards,
Levon
You can always run script, which will then run your applications. For example:
#!/bin/bash
app1
app2
or, if you don't want to keep separate script, adding line Exec=app1;app2 to your .desktop file may do the trick.