I want to set cron job in cpanel for yii2, I have set the cron job but it is not working, My controller name is Cron. When I hit the URL, it is working fine.
However, when I set its path in cron it is not working for me, my controller is in frontend/web folder.
Here, I have set its path like this:
/usr/local/bin/php /home/raanet/public_html/frontend/web/ cron
but it is not working for me, can anyone please help me how to resolve this issue?
I think you are trying to call a web controller from the console. There are 2 resolutions to this. You can make a console controller, if it is possible in your case. Read about console controllers here. The other thing you can do is, just set up a curl request in the crontab instead. Here is a list of examples. Please, let me know how it goes and if there are any other issues.
Related
I have an API built with Express JS and deployed on cPanel. The API has a script, let's say the endpoint looks like this:
/api/v1/cron
When the URL is hit, an SQL query runs and some data is inserted to the database, and no problem with that.
What I want is to automate the process through Cron job. The URL should be hit once every hour so the query will execute and push data to the database.
I have tried with the basic settings and a command like this on cPanel but didn't work:
/usr/local/bin/php -q /home2/{domain}/api/v1/cron
Please note: The API is in subdomain, like: node-api.google.com
I have also tried with node-cron package, but couldn't find a way how to run a script with that.
Either solution will work for me greatly.
I have a pipeline that I run with nextflow which is a workflow framework.
It has an option of seeing real time logs on the http server.
The command to do this is like so:
nextflow run script.nf --with-weblog http://localhost:8891
But I don't see anything when I open my web browser. I have port forwarded while logging into the ubuntu instance and the python http server seems to work fine.
I will need help in understanding how I can set this up so I can view logs generated by my script on the url provided.
Thanks in advance!
In nextflow you need to be careful with the leading dashes of commandline parameters. Everything starting with two dashes like --input will be forwarded to your workflow/processes (e.g. as params.input) while parameters with a single leading dash like -entry are interpreted as parameters by nextflow.
It might be a typo in your question, but to make this work you have to use -with-weblog <url> (note that I used only a single dash here)
See the corresponding docs for further information on this:
Nextflow is able to send detailed workflow execution metadata and
runtime statistics to a HTTP endpoint. To enable this feature use the
-with-weblog as shown below:
nextflow run <pipeline name> -with-weblog [url]
However, this might not be the only problem you are encountering with your setting. You will have to store or process the webhooks that nextflow sends on the server.
P.S: since this is already an old question, how did you proceed? Have you solved that issue yourself in the meantime or gave up on it?
I am still new in doing a scheduled task. My problem is where should I put the PHP script I will make?
In creating a Schedule Task I need to fill up this:
Specify the full path to the script. Example: /tmp/script.php
How can I get the full path? I already created a web user in my domain.
Example in my domain I will put my script inside my sample_website So my full path will be like this?
/usr/bin/php -q /home/my_domain.ph/public_html/sample_website/cron_script.php
Please help me guys. I am still new in doing this. Thanks
Can you provide me a step by step process with this?
To determine your script server full path you always can create PHP script with content:
<?php
echo(__FILE__);
and place it in your web root, like /httpdocs/script.php
Than you access it via browser like http://you-domain.name/script.php
It will show your web root path on server it should be something like:
/var/www/vhosts/you-domain.name/httpdocs/script.php
Now you know that your files are placed at /var/www/vhosts/you-domain.name/httpdocs/ and can use it to call scripts from scheduled tasks.
I'm working on a webshop in which it should be possible to subscribe to a service for a monthly 20$ fee in return.
I've made a script in a c# usercontrol which will be called monthly by umbraco scheduled task command:
<task log="true" alias="test60" interval="60" url="http://mysite/umbraco/subscriptionPayment.aspx"/>
For that I have two questions.
1) Where should I put the usercontrol in order to make it unaccessible for the public but accessible for the umbraco task command? It is very important that the script can only be accessed by local server commands.
2) I would like the script to log a file each time a transaction is made. I'm using the following script:
File.AppendAllText("paymentlog.txt",
"Transaction "+transactionNumber.ToString() +" sucessfully executed at "+DateTime.Now.ToString() + Environment.NewLine);
I just don't know which path I should give to the paymentlog.txt file since handling real paths in umbraco seems kind of obscure to me. I would like the paymentlog.txt file to be placed in the root umbraco folder. How do I do that?
Thanks in advance (I'm running umbraco 4.8 and uCommerce 2.6.1).
Best regards,
Brinck10
The task scheduler in umbraco isn't very reliable and this sounds like a crucial to your businness so i would probably try to use the build in scheduler in Windows on your server to do the processing.
That being said. In regards to marking the tasks scheduler approach a little more secure you could make the URL you are calling only accessible on localhost/127.0.0.1 and use localhost address in your task configuration.
You can make a call to the user service to verify that the caller is logged in. Following code will achieve that:
using UCommerce.Infrastructure;
using UCommerce.Security;
var authService = ObjectFactory.Instance.Resolve<IAuthenticationService>();
if (authService.IsAuthenticated())
{
// do secure code here
}
I have a Wordpress website automatically that gets some information from a RSS feed, posts it and then, with the help of a built-in Wordpress function, sets a custom field for that post with a name and a value. The problem is that this custom field only gets set when someone visits the published post. So, I have to visit every single new post for the custom field to be applied or to wait a visitor to do so.
I was looking forward to create a bot, web-crawler or spider that just visits all my new webpages once in an hour or whatever so the custom field gets automatically applied when the post is published.
There is any way of creating this with PHP, or other web-based language. I'm on a Mac, so I don't think that Visual Basic is a solution but I could try installing it.
You could for instance write a shell script that invokes wget (or if you don't have it, you can call curl -0 instead) and have it scheduled to run every hour, e.g. using cron.
It can be as simple as the following script:
#!/bin/sh
curl -0 mysite.com
Assuming it's called visitor.sh and is set to be executable, you can then edit your crontab by typing crontab -e to schedule it. Here is a link that explains how to do that second part. You will essentially need to add this line to your crontab:
0 * * * * /path/to/.../visitor.sh
(It means: run the script located at /path/to/.../visitor.sh every round hour.)
Note that the script would run from your computer, so it will only run when the computer is running.
crontab is a good point, also you can use curl or lynx to browse the web. They are pretty light-weighted.