why my google cron job doesnt work automatically? - python-3.x

I have the following google cron job scheduled to run every 3 minutes
cron:
description: South32-FTP-Push
url: /
schedule: every 3 minutes
it is supposed to run my main.py and download some files trough ftp.
The cron job works fine when I test trough the cron jobs web interface but it doesnt work automatically every 3 minutes.
Any ideas?

It might be because the cron.yaml content you show isn't actually correct, the cron should be a list of one or more cron jobs, so it should look more like this:
cron:
- description: South32-FTP-Push
url: /
schedule: every 3 minutes

Solved! I've deleted the file and created a new one, probably some problem with cache.

Related

API Log Pull via Crontab

How can I get Cloudflare WAF logs using log pull API request and send the output to a specific file ,
also I want this function to be automated using crontab so it does this automatically every 5 minutes,
does any one have the crontab script in hand?
Thanks,
I want this function to be automated using crontab so it does this automatically every 5 minutes

execute python script from cron don't send mail

I've script in python that takes photos from raspberry camera and send mail with this photos. From command line everything works OK, script start do the jobs and finishes without errors.
I setup cron to execute script every 1 hour.
Every hour I get mail but without attachments.
Where or how can I check why script executed from cron don't send attachments?
While running cron job check any relative paths is given in your code, then either change it to absolute path or change the directory inside the script. Then only the cron job will take the attachment or anything that need to include into your script.

Create cron job to run with MySQL

I've got a code that will delete a WordPress post from my database if it contains a certain text:
DELETE FROM wp_posts WHERE post_excerpt LIKE "%neuroscience%"
I want to get this to run every hour, but I don't know if I should initiate this within the MySQL platform, or via cPanel. I would prefer the latter so all my cron jobs would be in one place. But the truth is I don't know how to code either!
Why not just CREATE TRIGGER in MySQL itself ?
It will delete the post whenever they occur and if you use BEFORE UPDATE it won't even get into the database.

Azure Scheduled Job Fails to Execute

I had two scheduled jobs running in my developer program benefit subscription. I had set them up originally using the classic portal. I set them to run every hour since I am on Free Tier. Everything was working fine until I had to redeploy the code (console applications)
First problem - I could not find a way to easily update the code for the scheduled jobs so I deleted them and recreated them I tried to recreate them from the portal, first by creating the WebJob and then by going to the scheduled jobs collection and creating a schedule for the web job.
However each time it runs, it fails with the following error
Http Action - Response from host 'mysite.scm.azurewebsites.net':
'Unauthorized' Response Headers: Date: Thu, 16 Mar 2017 04:07:00 GMT
Server: Microsoft-IIS/8.0
WWW-Authenticate: Basic realm="site"
Body:
401 - Unauthorized: Access is denied due to invalid
credentials.....
And some other html stuff unrelated to the error
I tried also to deploy the job directly from Visual Studio 2015 (latest update)
however the same result occurs, running the job fails with the same error.
It is my understanding that even on free tier I should be able to run a scheduled job (5 of them) every hour.
Why is it failing and complaining about credentials?
EDIT: The job runs from App Service - WebJobs so there's nothing wrong with the job itself, the code executes correctly, I just can't get it to run from the Scheduler.
As far as I know, the “Access is denied due to invalid credentials” error is happened when you don’t set the Authentication information in your
Scheduler Job ‘s action settings tag.
The error is like below:
I suggest you find your webjob’s username and password firstly.
You could find it in the webjob’s properties tag.
Like below:
Then you could set the user name and password in the action’s setting.
It will work well.
Would you be able to run your schedule based off of a Cron expression? The post highlights how to use the settings.job file to provide a schedule to execute on. Here are some example cron expressions for your to figure out if this scenario would work.

Create a bot that just visits my website

I have a Wordpress website automatically that gets some information from a RSS feed, posts it and then, with the help of a built-in Wordpress function, sets a custom field for that post with a name and a value. The problem is that this custom field only gets set when someone visits the published post. So, I have to visit every single new post for the custom field to be applied or to wait a visitor to do so.
I was looking forward to create a bot, web-crawler or spider that just visits all my new webpages once in an hour or whatever so the custom field gets automatically applied when the post is published.
There is any way of creating this with PHP, or other web-based language. I'm on a Mac, so I don't think that Visual Basic is a solution but I could try installing it.
You could for instance write a shell script that invokes wget (or if you don't have it, you can call curl -0 instead) and have it scheduled to run every hour, e.g. using cron.
It can be as simple as the following script:
#!/bin/sh
curl -0 mysite.com
Assuming it's called visitor.sh and is set to be executable, you can then edit your crontab by typing crontab -e to schedule it. Here is a link that explains how to do that second part. You will essentially need to add this line to your crontab:
0 * * * * /path/to/.../visitor.sh
(It means: run the script located at /path/to/.../visitor.sh every round hour.)
Note that the script would run from your computer, so it will only run when the computer is running.
crontab is a good point, also you can use curl or lynx to browse the web. They are pretty light-weighted.

Resources