execute python script from cron don't send mail - python-3.x

I've script in python that takes photos from raspberry camera and send mail with this photos. From command line everything works OK, script start do the jobs and finishes without errors.
I setup cron to execute script every 1 hour.
Every hour I get mail but without attachments.
Where or how can I check why script executed from cron don't send attachments?

While running cron job check any relative paths is given in your code, then either change it to absolute path or change the directory inside the script. Then only the cron job will take the attachment or anything that need to include into your script.

Related

API Log Pull via Crontab

How can I get Cloudflare WAF logs using log pull API request and send the output to a specific file ,
also I want this function to be automated using crontab so it does this automatically every 5 minutes,
does any one have the crontab script in hand?
Thanks,
I want this function to be automated using crontab so it does this automatically every 5 minutes

How can I automate logging into Linux server and running git fetch/pull on a website?

Here's what I want to do:
I have a hosted website on a Linux server.
This site is pointed to a GitHub repository.
I want to be able to push changes to the repository, then be able to log into my website and click a button to have the site pull the new code in order to update itself.
Here's how I do it manually:
I created a file on the Linux server called update_site
I log into my Linux server via ssh and type .\update_site which goes to the site's directory and executes a fetch and pull
the first time it asked me to enter my login and password which I did
but since I had set git config --global credential.helper store, it never asks me again
Here's how I want to automate it:
I created a PHP file which executes the file update_site
However, since the PHP website is apparently executing code as another user, it doesn't have my credentials for GitHub
Here's my question:
How can I automate the process so that when the website executes the update_site file, my GitHub login and password are sent as well. And needless to say, how can I do this as securely as possible, i.e. without saving my GitHub password somewhere as plain text?
One possible way to do this automation is to use cron. Edit your cron record (with crontab -e command) and add line like this:
*/5 * * * * /path/to/update_site
In above line 5 mean every 5 minutes

why my google cron job doesnt work automatically?

I have the following google cron job scheduled to run every 3 minutes
cron:
description: South32-FTP-Push
url: /
schedule: every 3 minutes
it is supposed to run my main.py and download some files trough ftp.
The cron job works fine when I test trough the cron jobs web interface but it doesnt work automatically every 3 minutes.
Any ideas?
It might be because the cron.yaml content you show isn't actually correct, the cron should be a list of one or more cron jobs, so it should look more like this:
cron:
- description: South32-FTP-Push
url: /
schedule: every 3 minutes
Solved! I've deleted the file and created a new one, probably some problem with cache.

How to determine site web root on Shared Hosting and set up a Scheduled Task?

I am still new in doing a scheduled task. My problem is where should I put the PHP script I will make?
In creating a Schedule Task I need to fill up this:
Specify the full path to the script. Example: /tmp/script.php
How can I get the full path? I already created a web user in my domain.
Example in my domain I will put my script inside my sample_website So my full path will be like this?
/usr/bin/php -q /home/my_domain.ph/public_html/sample_website/cron_script.php
Please help me guys. I am still new in doing this. Thanks
Can you provide me a step by step process with this?
To determine your script server full path you always can create PHP script with content:
<?php
echo(__FILE__);
and place it in your web root, like /httpdocs/script.php
Than you access it via browser like http://you-domain.name/script.php
It will show your web root path on server it should be something like:
/var/www/vhosts/you-domain.name/httpdocs/script.php
Now you know that your files are placed at /var/www/vhosts/you-domain.name/httpdocs/ and can use it to call scripts from scheduled tasks.

Create a bot that just visits my website

I have a Wordpress website automatically that gets some information from a RSS feed, posts it and then, with the help of a built-in Wordpress function, sets a custom field for that post with a name and a value. The problem is that this custom field only gets set when someone visits the published post. So, I have to visit every single new post for the custom field to be applied or to wait a visitor to do so.
I was looking forward to create a bot, web-crawler or spider that just visits all my new webpages once in an hour or whatever so the custom field gets automatically applied when the post is published.
There is any way of creating this with PHP, or other web-based language. I'm on a Mac, so I don't think that Visual Basic is a solution but I could try installing it.
You could for instance write a shell script that invokes wget (or if you don't have it, you can call curl -0 instead) and have it scheduled to run every hour, e.g. using cron.
It can be as simple as the following script:
#!/bin/sh
curl -0 mysite.com
Assuming it's called visitor.sh and is set to be executable, you can then edit your crontab by typing crontab -e to schedule it. Here is a link that explains how to do that second part. You will essentially need to add this line to your crontab:
0 * * * * /path/to/.../visitor.sh
(It means: run the script located at /path/to/.../visitor.sh every round hour.)
Note that the script would run from your computer, so it will only run when the computer is running.
crontab is a good point, also you can use curl or lynx to browse the web. They are pretty light-weighted.

Resources