API Log Pull via Crontab - security

How can I get Cloudflare WAF logs using log pull API request and send the output to a specific file ,
also I want this function to be automated using crontab so it does this automatically every 5 minutes,
does any one have the crontab script in hand?
Thanks,
I want this function to be automated using crontab so it does this automatically every 5 minutes

Related

Running Cron Job on Cpanel for Node.js Project

I have an API built with Express JS and deployed on cPanel. The API has a script, let's say the endpoint looks like this:
/api/v1/cron
When the URL is hit, an SQL query runs and some data is inserted to the database, and no problem with that.
What I want is to automate the process through Cron job. The URL should be hit once every hour so the query will execute and push data to the database.
I have tried with the basic settings and a command like this on cPanel but didn't work:
/usr/local/bin/php -q /home2/{domain}/api/v1/cron
Please note: The API is in subdomain, like: node-api.google.com
I have also tried with node-cron package, but couldn't find a way how to run a script with that.
Either solution will work for me greatly.

How can I automate logging into Linux server and running git fetch/pull on a website?

Here's what I want to do:
I have a hosted website on a Linux server.
This site is pointed to a GitHub repository.
I want to be able to push changes to the repository, then be able to log into my website and click a button to have the site pull the new code in order to update itself.
Here's how I do it manually:
I created a file on the Linux server called update_site
I log into my Linux server via ssh and type .\update_site which goes to the site's directory and executes a fetch and pull
the first time it asked me to enter my login and password which I did
but since I had set git config --global credential.helper store, it never asks me again
Here's how I want to automate it:
I created a PHP file which executes the file update_site
However, since the PHP website is apparently executing code as another user, it doesn't have my credentials for GitHub
Here's my question:
How can I automate the process so that when the website executes the update_site file, my GitHub login and password are sent as well. And needless to say, how can I do this as securely as possible, i.e. without saving my GitHub password somewhere as plain text?
One possible way to do this automation is to use cron. Edit your cron record (with crontab -e command) and add line like this:
*/5 * * * * /path/to/update_site
In above line 5 mean every 5 minutes

execute python script from cron don't send mail

I've script in python that takes photos from raspberry camera and send mail with this photos. From command line everything works OK, script start do the jobs and finishes without errors.
I setup cron to execute script every 1 hour.
Every hour I get mail but without attachments.
Where or how can I check why script executed from cron don't send attachments?
While running cron job check any relative paths is given in your code, then either change it to absolute path or change the directory inside the script. Then only the cron job will take the attachment or anything that need to include into your script.

Create cron job to run with MySQL

I've got a code that will delete a WordPress post from my database if it contains a certain text:
DELETE FROM wp_posts WHERE post_excerpt LIKE "%neuroscience%"
I want to get this to run every hour, but I don't know if I should initiate this within the MySQL platform, or via cPanel. I would prefer the latter so all my cron jobs would be in one place. But the truth is I don't know how to code either!
Why not just CREATE TRIGGER in MySQL itself ?
It will delete the post whenever they occur and if you use BEFORE UPDATE it won't even get into the database.

Create a bot that just visits my website

I have a Wordpress website automatically that gets some information from a RSS feed, posts it and then, with the help of a built-in Wordpress function, sets a custom field for that post with a name and a value. The problem is that this custom field only gets set when someone visits the published post. So, I have to visit every single new post for the custom field to be applied or to wait a visitor to do so.
I was looking forward to create a bot, web-crawler or spider that just visits all my new webpages once in an hour or whatever so the custom field gets automatically applied when the post is published.
There is any way of creating this with PHP, or other web-based language. I'm on a Mac, so I don't think that Visual Basic is a solution but I could try installing it.
You could for instance write a shell script that invokes wget (or if you don't have it, you can call curl -0 instead) and have it scheduled to run every hour, e.g. using cron.
It can be as simple as the following script:
#!/bin/sh
curl -0 mysite.com
Assuming it's called visitor.sh and is set to be executable, you can then edit your crontab by typing crontab -e to schedule it. Here is a link that explains how to do that second part. You will essentially need to add this line to your crontab:
0 * * * * /path/to/.../visitor.sh
(It means: run the script located at /path/to/.../visitor.sh every round hour.)
Note that the script would run from your computer, so it will only run when the computer is running.
crontab is a good point, also you can use curl or lynx to browse the web. They are pretty light-weighted.

Resources