Running Cron Job on Cpanel for Node.js Project - node.js

I have an API built with Express JS and deployed on cPanel. The API has a script, let's say the endpoint looks like this:
/api/v1/cron
When the URL is hit, an SQL query runs and some data is inserted to the database, and no problem with that.
What I want is to automate the process through Cron job. The URL should be hit once every hour so the query will execute and push data to the database.
I have tried with the basic settings and a command like this on cPanel but didn't work:
/usr/local/bin/php -q /home2/{domain}/api/v1/cron
Please note: The API is in subdomain, like: node-api.google.com
I have also tried with node-cron package, but couldn't find a way how to run a script with that.
Either solution will work for me greatly.

Related

How to execute shell scripts on the backend of a nodejs web app?

I have a full stack MERN application running on a google compute engine, along with a local mongodb running on the same server. Its a CRUD application, however, I have a script that is stored in the server that I would like triggered everytime a user presses a button on the front end. (a use case would be, user enters some input that is logged into the database, and when its logged I would like a script triggered on the backend that creates a json file out of the mongodb table and uploads it to github/emails it out).
I'm not sure where to start learning this, a few google searches have led me to AJAX and child_processes, am I going in the right direction? Any resources or pointers would be great. Thank you
If I understood the question correctly then you want to accomplish the following things:
1. Export json data from a local MongoD instance.
2. Then send that data to github or email it somewhere.
In that case I would recommend to use child process(exec, spawn, execFile, fork) to execute the mongoexport command to get .json files.
But I don't recommend using shell script to upload that data to github or email it.
Use the github api for github and use node-mailer to email the data.
To learn more about child processes read the docs here Node.js v14.x Child process docs

Nodejs API architecture cron job

I'm building a REST API with Nodejs with MongoDB as database.
This API only have GET routes to retrieve data. This data is aggregated from others sources via cron job.
I'm not sure how to correctly implement this and what are the best practices.
Do i need to create POST/PUT route to put data in database ?
Do i need to just put data directly in the database ?
Edit : (more informations)
Only the cron jobs would use POST route
The cron jobs are getting data from others REST API and some web scraping.
Is it a good idea to have my cron in the same application with the API, or if I have to make another application to manage my cron jobs and populate my database ?
I suggest creating an API which can be called using an accessKey for updating the data because you would not want to write your mongo db username and password in a shell file.
But if the cron job is a k8s cron job or just a file written in some programming language which can access the db in secure way and is hosted in the same network then you can go ahead with doing it from cron job.
If you'd like to control the data flow entirely through API at this point, creating a POST route would be the way to go. I am not sure how are your GET routes secured - if not at all, consider implementing or at least hard-coding some sort of security for routes that modify your data (oAuth2 or similar).
If you can access the database directly and it's desired, you can just directly insert/update the data.
The second option would be probably quicker, but the first one offers more space for expansion in the future and could be more useful overall.
So in the end, both options are valid, it's up to preference and your use case.

Executing php scripts without opening browser

I want to execute a php file located in my apache server on localhost/remote from Processing. But I want to do it without opening the browser. How can I do this. The php script just adds data to mysql database using values obtained from GET request. Is this possible? Actually I tried using link("/receiver.php?a=1&b=2") but it opens a web page containing the php output.
Ideally such scripts must be generic so that it can be used as utility for both web and bash scripts, in-case you cannot change/modify script, then I would suggest to curl from terminal/bash script to make HTTP request to the given link.
Refer to this solution, as how to make request with CURL.

Create cron job to run with MySQL

I've got a code that will delete a WordPress post from my database if it contains a certain text:
DELETE FROM wp_posts WHERE post_excerpt LIKE "%neuroscience%"
I want to get this to run every hour, but I don't know if I should initiate this within the MySQL platform, or via cPanel. I would prefer the latter so all my cron jobs would be in one place. But the truth is I don't know how to code either!
Why not just CREATE TRIGGER in MySQL itself ?
It will delete the post whenever they occur and if you use BEFORE UPDATE it won't even get into the database.

standard way of setting a webserver deploy using webhooks

I am working on code for a webserver.
I am trying to use webhooks to do the following tasks, after each push to the repository:
update the code on the webserver.
restart the server to make my changes take effect.
I know how to make the revision control run the webhook.
Regardless of the specifics of which revision control etc. I am using, I would like to know what is the standard way to create a listener to the POST call from the webhook in LINUX.
I am not completely clueless - I know how to make a HTTP server in python and I can make it run the appropriate bash commands, but that seems so cumbersome. Is there a more straightforward way?
Setup a script to receive the POST request ( a PHP script would be enough )
Save the request into database and mark the request as "not yet finished"
Run a crontab and check the database for "not yet finished" tasks, and do whatever you want with the information you saved into database.
This is definately not the best solution but it works.
You could use IronWorker, http://www.iron.io, to ssh in and perform your tasks on every commit. And to kick off the IronWorker task you can use it's webhook support. Here's a blog post that shows you how to use IronWorker's webhooks functionality and the post already has half of what you want (it starts a task based on a github commit): http://blog.iron.io/2012/04/one-webhook-to-rule-them-all-one-url.html

Resources