Opencart google_sitemap URL - cron

I'm using Open cart in my website.
I want to run a cron job for these urls
http://www.mywebsite.com/index.php?route=feed/google_sitemap
http://www.mywebsite/index.php?route=feed/google_sitemap/mainIndexXml
How to specify these URL's into cron manager?

The simplest way would be to use wget depending on your privileges. That said I don't understand exactly why you would be running these though I'm guessing you want to get the caches for them updated periodically?

Related

Cpanel is not giving me option to change the root directory for one of my domains

I need to change the root directory for one of my domains,
But in my cpanel i dont find any options, just hard cores of system, but i have very basic knowledge about systems and servers.
How i can change that directory as easy as possible? I just need to change something because im gonna install laravel, and i want to change the public html to the public of laravel.
I was looking for the file that has the apache config, but it says like "the current config doesnt need to be changed or updated, bacause can be overryde", so i tought in Cpanel maybe i got an option for this.
Thanks, By the way i got an VPS, not shared. Using CENTOS 7.9.
Thanks and good night ^^
In cPanel, you can't change main domain directory/document root. If you want to change the document root, just change the main domain to another/random domain. Then add the domain that you want to change the root directory as addon domain
It's not recommended overriding Apache config. It's may break your system. WHM/cPanel exists to manage domains without a system admin knowledge
Do you try change this using console in Centos?
maybe will be better using console and open the file that contain the directory root

how to backup Joomla website when developing on MAMP?

I am a new web developer. I have been using Joomla with a 3rd party template, developing it on a local MAMP server. The template is sort of unstable and breaks easily. I would like to Backup my work on a daily basis.
I'm assuming all the files and the database need to be backed-up? Is there a best practice for this?
Thanks very much!
Using Akeeba Backup (https://www.akeebabackup.com/products/akeeba-backup.html) is indeed a good idea. You can schedule a command line task to run every day and create a backup of this site. To restore such a backup you can use Akeeba Kickstart (https://www.akeebabackup.com/products/akeeba-kickstart.html). Very easy, very comfortable.
But this only works if you don't break your Joomla! installation! Your question implies something like this. To do a manual backup you can simply zip the folder which contains your Joomla installation and create a database dump. You can do both every day using command line script.
Creating the dump: https://dev.mysql.com/doc/refman/5.7/en/mysqldump-sql-format.html
Using GIT might work as well. Instead of zipping your folder you simply commit it to your git repository. Don't forget to add the database dump to your repo as well.
https://techjoomla.com/developers-blogs/joomla-development/deploying-joomla-projects-using-git.html
http://joomlaablog.blogspot.de/2010/11/how-to-track-your-joomla-project-with.html
The best and easy method is to get it done through Akeeba Backup https://www.akeebabackup.com/download.html. Install this component at the backend. Run it and take a backup whenever you want. It takes backup of files and database both. You can even download and extract it to run it in another web server. To extract Akeeba you can use their software Akeeba Extract. This is all free.

Automating the download of redhat for kickstart

I'm trying to work on a project to automate the kickstart images however i'm stuck on my first subtask.
The download links for redhat downloads look something like the bellow:
https://access.cdn.redhat.com//content/origin/files/sha256/12/mkwosis89j9f8ef53ad7365f2997d42d4f83ccuwodjsl/rhel-server-7.3-x86_64-dvd.iso?auth=148102836_3974432975fa9f10e716c4a38928db
This becomes a problem because i can't know what the sha and the auth code are going to be before hand i can't just modify this url in bash, i need to have a way of going to the Latest downloads page and follow the link.
Anybody know how what i can use to achieve this?
Thanks.
It seems like you will be wasting a lot of bandwidth for each installation. Have you considered creating a local repository of ISO's? You would need to add the latest ISO when it is released. Check out this link for creating ISO repositories.
https://www.cyberciti.biz/tips/redhat-centos-fedora-linux-setup-repo.html

Where to store Cron Jobs, and will they always run?

I have thought of using cron jobs recently. In my site, I have css, js and images folders in my setup, which isn't very relevant, but might be needed.
I know how to do a cron job, but am unsure as to where to put it in my files so that it always runs every day.
So where should I put the cron job file, should I create a new folder for it and what should the file extension be?
Log in to your system via SSH, and then enter,
crontab -e
If this is your first time editing, it may ask you what editor you would like to use.
Then start editing.
*/1 * * * * /var/www/mysite/public/cron/script.php
Will run script.php every minute.
The cron I have installed on my Mythbuntu system keeps its daily cron scripts in /etc/cron.daily/.
File extensions don't matter on *nix. The file just needs to have executable permissions (and should have a shebang line at the top to state what program it should be run with).
anywhere, but I recommend outside web root. file extension to match the file type
Does not matter where you put it, as long as you call all included files by their absolute paths to avoid confusion. I've run into situations where
include '../../start.php';
had issues when running using the php command (usr/bin/php I think it was). Probably because it was running it from the different folder under which the php running command lies on the Apache server. So when including files I would use $_SERVER['DOCUMENT_ROOT'] as a reference point to include files.
alternatively you can always use the wget command to run it as if you are running it out of your own browser. Here's what I use:
wget http://www.mydomain.ca/cron/cron_whatever.php
And the timing can be set using the cPanel cron option, or you can write it out too.
And always have email notifications turned on to make sure you get the results written out to see if there's any issues.
You don't edit cron directly but rather run crontab -e which will effectively save allow you to edit and save it into a system area.
You can use SSH as described by guys before, but there is some hosting service providers who use cPanel and allows you to create those cron jobs easily via a web based interface easy to use and you will also easily make the correct time for run with them .

Where to put SVN repository directory in Linux?

I am setting up a new SVN server on Ubuntu Linux. Where is a good place (best practice) to put the repositories? Should I create a new user? The server will be accessed via http:// so no need to create user accounts etc (as was the case for svn://).
Many thanks in advance
I like putting things under /srv, as it seems to match the definition in the FHS.
The new location for service data according to the FHS is /srv, so under there would probably be best.
I've always used /var/svn or /var/lib/svn. While it doesn't quite line up with FHS, it matches closer to what the other apps actually do (On RHEL5, Apache uses /var/www; PostgreSQL uses /var/lib/pgsql). As suggested, /srv/svn looks like another good spot. And you get to say "Look, I'm following the standard!"
Using either /usr/svn or /usr/local/svn would probably be considered bad form, and all your Linux friends will laugh at you behind your back :-)
I host my SVN via the apache module, so I usually put it under my apache user, at the same level as my htdocs, and setup a specific authentication just for SVN users. Not under htdocs, but same level.
If you have a lot of projects, dedicate another volume to SVN since it will grow.
I guess I'm kind of old school but I like to put things (apache,tomcat,...etc) in /usr/local. So I will usually create repositories in /usr/local/svn and have the Apache module reference that path in the httpd.conf
/home/username/Dropbox
this way you backup the svn and can access it on a windows machine as well.

Resources