So I've got Cloudflare and Prestashop running together to improve load times however a side effect of the CDN is that my cronjobs can only run for a maximum of 90 seconds. Any longer and Cloudflare will send out a 524 error and the cronjob will not be properly launched.
According to Cloudflare there are two ways to get around this problem. I can either reduce the size of the cronjob process so it fits within the 90 second window (which is not an option) or I can run the cronjobs on a separate subdomain that Cloudflare has no effect on. The problem with this option however is that Prestashop has something built into it so that regardless of what subdomain you use to visit the site it simply redirects to the main domain.
Does anyone have experience with this kind of issue and if so what are the best methods of getting around the problem. Thanks!
All hosting have this issue in major or minor way. The problem is that a web server can't give an undefined process time to a single thread because it can consume all server resources in some point.
First thing is that you must be sure that your cron script has the ability of continue where it left last execution. You can do that saving periodically where the cron is processing and then when process stop and start again it can continue where it left. For example: if you are processing products you can save the last processed product ID (in your DB or a file), then when you execute your script again it can continue from the last processed product ID.
The second subject is that you must execute cron several times to be sure that process finish completely. You can make calculation knowing how much completely script take to finish execution. For example: 30 minutes in periods of 90 seconds, then you must execute your cron 20 time with 100 seconds of interval.
To execute cron you have some options:
- Hosting admin panel configuration.
- Operative system options.
- Prestashop free cronjob module.
- Third party service.
I hope this can help you.
Good luck.
The domain redirection is not done in the admin context. Each Controller on FrontOffice will check the current URL and redirect to the canonical one if needed (if the domain is not the default one for example). But this mechanism doesn't exist on the BackOffice.
Then if you have for example a main domain www.example.com and a subdomain cron.example.com, if you try to access cron.example.com/ you will be redirected to the default domain because you try to access the FrontOffice. But if you try to access the cronjobs module in backoffice it will work without redirection http://cron.example.com/admin-1f5zef1/index.php?controller=AdminCronJobs&token=7498b7d228cc3e630ee2fe6b34bd1638.
Tested and working on my website.
So after a fair bit of time I managed to resolve this issue. I had to modify a few of prestashop's controllers to add an exception for my particular subdomain. I needed to amend an if statement on line 370 of shop.php (classes/shop/Shop.php) so that it looks like this (make sure you update "exemption.myshop.com" with your subdomain:
if ((!$id_shop && defined('_PS_ADMIN_DIR_')) || Tools::isPHPCLI() || in_array($http_host, $all_media) || $http_host == 'exemption.myshop.com') {
I then had to modify two functions in frontController.php (classes/controllers/frontController.php). I needed to add the following piece of code to the top of both the sslRedirection and canonicalRedirection functions above everything else inside each of the functions.
if (Tools::getHttpHost() == 'exemption.myshop.com'){
return;
}
Then I deleted class_index.php from the cache folder on the main directory and the changes were made. You can test to see if it has worked by visiting the subdomain, it should load the page without changing the url.
Related
We are trying to use Stormcrawler to crawl grab the index page of every site that we know a domain of - politely and ignoring any where robots.txt tell us not to. We have a database of domains - around 250m of them - and we are using that as a start. The idea is that we will crawl these once a week.
We have had a number of warnings from our server provider
Currently our crawls attempt to go to a domain name - ie abc123.com - when we do this and the domain does not resolve, this gets 'flagged'. Obviously there are MANY domains that don't resolve and point to the same IP address and therefore when we try to access a large number of domains that don't work we think this causes our provider to send an alert to us.
Our plan is after the first crawl that we will identify the domains that do not work and we will only crawl these on a monthly basis to see if they have become live, but any help would be appreciated. Apologies for being a bit naive also, so any help/guidance will be appreciated
The alerts from your server provider are probably triggered during the DNS resolution. What DNS servers are used on your machines? They are probably the ones from your provider, have you tried using different ones e.g. OpenDNS or Google's? They might even be faster than the ones you are currently using. I'd also recommend using a DNS cache on your servers.
Okay I searched and I can't find the solution.
I have a folder on my website where I'm keeping some script I use for something. My server via Cron job has to access this folder all the time. But if someone goes to the URL of the folder they can see the Index page, and I would like to prevent it. I want it to throw up a 403 or a 404 or a 500 or any page, I don't care what it is.
I tried this in htaccess:
deny from all
But this also blocks my own server and breaks the Cron job. I tried a few other solutions I found here and there, but they didn't work.
One of the things I saw is you can block everyone but allow your server access via IP. But I'm on Hostgator Shared hosting, and so my IP isn't static (as far as I know). I don't want to have to worry that at any time my server's IP may change and break my Cron thing.
Isn't there a nice elegant simple and permanent solution for this? Block access to the folder to all people, allow my own server/cron to access it at will.
Thank you :)
As it seems you want to call a script which is stored on the same server.
You don't need to use curl for it, you can just run the script by providing it's path and the php installation.
You can prevent outside access to the script if you just don't save the script in your public_html directory.
Your cronjob command will something look like this:
/opt/php70/bin/php /home/<USERNAME>/cron.php
The exact directory structure differs depending on your webhost.
I have 2 (working well) applications providing APIs, old one is based on CodeIgniter 2.1 and new one on Yes It Is framework. I need to redirect some actions from old API to a new one. Routing should also provide filtering request methods such as GET, POST, PUT, DELETE etc.
Folder structure looks like this:
ci
yii
router
At first I wanted to redirect all traffic to router/index.php where depending on URI an appropriate app was loaded and started. It worked well with YII, but CI couldn't find it's controllers/models/actions.
Second idea was to use .htaccess, but I couldn't make CI, YII work neither. It's starting, but both of them cannot find it's controllers/models/actions. No errors are printed/logged into apache logs.
When those 2 apps are fired "normally" everything works properly.
I've been changing configuration paths (to absolute ones) and still nothing. I don't want to change those applications a lot, small fixes would be much better.
Also there should be no option to fire an app without checking URI with "routes".
Finally I stayed with PHP router as described in question post. It turn out that CodeIgniter couldn't find it's methods because predefined variable $_SERVER['SCRIPT_NAME'] contained a wrong path, since script was fired from another directory. I had it overridden in router file and everything seems to work properly now.
We have just built a new webstore for Grayle (www.grayle.com) in Magento 1.8.1. The webshop consists of multiple stores (including one for another company).
After switching hosting our online websites finally worked. www.grayle.com -> redirects to www.grayle.nl (as it should). www.eviax.nl -> has it's own website.
Now the problem:
At first when we went to www.eviax.nl, we got to www.eviax.nl, then at random someone else tried and got redirected to www.grayle.nl. About 10 minutes later we tried the same thing, we got back at eviax.nl.
What is the problem here? Is it a cache problem in our Magento set-up? Or a problem at the hosts server? Maybe there's something stuck at our provider?
Edit:
My colleague enabled cache cleaning in Magento. This resulted in not being redirected, but now we are not able to update/edit static blocks and the catalogue view in the back-end stopped to work.
Edit:
This has been solved, the "cache cleaning" script somehow blocks all edits.
It seems you have a redirection loop. Try to use a cookie to se the redirection once.
I need to know how to prevent repetitive file downloads using .htaccess, or if not via .htaccess than some other method.
A site I maintain had over 9,000 hits on a single PDF file, accounting for over 80% of the site's total bandwidth usage, and I believe that most of the hits were from the same IP address. I've banned the IP, but that's obviously not an effective solution because there are always proxies and besides, I can only do that after the fact.
So what I want to do is cap the number of times a single IP can attempt to download a designated file or file type over a given period of time. Can this be done with .htaccess? If not, what other options do I have?
EDIT: The suggestion that I redirect requests to a server-side script that would track requests by IP via database sounds like a good option. Can anyone recommend an existing script or library?
If you have some server side code stream out your files, you have the opportunity to control what's being sent. I'm not aware of a .htaccess solution (which doesn't mean there isn't one).
My background is in Microsoft products, so I'd write a bit of ASP.NET code that would accept the filename as a parameter & stream back the result. Since it would be my code, I could easily access a database to track which IP I was serving, how often the file was sent, etc.
This could easily be done using any server side tech - PHP, etc.