I've adjusted the expiration tags on many of the files of my website (with .htaccess). As you know, this can create a tiny problem of a users browser not showing the most updated version of a website until the cached version of the file expires.
I began looking for a way to avoid this and then I remembered using CloudFlare some time ago. CloudFlare makes it possible to adjust these same tags but it also has a purge feature. What is 'purge' doing?
Purge is telling us to purge the files that CloudFlare has cached on your site. You can do for the whole site (purge cache) or purge by single file.
Related
I've used a boilerplate to create a website and I've tested it on https://testmysite.thinkwithgoogle.com/ and it said I have to leverage my browser cache. After doing some research, it says I have to add this tweak in .htaccess. The issue is that this tweak already exist in boilerplate .htaccess so I'm not really sure why the speed checker said I should be leveraging my browser cache?
Any insights why it gives this error?
Update: I've tested it on gtmetrix, it says I have to leverage browser cache of bunch of img, js, and css files. I've checked .htaccess files and there the expires tag of all the mentioned above files. Still no clue why it keep giving that note?
After doing extensive research, I found that git hub web pages don't allow for server configuration, so using .htaccess wouldn't leverage my browser cache. Hoping that github would solve this issue.
So I have tested my page via Googles Page Insights
And it is currently telling me to:
Leverage browser caching for the following cacheable resources:
http://maps.google.com/maps/api/js?sensor=false&language=en (30
minutes)
Its rather ironic as its a google resource from a google server But
Its always good to know how to do things I've tried to read about how to do this on a link google provided on the test page however it didn't really give an example of how to cache this external resource I've tried reading as much as I can and adding bits into my htaccess file but nothing seems to work.
So I guess my question firstly is, is it even possible via the .htaccess file to cache this resouce?
And if so how what code would I need to put in there to get it to cache the resource?
Thanks you In advance for any help.
You can't control the caching of resources served from third party. htaccess is to control caching for resources served out of your own boxes.
I have setup a brand new Orchard CMS 1.5.1 site using Web Platform Installer on Windows 2008 Server. I wanted to test out the Performance settings so I configured the following Warmup entries one per line:
/
/blog
...and checked the following options:
x Generate warmup pages periodically 90 Every minutes
x Generate warmup pages any time some content is published
When I visit the site the performance was still a bit slow. The Performance Warmup settings show each page has a status of zero and a red "down arrow" icon next to it.
Is there anything else I need to enable? Is there anything I am missing in the configuration like permissions, etc.
UPDATE:
I have noticed that my site does not have a folder to store the warm up pages. I added that folder manually but it still didn't fix my problems. Are there permissions I would need to set on that folder?
UPDATE 2:
After talking with Sebastien Ros, I think I understand what is wrong but still don't know how to fix it. The base URL setting in Orchard is set to "www.mydomain.com" as it should be but networking-wise my server does not allow my site to go out to the internet and query itself by that address in order to generate the warm-up page. To make matters worse, I have several sites that are hosted on the same IP address and using host headers to distinguish between sites. This prevents me from even being able to configure the base URL as a local IP address (which cause issues with other modules anyway).
Not sure what alternatives I have now.
Thanks,
Brian
Make sure that the general settings page is pointing to your base URL, i.e. http://mywebsite.com.
It may be pointing to the local host by default.
I confirmed with a Network Engineer at my server host that there was a networking restriction on outgoing requests coming back in for the web site. So, the performance module could not query www.mydomain.com and get an answer. Once the network restriction was removed, I was able see Warm-ups create the cache pages with a Status 200.
Alternatively, it was suggested that I create entries in my host file for each of my Orchard sites. I did not try this but I see no reason why it would not work even with the host-headered scenario that I have.
Brian
Our website has been recently hacked (Joomla 1.5, hosted on VPS). Attacker added few php scripts that were redirecting to some ad sites. We have cleaned everything (or at least we think we did), and now everything works as it should.
However, links on Google (or Yahoo) that are pointing to our web site are still trying to include these php scripts (and returns 404 as these are deleted now). Direct links from browser works as they should.
We have cleaned site 10 days ago, so I do not think that something is cached at Google servers. Re-indexing should be done by now.
To reproduce this behavior:
Go to www.google.com
type in "anitex socks"
click any php link that starts with "anitexsocks.com"
You will get "The requested URL /wp-includes/client.php was not found on this server" + 404 error
Refresh page and everything works without issues
Why are only Google links making troubles?
Any help is welcome. Thanks!
As for the reason why this is happening, I installed a firefox add-on which blocks my browser's Referrer Header and then followed a Google link to your site and it worked fine. Then I disabled the add-on and the problem started occurring again.
This shows that there is still some malicious code running on your website which is checking all http requests to see if they come from Google (based on checking the HTTP Referrer header) and redirecting them to /wp-includes/client.php if they do,
To try to determine where this code may lie, try performing a recursive grep through all your www files on your server as well as your www configuration files,somewhere in there there must still be a reference to that client.php script, hopefully you can find and eliminate it.
That said, if it were my site and I knew a hacker had had free reign over my server to do whatever they wanted to it, I would not mess around with trying to undo the damage and would instead restore the most recent backup from before the site was hacked. You only have to miss one back door the hacker left in place and they can re-enter your site. After restoring backups, you should also upgrade/reconfigure the software they used to gain access in the first place so they can't simply rehack it in the same manner again.
We're building a mobile-friendly site to work in tandem with our client's MOSS 2007 internet site. We need to be able to redirect users who hit the home page and are using a mobile device.
Our original intention was to add a custom control to the home page page layout that would detect the current user's device and redirect to the mobile site accordingly. We quickly realised that this would not work as we are using the Output Caching functionality provided by SharePoint/Asp.Net. This means that the detection code will only run for the first visitor to the home page until the cache expires.
Our next idea was to build a custom HTTP Module and process the detection there. However, we are finding that the Output Caching is not allowing that either. If the cache is set while a mobile device is visiting all browsers are subsequently redirected to the mobile site (until the cache expires).
If we turn off output caching it works just fine - but we cannot turn output caching off, especically for the home page. We did investigate Substitution (Donut) Caching but this is not working due to the fact we are filtering the Asp.Net response within another HTTP Module that tidies up the rendered HTML for XHTML compatiblity reasons. I've also experimented with the output cache profile by setting it to vary-by-header property to "User-Agent" but I am getting mixed results and am also concerned at the memory implications of caching multipel versions of pages (we already have memory issues now and then).
It's possible we could run the redirection code in JavaScript but then we risk not detecting a lot of devices that don't have JavaScript enabled. This is a government website so the usage of JavaScript has to abide by accessibility guidelines.
Does anyone have any other ideas as to how we can solve this issue. Has anyone done this before? Perhaps in a different way?
Hope you can help, thanks.
p.s. I have also asked this question on SharePoint.SE but wanted to get as many eyes on this as possible.
I would suggest you to try ISAPI filters
I've actually solved this one I think. I've pretty much followed this article here - http://msdn.microsoft.com/en-us/library/ms550239.aspx. We have updated the code in that article to build a cache key based on whether the current page is the home page, whether the current user is using a mobile device and whether or not a cookie exists forcing the user to the full site. I will probably write this up as a blog post. When I do I will update this answer providing a link.