I've used a boilerplate to create a website and I've tested it on https://testmysite.thinkwithgoogle.com/ and it said I have to leverage my browser cache. After doing some research, it says I have to add this tweak in .htaccess. The issue is that this tweak already exist in boilerplate .htaccess so I'm not really sure why the speed checker said I should be leveraging my browser cache?
Any insights why it gives this error?
Update: I've tested it on gtmetrix, it says I have to leverage browser cache of bunch of img, js, and css files. I've checked .htaccess files and there the expires tag of all the mentioned above files. Still no clue why it keep giving that note?
After doing extensive research, I found that git hub web pages don't allow for server configuration, so using .htaccess wouldn't leverage my browser cache. Hoping that github would solve this issue.
Related
When trying to perform well in Google's Pagespeed insights, we've hit a situation.
We enabled mod_pagespeed and that worked. It did what it's supposed to do.
However, when looking at the results, on GT Metrix and Pagespeed Insights when we have mod_pagespeed enabled, mod_expires and browser caching doesn't work.
Is this by design? If so, is there anything globally or via .htaccess to I can code in to make browser caching happen?
I have same problem, also my css links that are indeed moved to footer by pagespeed's prioritize_critical_css are causing render-blocking error.
It happens even if I set pagespeed to ONLY do above-the-fold css optimization:
<IfModule pagespeed_module>
ModPagespeed On
ModPagespeedRewriteLevel PassThrough
ModPagespeedEnableFilters prioritize_critical_css
</IfModule>
When I set ModPagespeed Off, problem dissapears... For now I see browser caching warning for png images mostly (I have mod_expires set to "access 1 year", but when I'm trying to use extend_cache in pagespeed instead mod_exipre, even more browser caching warnings appear).
After some google search i found this removed the warning from insights:
ModPagespeedInPlaceResourceOptimization off
Docs on In-Place Resource Optimization.
Here is an answer, by Matthias Redl-Mann, I discovered in google products forum:
So I could solve the problem: The apache user had no access to the
cache directory. Setting a different Cache Path via the
ModPagespeedFileCachePath directive solved the problem. After setting
a path with access for the apache user, everything worked.
i am getting issues for Leverage browser caching for images on Google page speed tool. I have search all web for this issue but not get any solution.
I have tried total cache, wp super cache, zencahe,etc and i have also tried to edit my ht-access file for this and defined expiration for each image type but still i am getting this issue. Can you help me out please.Please tell me what is the cause of this issue so that i am beware next time.I also get this issue when i enable g-zip compression from ht-access file.
Your suggestion will be valuable for me.
Thanks
So I have tested my page via Googles Page Insights
And it is currently telling me to:
Leverage browser caching for the following cacheable resources:
http://maps.google.com/maps/api/js?sensor=false&language=en (30
minutes)
Its rather ironic as its a google resource from a google server But
Its always good to know how to do things I've tried to read about how to do this on a link google provided on the test page however it didn't really give an example of how to cache this external resource I've tried reading as much as I can and adding bits into my htaccess file but nothing seems to work.
So I guess my question firstly is, is it even possible via the .htaccess file to cache this resouce?
And if so how what code would I need to put in there to get it to cache the resource?
Thanks you In advance for any help.
You can't control the caching of resources served from third party. htaccess is to control caching for resources served out of your own boxes.
We're building a mobile-friendly site to work in tandem with our client's MOSS 2007 internet site. We need to be able to redirect users who hit the home page and are using a mobile device.
Our original intention was to add a custom control to the home page page layout that would detect the current user's device and redirect to the mobile site accordingly. We quickly realised that this would not work as we are using the Output Caching functionality provided by SharePoint/Asp.Net. This means that the detection code will only run for the first visitor to the home page until the cache expires.
Our next idea was to build a custom HTTP Module and process the detection there. However, we are finding that the Output Caching is not allowing that either. If the cache is set while a mobile device is visiting all browsers are subsequently redirected to the mobile site (until the cache expires).
If we turn off output caching it works just fine - but we cannot turn output caching off, especically for the home page. We did investigate Substitution (Donut) Caching but this is not working due to the fact we are filtering the Asp.Net response within another HTTP Module that tidies up the rendered HTML for XHTML compatiblity reasons. I've also experimented with the output cache profile by setting it to vary-by-header property to "User-Agent" but I am getting mixed results and am also concerned at the memory implications of caching multipel versions of pages (we already have memory issues now and then).
It's possible we could run the redirection code in JavaScript but then we risk not detecting a lot of devices that don't have JavaScript enabled. This is a government website so the usage of JavaScript has to abide by accessibility guidelines.
Does anyone have any other ideas as to how we can solve this issue. Has anyone done this before? Perhaps in a different way?
Hope you can help, thanks.
p.s. I have also asked this question on SharePoint.SE but wanted to get as many eyes on this as possible.
I would suggest you to try ISAPI filters
I've actually solved this one I think. I've pretty much followed this article here - http://msdn.microsoft.com/en-us/library/ms550239.aspx. We have updated the code in that article to build a cache key based on whether the current page is the home page, whether the current user is using a mobile device and whether or not a cookie exists forcing the user to the full site. I will probably write this up as a blog post. When I do I will update this answer providing a link.
A couple sites of mine recently got "hacked". Someone was able to add a line of JavaScript to the bottom of every page on the site.
The server is a Windows Server 2003, and has Cold Fusion 8 and MySQL 5.x installed and running.
Looking into the code on each page shows that none of the pages were modified. The JavaScript is not in the code files themselves. This leads me to believe it is an IIS problem, but I am unsure and cannot find anything that would be able to do this within IIS.
The JavaScript being added redirects a user to another page only when they come from Google, or at least it appears to work this way.
Any help on how someone was able to accomplish this as well as removing it would be greatly appreciated.
Another way to word the question thanks to #Jeffrey Hantin
How do you systematically modify output from IIS without modifying individual pages?
EDIT: A bit more testing has shown that only the .cfm pages add the extra javascript. Added a new .cfm and the js was there but a .html did not have it.
Edit2: Turns out to have been a coldfusion problem after all. Somehow the pages OnRequestEnd.cfm were created on the sites and added that js.
Looks like someone exploited some latest Adobe CF vulnerabilities.
Please see these blog posts for details and try to search symptoms on your server:
Image upload
FCKEditor bug + this post
Hope this helps.
Turns out to have been a coldfusion problem after all. The page OnRequestEnd.cfm were created on the sites and added that js.
If you only want to use IIS to modify output, the ISAPI filter is probably the best answer. If you would like to use Coldfusion, you could utilize the application.cfc to modify output during certain parts of the request cycle or wrap all of your pages in a Custom Tag to consolidate the like portions of your page templates.
I have used both. In cases where my page headers and footers are all the same, the custom tag is fast and easy to use. To make changes to all the pages, you edit one custom tag file. In cases where I have a more complicated web application I'll use the application.cfc to store and insert common components where they are needed.
They might have guessed your password. You should change it immediately.
It's possible that an ISAPI filter is used to do this. I once used one myself to perform compression before IIS supported it natively.
In your specific situation, you may want to check for ISAPI filters you don't want installed. Of course, if your server has been compromised, you will likely be better off rebuilding from a known good image rather than trying to fix it in situ.