Fastly CDN - do not cache specific sections of a page - iis

I am reading Fastly's guides on how to cache pages or using conditions to not cache specific pages. The site I'm working on has some dynamic elements present on all pages of the site which I need to not be cached.
The Fastly guides mention passing specific HTTP headers or using conditions to cache pages and this makes sense for full pages. However, I can't find a clear answer on how to tell the system to not cache parts of a given page while still caching the rest of the page.
Anyone got an example of how I would do this, ideally using IIS / C#?

Related

How to Leverage Browser Caching using .htaccess ? (Google Maps api)

So I have tested my page via Googles Page Insights
And it is currently telling me to:
Leverage browser caching for the following cacheable resources:
http://maps.google.com/maps/api/js?sensor=false&language=en (30
minutes)
Its rather ironic as its a google resource from a google server But
Its always good to know how to do things I've tried to read about how to do this on a link google provided on the test page however it didn't really give an example of how to cache this external resource I've tried reading as much as I can and adding bits into my htaccess file but nothing seems to work.
So I guess my question firstly is, is it even possible via the .htaccess file to cache this resouce?
And if so how what code would I need to put in there to get it to cache the resource?
Thanks you In advance for any help.
You can't control the caching of resources served from third party. htaccess is to control caching for resources served out of your own boxes.

Orchard CMS Warmup Status Zero

I have setup a brand new Orchard CMS 1.5.1 site using Web Platform Installer on Windows 2008 Server. I wanted to test out the Performance settings so I configured the following Warmup entries one per line:
/
/blog
...and checked the following options:
x Generate warmup pages periodically 90 Every minutes
x Generate warmup pages any time some content is published
When I visit the site the performance was still a bit slow. The Performance Warmup settings show each page has a status of zero and a red "down arrow" icon next to it.
Is there anything else I need to enable? Is there anything I am missing in the configuration like permissions, etc.
UPDATE:
I have noticed that my site does not have a folder to store the warm up pages. I added that folder manually but it still didn't fix my problems. Are there permissions I would need to set on that folder?
UPDATE 2:
After talking with Sebastien Ros, I think I understand what is wrong but still don't know how to fix it. The base URL setting in Orchard is set to "www.mydomain.com" as it should be but networking-wise my server does not allow my site to go out to the internet and query itself by that address in order to generate the warm-up page. To make matters worse, I have several sites that are hosted on the same IP address and using host headers to distinguish between sites. This prevents me from even being able to configure the base URL as a local IP address (which cause issues with other modules anyway).
Not sure what alternatives I have now.
Thanks,
Brian
Make sure that the general settings page is pointing to your base URL, i.e. http://mywebsite.com.
It may be pointing to the local host by default.
I confirmed with a Network Engineer at my server host that there was a networking restriction on outgoing requests coming back in for the web site. So, the performance module could not query www.mydomain.com and get an answer. Once the network restriction was removed, I was able see Warm-ups create the cache pages with a Status 200.
Alternatively, it was suggested that I create entries in my host file for each of my Orchard sites. I did not try this but I see no reason why it would not work even with the host-headered scenario that I have.
Brian

How to use locations.kml with sitemap.xml

I would like to make sure website ranks as high as possible whenever my Google Places location ranks high.
I have seen references to creating a locations.kml file and putting it in the root directory of my site. Then creating lines in the sitemap.xml file to point to this .kml file.
I get this from this statement on the geolocations page
Google no longer supports the Geo extension to the Sitemap protocol. We recommmend that you tell Google about geographically-based URLs by including them in a regular Web Sitemap.
There is a link to the Web Sitemap page
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=183668
I'm looking for examples of how to include Geo location information in the sitemap.xml file.
Would someone please point me to an example so that I can know how to code the reference?
I think the point is that you dont use any specific formatting in the sitemap. You make sure you include all your locally relevent pages in the sitemap as normal. (ie you dont include any geo location in the sitemap)
GoogleBot will use its normal methods for detereriming if the page should be locally targeted.
(I think Google have found the sitemap-protocol has been abused, and or misunderstood, so they dont need it to tell them so much about the page. Rather its just a way to find pages, that it might take a long time to discover though conventual means. )

How to Bypass Output Cache in SharePoint 2007 Publishing Internet site

We're building a mobile-friendly site to work in tandem with our client's MOSS 2007 internet site. We need to be able to redirect users who hit the home page and are using a mobile device.
Our original intention was to add a custom control to the home page page layout that would detect the current user's device and redirect to the mobile site accordingly. We quickly realised that this would not work as we are using the Output Caching functionality provided by SharePoint/Asp.Net. This means that the detection code will only run for the first visitor to the home page until the cache expires.
Our next idea was to build a custom HTTP Module and process the detection there. However, we are finding that the Output Caching is not allowing that either. If the cache is set while a mobile device is visiting all browsers are subsequently redirected to the mobile site (until the cache expires).
If we turn off output caching it works just fine - but we cannot turn output caching off, especically for the home page. We did investigate Substitution (Donut) Caching but this is not working due to the fact we are filtering the Asp.Net response within another HTTP Module that tidies up the rendered HTML for XHTML compatiblity reasons. I've also experimented with the output cache profile by setting it to vary-by-header property to "User-Agent" but I am getting mixed results and am also concerned at the memory implications of caching multipel versions of pages (we already have memory issues now and then).
It's possible we could run the redirection code in JavaScript but then we risk not detecting a lot of devices that don't have JavaScript enabled. This is a government website so the usage of JavaScript has to abide by accessibility guidelines.
Does anyone have any other ideas as to how we can solve this issue. Has anyone done this before? Perhaps in a different way?
Hope you can help, thanks.
p.s. I have also asked this question on SharePoint.SE but wanted to get as many eyes on this as possible.
I would suggest you to try ISAPI filters
I've actually solved this one I think. I've pretty much followed this article here - http://msdn.microsoft.com/en-us/library/ms550239.aspx. We have updated the code in that article to build a cache key based on whether the current page is the home page, whether the current user is using a mobile device and whether or not a cookie exists forcing the user to the full site. I will probably write this up as a blog post. When I do I will update this answer providing a link.

How do you globally modify page output sent from IIS without modifying the page source?

A couple sites of mine recently got "hacked". Someone was able to add a line of JavaScript to the bottom of every page on the site.
The server is a Windows Server 2003, and has Cold Fusion 8 and MySQL 5.x installed and running.
Looking into the code on each page shows that none of the pages were modified. The JavaScript is not in the code files themselves. This leads me to believe it is an IIS problem, but I am unsure and cannot find anything that would be able to do this within IIS.
The JavaScript being added redirects a user to another page only when they come from Google, or at least it appears to work this way.
Any help on how someone was able to accomplish this as well as removing it would be greatly appreciated.
Another way to word the question thanks to #Jeffrey Hantin
How do you systematically modify output from IIS without modifying individual pages?
EDIT: A bit more testing has shown that only the .cfm pages add the extra javascript. Added a new .cfm and the js was there but a .html did not have it.
Edit2: Turns out to have been a coldfusion problem after all. Somehow the pages OnRequestEnd.cfm were created on the sites and added that js.
Looks like someone exploited some latest Adobe CF vulnerabilities.
Please see these blog posts for details and try to search symptoms on your server:
Image upload
FCKEditor bug + this post
Hope this helps.
Turns out to have been a coldfusion problem after all. The page OnRequestEnd.cfm were created on the sites and added that js.
If you only want to use IIS to modify output, the ISAPI filter is probably the best answer. If you would like to use Coldfusion, you could utilize the application.cfc to modify output during certain parts of the request cycle or wrap all of your pages in a Custom Tag to consolidate the like portions of your page templates.
I have used both. In cases where my page headers and footers are all the same, the custom tag is fast and easy to use. To make changes to all the pages, you edit one custom tag file. In cases where I have a more complicated web application I'll use the application.cfc to store and insert common components where they are needed.
They might have guessed your password. You should change it immediately.
It's possible that an ISAPI filter is used to do this. I once used one myself to perform compression before IIS supported it natively.
In your specific situation, you may want to check for ISAPI filters you don't want installed. Of course, if your server has been compromised, you will likely be better off rebuilding from a known good image rather than trying to fix it in situ.

Resources