Even if the image is changed, overwritten, modified, IIS still serves the cached copy.
I am trying to upload an image from a webcam taken every 15 seconds. The image makes it onto the server but when I refresh the browser with the image FROM the server it does not refresh.
IIS caches the file apparently for more than 2 minutes. I want this to be in real-time. Tried disabling caching everywhere I could think of. No luck.
Embed your image as follows:
<*ImageTag src="WebCamImage.aspx?data={auto-generated guid}" ... >
*ImageTag = img (spam filter won't let me post it)
And create a page (WebCamImage.aspx) that streams the static image file back to the browser while ignoring the "data" request parameter, which is only used to avoid any caching (make sure to set the response content type to "image/jpeg" or whatever is adequate in the #page header).
Are you sure that the image is cached on the server and not on the client. Have you tried requesting the same image from a different client?
If this IS server side caching then this article has all the answers for you:
http://blogs.msdn.com/david.wang/archive/2005/07/07/HOWTO-Use-Kernel-Response-Cache-with-IIS-6.aspx
You are most likely "affected" by the kernel-mode caching.
See that scavenger time?
Scavenger - 120 seconds by default and controlled by the registry key HKLM\SYSTEM\CurrentControlSet\Services\HTTP\Parameters\UriScavengerPeriod
That is probably what you experience (2min caching)
Try turning kernel-mode caching off to see if it makes a difference (performance may suffer but it will be no worse than IIS5)
Related
I have a load of images stored as blobs in an Azure container.
I am attempting to get the browser (Chrome) to cache these images to save bandwidth. I have read many posts stating that this can be achieved by setting the Cache-Control response header. I am using the Microsoft Azure Storage Explorer to modify these headers, e.g:
public, max-age=7776000
When loading this image (https://XXXXXXXX.blob.core.windows.net/XXXXXXXX/Themes/summer.jpg) this is what I see using Google Chrome's Developer Tools:
It doesn't make any difference to the caching of the image. I have tried many different permutations of the allowed CacheControl attributes but I don't see any caching going on at all. The status is always 200, but I was expecting 304 for a cache hit. Is this correct?
Whichever CacheControl string I provide is always displayed in Chrome's results; it just doesn't seem to make any difference to the caching aspect. I've tried variations of public, private, max-age, s-maxage, must-revalidate. And just to be complete, no-cache and no-store. No differences were observed.
The above image takes 900ms+ to load for me. However, when saved locally, the same image takes 19ms. I would expect if the browser was caching the image then it's timing would be equivalent to the local time.
Other posts suggest that an Azure CDN be used. However, I don't want to go down this route as the site that uses these images would not need that.
Am I missing a setting in Azure to allow caching? Loading the images directly in the browser, or within a web page makes no difference either.
Can anyone provide assistance? Let me know if any other information is required.
The CacheControl settings should take effect.
When using chrome, please make sure you didn't select the option "disable-cache".
I can see the expected behavior when set max-age=xx for CacheControl in chrome.
Somehow I think there's a problem with stat in linux, but I tested a regular empty folder in linux vs an icon of less than 1000 bytes in size. The test was done with Apache 2.2 and the server location is in east canada.
Webpagetest results:
http://www.webpagetest.org/result/160202_KK_9HJ/1/details/
I'm curious as to why the time to first byte for the directory listing is higher than the time to first byte for the icon by one third?
What settings do I use in linux to fix this?
The time to first byte represents the time taken to 1) send the request to the server, 2) process the request and 3) return at least some of the results from the server.
For similarly sized resources 1) and 3) should be the same so let's concentrate on 2) for now.
When you request the directory, Apache has to check if the directory contains an index.html file, if not it reads the directory, then it starts to construct the HTML page creating links to the parent directory and each file/sub directory in the directory, then it has to return the file.
When you request the ico file Apache just has to pick up the file and return it to you nice and simple.
So as you can see there is more work in the first than in the second. So I don't think this is a fair test. Compare a static index.html file to a static ico file for a fairer test and then you'll know if you have an issue.
Additionally, depending on your mpm choice, settings, server load and server history there may be a thread or process started up waiting to process the first request (fast) or the first request may have to initiate one to handle this request (slow). This is likely to be less of an issue for a second request, particularly with keep-alive enabled. See here for more details: https://serverfault.com/questions/383526/how-do-i-select-which-apache-mpm-to-use.
There is also the TCP slow start issue, which particularly affects older version of OS and software, but that is unlikely to have an impact here in the small loads you are talking about and also should affect total download time rather than TTFB. Still it's yet another reason to ensure you're running up to date software.
And finally your TTFB is mostly influenced by your hosting provider and the pipes to your server and number of hops until
It gets to Apache so, once you have chosen a hosting provider, it is mostly out of your control. Again this will usually be reflective across the board and rather than the variances you see between two requests here.
I'm trying to limit data usage when serving images to ensure the user isn't loading bloated pages on mobile while still maintaining the ability to serve larger images on desktop.
I was looking at Twitter and noticed they append :large to the end of the url
e.g. https://pbs.twimg.com/media/CDX2lmOWMAIZPw9.jpg:large
I'm really just curious how this request is being handled, if you go directly to that link there is no scripts on the page so I'm assuming it's done serverside.
Can this be done using something like Restify/Express on a Node instance? More than anything I'm really just curious how it is done.
Yes, it can be done using Express in Node. However, it can't be done using express.static() since it is not a static request. Rather, the receiving function must handle the request by parsing the querystring (or whatever :large is) in order to dynamically respond with the appropriate image.
Generally the images will have already been pre-generated during the user-upload phase for a set of varying sizes (e.g. small, medium, large, original), and the function checks the querystring to determine which static request to respond with.
That is a much higher-performing solution than generating the appropriately-sized image server-side on every request from the original image, though sometimes that dynamic approach is necessary if the server is required to generate a non-finite set of image sizes.
As topic is it possible to set cache on external resources with htaccess.
I have some third party stuff on my site eg, google web elements and embedded youtube clips.
I want my google page speed to get higher.
error code from page speed:
The following resources are missing a cache validator.
http://i.ytimg.com/vi/-MfM1fVSFnM/0.jpg
http://i.ytimg.com/vi/-PxVKNJmw4M/0.jpg
http://i.ytimg.com/vi/3nxENc_msc0/0.jpg
http://i.ytimg.com/vi/5Bra7rbGb7g/0.jpg
http://i.ytimg.com/vi/5P76PKybW5o/0.jpg
http://i.ytimg.com/vi/9l9BzKfI88o/0.jpg
http://i.ytimg.com/vi/E7hvBxMB4XI/0.jpg
http://i.ytimg.com/vi/IiocozLHFis/0.jpg
http://i.ytimg.com/vi/JIHohC8fydQ/0.jpg
http://i.ytimg.com/vi/P66uwFpmQSE/0.jpg
http://i.ytimg.com/vi/TXLTbARnRdU/0.jpg
http://i.ytimg.com/vi/bPBrRzckfEQ/0.jpg
http://i.ytimg.com/vi/dajcIH9YUuI/0.jpg
http://i.ytimg.com/vi/g4roerqw090/0.jpg
http://i.ytimg.com/vi/h1imBHP3DdA/0.jpg
http://i.ytimg.com/vi/hRvW5ndLLEk/0.jpg
http://i.ytimg.com/vi/kzahftbo6Qc/0.jpg
http://i.ytimg.com/vi/lta2U3hkC4k/0.jpg
http://i.ytimg.com/vi/n1o9bGF88HY/0.jpg
http://i.ytimg.com/vi/n3csJN0wXew/0.jpg
http://i.ytimg.com/vi/q0Xu-0moeew/0.jpg
http://i.ytimg.com/vi/tPCDPKirZBM/0.jpg
http://i.ytimg.com/vi/uLxsPImMJmg/0.jpg
http://i.ytimg.com/vi/x33B_iBn2_M/0.jpg
No, it's up to them to cache it.
The best you could do would be to download them onto your server and then serve them, but that would be slower anyway!
Nope, setting cache settings for third parties is not possible unless you start passing those resources through on your server as a proxy, which you usually don't want for reasons of speed and traffic.
As far as I can see, there's nothing you can do here.
You could delay your Youtube videos from loading on the page until something like a holding image is clicked. This wouldn't cache these images when (or if) they are loaded, but they wouldn't detrimentally affect your Page Speed because they wouldn't be loaded on page load any more.
I'm writing my own webserver and I don't yet handle concurrent connections properly. I get massive page loading lag due to inappropriately handling concurrent connections (I respond to SYN, but I lose the GET packet somehow. The browser retries after a while, but it takes 3 seconds!) I'm trying to figure out if there's a way to instruct the browser to stop loading things concurrently, because debugging this is taking a long time. The webserver is very stripped down, is not going to be public and is not the main purpose of this application, which is why I'm willing to cut corners in this fashion.
It'd be nice to just limit the concurrent connections to 1, because modifying that parameter using a registry hack for IE and using about:config for Firefox both make things work perfectly.
Any other workaround ideas would be useful, too. A couple I can think of:
1 - Instruct the browser to cache everything with no expiration so the slow loads (.js, .css and image files) happen only once. I can append a checksum to the end of the file (img src="/img/blah.png?12345678") to make sure if I update the file, it's reloaded properly.
2 - Add the .js and .css to load inline with the .html files - but this still doesn't fix the image issue, and is just plain ugly anyway.
I don't believe it's possible to tell a browser like Firefox to not load concurrently, at least not for your users via some http header or something.
So I never found a way to do this.
My underlying issue was too many requests were coming in and overflowing my limited receive buffers in emac ram. Overflowing receive buffers = discarded packets. The resolution was to combine all .js and all .css files into 1 .js and 1 .css file in order to get my requests down. I set all image, js and css pages to have a year's expiration. The html pages are set to expire immediately. I wrote a perl script to append md5 checksums to files so changed files are refetched. Works great now. Pages load instantly after the first load caches everything.