Microsoft JavaScript Add-in Clear Static website cache - ms-office

We have 2 Microsoft add-ins written using the Office JS framework.As we understand we are loading the static website (taskpane.html) whenever the pane is loaded.
Changes to our plugin are mostly cosmetic, and due to that we usually do not update the version of the plugin, and just push a new version of code to the bucket hosting the static website.
The issue we are facing is with caching of the build bundle, unless we manually clear the cache using developer tools, we do not get the updated website inside the plugin pane.
We have disabled the caching at S3 end to return Cache-Control header value as no-cache, but even after that I see http status code 304 on plugin refresh against the task.html code.
Are we supposed to distribute a new version of plugin event for website updates ?

Are we supposed to distribute a new version of plugin event for website updates ?
No, that is not required.
We have disabled the caching at S3 end to return Cache-Control header value as no-cache
But the loaded by the web browser files are already cached on the end user machine. To get them requested anew you need to clear the cache for users. Only after that you will get new requests according to the HTTP headers set on the server. Note, the Cache-Control header field holds directives (instructions) — in both requests and responses — that control caching in browsers and shared caches (e.g. Proxies, CDNs).
It is not clear what directives are used for the Cache-Control HTTP header.

Related

How to stop index.html being cached by Azure CDN

I am using Azure CDN to host a static website I am building.
It's great, other than the fact that when I update my web app the old page is cached and so still shown.
I have added the following Cache rule in the rules engine to put it to refresh every 60 seconds, however this does nothing and I still get the old content, the only way to get the new content is to go to an incognito browser.
Anyone have any ideas it's driving me crazy!
Here is a screenshot of the browser dev window when I hit the index.html page, I can't see any cache control headers here, I would think that the Azure CDN would/should be putting these on, is that incorrect?
The rule you are modifying controls the "internal max age". If a file shows up correctly in icognito mode, this rule is working fine. You have to set "external max age" to control the Cache-Control header.
https://learn.microsoft.com/en-us/azure/cdn/cdn-verizon-premium-rules-engine-reference-features
Looks like it is not Azure CDN which is caching index.html, it is your browser. Ensure that the Cache-Control header is returned correctly by using the developer tools.
https://learn.microsoft.com/en-us/azure/cdn/cdn-manage-expiration-of-cloud-service-content
https://learn.microsoft.com/en-us/azure/cdn/cdn-manage-expiration-of-blob-content

How to Use eTag on IIS for text/html Pages

I have a website which sits on a non-public domain and is delivered via a proxy through on a different domain. We're having some trouble with caching of content - this is an Umbraco site and making changes updates the pages if you hit the domain directly, but not through the proxy.
I've been informed that the proxy honours response headers and setting an eTag would fix the issue. Having looked into this I can see that IIS sets the eTag by default, and I can see this is working on static content i.e. .js, .css files like so:
However, if I visit a page on the site, for example /uk/products/product I don't see the eTag header.
Is this expected behaviour, should it only be working with those static content files or can I set this on the page to tell the proxy that it should recache?
The ETag HTTP response header is an identifier for a specific version of a resource. It lets caches be more efficient and save bandwidth, as a web server does not need to resend a full response if the content has not changed. Additionally,etags help prevents simultaneous updates of a resource from overwriting each other ("mid-air collisions").
If the resource at a given URL changes, a new Etag value must be generated.
Static content does not change from request to request. The content that gets returned to the Web browser is always the same. Examples of static content include HTML, JPG, or GIF files.
IIS automatically caches static content (such as HTML pages, images, and style sheets), since these types of content do not change from request to request. IIS also detects changes to the files when you make updates, and IIS flushes the cache as needed.
to enable caching in iis you could use iis output caching feature:
1)open iis manager. select site.
2)select the output caching feature from the middle pane.
3)select edit feature setting from the middle pane.
4)check the enable cache and enable kernel cache box and click ok.
if you want to set the ETag as blank you could also do by adding below code in web.config file:
<httpProtocol>
<customHeaders>
<add name="ETag" value="" />
</customHeaders>
</httpProtocol>
refer this below article for more detail:
Caching
To use or not to use ETag, that is the question.
Configure IIS Output Caching
I've read that IIS after version 7 automatically enables E-tags, however, I ran a Pingdom speed test and the report advised me to enable E-tags. I'm not sure that report is accurate, or the information I read about IIS 7 and newer may not be correct.

Cache control header not working

I have set Cache control in my response header as Cache-Control:public, max-age=86400. But when I try to refresh page or open a new tab, it always hits my server. The response status I got is 200, server log is appeared for this request also I checked chrome://cache/ this request is not in the list. I already looked some similar SO questions cache-control not working without etag and why cache-control:max-age don't work?. But still with no luck. Tested on chrome 56.
Chrome disables cache when DevTools is open, or at least it does Chrome 59. Open DevTools, go to Network, uncheck "Disable cache" at the top. Now you should be able to refresh the page and see it in chrome://cache.
Cache control tells your browser (and proxy servers like Squid) what resources it cannot cache. But it does not force your browser to cache a resource.
I recommend to check the error_logs to see if you really go to the backend, or stay in the browser.
In my case, browser gives me 200OK in the console logs but I don't reach the back end according to the error_log ...
Cache-Control response header will not work for page refresh. Try making that request twice without refreshing the page, then you will see it being cached (the request won't reach your server internally).
To achieve what you want you might have to cache your request by accessing localStorage, or just cache it through a back-end caching library.

Node js - Bundler for http2

I'm currently using babel to transform es6 code to es5 and browserify to bundle it to use it in the browser. Now I've began to using a http2 server (Nginx).
Http2 is more effective when it can load multiple small files instead of one big bundle.
How to best serve multiple js files instead of one big bundle?
I know that SystemJS can load multiple files in development without bundling, and for production you can use a DepCache to define the dependence trees of the modules you are importing
https://github.com/systemjs/systemjs/blob/master/docs/production-workflows.md
This approach would require you to ditch browserfy and change to systemjs as it only uses bundles.
I see that you didn't get the answer on your question till now. Thus I try to help you in spite of HTTP/2 is new for me too (it explains the long text of my answer :-)).
Good information about HTTP/2 can be find on the page https://blog.cloudflare.com/http-2-for-web-developers/. I repeat shortly:
stop concatenating files
stop inlining assets
stop sharding domains
continue minimizing of CSS/JavaScript files
continue loading from CDNs
continue DNS prefetching via <link rel='dns-prefetch' href='...' /> included in <head>
...
I want to add two additional points about the importance of setting HTTP headers Cache-Control and Link:
think about setting Cache-Control HTTP headers (especially max-age, expires and etag) on all content of your page. See details below. I strictly recommend to read the Caching Tutorial.
set Link HTTP header to use SERVER PUSH of HTTP/2.
The setting of HTTP headers LINK: are important to use server push feature of HTTP/2 (see here, here). RFC5988 and Section 19.6.1.2 of RFC2068 describe the feature existing in HTTP 1.1 already. Everybody knows Content-Type: application/json, but in the same way one could set less known Link: <...>; rel=prefetch, described here. For example, one can use
Link: </app/script.js>; rel=preload; as=script
Link: </fonts/font.woff>; rel=preload; as=font
Link: </app/style.css>; rel=preload; as=style
Such links, set on HTML page (like index.html), will informs HTTP server to push the resources together with the response on your HTML page. As the result you save unneeded round-trips and the later requests (after parsing HTML files) and the resources will be displayed immediately. You can consider to set the LINK headers on all images from your page to improve the visibility of your page. See here additional information with nice pictures, which demonstrates the advantage of HTTP/2 server push. If you use PHP then the code could be interesting for you.
The most web developers do some optimizations steps directly or indirectly. The steps are done either during building process or by setting HTTP headers in HTTP responses. One have to review some processes switch off someone and include another one. I try to summarize my results.
you can consider to use webpack instead of browserify to exclude some dependencies from merging. I don't know browserify good enough, but I know that webpack supports externals (see here), which allows to load some modules from CDN. In the next step you can remove any merging at all, but minimize and set cache-control on all your modules.
It's strictly recommended to load CSS/JS/Fonts, which you use, and which you don't developed yourself, from CDN. You should never merge such resources with your JavaScript files (what could you probably do with browserify now). Loading of Bootstrap CSS from your server is not good idea. One should better follow advises from here and use CDN instead ol downloading of all files locally.
The main reason of the usage of CDN is very easy to understand if you examine HTTP headres of the response from https://cdnjs.cloudflare.com/ajax/libs/jquery/2.2.1/jquery.min.js for example. You will find something like cache-control: public, max-age=30672000 and expires:Mon, 06 Mar 2017 21:25:04 GMT. Chrome will shows typically Status Code:200 (from cache) and you will see no traffic over the wire. If you explicitly reload the page (by pressing F5) then you will see a response with 222 bytes and Status Code:304. In other words the file will be typically didn't loaded at all. jQuery 2.2.1 stay forever the same. The next version will have another URL. The usage of HTTPS makes sure that the user will load really jQuery 2.2.1. If it's not enough then you can use https://www.srihash.org/ to calculate sha384 value and use extended form of <link> or <script>:
<script src="https://cdnjs.cloudflare.com/ajax/libs/jquery/2.2.1/jquery.min.js"
integrity="sha384-8C+3bW/ArbXinsJduAjm9O7WNnuOcO+Bok/VScRYikawtvz4ZPrpXtGfKIewM9dK"
crossorigin="anonymous"></script>
If the user opens your page with the link then the sha384 hash will be recalculated and verified (by Chrome and Firefox). If the file is not yet in local cache then it will be loaded really quickly too. One short remark by loading the same file from https://code.jquery.com/jquery-2.2.1.min.js one uses HTTP 1.1 today, but from https://cdnjs.cloudflare.com/ajax/libs/jquery/2.2.1/jquery.min.js be used HTTP/2 protocol. i recommend to test the protocol by choosing the CDN. You can find here the list of CDNs which supports now HTTP/2. In the same way loading Bootstrap from https://maxcdn.bootstrapcdn.com/bootstrap/3.3.6/css/bootstrap.min.css one would uses HTTP 1.1 today, but one would use HTTP/2 by loading the same data from https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.6/css/bootstrap.min.css.
I spend many time for CDN to make clear that the most advantage of CDN is setting of cashing headers of HTTP response and the usage of immutable URLs. You can do the same in your modules too.
One should think about the time of caching of every content returned from the server. You can use URLs to your modules, which contains version number of your component (like /script/mycomponent1.1.12341) and to change the last part of version number every time on changing the module. You can set long enough value of max-age in cache-control and your components will be cached by web browser of the client.
Finally I'd recommend you to verify that you installed the latest version of OpenSSL and the latest version of nginx. I recommend to verify your web site in http://www.webpagetest.org/ and in https://www.ssllabs.com/ssltest/ to be sure that you don't forget any simple steps.

How do I cache control?

How do I get my website to save images to clients computer and use them, not redownload them every page reload?
I tried to send header("Cache-Control: max-age=3600"); but that had no effect.
You'll need to sand caching headers for the image files, not for your HTML document. You can use the header function only if the files are actually served by a PHP script - not if they are static files handled by the web server. If they are static files, check the documentation for your web server of choice.
Also consider sending en Expires header, and disable ETags.

Resources