What's the best way to troubleshoot Akamai headers these days? - browser

Traditionally, I would inspect the Akamai headers by installing a Firefox extension called akamaiheaders.xpi. Unfortunately, I think the last version of Firefox to support this was 3.
As I understand it, this plugin would add special headers to all HTTP requests that Firefox made, which would prompt Akamai to add a bunch of headers to the response (telling me whether the file was cached, where it got it from, etc.). Then, using a tool like HTTPFox or Firebug, I could easily see which assets were cached and which ones were not.
I've searched all over, but I can't find anything as simple and easy to use as that. Does anyone know of anything out there that allows me to track all the Akamai headers for all the assets my browser loads that works in either FF, Chrome, or Safari?

You can use curl and/or wget for this:
curl -H "Pragma: akamai-x-cache-on, akamai-x-cache-remote-on, akamai-x-check-cacheable, akamai-x-get-cache-key, akamai-x-get-extracted-values, akamai-x-get-nonces, akamai-x-get-ssl-client-session-id, akamai-x-get-true-cache-key, akamai-x-serial-no" -IXGET http://www.oxfordpress.com/
or
wget -S -O /dev/null --header="Pragma: akamai-x-cache-on, akamai-x-cache-remote-on, akamai-x-check-cacheable, akamai-x-get-cache-key, akamai-x-get-extracted-values, akamai-x-get-nonces, akamai-x-get-ssl-client-session-id, akamai-x-get-true-cache-key, akamai-x-serial-no" http://www.oxfordpress.com/
If you want to test staging environment, you need to remember to send Host header, eg:
curl -H "Host: www.oxfordpress.com" -H "Pragma: ..." -IXGET http://oxfordpress.com.edgesuite-staging.net/
This way or another, it's always about sending proper Pragma headers and then reading response headers.
List of Pragma headers as well as explanations for X-Cache response header can be found here: http://webspherehelp.blogspot.com/2009/07/understanding-akamai-headers-to-debug.html.

I know this question is old, but since I came across it in my search today I thought I'd add an answer for the next person who comes along.
There are a couple of extensions in the Chrome store for this now:
Akamai debug headers which just adds headers to your net panel in web inspector
Exceda Akamai Headers Extension which seems to also work for purging cache.
Akamai debug headers is the one I chose and it's working well so far.

You can use a local proxy (e.g. Fiddler or Charles Proxy, my personal favorite) and add the following header to outgoing requests:
Pragma: akamai-x-cache-on, akamai-x-cache-remote-on, akamai-x-check-cacheable, akamai-x-get-cache-key, akamai-x-get-extracted-values, akamai-x-get-nonces, akamai-x-get-ssl-client-session-id, akamai-x-get-true-cache-key, akamai-x-serial-no

If you're using Chrome or Chromium, you can use the extensions Header Hacker or Pragma Header. With either one, you will be have to add Pragmas manually.

If you can find the akamaiheader.xpi file, you can just open it and change the maxVersion in install.rdf to 9.*
.xpi files are just ZIP files, and on most machines you can just add .zip to the filename and doubleclick on it.

To debug akamai headers, for the Chrome browser, try this extension: CDN Headers & Cookies - Chrome Web Store
https://chrome.google.com/webstore/detail/cdn-headers-cookies/obldlamadkihjlkdjblncejeblbogmnb
Note: Enable 'Load Akamai Headers' in the settings (click the 'Lego minifig Head' icon, click the gear, and check on 'Load Akamai Headers').
It has been suggested on the Akamai community.
https://community.akamai.com/community/web-performance/blog/2015/03/31/using-akamai-pragma-headers-to-investigate-or-troubleshoot-akamai-content-delivery

They have a new version of the XPI out which you can download in Luna. There's also another Plugin which adds a 'content source' pane into Firebug for a quick reference of what on the page was Akamaised.
As I say, to download both plugins you need to login to Luna and look under 'Support' > 'More Tools' > 'Browser Extensions'. The XPI isn't publicly accessible.
YMMV but as far as I recall being told by colleagues the Exceda plugin duplicated HTTP requests which can be a bit messy whilst debugging.
For Chrome I find ModHeader + Setting up a profile where the Pragma headers are sent works fine.

Related

Modify Response Header with Chrome - nosniff

I am trying to add the attribute X-Content-Type-Options:nosniff to my HTTP response.
But I can't add this attribute, just modify if it exists. (www.google.de for example)
I used Chrome and tried several plugins like HeadersModify, ModHeader or Requestly: Redirect Url, Modify Headers.
Sven
Most probably, you are seeing a Chrome developer tools bug/limitation:-
Any modification in response headers is not visible in Chrome developer tools
In Requestly, You can setup a header rule like this:
In Requestly, you can also apply this rule to selective requests. I have left the field empty which applies the rule to all requests. Please modify it according to your use case.
Feel free to reach out to requestly.extension#gmail.com for any issues.

Node js - Bundler for http2

I'm currently using babel to transform es6 code to es5 and browserify to bundle it to use it in the browser. Now I've began to using a http2 server (Nginx).
Http2 is more effective when it can load multiple small files instead of one big bundle.
How to best serve multiple js files instead of one big bundle?
I know that SystemJS can load multiple files in development without bundling, and for production you can use a DepCache to define the dependence trees of the modules you are importing
https://github.com/systemjs/systemjs/blob/master/docs/production-workflows.md
This approach would require you to ditch browserfy and change to systemjs as it only uses bundles.
I see that you didn't get the answer on your question till now. Thus I try to help you in spite of HTTP/2 is new for me too (it explains the long text of my answer :-)).
Good information about HTTP/2 can be find on the page https://blog.cloudflare.com/http-2-for-web-developers/. I repeat shortly:
stop concatenating files
stop inlining assets
stop sharding domains
continue minimizing of CSS/JavaScript files
continue loading from CDNs
continue DNS prefetching via <link rel='dns-prefetch' href='...' /> included in <head>
...
I want to add two additional points about the importance of setting HTTP headers Cache-Control and Link:
think about setting Cache-Control HTTP headers (especially max-age, expires and etag) on all content of your page. See details below. I strictly recommend to read the Caching Tutorial.
set Link HTTP header to use SERVER PUSH of HTTP/2.
The setting of HTTP headers LINK: are important to use server push feature of HTTP/2 (see here, here). RFC5988 and Section 19.6.1.2 of RFC2068 describe the feature existing in HTTP 1.1 already. Everybody knows Content-Type: application/json, but in the same way one could set less known Link: <...>; rel=prefetch, described here. For example, one can use
Link: </app/script.js>; rel=preload; as=script
Link: </fonts/font.woff>; rel=preload; as=font
Link: </app/style.css>; rel=preload; as=style
Such links, set on HTML page (like index.html), will informs HTTP server to push the resources together with the response on your HTML page. As the result you save unneeded round-trips and the later requests (after parsing HTML files) and the resources will be displayed immediately. You can consider to set the LINK headers on all images from your page to improve the visibility of your page. See here additional information with nice pictures, which demonstrates the advantage of HTTP/2 server push. If you use PHP then the code could be interesting for you.
The most web developers do some optimizations steps directly or indirectly. The steps are done either during building process or by setting HTTP headers in HTTP responses. One have to review some processes switch off someone and include another one. I try to summarize my results.
you can consider to use webpack instead of browserify to exclude some dependencies from merging. I don't know browserify good enough, but I know that webpack supports externals (see here), which allows to load some modules from CDN. In the next step you can remove any merging at all, but minimize and set cache-control on all your modules.
It's strictly recommended to load CSS/JS/Fonts, which you use, and which you don't developed yourself, from CDN. You should never merge such resources with your JavaScript files (what could you probably do with browserify now). Loading of Bootstrap CSS from your server is not good idea. One should better follow advises from here and use CDN instead ol downloading of all files locally.
The main reason of the usage of CDN is very easy to understand if you examine HTTP headres of the response from https://cdnjs.cloudflare.com/ajax/libs/jquery/2.2.1/jquery.min.js for example. You will find something like cache-control: public, max-age=30672000 and expires:Mon, 06 Mar 2017 21:25:04 GMT. Chrome will shows typically Status Code:200 (from cache) and you will see no traffic over the wire. If you explicitly reload the page (by pressing F5) then you will see a response with 222 bytes and Status Code:304. In other words the file will be typically didn't loaded at all. jQuery 2.2.1 stay forever the same. The next version will have another URL. The usage of HTTPS makes sure that the user will load really jQuery 2.2.1. If it's not enough then you can use https://www.srihash.org/ to calculate sha384 value and use extended form of <link> or <script>:
<script src="https://cdnjs.cloudflare.com/ajax/libs/jquery/2.2.1/jquery.min.js"
integrity="sha384-8C+3bW/ArbXinsJduAjm9O7WNnuOcO+Bok/VScRYikawtvz4ZPrpXtGfKIewM9dK"
crossorigin="anonymous"></script>
If the user opens your page with the link then the sha384 hash will be recalculated and verified (by Chrome and Firefox). If the file is not yet in local cache then it will be loaded really quickly too. One short remark by loading the same file from https://code.jquery.com/jquery-2.2.1.min.js one uses HTTP 1.1 today, but from https://cdnjs.cloudflare.com/ajax/libs/jquery/2.2.1/jquery.min.js be used HTTP/2 protocol. i recommend to test the protocol by choosing the CDN. You can find here the list of CDNs which supports now HTTP/2. In the same way loading Bootstrap from https://maxcdn.bootstrapcdn.com/bootstrap/3.3.6/css/bootstrap.min.css one would uses HTTP 1.1 today, but one would use HTTP/2 by loading the same data from https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.6/css/bootstrap.min.css.
I spend many time for CDN to make clear that the most advantage of CDN is setting of cashing headers of HTTP response and the usage of immutable URLs. You can do the same in your modules too.
One should think about the time of caching of every content returned from the server. You can use URLs to your modules, which contains version number of your component (like /script/mycomponent1.1.12341) and to change the last part of version number every time on changing the module. You can set long enough value of max-age in cache-control and your components will be cached by web browser of the client.
Finally I'd recommend you to verify that you installed the latest version of OpenSSL and the latest version of nginx. I recommend to verify your web site in http://www.webpagetest.org/ and in https://www.ssllabs.com/ssltest/ to be sure that you don't forget any simple steps.

Chrome extension background XMLHttpRequest proxy server

I have a background task in a Chrome Extension that performs some polling/checking. In error cases, I want to retry the check using a different CDN server to verify if it's a site-wide problem or just a CDN node affected. The challenge is how to control to which CDN node to send to.
e.g. let's say I'm checking www.company.com and typically that will be served by server21.cdn.co
Now if that fails I want to check server5.cdn.co and server10.cdn2.co for the same content to see if there's a correlation.
These checks are done using XMLHttpRequest but I can't find a way to specify which host/proxy to use per-request.
I wouldn't want to "hijack" the entire browser's proxy server settings because it would cause all other pages/tabs to fail.
If it's load-balancing performed on the DNS level (which is most probable), you can't affect which server you actually contact at all.
To clarify, this can't be done because XMLHTTPRequest forbids overriding the Host header in requests. See https://security.stackexchange.com/questions/46702/how-can-i-control-the-content-of-the-http-host-header-in-requests-issued-from-my/ for an explanation of why.
Conceptually/technically it's easy to do if it weren't for this security issue. For example with curl:
curl --verbose --header 'Host: www.example.com' 'http://10.1.1.36/the_url_to_test'
But as mentioned in the question, we're stuck inside a Chrome extension so curl is not an option!

How do I know if IIS is really Compressing my HTML?

Our IIS server has Dynamic and Static HTML Compression enabled, but when I browse to our website and view the Response Headers in Fiddler, I only see the "Content-Encoding: gzip" header for one resource (a flash file).
Why would the other response types not have this header? Does it mean that compression is NOT working for the other responses?
The only way to be 100% sure that compression is active is to compare the size of the downloaded resource against the original file on the server. The network tab of the Firebug extension can help you here.
It looks like our company network was actually stripping out the Content-Encoding header. (I have no idea why). When I browse from home the gzipping seems to work fine. This post on StackExchange.com helped me figure it out.

How to find out all files that my browser loads while accessing a webpage?

I can use Firebug and it will show lots of info about files that are loaded and even http return codes but it doesn't seem to show all of them.
For example i visit a page that loads a flash file. In firebug it will show that the file is loaded, but if that swf itself loads other swf's and accesses other resources those will not be showed in firebug. Same with ajax calls.
So i would like to know how can i monitor ALL activity that is made while browsing a page, what files are loaded, from where, etc...
One of the tools I use for inspecting requests and responses is Fiddler. It works very well and it is free. From their homepage http://www.fiddlertool.com/fiddler/
Fiddler is a HTTP Debugging Proxy
which logs all HTTP traffic between
your computer and the Internet.
Fiddler allows you to inspect all HTTP
Traffic, set breakpoints, and "fiddle"
with incoming or outgoing data.
Fiddler includes a powerful
event-based scripting subsystem, and
can be extended using any .NET
language.
I have also used IEWatch, however IEWatch is not free and only works for IE.
You could set up a simple local HTTP proxy and pass all your requests through that. Then monitor the proxy log file to see what was requested.
I use this:
http://www.httpwatch.com/
There is a Firefox add-in called lori (life-of-request info) which does this: it displays the total number of bytes and other stats on the toolbar and if you right click on it it offers to copy the detailed stats to the clipboard which contains the urls themselves. It works for ajax requests, I am not sure about swf though.
Also, the resource inspector in Webkit browsers like Safari or Chrome will do the same for you.
Firebug does record AJAX requests. The safari web inspector would be the next thing to try, but I don't think any browser tools will record flash data sent. For that a packet recorder like wireshark would be better.

Resources