I've set up an Azure CDN using the Standard Akamai tier. My origin is a Web App. I've set up the cache-control header in my web.config as follows:
<clientCache cacheControlMode="UseMaxAge" cacheControlCustom="public" cacheControlMaxAge="30.00:00:00" />
In my markup, I'm hitting my image file with the following code:
<img src="https://xxxx.azureedge.net/Content/Images/Turtle.jpg?v=1.0.0.27987">
When I do so, I get the following response headers:
cache-control:public, max-age=2591903
content-length:2321435
content-type:image/jpeg
date:Mon, 03 Apr 2017 19:34:23 GMT
etag:"2e7a1f1690a9d21:0"
last-modified:Thu, 30 Mar 2017 19:59:05 GMT
pragma:no-cache
server:Microsoft-IIS/8.0 status:200 vary:Accept-Encoding x-powered-by:ASP.NET
Notice the pragma:no-cache. I have NO IDEA where that is coming from. This is definitely NOT in my origin's response to load the cache. If I hit the origin, I see the following headers sent for the same image:
Accept-Ranges:bytes
Cache-Control:public,max-age=2592000
Content-Length:2321435
Content-Type:image/jpeg
Date:Mon, 03 Apr 2017 19:41:50 GMT
ETag:"2e7a1f1690a9d21:0"
Last-Modified:Thu, 30 Mar 2017 19:59:05 GMT
Server:Microsoft-IIS/8.0
X-Powered-By:ASP.NET
This means that when hitting the CDN, instead of serving this image from http cache, it's sending the ETag and I'm wasting a HTTP roundtrip. I'm assuming that the culprit is the pragma:no-cache header that is sent, which is overriding the cache-control header. My questions are:
Why is the Azure CDN adding this pragma:no-cache header when serving my image?
Is the pragma:no-cache the reason we're seeing the 304/ETag validation instead of serving from http cache?
Thanks!
Edit: I've also tried removing the tag from the web.config. I'm still seeing an ETag transmitted for the image instead of serving it from the browser http cache.
Related
I'm attempting to use the gitHubAutoDeployer function provided by:
https://github.com/GoogleCloudPlatform/community/blob/master/tutorials/cloud-functions-github-auto-deployer/index.md
I followed the project, but upon deploying and trying to trigger it, I discovered that the response to my webhook was a Google OAuth screen (in the GitHub project, go to Settings > Webhooks > Recent Deliveries). I went ahead and made the request interactive in a browser so I could provide access. After providing my credentials, I was redirected to the following:
<html><head>
<meta http-equiv="content-type" content="text/html;charset=utf-8">
<title>403 Forbidden</title>
</head>
<body text=#000000 bgcolor=#ffffff>
<h1>Error: Forbidden</h1>
<h2>Your client does not have permission to get URL <code>/gitHubAutoDeployer</code> from this server.</h2>
<h2></h2>
</body></html>
Now, Recent Deliveries in GitHub respond with a 302:
Content-Length: 2
Content-Type: text/html
Date: Mon, 30 Mar 2020 15:02:27 GMT
Location: https://accounts.google.com/ServiceLogin?service=ah&passive=true&continue=https://appengine.google.com/_ah/conflogin%3Fcontinue%3Dhttps://us-central1-REDACTED.cloudfunctions.net/gitHubAutoDeployer
Server: Google Frontend
X-Cloud-Trace-Context: d3333e1490ee3ca522c37243673931ed
What am I doing wrong? Any thoughts?
UPDATE: I opened an issue on the project's GitHub - there's a little more information available over there:
https://github.com/GoogleCloudPlatform/community/issues/1202
I also followed the tutorial on the project you referenced and I got the same result.
However, I achieved the expected behavior using Cloud Build with Continuous deployment
I am using ServiceStack with SharpPages to render dynamic content. For "reasons", I need to set the CORS headers Access-Control-Allow-Origin and Access-Control-Allow-Credentials, supporting multiple subdomains.
My SharpPages feature is enabled with :
var pagesFeature = new SharpPagesFeature()
{
ScriptMethods = { new UrlScriptMethods(), new DbScriptsAsync() },
};
pagesFeature.Args[ServiceStack.Script.ScriptConstants.DefaultDateFormat] = "MM/dd/yyyy hh:mm";
pagesFeature.Args[ServiceStack.Script.ScriptConstants.DefaultDateTimeFormat] = "MM/dd/yyyy hh:mm";
Plugins.Add(pagesFeature);
I'm hosting on IIS, so I could use web.config like below, but I can only specify one domain this way. If I specify multiple, XMLHttpRequest calls complain there are multiple domains set for that header.
<system.webServer>
<httpProtocol>
<customHeaders>
<add name="Access-Control-Allow-Origin" value="https://subdomain.domain.com" />
</customHeaders>
</httpProtocol>
</system.webServer>
Likewise, I could have used the ServiceStack HostConfig property GlobalResponseHeaders, but same deal.
I've even tried ServiceStack PreRequestFilters, but those aren't called unless a service method is called. Here is my filter:
this.PreRequestFilters.Add((httpReq, httpResp) =>
{
var origin = httpReq.Headers.Get(HttpHeaders.Origin);
if (!string.IsNullOrWhiteSpace(origin))
{
httpResp.AddHeader(HttpHeaders.AllowOrigin, origin);
httpResp.AddHeader(HttpHeaders.AllowCredentials, "true");
}
});
Finally, StaticFileHandler.ResponseFilter won't work, since I'm using a view engine and not static files.
So, how can I add custom response headers to View Pages (SharpPages in particular, possibly Razor pages as well) in ServiceStack?
The raw request is below. Interesting that I'm requesting https://computer.domain but FireFox translates that to localhost. Regardless, the favicon.ico request DOES get trapped by the filter. The request below DOES NOT.
GET /forms/newsletter HTTP/1.1
Host: localhost:44308
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:67.0) Gecko/20100101 Firefox/67.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: en-US,en;q=0.5
Accept-Encoding: gzip, deflate, br
Connection: keep-alive
Cookie: ss-pid=wCR4INmjLXpBnbsBoe2n
Upgrade-Insecure-Requests: 1
Pragma: no-cache
Cache-Control: no-cache
The raw response is :
HTTP/2.0 200 OK
cache-control: private
content-type: text/html
content-encoding: gzip
vary: Accept-Encoding
server: Microsoft-IIS/10.0
x-aspnet-version: 4.0.30319
x-sourcefiles: =?UTF-8?B?QzpcVXNlcnNcamtsZW1tYWNrXFNvdXJjZVxSZXBvc1xPQlJDX0JNU1xCTVMuV2ViLkJvdHRsZURyb3BDZW50ZXJzXEJNUy5XZWIuQm90dGxlRHJvcENlbnRlcnNcZm9ybXNcbmV3c2xldHRlcg==?=
x-powered-by: ASP.NET
access-control-allow-origin: *
date: Tue, 11 Jun 2019 16:28:34 GMT
content-length: 862
X-Firefox-Spdy: h2
The PreRequestFilters should now be fired for all Razor and Sharp Pages requests from the latest v5.5.1+ that's now available on MyGet.
In the meantime, I've created a portmanteau. For a particular sub-folder, I'm using web.config to allow all remote origins. For Service-based calls, I'm using a hand-rolled version of CorsFeature (a small amount of custom logic won't work).
<location path="views/subfolder"> <-- applies the ACAO header for specific view pages
<system.webServer>
<httpProtocol>
<customHeaders>
<add name="Access-Control-Allow-Origin" value="*" />
</customHeaders>
</httpProtocol>
</system.webServer>
</location>
Need help on sharepoint 2013 Rest API with Ajax Call.
I am trying to read the list items from publishing site to the team site. Both the sites are in different site collections.
The below code is worksfine in Internet explorer and not in Google chrome.
$(document).ready(function() {
$.support.cors = true;
$.ajax({
url:"http://icon.heart.com/WorkTools/Organization/Claim/_api/web/lists/getByTitle('Claims Links')/items?$top=200",
type:"GET",
headers:{"accept":"application/json;odata=verbose"},
dataType: "json",
success: function(data){ alert("pass")}
error: function(Data){ alert ("Fail");}
});
});
The response had Http Status code 401. The error from the $.ajax request is
Failed to load resource : the server responded with a status of 401(unauthorized)
Error 2:
XML HttpRequest Cannot load No 'Access-control-Allow-Origin' header is present on the requested resource. Oringin 'url' is therefore not allowed access.
I don't have access to the servers. I need to try only with Script editor on SharePoint 2013 page.
Most likely it occurs since Chrome refuses to set a an Origin header for a CORS request. It won't even let you explicitly override the Origin header. Basically this causes the server to see Origin: null, which results in a 403 in most cases. IE/Firefox apparently has no such constraint.
As a workaround in case of SharePoint On-Premises you could set a custom header in web.config:
<customHeaders>
<add name="Access-Control-Allow-Origin" value="*" />
</customHeaders>
or specify explicitly domain:
<customHeaders>
<add name="Access-Control-Allow-Origin" value="http://anotherintra.contoso.com" />
</customHeaders>
using OOB scripts, it will not be fixed. the changes need to be done at server side as specified by Vadim Gremyachev. Also it might work in IE8 but in IE10 it will show you a security pop up asking for accessing data from other domain.
headers: {
"Accept": "application/json; odata=verbose",
"X-RequestDigest": $("#__REQUESTDIGEST").val()
},
As explained in Work with __REQUESTDIGEST, some requests require to add the request digest. Even, if this is a get request and the explanation on the ms pages is for "non-GET" requests, it solved some unauthorized issues with my api SP GET calls too.
It is possible the reason IE works and Chrome does not is due to how the respective browsers handle your credentials. To provide your credentials in chrome add the following code to your $.ajax call.
xhrFields: {
withCredentials: true
},
see
Cross domain ajax call windows authentication working in chrome and not working in Firefox
Sending credentials with cross-domain posts?
It is possible cache of dynamic pages, especially the home page?
For plan to reduce access to the database .
I can do static file cache. perfect.
Response headers
Accept-Ranges:bytes
Age:0
Cache-Control:no-store, no-cache, must-revalidate, post-check=0, pre-check=0
Connection:keep-alive
Content-Type:text/html; charset=utf-8
Date:Tue, 09 Dec 2014 17:07:13 GMT
Expires:Thu, 19 Nov 1981 08:52:00 GMT
Pragma:no-cache
Transfer-Encoding:chunked
Via:1.1 varnish-v4
x-Cache:uncached
X-Varnish:295421
File default vcl: http://notepad.cc/vaokodde9
Your backend is screamming it doesn't want anyone to cache the page:
Cache-Control:no-store, no-cache, must-revalidate, post-check=0, pre-check=0
Expires:Thu, 19 Nov 1981 08:52:00 GMT
Pragma:no-cache
It's setting all the Cache-Control headers to disallow cache, it's setting an Expires header in the past (way back), and a Pragma: no-cache. You either have to fix the backend to send headers that allow caching, or if you are sure that you won't break anything start to workaround this headers in vcl.
I would choose the first option, and work the Cookies issue later.
I think Varnish by default does not cache pages with Cookies. Maybe that is your problem (it looks like you have a PHPSESSID and some other stuff)?
See the Varnish documentation: https://www.varnish-cache.org/trac/wiki/VCLExampleCacheCookies
Try configuring your webserver to not set any cookies, or configure Varnish to ignore them (note that that may not make sense, and break your website!)
Thank you all . I managed to solve the problem by studying a little about the functioning of the cookie in the varnish .
In IIS 7.5, you can add static HTTP Response headers, but I want to add an "Expires" header that always specifies a date that is 7 days in the future.
I'm running php 5.4, so I'd like a solution that can do this by editing the web.config file rather than some c# code solution.
I know how to add the header using php, but that won't help for static image file's http headers (jpg, gif, png, etc).
The header should look something like this:
Expires: Thu, 31 May 2012 10:59:25 GMT
How can I make it dynamically always show a date and time 7 days in the future?
Edit:
Notice that I have the expires header that I want on my php files:
http://web-sniffer.net/?url=http%3A%2F%2Fwww.bestds.com
However, I'm not able to specify a date that is 7 days ahead for the "Expires" key on png files (for example), I'm having to use a static date far in the future:
http://web-sniffer.net/?url=http%3A%2F%2Fwww.bestds.com%2Fimage%2Ftlogo.png
This is a standard feature of IIS. The HTTP Response Headers module allows you to set this common header. This results in the following web.config:
<?xml version="1.0" encoding="UTF-8"?>
<configuration>
<system.webServer>
<staticContent>
<clientCache cacheControlMode="UseMaxAge" cacheControlMaxAge="7.00:00:00" />
</staticContent>
</system.webServer>
</configuration>
You should do this only in the directories where you want this header to be send. Typically only directories with static content.
You can only add dynamic expires header using program code.
Source:
The Microsoft IIS Site
You should use Cache-Control max-age instead, like suggested in the other answer.