Varnish 4 does not honor Cache-Control: must-revalidate - varnish

I'm trying to make Varnish work with last-modified headers, but no matter what I do my page is cached during 120s, and Varnish never revalidate with the backend.
My backend is sending these headers :
Cache-Control: must-revalidate, proxy-revalidate, public, stale-while-revalidate=0
Last-Modified: Fri, 22 Jan 2016 03:32:33 GMT
And when I log the TTL of the object on hit, it's value is always set to 120s.
I am using the default VCL config of Varnish 4.
Regards,
Edit: After some search, I found that 120s is the default ttl value of Varnish. But why is he ignoring last-modified?

Set the "s-maxage" or "max-age" attributes of the Cache-Control header:
beresp.ttl is initialized with the first value it finds among:
The s-maxage variable in the Cache-Control response header field
The max-age variable in the Cache-Control response header field
The Expires response header field
The default_ttl parameter.
See: http://book.varnish-software.com/4.0/chapters/VCL_Built_in_Subroutines.html#the-initial-value-of-beresp-ttl

I got an answer from the Varnish mailing list, in order to emulate the "must-revalidate" header, this piece of VCL must be added:
sub vcl_backend_response {
if (beresp.http.cache-control ~ "must-revalidate") {
set beresp.ttl = 1s;
set beresp.grace = 0s;
set beresp.keep = 1w;
}
}
It only works on Varnish 4.
I quote the reason for the 1s ttl :
This way, you'd only cache for 1 second (don't set it to 0, or all the
requests for this object would be done sequentially), but will keep
the object for a week, revalidating it each time it is requested and
its ttl is expired.

Related

How to display the original page cache when there is a query parameter?

I think I solved the first part where it shouldn't cache pages with query parameters using
if ( bereq.url ~ "/\?.*$" ){
set beresp.uncacheable = true;
set beresp.ttl = 120s;
return (deliver);
}
However, that doesn't show the cached version of the original page when visiting the page with queries.
Any advice is appreciated. Thanks.
What should be added to the varnish-vcl configuration to make this possible?
Sorry, I'm not sure what you need. Your code is actually leveraging the hit-for-miss capabilities of varnish, but you'd be better of just doing this:
sub vcl_recv {
if (req.url ~ \?) {
return (pass);
}
}
The subtle difference in your version, Varnish creates a cache object (just the metadata, actually) for 120 seconds to remember it shouldn't be cached. But as you know from the start you shouldn't cache it, you can just return (pass) in vcl_recv.
Now, if what you want is stripping the query string so that all requests actually return the main page, you have to do this instead:
sub vcl_recv {
set req.url = reqsub(req.url, "\?.*", "");
}
this finds the first question mark and everything right of it and replaces it with nothing, effectively killing the query string.

Deactivate HTTP cache in Shopware 5 in a plugin

In a plugin I need to deactivate the Shopware HTTP-Cache for two categories. The manual says I should emit this event:
Shopware()->Events()->notify(
'Shopware_Plugins_HttpCache_InvalidateCacheId',
array(
'cacheId' => 'a14',
)
);
The a14 stands for the article with the ID 14. According to the manual a c can be used to uncache category pages. So I put this in my plugins bootstrap.php to stop caching of the categorys with the ID 113 and 114:
public function afterInit()
{
Shopware()->Events()->notify(
'Shopware_Plugins_HttpCache_InvalidateCacheId',
array(
'cacheId' => 'c113',
'cacheId' => 'c114',
)
);
}
I have emptied the cache manually on all levels, but nothing happens, neither good or bad, no error thrown and the categories are not removed from cache when the cache has been rebuild after emptying. Does anybody have a clue what I should change?
Here is the complete solution, thanks to Thomas answer, everything is done in the Bootstrap.php:
First subscribe to the PostDispatch_Frontend_Listing Event:
public function install()
{
$this->subscribeEvent('Enlight_Controller_Action_PostDispatch_Frontend_Listing', 'onPostDispatchListing');
return true;
}
Second create a function to send no-cache-header under certain conditions:
public function onPostDispatchListing(Enlight_Event_EventArgs $arguments)
{
$response = $arguments->getResponse();
$categoryId = (int)Shopware()->Front()->Request()->sCategory;
if ($categoryId === 113 || $categoryId === 114) {
$response->setHeader('Cache-Control', 'private, no-cache');
}
}
Third install or reinstall the plugin so the subscription to the event will be persisted in the database.
I think the best way is to add a plugin which adds a Cache-Control: no-cache header to the response for the specified categories. When this header is set the categories are not stored in the HTTP cache and you don't need to invalidate it.
You can listen to the Enlight_Controller_Action_PostDispatch_Frontend_Listing event and check if the category id is the one you need and add the header to the response.
$response->setHeader('Cache-Control', 'private, no-cache');

Gmail API playground: Send method, converted MIME raw header not populating email fields on send

I'm using the Google OAuth 2.0 Playground and attempting to send an email. The autentication is working fine. Here is the very simple message I'm trying to send (email address changed to prevent spam):
to: 'Michael To' <FakeMichael#gmail.com>
from: 'John From' <JohnF#mydomain.com>
subject: 'Test Message'
htmlBody: '<b>HI!</b><br>Test Message'
I convert that to Base64 RFC 822 using VBA which gets me this (I've tried swapping the "+" and the "-" per other StackOverflow posts but to no avail):
dG86ICdNaWNoYWVsIFRvJyA8RmFrZU1pY2hhZWxAZ21haWwuY29tPg1mcm9tOiAnSm9obiBGcm9tJyA8Sm9obkZAbXlkb21haW4uY29tPg1zdWJqZWN0OiAnVGVzdCBNZXNzYWdlJw1odG1sQm9keTogJzxiPkhJITwvYj48YnI-VGVzdCBNZXNzYWdlJw==
In the Playground my method is POST and I've added 2 headers:
raw: and the Base64 string above (no quotes or anything)
Content-Type: message/rfc822 <I added this because I kept getting an a different error. Putting this prevetned that error>
Request URI (removed the https cause SO won't let me post more than 2 links)://www.googleapis.com/upload/gmail/v1/users/me/messages/send
I click "send the request" and get an OK response:
Here is my request:
POST /upload/gmail/v1/users/me/messages/send HTTP/1.1
Host: www.googleapis.com
Raw: <string above>
Content-length: 0
Content-type: message/rfc822
Authorization: Bearer <my token>
Response:
HTTP/1.1 200 OK
Alternate-protocol: 443:quic,p=1
Content-length: 91
Expires: Fri, 01 Jan 1990 00:00:00 GMT
Vary: Origin, X-Origin
Server: UploadServer ("Built on Jun 6 2015 11:14:45 (1433614485)")
Etag: "YAnoF_dHYOakPARISZQhTvRsqto/nwevNUuzaUU_lB19L-UhrwaaUSM"
Pragma: no-cache
Cache-control: no-cache, no-store, max-age=0, must-revalidate
Date: Wed, 10 Jun 2015 15:59:36 GMT
Content-type: application/json; charset=UTF-8
{
"labelIds": [
"SENT"
],
"id": "14dde32bc92c9XYZ",
"threadId": "14dde32bc92c9XYZ"
}
However, when I go to my Gmail sent mail folder, the message is there but nothing is in the To, Subject, or Body is field: See Screenshot
I have to imagine this is something simple, but as I'm new to the Google Gmail API, MIME, and dealing with Raw Base64 stuff, I'm not having much luck.
Thanks in advance for any assistance.
--- EDIT PER THOLLE'S Response ---
That helps! I removed the raw base64 string header and put:
From: 'John From' <JohnF#mydomain.com>
Subject: Test Message
To: 'Michael To' <FakeMichael#gmail.com>
Test Message
into the "Enter request body" and it sends, which is great!
Three Follow up questions:
Are there any security risks or limitations (max length? I see there might be a 2mb limitation but that would be a lot of text.) sending it this way (in the body) opposed to a raw Base64 string in the header?
(I'll dig more on this) How do I make the message body HTML? Does the content type of "Content-Type: message/rfc822" prevent me from being able to send it HTML? Sending it HTML is a requirement for this application and I can't have two content types, is there an HTML parameter I can use or am I out of luck?
(I'll do homework on this as well) How do I include an attachment, say a PDF file, with the email?
Thanks again!
I think you are violating a few minor details of the RFC 822 standard:
It is recommended
that, if present, headers be sent in the order "Return-
Path", "Received", "Date", "From", "Subject", "Sender",
"To", "cc", etc.
I can't find it for the life of me, but I also think that the headers has to have their first character capitalized. Try this:
From: John From <JohnF#mydomain.com>
Subject: Test Subject
To: Michael To <FakeMichael#gmail.com>
Test Message
You also don't want to send the base64-encoded mail if you choose message/rfc822 as your Content-Type. Just supply the example mail above as is.
POST /upload/gmail/v1/users/me/messages/send HTTP/1.1
Host: www.googleapis.com
Content-length: 108
Content-type: message/rfc822
Authorization: Bearer {YOUR_ACCESS_TOKEN}
From: John From <JohnF#mydomain.com>
Subject: Test Subject
To: Michael To <FakeMichael#gmail.com>
Test Message
If you want HTML, just modify your message to this:
From: John From <JohnF#mydomain.com>
Subject: Test Subject
To: Michael To <FakeMichael#gmail.com>
Content-Type: text/html; charset=utf-8
Content-Transfer-Encoding: quoted-printable
<b>Test Message</b>

Does CrossRef Search API support cross-domain requests?

The CrossRef Search API (docs here) provides citation information from DOI identifiers. I tried using it to get this info but am oddly getting 404 responses.
The headers I set were
Content-type: application/json
Access-Control-Allow-Origin: *
Access-Control-Allow-Methods: GET, POST, OPTIONS
Access-Control-Allow-Headers: Content-Type
Access-Control-Max-Age: 86400
I get the same result from this appspot tester, so I wouldn't think it's my code.
Can anyone advise how I could get it working? It works just fine from their own domain.
It's possible they don't allow cross-domain at all, but I'm not sure if/how I could check that.
Reproducible example:
function doiInfo(doi) {
var doienc = encodeURIComponent(doi);
var doiXHR;
window.XMLHttpRequest ? doiXHR=new XMLHttpRequest() : doiXHR=new ActiveXObject("Microsoft.XMLHTTP");
doiXHR.onreadystatechange=function()
{
if (doiXHR.readyState==4 && doiXHR.status==200)
{
console.log(doiXHR.responseText);
} else if (doiXHR.readyState==4) {
// something went wrong
}
}
doiXHR.open("GET", "http://search.crossref.org/dois?q=" + doienc, true);
doiXHR.setRequestHeader("Content-type", "application/json;");
doiXHR.setRequestHeader("Access-Control-Allow-Origin", "*");
doiXHR.setRequestHeader("Access-Control-Allow-Methods", "GET, POST, OPTIONS");
doiXHR.setRequestHeader("Access-Control-Allow-Headers", "Content-Type");
doiXHR.setRequestHeader("Access-Control-Max-Age", "86400"); // cache for 1 day
// doiXHR.withCredentials = "true";
doiXHR.send();
}
doiInfo('10.1002/bies.201000071')
In the browser console from crossref.org I get
[
{
"doi": "http://dx.doi.org/10.1002/bies.201000071",
"score": 18.623272,
"normalizedScore": 100,
"title": "The phage-host arms race: Shaping the evolution of microbes",
"fullCitation": "Adi Stern, Rotem Sorek, 2010, 'The phage-host arms race: Shaping the evolution of microbes', <i>BioEssays</i>, vol. 33, no. 1, pp. 43-51",
"coins": "ctx_ver=Z39.88-2004&rft_id=info%3Adoi%2Fhttp%3A%2F%2Fdx.doi.org%2F10.1002%2Fbies.201000071&rfr_id=info%3Asid%2Fcrossref.org%3Asearch&rft.atitle=The+phage-host+arms+race%3A+Shaping+the+evolution+of+microbes&rft.jtitle=BioEssays&rft.date=2010&rft.volume=33&rft.issue=1&rft.spage=43&rft.epage=51&rft.aufirst=Adi&rft.aulast=Stern&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.au=Adi+Stern&rft.au=+Rotem+Sorek",
"year": "2010"
}
]
Running it from my website (not https) I get
OPTIONS http://search.crossref.org/dois?q=10.1002%2Fbies.201000071 404 (Not Found)
XMLHttpRequest cannot load http://search.crossref.org/dois?q=10.1002%2Fbies.201000071. Invalid HTTP status code 404
The GET/OPTIONS issue aside, it definitely seems to get a 404 on the page, which doesn't seem right.
I think you can get around it with an iframe and window.postMessage(?) but that sounds messy.
Please comment if I can provide more details and I'll be happy to, doesn't seem like anyone's done this before online - hopefully not because it's impossible!
Answering the title of your question: yes it allows Cross-Origin requests. A 404 indicates a wrong resource. Cross-origin problems would give you a 401.
The allow-origin header indicates that the resource can be accessed from all locations. Take a look at my working example: http://pastebin.com/8W23P48Z

ServiceStack AutoQuery, Implicit/Explicit Queries

I have the following Request DTO:
[Route("/processresults")]
public class FindProcessResults : QueryBase<ProcessResult, ProcessResultDto> {}
ProcessResult has a property named Id (Int32). I have two ProcessResults in my database, Id 1 and 2.
When I perform a GET to /processresults?Id=1 I get a single ProcessResult returned. Great.
However when I POST this JSON I get two ProcessResults returned. The query is not executing. When I add the property Id to FindProcessResults the JSON call does work, however I have not set EnableUntypedQueries to false.
PostData: {"Id":"1"}
What could be the issue here?
Bonus points, if I make a POST with Form Data I get the following exception:
{
ResponseStatus: {
ErrorCode: "RequestBindingException",
Message: "Unable to bind request",
StackTrace: " at ServiceStack.Host.RestHandler.CreateRequest(IRequest httpReq, IRestPath restPath)\ \ at ServiceStack.Host.RestHandler.ProcessRequestAsync(IRequest httpReq, IResponse httpRes, String operationName)"
}
}
However if I do the same (a post) with x-www-form-urlencoded the query works as intended (returns a single result).
Conclusion: Whilst I can resolve this issue by adding the parameter I wish to query by (Id) to the typed request, this defeats the purpose of what I am trying to achieve, a generic query mechanism for my data store. The functionality already exists for the GET version of the request.
I believe it is to do with the implementation of AutoQueryServiceBase:
public virtual object Exec<From>(IQuery<From> dto)
{
SqlExpression<From> q;
using (Profiler.Current.Step("AutoQuery.CreateQuery"))
{
q = AutoQuery.CreateQuery(dto, Request.GetRequestParams());
}
using (Profiler.Current.Step("AutoQuery.Execute"))
{
return AutoQuery.Execute(dto, q);
}
}
This is using Request.GetRequestParams() which will return parameters from the Query String or the Form Parameters, whilst a JSON request is attempting to be deserialized into <From> dto. The From type FindProcessResults has no Id property and so it is not populated and passed to the Query.
Requested HTTP Request/Response:
Request
POST /processresults HTTP/1.1
Host: localocl
Accept: application/json
Content-Type: application/json
Cache-Control: no-cache
Postman-Token: 36d4b37e-0407-a9b3-f2f2-5b024d7faf7f
{"Id":1}
Response
Cache-Control → private
Content-Length → 1580
Content-Type → application/json; charset=utf-8
Date → Mon, 03 Nov 2014 21:20:43 GMT
Server → Microsoft-IIS/8.5
Vary → Accept
X-AspNet-Version → 4.0.30319
X-Powered-By → ServiceStack/4.033 Win32NT/.NET, ASP.NET
{"Offset":0,"Total":2,"Results"....
You should strongly consider using GET requests for consuming AutoQuery services which is more appropriate HTTP Verb to use, that's also more cacheable and introspectable.
If you want to POST and you don't want to use a HTML Form POST (i.e. x-www-form-urlencoded Content-Type) you will need to formalize the parameters by adding them to the Request DTO:
[Route("/processresults")]
public class FindProcessResults : QueryBase<ProcessResult, ProcessResultDto>
{
public int Id { get; set; }
}
Otherwise it will try to deserialize the JSON into an empty DTO where any non-existing properties are ignored.

Resources