Varnish stats only for pages - varnish

On my server I'm usign Varnish 3.
If I run "varnishstats" I can view stats, but the count of "missed" and "hits" is related to every resource served by Varnish.
Is there a way to obtain stats ONLY for "pages"?
Thanks!

No varnishstat does not support any filtering.

Related

How to extract value from URL and check cache to load data in varnish

I have a scenario where my URL will be either contains a comma delimiter with value or without.
i.e. /api/parameters/XXXXXXXXXX?tables=x0 or tables=x0;x1;x2.
now based on this URL I want to check in the varnish that, if URL contains multiple values as tables then separate that out and pass each table name in seperate URL (/api/parameters/XXXXXXXXXX?tables=x0, /api/parameters/XXXXXXXXXX?tables=x1, /api/parameters/XXXXXXXXXX?tables=x2) either to cache if miss then backend server.
then based on the response of this need to combine the result and return it to the client.
my question here is:
How to segregate the value from the URL and pass a modified URL to varnish cache or backend.
after returning the result I want to return it as a combined JSON object in a sequence of which it was originally requested with a comma delimiter(i.e. x0 result;x1 result;x2 result).
It is possible to turn a single request into multiple subrequests in Varnish. Unfortunately this cannot be done with the open source version, only with the Enterprise version.
vmod_http
https://docs.varnish-software.com/varnish-cache-plus/vmods/http/ describes how you can perform HTTP calls from within Varnish using vmod_http.
By sending HTTP requests to other URLs through Varnish, you can get multiple objects out of the cache and aggregate them into a single response
No looping
The fact that Varnish doesn't have loops makes matters a bit more complicated. You'll have so set an upper limit to the amount of values the tables querystring parameter has and you'll have to check the values using individual if-statements.
Returning the combined JSON output
Once you have fetched the results from the various URLs, you can create a JSON string and return it via return(synth(200,req.http.json)). Where req.http.json contains the JSON string.
This will create a synthetic response.
In Varnish Enterprise it is also possible to cache synthetic output. See https://docs.varnish-software.com/varnish-cache-plus/vmods/synthbackend/ to learn more about vmod_synthbackend.
Varnish Enterprise disclaimer
The solution I suggested in my answer uses Varnish Enterprise, the commercial version of Varnish. It extends Varnish capabilities with additional VMODs and features, which you can read about here. One easy way to try it out without upfront licensing payments, if you’re interested, is to spin up an instance on cloud infrastructure:
Varnish Enterprise on AWS
Varnish Enterprise on Azure
Varnish Enterprise on GCP

How to set SSL versions in script when there are multiple URLs in a concurrent group of requests in a Single script?

There are 2 different URLs in a script that I have recorded and each use a different version of SSL. There is a concurrent group inside the script which has requests with both the URLs. How do I set the SSL version for them without removing the concurrency part?
I have tried using WinInet mode for replay which solved the issue. But I need to measure the response time for each URL and I cannot achieve it using WinInet mode as it doesn't generate the Web Page Diagnostics graph.
I've also tried creating automatic transactions but I couldn't see any of them in the results summary.
If you have access to the servers involved, then enable the time-taken HTTP log field. If you are running IIS, which is a good chance with WinInet, then the default log model for IIS will give you what you need.
At the conclusion of your tests, pull the logs. Use Microsoft logparser (staying with the Microsoft theme), to pull the min, max, avg time-taken values, grouped by request and filtered on the IP addresses of your load generators.
it would be interesting to know which SSL versions you have.
the following function let you set the SSL version before the URL call:
web_set_sockets_option("SSL_VERSION", "put your tls version here");
accepted are TLS1, TLS1.1, TLS1.2 and more.
See the Help hitting F1 on the function to get more information.

Multiple url purging Varnish issue

I have an issue with varnish purging :
Our application is very dynamic .
So an event on Object A , will generate 10.000 Purges because Object A infos are present in all pages.
Object A is stats seller and Page are ads page .
We are managing this by an asynch http PURGE call to varnish from the php code using curl .
So we will have 10000 http call
The urls cannot be calculated (so REgex is not an options )
I want to ask you guys , is there any possibility in varnish to do some Batch Purging (HTTP interface) ?
If not , what's the options that you test and works in a very dynamic application when model and events affect a lot your pages .
Thanks in advance
Nabil
Running the purges through varnishadm would be your best bet. You could either tunnel commands through SSH (assuming you are dealing with a remote Varnish server) or allow remote access from your Web server to the Varnish server.
You can easily write your own shell script to run a batch purge using varnishadm or you could take a look at Thinner, which is a Ruby based purger written to do exactly what you're looking for.
The obvious alternative, which you have most likely considered already, is to re-write your App to include Object A in the URL or in a custom header (for example X-Object: A), so you could do the ban based on that header:
sub vcl_recv {
if (req.request == "BAN") {
ban("obj.http.x-object == " + req.http.x-object);
}
}

Does browser timeout on huge request?

Assume that I use web browser to upload a huge file(maybe several GB), it may take hours for all the data to be transferred to server. Assume the server has no limit on the size of file upload, it just keeps engulfing data. Will the browser work earnestly for hours till all the data is transferred? Or browser will prompt certain error after some time? Or a browser-specific issue?
A request will always timeout at some point, regardless of the web server you are using (Apache, IIS, etc.) The defaults are usually a couple of minutes, so you will need to increase those. There are also other limits such as maximum file size that you would have to increase. An example of doing this with PHP/Apache can be found here:
Increase max_execution_time in PHP?
The browser will give an error page request timeout

cache external files eg, i.ytimg.com/vi/#code#/0.jpg with apache .htaccess?

As topic is it possible to set cache on external resources with htaccess.
I have some third party stuff on my site eg, google web elements and embedded youtube clips.
I want my google page speed to get higher.
error code from page speed:
The following resources are missing a cache validator.
http://i.ytimg.com/vi/-MfM1fVSFnM/0.jpg
http://i.ytimg.com/vi/-PxVKNJmw4M/0.jpg
http://i.ytimg.com/vi/3nxENc_msc0/0.jpg
http://i.ytimg.com/vi/5Bra7rbGb7g/0.jpg
http://i.ytimg.com/vi/5P76PKybW5o/0.jpg
http://i.ytimg.com/vi/9l9BzKfI88o/0.jpg
http://i.ytimg.com/vi/E7hvBxMB4XI/0.jpg
http://i.ytimg.com/vi/IiocozLHFis/0.jpg
http://i.ytimg.com/vi/JIHohC8fydQ/0.jpg
http://i.ytimg.com/vi/P66uwFpmQSE/0.jpg
http://i.ytimg.com/vi/TXLTbARnRdU/0.jpg
http://i.ytimg.com/vi/bPBrRzckfEQ/0.jpg
http://i.ytimg.com/vi/dajcIH9YUuI/0.jpg
http://i.ytimg.com/vi/g4roerqw090/0.jpg
http://i.ytimg.com/vi/h1imBHP3DdA/0.jpg
http://i.ytimg.com/vi/hRvW5ndLLEk/0.jpg
http://i.ytimg.com/vi/kzahftbo6Qc/0.jpg
http://i.ytimg.com/vi/lta2U3hkC4k/0.jpg
http://i.ytimg.com/vi/n1o9bGF88HY/0.jpg
http://i.ytimg.com/vi/n3csJN0wXew/0.jpg
http://i.ytimg.com/vi/q0Xu-0moeew/0.jpg
http://i.ytimg.com/vi/tPCDPKirZBM/0.jpg
http://i.ytimg.com/vi/uLxsPImMJmg/0.jpg
http://i.ytimg.com/vi/x33B_iBn2_M/0.jpg
No, it's up to them to cache it.
The best you could do would be to download them onto your server and then serve them, but that would be slower anyway!
Nope, setting cache settings for third parties is not possible unless you start passing those resources through on your server as a proxy, which you usually don't want for reasons of speed and traffic.
As far as I can see, there's nothing you can do here.
You could delay your Youtube videos from loading on the page until something like a holding image is clicked. This wouldn't cache these images when (or if) they are loaded, but they wouldn't detrimentally affect your Page Speed because they wouldn't be loaded on page load any more.

Resources