By caching we basically mean, replicating data for faster access. For example -
Store freqeuently used data, from DB into memory.
Store static conents of Web page in the client browser.
Cloud hosting already uses closest DataCenter (CDN) to serve contents to the user. My question is, how does Caching Service makes it faster.
CDN is used to improve the delivery performance between your service datacenter and your customer, by introducing a transparent proxy datacenter that is nearer your customer. The CDN typically is set up to cache - such that requests from different customers can be serviced by the same "CDN answer" without calling the origin service datacenter. This configuration is predominantly used to offload requests for shared assets such as jpegs, javascript etc.
Azure Caching Service is employed behind your service, within your service datacenter. Unlike the built in ASP.NET cache, Azure Cache runs as a seperate service, and can be shared between servers/services. Generally your service would use this to store cross-session or expensive-to-create information - e.g. query results from a database. You're trading:
value of memory to cache the item (time/money)
cost (time/money) of creation of the item
number of times you'd expect to reuse the item.
"freshness" of information
For example you might use the memory cache to reduce the number of times that you query Azure Table, because you expect to reuse the same information multiple times, the latency to perform the query is high, and you can live with information potentially being "stale". Doing so would, save you money, and improve the overall performance of your system.
You'd typically "layer" the out-of-process Azure Cache with on-machine/in-process cache, such that for frequent queries you pull information as follows:
best - look first in local/on-box cache
better - look in off-box Azure Service Cache, then load local cache with result
good - make a call/query to expensive resource, load Azure Cache and local cache with result
Before saying anything I wanted to point you to this (very similar discussion):
Is it better to use Cache or CDN?
Having said that, this is how CDN and Caching can improve your website's performance.
CDN: This service helps you stay "closed" to your end user. Wit CDN, your websites content will be spread over a system of servers, each in its own location. Every server will hold a redundant copy of your site. When accessed by visitor, the CDN system will identify his/hers location and serve the content from the closest server (also called POP or Proxy).
For example: When visited from Australia your be server by Australian server. When visited from US you'll be server by US server and etc...
CDN will me most useful is your website operated outside of its immediate locale.
(i.e. CDN will not help you is your website promotes a local locksmith service that only has visitors from your city. As long as your original servers are sitting near by...)
Also, the overall coverage is unimportant.
You just need to make sure that the network covers all locations relevant to you day-2-day operations.
Cache: Provides faster access to your static or/and commonly used content objects. For example, if you have an image on your home page, and that image is downloaded again and again (and again) by all visitor, you should Cache it, so that returning visitor will already have it stored in his/hers PC (in browser Cache). This will save time, because local ressourses will load fasted and also save you bandwidth - because the image will load from visitor's computer and not from your server.
CDN and Caching are often combined, because this setup allows your to store Cache on the CDN network.
Also, this dual setup can also help improve Caching efficiency - For example it can help with dynamic Caching by introducing smart algorithms into the "top" CDN layer.
Here is more information about Dynamic Caching (also good introduction to HTTP Caching directives)
As you might already know, from reading the above mention post, no one method is better and they are at their best, when combined.
Hope this answers it
GL
Related
I'm trying to improve the latency of CDN. The particular website will be accessed all over the world. Earlier the website wasn't provided using CDN but because of the slowness, used CDN but still getting the JS file from the CDN is very slow for two people on the same part of the world. I am not talking about accessing on the first time. It is happening randomly very often.
If it happens randomly very often, it sounds like there's something wrong with your caching settings.
To confirm that a file is actually coming from CDN, investigate the network traffic in your browser. If the file came from CDN it will have a "X-Cache: HIT" response header.
If you can't reproduce it yourself, click a "Manage" button in your CDN profile. It will open a management window for your CDN and there you can see total "hits" and "misses" for your files, if there's a lot of misses, that will indicate you are having a lot of files not going through CDN and should investigate your caching settings.
Also you should see if your files come out from CDN compressed. Based on your CDN and server settings, it's possible that uncompressed files are cached in the CDN.
If everything looks ok, a way to speed up Azure CDN is to use Dynamic Site Acceleration in addition to caching. It helps with potentially faster SSL negotiation and other network improvements.
https://learn.microsoft.com/en-us/azure/cdn/cdn-dynamic-site-acceleration
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 2 years ago.
Improve this question
I'm currently running a file hosting site as a side project and I'm using Azure Storage to actually store and serve the files. The big issue for me is that I would like to be able to support relatively large files as well, which are really expensive to serve.
According to the pricing details for Azure outbound data transfers, it'll cost me $0.087 per GB to serve files to the user. This is okay for things like images, but if the user stores something like a 1 GB video, it'll cost me around 9 cents per person who wants to download the file. Even if I try to monetize the service, I cannot see how I can reasonably sustain these costs if the service ever becomes popular.
Does anyone have any suggestions or alternatives to reducing outbound data transfer costs?
Edit: As I come across helpful ways to reduce my costs, I'll update the list below:
Use a free CDN provider like Cloudflare. Specifically for me, I only enabled the CDN for files served through Azure Storage, because enabling it for my whole site would impose a 100MB file size upload restriction. One thing to note is that Cloudflare doesn't cache everything, so even though I'm covered for images, I'm still out of luck for many other media types that users might upload.
Compress uploaded files so that not as much bandwidth is used on outbound transfers.
If you're using cloud storage but host your website on a dedicated server with generous bandwidth, you can implement some kind of local cache and serve content directly from your cache, with the storage provider being a fallback on a cache miss. Unfortunately this isn't viable for me since I also host my site on Azure, and the outbound data transfer rates apply across their entire service stack.
Are all of your assets available publicly or do you have some kind of authentication before them? If they are available publicly then maybe a CDN would be an option here
You can try caching your content on the client. For scenarios where you are accessing static content such as photos or videos, having a cache set up could keep you from having to go to the server each time you need data.
We currently get web analytics for a WordPress site using WebTrends.
If we use a caching mechanism like Varnish, I would assume WebTrends would suddenly report a dramatic reduction in traffic.
Is this correct and, if so, can you avoid this problem and get the correct statistics reported by WebTrends?
In my experience, acceleration caches shouldn't interfere with your Analytics data capture, because the cached content should include all of the on-page data points (such as meta tags) as well as the WT base tag file, which the user's browser will then execute and which will then make the call to the WT data collection server.
By way of a disclaimer, I should add that I haven't got any specific experience with Varnish, but a cache that acts as a barrier to on-page JavaScript executing is basically broken, and I've personally never had a problem with one preventing analytics software from running.
The only conceivable problem I could foresee is if a cache was going to the extent of scanning pages for linked resources (such as the "no javascript" image in the noscript tag), acquiring those resources in advance, and then reconfiguring the page being served to pull those resources from the cache rather than the third party servers. In which case you might end up with spurious "no javascript" records in your data.
Just make sure that your varnish config is not removing any webtrends cookies and it should be percetly OK. By default it does not but if you use some ready-made wordpress vcl then it might be you will need to exclude these cookies together with the wordpress-specific ones in the configuration.
My webhost is aking me to speed up my site and reduce the number of files calls.
Ok let me explain a little, my website is use in 95% as a bridge between my database (in the same hosting) and my Android applications (I have around 30 that need information from my db), the information only goes one way (as now) the app calls a json string like this the one in the site:
http://www.guiasitio.com/mantenimiento/applinks/prlinks.php
and this webpage to show in a web view as welcome message:
http://www.guiasitio.com/movilapp/test.php
this page has some images and jquery so I think this are the ones having a lot of memory usage, they have told me to use some code to create a cache of those files in the person browser to save memory (that is a little Chinese to me since I don't understand it) can some one give me an idea and send me to a tutorial on how to get this done?. Can the webview in a Android app keep caches of this files?
All your help his highly appreciated. Thanks
Using a CDN or content delivery network would be an easy solution if it worked well for you. Essentially you are off-loading the work or storing and serving static files (mainly images and CSS files) to another server. In addition to reducing the load on your your current server, it will speed up your site because files will be served from a location closest to each site visitor.
There are many good CDN choices. Amazon CloudFront is one popular option, though in my optinion the prize for the easiest service to setup is CloudFlare ... they offer a free plan, simply fill in the details, change the DNS settings on your domain to point to CloudFlare and you will be up and running.
With some fine-tuning, you can expect to reduce the requests on your server by up to 80%
I use both Amazon and CloudFlare, with good results. I have found that the main thing to be cautious of is to carefully check all the scripts on your site and make sure they are working as expected. CloudFlare has a simple setting where you can specify the cache settings as well, so there's another detail on your list covered.
Good luck!
I have this problem. I have web page with adult content and for several past months i had PPC advertisement on it. And I've noticed a big difference between Ad company statistics of my page, Google Analytics data and Awstats data on my server.
For example, Ad company tells me, that i have 10K pageviews per day, Google Analytics tells me, that i have 15K pageviews and on Awstats it's around 13K pageviews. Which system should I trust? Should i write my own (and reinvent a wheel again)? If so, how? :)
The joke is, that i have another web page, with "normal" content (MMORPG fan site) and those numbers are +- equal in all three systems (ad company, GA, Awstats). Do you think it's because it's not adult oriented page?
And final question, that is totally offtopic, do you know about Ad company that pays per impression and don't mind adult sites?
Thanks for the answers!
First, you should make sure not to mix up »hits«, »files«, »visits« and »unique visits«. They all have a different meaning and are sometimes called differently. I recommend you to look up some definitions if you are confused about the terms.
awstats has probably the most correct statistics, because it has access to the access.log from the web server. Unfortunately, a cached site (maybe cached by the browser, a proxy from an ISP or your own caching server) might not produce a hit on the web server. Especially if your site is served with good caching hints which don't enforce a revalidation and you are running your own web cache (e.g. Squid) in front of your site, the number will be considerable lower, because it only measures the work of the web server.
On the other hand, Google Analytics is only able to count requests from users which haven't blocked Google Analytics and have JavaScript enabled (but they will count pages served by a web cache). So, this count can be influenced by the user, but isn't affected by web caches.
The ad-company is probably simply counting the number of requests which they get from your site (probably based on their access.log). So, to get counted there, the add must not be cached and must not be blocked by the user.
So, as you can see, it's not that easy to get a single correct value. But as long as you use the measured values in comparison to those from the previous months, you should get at least a (nearly) correct rate of growth.
And your porn site probably serves a high amount of static content (e.g. images from the disk) and most of the web servers are really good at serving caching hints automatically for static files. Your MMORPG on the other hand, might mostly consist of some dynamic scripts (PHP?) which don't send any caching hints at all and web servers aren't able to determine those caching headers for dynamic content automatically. That's at least my explanation, without knowing your application and server configuration :)