In order to try and get to a resolution about web page compression, I'd like to pose the question to you 'gurus' here in the hope that I can arrive at some kind of clear answer.
The website in question: http://yoginiyogabahrain.com
I recently developed this site and am hosting it with Hostmonster in Utah.
My reasons for constructing it as a one page scrollable site was based around the amount of content that does not get updated - literally everything outside of the 'schedule' which is updated once a month. I realise that the 'departments' could have been displayed on separate pages, but felt that the content didn't warrant whole pages devoted to their own containers which also requires further server requests.
I have minimised the HTML, CSS and JS components of the site in accordance with the guidelines and recommendations from Google Page Speed and Yahoo YSlow. I have also applied server and browser caching directives to the .htaccess file to complete further recommendations.
Currently Pingdom Tools rates the site at 98/100 which pleases me. Google and Yahoo are hammering the site on the lack of GZIP compression and, in the case of Yahoo, the lack of CDN usage. I'm not so much worried about the CDN as this site simply doesn't warrant a CDN. But the compression bothers me in that it was initially being applied.
For about a week, the site was being GZipped and then it stopped. I contacted Hostmonster about this and they said that if it was determined that there were not enough resources to serve a compressed version of the site, it would not do so. But that doesn't answer the question about whether it would do so if the resources detrmined it could. To date, the site has no longer been compressed.
Having done a lot of online research to find an answer about whether this is such a major issue, I have come across a plethora of differing opinions. Some say we should be compressing, and some say it's not worth the strain on resources to do so.
If Hostmonster have determined that the site doesn't warrant being compressed, why do Google and Yahoo nail it for the lack of compression? Why does Pingdom Tools not even take that aspect into account?
Forgive the lengthy post, but I wanted to be as clear as possible about what I'm trying to establish.
So in summary, is the lack of compression on this a major issue or would it be necessary to perhaps look at a hosting provider who will apply compression without question on a shared hosting plan?
Many thanks!
Related
My webhost is aking me to speed up my site and reduce the number of files calls.
Ok let me explain a little, my website is use in 95% as a bridge between my database (in the same hosting) and my Android applications (I have around 30 that need information from my db), the information only goes one way (as now) the app calls a json string like this the one in the site:
http://www.guiasitio.com/mantenimiento/applinks/prlinks.php
and this webpage to show in a web view as welcome message:
http://www.guiasitio.com/movilapp/test.php
this page has some images and jquery so I think this are the ones having a lot of memory usage, they have told me to use some code to create a cache of those files in the person browser to save memory (that is a little Chinese to me since I don't understand it) can some one give me an idea and send me to a tutorial on how to get this done?. Can the webview in a Android app keep caches of this files?
All your help his highly appreciated. Thanks
Using a CDN or content delivery network would be an easy solution if it worked well for you. Essentially you are off-loading the work or storing and serving static files (mainly images and CSS files) to another server. In addition to reducing the load on your your current server, it will speed up your site because files will be served from a location closest to each site visitor.
There are many good CDN choices. Amazon CloudFront is one popular option, though in my optinion the prize for the easiest service to setup is CloudFlare ... they offer a free plan, simply fill in the details, change the DNS settings on your domain to point to CloudFlare and you will be up and running.
With some fine-tuning, you can expect to reduce the requests on your server by up to 80%
I use both Amazon and CloudFlare, with good results. I have found that the main thing to be cautious of is to carefully check all the scripts on your site and make sure they are working as expected. CloudFlare has a simple setting where you can specify the cache settings as well, so there's another detail on your list covered.
Good luck!
I have this question. If I am designing a web site which is expected to have high-traffic, then what are the things I should keep in mind?
Thanks
Be careful about your database management.
Build your database tables, and links between tables keeping in mind that you do not want to search / load useless things.
Once the site is working, I would optimise it as much as possible. Tools like YSlow or WebPageTest make it easy to analyse a page and pinpoint bottlenecks and places for improvements.
Also for a high volume site, I think that you definitely want to use a content delivery network. There are lots of options, including Amazon CloudFront and CloudFlare. Using a CDN will reduce the load on your server by 60-80%, it will make the site faster and it will cost you hardly anything.
Unless there is a specific reason why it's not a good fit for your site, you can't go wrong!
Good luck!
I am migrating the javascript of my site to YUI3 and am considering using the YUI files hosted on Yahoo's cdn.
As my site attracts a high amount of traffic I wondered whether anybody had experience of using the cdn and whether there were any problems they experienced or lessons they learnt.
Ideally I would love to offload the bandwidth to Yahoo but am a little concerned that I add a little risk by not being in control.
Any opinon appreciated.
It is totally rock solid, at least in my experience. The underlying platform is the same cdn that we (flickr) use, as well as all other Yahoo sites.
Trying to find a solution to this, we have our LMS Server, and content servers all across the US, so the user gets their content from the closest location.
I've come across a solution using SCO-Fetcher, mentioned in these two links below and illustrated below, but I cannot find any information on how to implement a similar solution.
here: http://elearningrandomwalk.blogspot.com/2006/08/sco-fetcher.html
and here: http://www.adlnet.gov/Technologies/scorm/SCORMSDocuments/SCORM%20Resources/ADL_CrossDomainScripting_1_0.pdf
If anyone has any thoughts or information regarding this, it would be most appreciated.
I work for a content provider who has had to interface with a lot of different LMSs and cross-domain has always been a painful issue.
The document you linked to not-withstanding, SCORM doesn't really cater for cross-domain at all. My experiences with cross-domain has been against the AICC standard. In the past we've used a signed java applet to perform the cross-domain communications, but currently we are using a little hidden flash* SWF file which we talk to via javascript. This requires the LMS to have a crossdomain.xml file installed on their web server to allow the communication, which some of our customers balk at.
* Our product heavily uses flash already, so this was not an onerous requirement for us.
The solution that we are seriously considering now is a variation on the "Run-time service on Content Server" as suggested in section 4.8 of the cross-domain scripting document.
The content server would run the courseware itself, as if it were an LMS, and proxy all the API calls to the real LMS. The diagram below shows the communication paths:
Also, your launch URL wouldn't be directly to the content (e.g., http://abc.com/content/sco.html) but to the software application on the content server (e.g., http://abc.com/access.php?content=sco.html&permissions=OAUTH_ID), which would then serve the content as if it was an LMS itself.
Also, just came across this article on cross-domain communication which, while not SCORM specific, might provide some alternative ideas for implementation.
I know this is an old question, but thought I'd share: I had a similar situation a few years ago and settled on an iframe hack to get around the cross-domain restrictions. All it requires is a bit of JavaScript and HTML. It works on older browsers, including IE6.
Chuck, what solution did you wind up using?
I have this problem. I have web page with adult content and for several past months i had PPC advertisement on it. And I've noticed a big difference between Ad company statistics of my page, Google Analytics data and Awstats data on my server.
For example, Ad company tells me, that i have 10K pageviews per day, Google Analytics tells me, that i have 15K pageviews and on Awstats it's around 13K pageviews. Which system should I trust? Should i write my own (and reinvent a wheel again)? If so, how? :)
The joke is, that i have another web page, with "normal" content (MMORPG fan site) and those numbers are +- equal in all three systems (ad company, GA, Awstats). Do you think it's because it's not adult oriented page?
And final question, that is totally offtopic, do you know about Ad company that pays per impression and don't mind adult sites?
Thanks for the answers!
First, you should make sure not to mix up »hits«, »files«, »visits« and »unique visits«. They all have a different meaning and are sometimes called differently. I recommend you to look up some definitions if you are confused about the terms.
awstats has probably the most correct statistics, because it has access to the access.log from the web server. Unfortunately, a cached site (maybe cached by the browser, a proxy from an ISP or your own caching server) might not produce a hit on the web server. Especially if your site is served with good caching hints which don't enforce a revalidation and you are running your own web cache (e.g. Squid) in front of your site, the number will be considerable lower, because it only measures the work of the web server.
On the other hand, Google Analytics is only able to count requests from users which haven't blocked Google Analytics and have JavaScript enabled (but they will count pages served by a web cache). So, this count can be influenced by the user, but isn't affected by web caches.
The ad-company is probably simply counting the number of requests which they get from your site (probably based on their access.log). So, to get counted there, the add must not be cached and must not be blocked by the user.
So, as you can see, it's not that easy to get a single correct value. But as long as you use the measured values in comparison to those from the previous months, you should get at least a (nearly) correct rate of growth.
And your porn site probably serves a high amount of static content (e.g. images from the disk) and most of the web servers are really good at serving caching hints automatically for static files. Your MMORPG on the other hand, might mostly consist of some dynamic scripts (PHP?) which don't send any caching hints at all and web servers aren't able to determine those caching headers for dynamic content automatically. That's at least my explanation, without knowing your application and server configuration :)