What exactly is Page Response Time? - visual-studio-2012

I use Visual Studio 2012 for doing some load testing in our web application, and one of the metrics we rely on is "Page Response Time". I found this article, but it doesn't really answer the question. I also found a Stack Overflow question that gives some details, but isn't quite what I'm looking for.
The software I'm testing is a web interface that uses a relational database back-end. If I look at the Total Average Page Response Time for a specific test, what exactly is that telling me?
Does it take into account the time that the database spent processing the request? Or just the time that the web page took to be processed on the web server and client machine?
Ultimately, I'm looking for a metric that would tell me the total response time and would also take into account time spent within SQL finding the information.

Is the time that takes to server to send all resources (or the time to send the last one) to client browser (this is why is bandwidth dependent). for example in the picture you can see the stack overflow response time (big red line) using chrome developer tool (press f12)

Related

How much bandwith is needed for my website

i have a website what can be used by 50 users at the same time. Those users will be in the same room.
My problem is to know how much bandwith (in Mb/s) do I need to rent for that room so that they can access my website comfortably (speed up and down) ?
The average page size of my website is 1MB.
I searched for answers on the internet and all I got was bandwith used in a month (for servers).
Sorry if my question is "vague", I did my best to make it clear.
Thank you in advance for your answers.
Using https://gtmetrix.com/ you can test your websites speed, page size, and load times
There are several alternatives you just have to do the research
The more important issue you should focus on is why your page is 1Mb that should be your first priority to resolve and using tools like gtmetrix can help
I recommend load testing your site to figure that out. If you're at all familiar with JMeter, you can use it to create a script that simulates a user navigating your site, then run multiple instances of that user (in your case, 50) to see how the site holds up under load.
You can learn more about JMeter here:
https://jmeter.apache.org/
If you're not familiar with creating JMeter scripts, you can record and auto-generate basic scripts using the Blazemeter Chrome Extension, here.
For low-load testing (50 users is pretty low), you can upload your JMeter script to Blazemeter, and with a free tier Blazemeter account, you can perform some basic tests to see how your site holds up. If you go that route, I recommend focusing on avg. response time and hits/second in order to determine what your bandwidth need truly is under load.

How important it is to measure the page render/load time in web application [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
When we test for the Performance of an Web Application what generally people concentrate on ?. Is it the http response time ?. Or is it the time page takes to load/render completely on client browser once after it receives the response from the Server ?.
What is measured generally across the industry ?. Do you have any recommendations in terms which should be done when ?
Do you have any tool recommendations for the same ?.
Can I use the Visual Studio Web Tests to measure the performance in terms of Web page load/render time once after client receives the response. or its just the http response time ?.
In three words : Performance really matters !
My golden rule is pretty simple : You have to measure everything and optimize everything. It's not only a pure tech challenge, but also concerns your business team. Here are some classic exemples from Velocity Conf.
Bing – A page that was 2 seconds slower resulted in a 4.3% drop in revenue/user.
Google – A 400 millisecond delay caused a 0.59% drop in searches/user.
Yahoo! – A 400 milliseconds slowdown resulted in a 5-9% drop in full-page traffic.
Shopzilla – Speeding up their site by 5 seconds increased the conversion rate 7-12%, doubled the number of sessions from search engine marketing, and cut the number of required servers in half.
Mozilla – Shaving 2.2 seconds off their landing pages increased download conversions by 15.4%, which they estimate will result in 60 million more Firefox downloads per year.
Netflix – Adopting a single optimization, gzip compression, resulted in a 13-25% speedup and cut their outbound network traffic by 50%.
What is measured generally across the industry ?. Do you have any
recommendations in terms which should be done when ?
From Steve Souders, pioneer in Web Performance Optimization, "80-90% of the end-user response time is spent on the frontend" Start here first : Too many requests, non-optimized images, un-minified content (js/css), do not distribute static throught a cdn are common errros.
On the other hand, do not forget your backend, because this part really depends on load & activity. Some sites are paying the largest amount of performance tax due to backend issues. As the page generation time increases proportionally to the user load, You have to find the throughput peak of your app and check if it's ok with your -own- SLA.
Do you have any tool recommendations for the same ?
There is no magic tool that covers all topics, but many great tools that will help for a specific part of your app.
Page Rendering : Google Chrome SpeedTracer or IE 11 UI Responsiveness tool
FrontEnd : PageSpeed, YSlow, WebPageTest.org (online), GtMetrix(online), Pingdom (online)
Backend : asp.net Mini-Profiler, Glimpse, Visual Studio Profiler & Visual Studio Web/Load Tests
Google Analytics for RUM (Real User Monitoring)
Can I use the Visual Studio Web Tests to measure the performance in
terms of Web page load/render time once after client receives the
response. or its just the http response time ?.
No, Visual Studio Web & Load Test focus only on HTTP request. Javascript is not executed and virtual users are not virtual browsers : it's impossible to measure page laod/redner time. In my company, we use it only for integration tests and load testing.
If you want to read more, you can look at this post (disclamer : I am the author).
Another interested link is from Jeff Atwood (co-founder of StackOverflow), Performance is a feature.
Performance is a vast topic, and I only cover here only a small part, but you have a good starting point.

Android Market Developer Console Statistics

I would like to get some stats programmatically about my app; for example: total downloads, active installs...
I wish google would give an api for doing this. What do people do?
There currently is no API, and it is possible to get these statistics, but it's quite complicated (you have to get an AuthSub token for the market developer service, and then do the right requests and parse a number of GWTRPC encoded responses. These GWTRPC responses change everytime they launch a new version of the android market dashboard, so be prepared to monitor frequently whether this has happened).
In a recent chat with one of the Android folks from Google I heard that they are aware of this issue and know that it's a very popular demand by developers, so hopefully there will be a better way soon.

Recently my site has been slow and Ive been getting timeout messages in browser - whats wrong?

How can I diagnose timeout problems and slow page loading with my site, I have ySlow plugin in firefox and it shows that grade A/B for most pages so i would expect pages to load quickly. Should I contact my hosting company? The company I bought my domain name from? There is not much load on the server at present and I am using a v. fast connection to connect to the internet.
wheres a good place to start? How can i monitor this when we start seeing more traffic? Should hosting company be doing this?
The first step is to establish whether the problem is client-side or server-side.
A good YSlow grade indicates the problem probably isn't clientside. YSlow checks to see that you don't have too many objects on the page, that you have minified your javascript/CSS etc. It does not evaluate the performance of your network or server.
Using YSlow/Firebug, check to see how long it takes to load the actual HTML of your page. If that is taking a long time, then the problem is almost certainly with your server, network or server-side code.
To rule out network issues, compare accessing your site from the server itself to accessing it over the internet. If it's a lot slower over the internet the problem could be network-related.
If it's not client-side or network-related, then it's either that your server is struggling for resources or that your code is slow (perhaps because the amount of data it is mananaging has grown).In that case, check the server logs and run a profiler on your code (on a development server but with a copy of production data).
Tools like YSlow will point out some opportunities for optimization but they don't acrually measure performance and they don't look at how long it takes for things to happen.
Try something like WebPagetest which will give you a browser-view of the page loading and you can work through the waterfall to see where the time is going.
If you are seeing timeouts then it's probably a back-end problem (will be pretty clear in the waterfall) and you're going to need to instrument your server to figure out where the time is going. If it's a dedicated server or VPS then you can install something like New Relic and it will point out the problem pretty quickly. If you are on shared hosting then you're going to have to add logging to your app directly (there are plugins that can do this if you are running something like Wordpress).
The first place to look would be the server logs , that should provide you a clue as to what is happening and how much time a request is taking in general .
If the server is returning fine and the page is taking long because of client side code , you might want to use the Firebug profile to profile your page and find out more .
Hope this helps .
Want kind of pages are you trying to load? Plain html or scripts like PHP? If plain html I guess its your hosting company.

What is a good benchmark test for load testing a production system running IIS?

I've noticed a few posts regarding stress testing IIS, but I'm more interested in knowing a good way to establish a bench mark of what my physical web server can handle.
I'm working on a production system right now that seems to be slowing to a near halt on a daily basis, and no one can seem to figure out what is causing the issue. This is your standard N-tier set up (client, web server, db server). I have created some simple ASP.NET pages that can do a few simple things that I'm using to establish a benchmark for stress testing that I can later user when comparing with the production system that is already in place.
I have already built a sample aspx page that simply returns back some web server statistics about the box utilizations, nothing heavy, maybe a few lines of code. I've also created a simple web service (asmx) that will test a DB connection given the correct id, again nothing heavy just opening a db connection and a sample query that doesn't really do anything.
What I would like to know is, what is a good stress level that the following tests should operate under? In particular, I'm using Microsoft's Web Application Stress Tool, and I'm curious what most people would expect the Stress Level and the Stress Multiplier acceptable ranges should be. I already know at what point the web server starts buckling, its around 20 Stress Level and 10 Stress Multiplier give or take, which results in about ~275 page requests / sec. I'm trying to find out if that number is way too low, and if that's some sort of indication that there's an issue between the client and the web server or a hardware issue of some sort, or if it's more of an indication that there's nothing wrong and that the system is just too heavy.
UPDATE: Since I originally wrote this post the code has been moved to a server running Windows 2003 utilizing IIS 7.
What I'd like to know now is if I set up a relatively simple test, such as a basic index.html file that just serves up a small page, what would you expect the "best" page requests / sec I could achieve from a stress test?
What I'd like to know now is if I set
up a relatively simple test, such as a
basic index.html file that just serves
up a small page, what would you expect
the "best" page requests / sec I could
achieve from a stress test?
I cannot talk about Microsoft's Web Application Stress Tool, but it should be possible to thrash a static html page hosted on IIS until either the network between the test machine and the server floods or the test server runs out of CPU to create and compile the tests. All ISS would do is send out the same cached response every time.
To get a good profile of how your site is working, you do need to build up a good representation of the actual traffic hitting the site. Once you have the modelled, you can beging scaling up the numbers of people using it to explore what pages are slowest and what server resources are first against the wall.
Either you will have one or more pages whose performance is unacceptable for a given load or some server statistic like disk usage or database performance is looking over utilised.
Unless you are testing on a site that is hosted on the same architecture as your production site, it is hard to make architecture recommendations.
With underperforming pages it is easier to make improvements as the code is available to analyse.
I recommend using Visual Studio Ultimate edition to load test if you can get it. It also gives some ASP.NET profiling tools to get to grips with sections of code and database calls.

Resources