Recently my site has been slow and Ive been getting timeout messages in browser - whats wrong? - browser

How can I diagnose timeout problems and slow page loading with my site, I have ySlow plugin in firefox and it shows that grade A/B for most pages so i would expect pages to load quickly. Should I contact my hosting company? The company I bought my domain name from? There is not much load on the server at present and I am using a v. fast connection to connect to the internet.
wheres a good place to start? How can i monitor this when we start seeing more traffic? Should hosting company be doing this?

The first step is to establish whether the problem is client-side or server-side.
A good YSlow grade indicates the problem probably isn't clientside. YSlow checks to see that you don't have too many objects on the page, that you have minified your javascript/CSS etc. It does not evaluate the performance of your network or server.
Using YSlow/Firebug, check to see how long it takes to load the actual HTML of your page. If that is taking a long time, then the problem is almost certainly with your server, network or server-side code.
To rule out network issues, compare accessing your site from the server itself to accessing it over the internet. If it's a lot slower over the internet the problem could be network-related.
If it's not client-side or network-related, then it's either that your server is struggling for resources or that your code is slow (perhaps because the amount of data it is mananaging has grown).In that case, check the server logs and run a profiler on your code (on a development server but with a copy of production data).

Tools like YSlow will point out some opportunities for optimization but they don't acrually measure performance and they don't look at how long it takes for things to happen.
Try something like WebPagetest which will give you a browser-view of the page loading and you can work through the waterfall to see where the time is going.
If you are seeing timeouts then it's probably a back-end problem (will be pretty clear in the waterfall) and you're going to need to instrument your server to figure out where the time is going. If it's a dedicated server or VPS then you can install something like New Relic and it will point out the problem pretty quickly. If you are on shared hosting then you're going to have to add logging to your app directly (there are plugins that can do this if you are running something like Wordpress).

The first place to look would be the server logs , that should provide you a clue as to what is happening and how much time a request is taking in general .
If the server is returning fine and the page is taking long because of client side code , you might want to use the Firebug profile to profile your page and find out more .
Hope this helps .

Want kind of pages are you trying to load? Plain html or scripts like PHP? If plain html I guess its your hosting company.

Related

Does Yandex Metrica Slow down the site speed?

I am using Yandex Metrika for my site Thegoldlive.com and facing a Core Web Vitals issue due to it. I believe it's the main reason due to why my site is getting slow. Any way to get rid of it or should I remove this from the site?
When I remove it from the site, the speed of my site gets better. But, I don't want to remove it because it helps me analyze the visitors on site in the best manner. So that's why asking, is there any way to keep both things parallel?
Running your site through PageSpeed Insights it appears your issues are with loading time (TTFB, FCP, and LCP) and shifting content (CLS).
I'm not familiar with Yandex Metrika, but it seems unlikely an analytics solution will slow down these metrics. Mostly they affect responsiveness metrics like FID and INP.
I can't quite see the reason for slow TTFBs (it seems fast to me!), which will directly affect the other loading metrics. You seem to be using a CDN (cloudflare) and the server response time from lab tests seems fast.
It could be you just get a lot of visitors from slow networks/devices? If so one thing that can help here is ensuring sites are eligible for the Back/Forward cache, so at least they get a fast (instant!) load when going back and forwards within the site. Testing your site for this shows your site is using an unload handler, meaning you can't benefit from this performance gain. It looks like you are using Cloudflare's Rocker Loader - ironically something that's supposed to improve performance but that might be holding you back here. I'd turn that off.
For your layout shift issues (CLS), it's must more obvious. You have an advertisement that pops in and out and pushes all the content down. You'd be better to reserve a block of white space for that to slot into, rather than have it dynamically inserted and moving the text around, which is an irritating experience for site visitors.

Hybrid App Development, Database-Driven Content

I've been doing a lot of research, and perhaps just need a few dots connected.
I have an idea for a mobile app/website that contains lists of local eating/drinking establishments along with the deals/specials they offer each day. The idea is to create an app that people can refer to in order to save money on a night out.
I'm familiar enough with HTML/CSS/JS to create a functioning website, but when it comes to backend I'm a little confused. Editing the markup in order to reflect changes (e.g. a new deal starts or new establishment opens up) is a bit cumbersome. Now I know I want a database with my information ready to be displayed on my page. Does this mean that I need to develop my own API for everything, and then make sure it integrates with the hosting website that I end up choosing?
I feel like I'm missing something that should make it obvious what the next step is. Can anyone offer any advice?
The short answer is yes, you are exactly right.
The long answer is that is definetly one way to do it. But, for large projext just using JS can get quite cumbersomoe on your client end. Usually the first level would be using something like ajax. It's a great way to start and you can go a long way with just ajax. This is acutually where most people "start" when using just javascript to make api calls. The next level would be to use a framework like Angular. This will of course do more for you than just help handle api calls and it requires a larger investment in learning.
So that is all client side...
Now for the server side part... When you publish a website you are now dealing with "server-side" content. You have taken your static content and it is served up from the server but it's always the same static content from the server then it becomes dynamic on the client when all the javascript starts getting parsed.
The API is another server side component. But instead of being static like your pages, a bunch of files just sitting there, it is an actual application on the server. It takes a command via an api request and then does its thinking and then spits out a response object dynamically to the requester, which in this case will be the JS on your site.
Now, if you don't like the idea of learning to make your own API there are resources out there that will host an api for you and give you a gui to build your own API. I can't recommend one because I have never used one, but I do work with businesses that do and they love the fact they don't have to hire a dev to make thier apis. The downside is they are tied to that service and limited to the functionality that the service offers. It's not a big limitation as the services are quire powerful but if you are going to be managing complex data sets then it would probably be better to learn to make your own api.
Hope that clears things up a bit for you!

Using another server to store files: Good or bad idea?

I am thinking of using another "less" important server to store files that our clients want to upload and handling the data validation, copying, insertion, etc at that end.
I would display the whole upload thingy through iframe on our website and using HTML,PHP,SQL as syntax-languages for the thingy?
Now I would like to ask your opinions is this is a good or bad idea.
I´m figuring out that the pros and cons are:
**Pros:
The other server is "less" valuable, meaning if something malicious could be uploaded there it would not be the end of the world
Since the other server has less events/users/functionality/data it would help to lessen the stress of our main website server
If the less important server goes down the other functionality on main server would still be functioning
Firewall prevents outside traffic (at least to a certain point)
The users need to be logged through the main website
**Cons:
It does not have any CMS+plugins, so it might be more vunerable
It might generate more malicious traffic towards it.
Makes the upkeep of the main website that much more complicated for future developers
Generally I´m not found of the idea that users get to uploading files, but it is not up to me.
Thanks for your input. I´m looking forward to hearing your opinions.
Servers have file quotas and bandwidths defined/allocated for them.
If you transfer your "less" used files to another server ,it will help your main server to improve its performance.
And also there wont be much maintenance headaches with the main server if all files are uploaded there.
Conclusion : It is a good idea.
Well, I guess most importantly, you will need a single sign-on (SSO) solution in place between the two web applications. I assume you don't want user A be able to read or delete files from user B.
SSO between 2 servers is a lot more complicated than for a single web application. Unless this site is only deployed in an intranet with a Active Directory domain controller in which case you can use Kerberos.
I'm not sure it's worth it just for the advantages you name.

I need to speed up my site and reduce the number of files calls

My webhost is aking me to speed up my site and reduce the number of files calls.
Ok let me explain a little, my website is use in 95% as a bridge between my database (in the same hosting) and my Android applications (I have around 30 that need information from my db), the information only goes one way (as now) the app calls a json string like this the one in the site:
http://www.guiasitio.com/mantenimiento/applinks/prlinks.php
and this webpage to show in a web view as welcome message:
http://www.guiasitio.com/movilapp/test.php
this page has some images and jquery so I think this are the ones having a lot of memory usage, they have told me to use some code to create a cache of those files in the person browser to save memory (that is a little Chinese to me since I don't understand it) can some one give me an idea and send me to a tutorial on how to get this done?. Can the webview in a Android app keep caches of this files?
All your help his highly appreciated. Thanks
Using a CDN or content delivery network would be an easy solution if it worked well for you. Essentially you are off-loading the work or storing and serving static files (mainly images and CSS files) to another server. In addition to reducing the load on your your current server, it will speed up your site because files will be served from a location closest to each site visitor.
There are many good CDN choices. Amazon CloudFront is one popular option, though in my optinion the prize for the easiest service to setup is CloudFlare ... they offer a free plan, simply fill in the details, change the DNS settings on your domain to point to CloudFlare and you will be up and running.
With some fine-tuning, you can expect to reduce the requests on your server by up to 80%
I use both Amazon and CloudFlare, with good results. I have found that the main thing to be cautious of is to carefully check all the scripts on your site and make sure they are working as expected. CloudFlare has a simple setting where you can specify the cache settings as well, so there's another detail on your list covered.
Good luck!

What is a good benchmark test for load testing a production system running IIS?

I've noticed a few posts regarding stress testing IIS, but I'm more interested in knowing a good way to establish a bench mark of what my physical web server can handle.
I'm working on a production system right now that seems to be slowing to a near halt on a daily basis, and no one can seem to figure out what is causing the issue. This is your standard N-tier set up (client, web server, db server). I have created some simple ASP.NET pages that can do a few simple things that I'm using to establish a benchmark for stress testing that I can later user when comparing with the production system that is already in place.
I have already built a sample aspx page that simply returns back some web server statistics about the box utilizations, nothing heavy, maybe a few lines of code. I've also created a simple web service (asmx) that will test a DB connection given the correct id, again nothing heavy just opening a db connection and a sample query that doesn't really do anything.
What I would like to know is, what is a good stress level that the following tests should operate under? In particular, I'm using Microsoft's Web Application Stress Tool, and I'm curious what most people would expect the Stress Level and the Stress Multiplier acceptable ranges should be. I already know at what point the web server starts buckling, its around 20 Stress Level and 10 Stress Multiplier give or take, which results in about ~275 page requests / sec. I'm trying to find out if that number is way too low, and if that's some sort of indication that there's an issue between the client and the web server or a hardware issue of some sort, or if it's more of an indication that there's nothing wrong and that the system is just too heavy.
UPDATE: Since I originally wrote this post the code has been moved to a server running Windows 2003 utilizing IIS 7.
What I'd like to know now is if I set up a relatively simple test, such as a basic index.html file that just serves up a small page, what would you expect the "best" page requests / sec I could achieve from a stress test?
What I'd like to know now is if I set
up a relatively simple test, such as a
basic index.html file that just serves
up a small page, what would you expect
the "best" page requests / sec I could
achieve from a stress test?
I cannot talk about Microsoft's Web Application Stress Tool, but it should be possible to thrash a static html page hosted on IIS until either the network between the test machine and the server floods or the test server runs out of CPU to create and compile the tests. All ISS would do is send out the same cached response every time.
To get a good profile of how your site is working, you do need to build up a good representation of the actual traffic hitting the site. Once you have the modelled, you can beging scaling up the numbers of people using it to explore what pages are slowest and what server resources are first against the wall.
Either you will have one or more pages whose performance is unacceptable for a given load or some server statistic like disk usage or database performance is looking over utilised.
Unless you are testing on a site that is hosted on the same architecture as your production site, it is hard to make architecture recommendations.
With underperforming pages it is easier to make improvements as the code is available to analyse.
I recommend using Visual Studio Ultimate edition to load test if you can get it. It also gives some ASP.NET profiling tools to get to grips with sections of code and database calls.

Resources