When I publish or restart my web app it loads very slowly the first time, then when I refresh with F5 it is ok again. what could it be?
It will happen with Windows Azure Websites. Windows Azure Websites are running in shared pool of resources and uses the concept of hot (active) and cold (inactive) sites in which if a websites has no active connection for x amount of time, the site goes into cold state means the host IIS process exits. When a new connection is made to that websites it takes a few seconds to get the site ready and working. Depend on how your first page code is, the time to load the site for the first time varies.
The IIS takes a while to boot up after you upload new files to the app container. Application Initialization module and deployment slot swap also takes several time.
So the first page hit after you've updated the app will be slower. Also Azure Web Apps get dehydrated after a period of inactivity. This also causes the first page hit to be very slow if the page hasn't been accessed in a while.
To combat this, in the Application Settings for the web app, you can find a setting called Always On, which basically pings your page every couple minutes to keep the app hydrated and responsive.
For more details, you could refer to this blog.
As juunas said, you could also use Razor view pre-compilation to speed up the initial loads. Otherwise the app must compile the views at run-time when they are first rendered.
Related
I have an ASP.NET Core 3.1 Blazor Server-Side Application.
Recently a user has complained, that it does not work with the TOR Browser.
So I tried it out and had to find out, it works in only ~50% of all cases.
Every time I tried, I requested a new circuit for the request.
When it worked, everything was fine and even the speed was not that bad.
But if not, first, the tab is loading the source, showing the favicon and then nothing happens for a while.
Sometimes, if you are patient enough it looks like the first render happens, but the second and so on for all the async operations never happens.
Is there a difference between the TOR nodes, that prevents Blazor pages from working properly?
Tor by default blocks javascript, which blazor signalR is based on, so it can't effectively communicate with the server, disable noscript.
Also k
Azure Windows Web App with instances in Amsterdam, US and Hong Kong.
Need to optimize for browser cached files - so that once a user has visited a page in the app and downloaded whatever js, css, fonts or images - the browser caches these locally.
Then - only when a cached file is updated - it then breaks browser cache and does an https call to get the newer file version.
Thus minimizing latency, bandwidth and # of http requests from the user browser standpoint.
Typically have 90-96 PageSpeed scores - so the pages are optimized - but seems it's way slower than could be perhaps because these unchanging js, css, font and image files may be re-downloaded unnecessarily?
Once a page is visited once during a session it's wicked fast for the rest of the session.
But - on login the next day there is latency again for first render of each page - then it's wicked fast again.
Kinda seems like a TTL expiry setting or something like an old school IIS ETag HTTP response header type of thing?
ASP.NET C# 4.7 Windows Web App
Any geniuses know how to optimize for this sort of thing?
It's not a caching issue, it's more likely to be your application unloading due to being idle for too long.
To overcome this you need to ensure your App Service Plan is S1 (Standard) or higher and enable the "Always On" option for your site. To do this:
Azure Portal -> Your Web App -> Application Settings -> Always On -> Enabled
You can scale up your App Service Plan without any downtime if it's currently at the free or shared tier:
Azure Portal -> App Service Plan -> Scale Up
We recently migrated from a SQL Server 2008 SSRS server to a new SQL Server 2016. The entire report catalog was restored and upgraded to this new server. Everything is working, except the horrible performance of the web portal.
The performance while connecting to the web portal from a domain joined computer seems decent, but connecting to the web portal over the internet is seriously frustrating. Even simply trying to browse a directory of reports is a wait of several seconds, and running any given report is similarly slow. It's slow in IE11, Edge, Chrome, Safari, you name it. Like 25+ seconds to go from login to viewing the Home directory.
We are using NETWORK SERVICE as the service account, and NTLM authentication. We aren't getting any permission errors in the UI or the logs. HOWEVER in Edge browser, using the developer tools, I notice several 401 unauthorized HTTPS GET requests for things like reports/assets/css/app-x-x-x-bundle.min.css. So perhaps there is a permissions issue somewhere? Other interesting items in the developer tools is that things like reports/api/v1.0/CatalogItemByPath(path=#path)?#path= are taking like 10 seconds. These are JSON types.
Certain reports that do things like have a parameter depend on another parameter selection will sometimes not work. The waiting icon spins and when the report returns the selection will not be kept, nor the second parameter filled. Sometimes it works, however, which is part of what makes this so maddening.
There are no explicit error messages, but something is getting bogged down in a major way. There are no resource issues on this server that we can see- it's got plenty of headroom in terms of RAM and CPU.
This is not a report optimization problem; the entire UI is slow for everything.
I am launching a new redesigned website on windows hosting. I am wondering what is the best way to launch this new website without having any downtime on the existing one?
MY only fear is having a user go to visit a page and it's not there or the supporting files are not uploaded yet.
One of the simplest ways to handle this is to put a load balancer or proxy server in front of the application server. Then set up another application server with the new code. Once it is ready, you can change the proxy server to point to the new application server with the new code. Once you are sure nobody is using the old application server, you can shut it down. This, of course, relies on your ability to get that setup in place. If you are on a budget, you might be able to do it all on a single box. For instance, you could use nginx as a reverse proxy to your application on the same box. Getting that in place could potentially cause a tiny window of downtime - not sure if that's acceptable. Then you might be able to set up the new application on the same box with a different port - again, I'm not sure if that would work for your setup. Anyway, the reverse proxy approach is a pretty common one, and one of the great reasons for deploying to the cloud. You only pay for the short period of time when you need both boxes.
You should make sure that your new website launches all at once and that you set up the proper redirection rules for all previous pages. Once you are launching the new website, pick a time at night where you have low traffic volume, and simply upload all the new code at once to the webserver. This eliminates the fear you have of the "supporting files not uploaded yet". One of the key things to do is make sure all your old pages redirect and map over to new pages on the site just in case anyone clicks into your site using external links.
Two good resources to read:
http://www.rise.net/blog/ideal-way-launch-website-rebrand
http://googlewebmastercentral.blogspot.com/2008/04/best-practices-when-moving-your-site.html
The best methos is to upload the site via FTP, and if you have RD access to login into the Windows server and to copy the new site for a few seconds. In this way you will not have any downtime as when you directly upload the site via FTP.
I took a copy of a website that works fine on a server (done have server access) but when I set up a website in IIS7 and run the classic asp code loading a page is taking nearly a minute. It always loads the pages but just takes forever.
Any help or suggestions would be great...
If you moved the code to another server elsewhere on the internet, and it is still pointing to the same database, then it could take a lot longer to do the query when compared to if the DB was on the same network as the web server. If this is the problem you'll definitely want to get a dump of the database and move it closer to the server.