Azure Website each page loading slow - azure

I've hosted a website on azure and right now I am deploying on a test slot and I have a problem after each deployment and if I haven't accessed the pages in a long time. Each page loads like 15 seconds and if accessed again loads in normal time. I changed the settings in the portal to keep the site always alive, but that doesn't work as expected. Tried with ping to the website every few minutes, but did not work again.
I thought the build is only once for the whole application, but it looks like that every file is built on-the-go. I am not sure how to fix that, or if it's even what I think it is.

The pages will build and cache the first time you access any given page (not once for the whole site) right after deployment. If you have public pages, you can also look at Application Initialization options in the web.config to do the "warm up" for you.

Related

iis 10 Static Website: Deleting default site and creating completely new site (how to access new site)

This post needs help from experienced iis administrators, but must be explained in details for EXTREME newbies.
What I am doing:
I have two computers, both running Windows 10. One is a desktop and one is a laptop.
iis is enabled on both computers. Each computer can access the iis web server from the other and pull up a page from the other - using the ip address.
There is no DNS or host files being used (this is by ip address only), nor do I want to use any sort of naming.
Both computers are running an identical website, and the website files are in a different directory than the default. The structure is like this:
C:\inetpub\ROOT\myWebsite\myIndex.html
web.config
Changes I've made - now a few problems.
On both computers I have deleted the DefaultAppPool and the default website that comes installed with iis. This has not stopped the website from completely working, so adding that back seems unlikely to fix my problem.
I have deleted my application pool and website from iis (never deleting the actual files from the file system) several times, and added it several times. Each time I do this, my site comes back, but with the same problem I am having.
I have deleted all of the default documents, and the only default document listed in iis is myIndex.html.
myIndex.html initially displays a graphic image (using the standard tag), and this image comes up. Sort of. See explanation below.
The problem I am having
Before I started this project, I had iis working on the desktop with the default site and app pool and simply added some of my own files with really simple text content and some pics. I had replaced the default iis splash image with my own image, and all that worked with no problem.
the image that comes up is a link to another page that has a list of links to other stuff in my website. It all works no problem there.
Now, with the setup I have now, on the desktop I was originally using (in the paragraph above) if I pull up my website locally, myIndex.html loads in the browser and my image comes up, and everything works fine.
The same is true on the laptop, when I access the site locally.
However, if I attempt to access the desktop site (using its ip address) from the laptop, it pulls up the old splash image from the default site I deleted.( I left those files there even though I deleted the site from within iis). All those files are in the default location C:\inetpub\wwwroot.
If I move those files to another directory, thus leaving C:\inetpub\wwwroot completely empty, then when I access the site on the desktop (via the ip address) from the laptop, my new site comes up without a problem.
While it seems I may have solved my problem by moving the file from the previous project, doing that does not teach me how iis is actually working, and why files from a website that no longer exists in iis are still being accessed from remote computers.
So, please teach me something about the internal workings of iis, and how it chooses to access the different application pools and websites.
Again, please word your answers for complete newbies, because I know a little but not enough to get real technical.
I have been reading posts on stackexchange.com and other sites; links to microsoft docs etc. That's not helping as those docs are expecting too much prerequisite knowledge, and speaking in terms that are not really explaining things in a way I can understand.
You have described several different problems. I will try to address each of them (contrary to S/O recommendations).
First, when you make changes, and they don't seem to show up, it is usually because of caching. IIS always wants to cache files/configs. So does your web browser. So, to force an accurate test, you need to dump your browser cache and cycle IIS (to make sure it drops its cache and loads new files and configs). Start there.
Second, IIS is designed for settings inheritance. Which means, each app and each folder will inherit settings and permissions from the parent, unless you override them. Overriding them can be done by files and/or IIS configs (application vs folder). The IIS configs are the stronger of the two.
Also, the IIS config for "default files" might have come into-play for your test. If you didn't set up MyIndex.html as the top-most default file, then IIS would look for other files first. In fact, if you don't have MyIndex.html in the list of default files, IIS would have to depend on your app to choose that as a default page (MVC routing, etc).

I have copied files via FileZilla to Azure but the index page doesn't show

I shall try not to be subjective so as not be closed.
This is my first foray into Azure and it really is NOT like my dedicated server on another hosting company. Suffice it to say, what takes me minutes to deploy a site via FTP, then to IIS to set it up, has taken me WEEKS!
I don't want to set up any of the "pre-packaged" Quickstart solutions. I simply want my INDEX.HTML file to DISPLAY.
I copied all the files via Filezilla to Azure, quite easily, but yet, when I go to the URL, I keep getting:
Your App Service app has been created
Go to your app's Quick Start guide in the Azure portal to get started or read our deployment documentation.
Everything is set up on Azure perfectly.
Here's what it looks like under the appSettings Tab:
**Virtual applications and directories**
/ site\wwwroot Application …
/wwwroot site\wwwroot\mynewsite Application …
The directory, site\wwwroot\mynewsite has an index.html but it will not display when I type in the URL.
I already built the site and the company I'm working for wants it on AZURE.
A dedicated server takes under 15 min. This has taken weeks.
UPDATE:
Thiago, thank you... so here's the file structure below...
Reveals EXACTLY what my directory looks like. Under /thingblugrow is where "the fake name" mynewsite exists. I thought it'd be easier to just show you what I really have.
So, /thingblugrow has an index.html file....
If your want to visit http://yoursitename.azurewebsites.net/mynewsite/index.html,
The appsetting we need config it as following in your case:
Virtual applications and directories
/ site\wwwroot Application …
/mynewsite site\wwwroot\thingblugrow Application …
You also can refer to another SO Thread to get more info about creating Virtual applications and directories
You're adding an extra level, so in your case you'll be able to see the index through:
http://yoursitename.azurewebsites.net/mynewsite/index.html
Just move all the content from "mynewsite" folder to the parent directory (wwwroot).

sharepoint website open very slow after change server location?

I am new in share point. I just made some static site and host it on server, its was working fine. due to some reason I change my server location now this site is responding very slow. I did not get any root cause why its happening, even there is 5 more share point sites all are working very slow. May be there is some configuration problem please help me to fix it
Sounds like IIS is rebuilding the cache. This happens whenever the server is rebooted, or sharepoint is loaded in a new/different web server. Typically the speed increases after a few minutes as everything gets cached.
What #marten said is correct.. its takes time for distributed cache service to kick in for new server. also check if there is any app pool/IIS recycling schedule is applied on new server.

Deployment race condition causing CDN to cache old or broken files

Our current deploy process goes something like this:
Use grunt to create production assets.
Create a datestamp and point files at our CDN (eg /scripts/20140324142354/app.min.js).
Sidenote: I've heard this process called "versioning" before but I'm not sure if it's the proper term.
Commit build to github.
Run git pull on the web servers to retrieve the new code from github.
This is a node.js site and we are using forever -w to watch for file changes and update the site accordingly.
We have a route setup in our app to serve the latest version of the app via /scripts/*/app.min.js.
The reason we version like this is because our CDN is set to cache JavaScript files indefinitely and this purposely creates a cache miss so that the code is updated on the CDN (and also in our users' browsers).
This works fine most of the time. But where it breaks down is if one of the servers lags a bit in checking out the new code.
Sometimes a client hits the page while a deploy is in progress and tries to retrieve the new JavaScript code from the CDN. The CDN tries to retrieve it but hits a server that isn't finished checking out the new code yet and caches an old or partially downloaded file causing all sorts of problems.
This problem is exacerbated by the fact that our CDN has many edge locations and so the problem isn't always immediately visible to us from our office. Some edge locations may have pulled down old/bad code while others may have pulled down new/good code.
Is there a better way to do these deployments that will avoid this issue?
As a general rule of thumb:
Don't do live upgrades. (unless the language supports it, but even then think twice)
Pulling code using git pull and then waiting for the app to notice changes to files sounds a lot like the 90's: uploading php files to an apache web server using ftp (or sftp if you are cool) and waiting for apache to notice that they were updated. It can't happen atomically, so of course there is a race condition. Some users WILL get a half built and broken site.
I recommend only upgrading your live and running application while no one is using it. Hopefully you have a pool of servers behind a load balancer of some sort, which will allow you to remove them one at a time and upgrade them.
This will mean that users will be able to use both the old and the new site at the same time depending on how and when they access it, but that is much better then not being able to access it at all.
Ideally you would be able to spin up copies of each of the web servers that you have running with the new version of the site. Check that the new version does work, and then atomically update the load balancer so that everyone gets bumped to the new site at the same time. And only once everything is verified to be working perfectly the old machines are shut down and decommissioned, or reused.
step 4 in your procedure should be:
git archive --remote $yourgithubrepo --prefix=$timestamp/ | tar -xf -
stop-server
ln -sf $timestamp current
start-server
your server would use the current directory (well, a symlink) at all times. no matter how long the deploy takes, your application is in a consistent state.
I'll go ahead and post our far-from-ideal monkey-patch that we're using right now.
We deploy once which may or may not go as planned, once we're sure the code is deployed on all the servers we do another build where the only thing that changes is the version number.
Then we deploy again server by server.
The race condition still exists but because the application code between the two versions is the same this masks the issue since no matter which server the CDN hits it gets the "latest" code.

Azure website node process lifecycle

I 've found out that Azure websites (trial version) doesn't autostart my node sever process (it starts only when I load the url in the web browser); and that when there are no requests in a while, the process is killed.
I mean, when I git push my server, I would like it to start running immediately and continuously.
I read (here, for example) that this might have to do with the way iisnode manages azure websites, and that I can't do anything to change it. Is this the actual way Azure websites work? Is there any way I can deal with this?
Thanks in advance,
Bruno.
You've find the answer. There is no other answer.
The process termination because of inactivity comes from IIS - there is Idle Timeout setting. Which to my knowledge is not configurable in Azure Web Sites (at least not Free tier). Check out also this SO question and its answer to get better understanding on why you can't change this timeout on the FREE and STANDARD tiers.
And here is an interesting workaround to avoid this idle timeout. Actually if you use technique, you will also have kind-of "auto start", in terms that when your scheduler hits your site after a new deployment, it will "boot up".
This can get a little complicated, but if you don't want to use their 5-min ping service, you can keep these always on by doing the following:
Create an app setting on your website configuration tab within the portal:
WEBSITE_PRIVATE_EXTENSIONS and give it a value of 1
Create a text file named applicationhost.xdt and populate it with:
<?xml version="1.0"?><configuration xmlns:xdt="http://schemas.microsoft.com/XML-Document-Transform"><system.applicationHost><applicationPools><add name="DefaultAppPool" managedRuntimeVersion="v4.5" startMode="AlwaysRunning"><processModel identityType="ApplicationPoolIdentity" /></add></applicationPools></system.applicationHost></configuration>
ftp into your website and create a folder on the root directory called Site Extensions. (there should now be 3 folders in your root: LogFiles, site, & SiteExtensions)
Create another folder within 'Site Extensions', named ASPLimits
Upload the applicationhost.xdt into the ASPLimits folder
Restart your website using the portal

Resources