Custom logs in Azure website file system combined into single log file - azure

My Azure web app (App Service) writes a log file mywebapp.log to the d:\LogFiles directory of the VM that hosts the website. When the log file gets to a certain size I rename it to mywebapp1.log, mywebapp2.log, and so on so and a new log file is created. (I do this manually - stop the website, rename the file and restart the site.)
One day I inspected the directory through the Kudu (SCM) portal and saw just a lone mywebapp.log that was much larger than normal. The file included all of the individual logs that previously existed (included the contents of mywebapp1.log + mywebapp2.log and so on).
My app has no which combines the files. Is there an Azure process that does this or did I do it in my sleep and have no recollection?

There really is no logic in Azure that would do this. Azure knows nothing about your log files, and would not be doing anything with them, especially something as complex as combining several existing files into one.
So I'll go with the sleep theory on this one :)

The problem was that I had swapped deployment slots at some point and failed to realize that the d:\LogFiles directory (the entire d: drive I believe) travels with the slot. The missing log files were sitting in my staging slot's LogFiles directory.

Related

Azure Windows App Service files not available across nodes

Our application has the ability to request a generation of a file that are then downloaded by the client. We are seeing issues that when our app service has more than one node that the files generated are not available across the other nodes.
E.g.:
POST request to generate file and save to d:\home\site\wwwroot\app_data is send by user from machine-1 and is successful.
GET request from user to download this file is received by machine-2, this fails because the file cannot be found.
My reading of the microsoft docs is that anything in d:\home is backed by azure storage and is not local to the machine: https://learn.microsoft.com/en-us/azure/app-service/operating-system-functionality#file-access
File access across multiple instances The home directory contains an
app's content, and application code can write to it. If an app runs on
multiple instances, the home directory is shared among all instances
so that all instances see the same directory. So, for example, if an
app saves uploaded files to the home directory, those files are
immediately available to all instances.
But this doesn't seem to be happening, is there something else that needs configuring?

Web App for Containers (Linux) version of app_offline.htm

Occasionally, there are times when a system needs to undergo maintenance for a short time. Standard Web Apps handle this by redirecting all traffic to an app_offline.htm if the file exists in the root directory (wwwroot). What is the equivalent for a Linux Web App for Containers instance?
I tried using Kudu's Bash terminal by echoing the minimum html contents into an app_offline.htm but it isn't working.
One thing I was looking into would be having a specific container image that is for maintenance, but that doesn't seem very elegant.
Eventually, I would like to be able to automate this via Azure DevOps.
Are you able to create an app setting with the name SCM_CREATE_APP_OFFLINE and a value of 1 to see if this allows the creation of a app_offline.htm file?

What files are relevant to Azure Web App?

I'm deploying files to an Azure Web App via Octopus Deploy, and want to clean out the Azure Web App directories before deploying new versions. This way I can be sure I'm deploying each app version onto a clean slate. I don't want to entirely delete and re-create the app, because there are some app settings that need to carry over from previous deployments.
Kudu documentation lists the web app file structure here (all under D:\home), but I'm wondering if there's any possibility of other files outside of the D:\home directory that could affect app performance.
I tried running get-childItem D:\ -recursive in the kudu powershell console before and after deployment to compare results and found 268 new files (not counting those in wwwroot) after deployment, all within these directories:
D:\Windows\Temp
D:\Windows\Logs
D:\Windows\security\logs
D:\Users\\AppData\Roaming\Microsoft\SystemCertificates
D:\home\LogFiles
D:\home\Microsoft\Windows\PowerShell
D:\home\data\aspnet\CompilationSnapshots
D:\local\VirtualDirectory0\LogFiles
D:\local\VirtualDirectory0\data
D:\local\VirtualDirectory0\site\wwwroot
D:\local\VirtualDirectory0\Microsoft\Windows\PowerShell
D:\local\Config\
D:\local\Temporary ASP.NET Files\msdeploy
So which files do I need to clear or reset in order to ensure that new versions of the web app run as intended? Is it sufficient to clear out the wwwroot directory?
The only writable folders are d:\home and d:\local. But d:\local is temporary, and gets wiped clean on app restart. So effectively, you should only be concerned about d:\home when it comes to deployment.
Within that, wwwroot is typically the most important, though if you set up virtual directories and applications, you can end up with other folders as part of your app.
See also https://github.com/projectkudu/kudu/wiki/Understanding-the-Azure-App-Service-file-system which has related info.

How Can I Update My Web Site Automaticlly EveryDay?

The tasks I do manually for updating my web site:
Stop IIS 7
Copy source files from a folder to the virtual directory of my web site
Start IIS 7
There are many ways to approach this, but here is one way.
I am assuming you don't want every single file in your source repository to exist on your destination server. The best way to reliably extract what you need from your source on a regular basis is through a build file. Two options for achieving this are nant and msbuild.
Once you have the set of files you want to deploy, you now need a way to distribute them to your destination server & to stop and start IIS. Again, there are options, but I would personally recommend powershell (with the IIS snapin) for this.
If you want this to happen regularly, consider a batch file executed by some timer, such as a scheduled task, or even better, a CI solution such as TeamCity.
For a full rundown, there are examples within my PowerUp project that does this.
It depends where you are updating from, but you could have a your virtual directory pointing to a local read-only working copy of your source code and create a task that every day runs a batch file/powershell script/etc. that would update that working copy (via a svn update, git pull etc.)
That supposes that you have a branch that always contains the latest releasable code.
You have to create a batch file with the following content:
Stop WWW publishing service
Delete your old files
Copy the new files
Start WWW publishing service
You can start/stop services like this:
net stop "World Wide Web Publishing Service"
When you have your batch file you can create a task in the Task Scheduler that calls your batch in a regular time interval (e.g. each day).

How to use IIS app_offline.htm file with Azure

I have a brilliantly designed app_offline.htm file that I'd like to display on my site periodically when I'm doing things like backing up the DB. On a server with a real file system, this wouldn't be a problem: I'd just copy app_offline.htm to the my app's root, and IIS will work its magic and redirect all requests to this file.
However, I'm using Azure, so there's no real file system and there's no easy way move files around from one location to another.
How I can I make app_offline.htm play nicely with Azure?
I figured I'd add this, I haven't seen it mentioned yet. You can actually do this via web publish from Visual Studio (or WebMatrix) as well, just put app_offline.htm in the root of your project - the same level as your main web.config. When done, just rename it and redeploy to go back online. 2 clicks - easy.
The manual option is to drop it into your /site/wwwroot via FTP.
A little personal secret, none of your site files will be accessible, style sheets etc. So put your includes into an azure blob container, and viola.
Actually there is a real file system, as each VM instance runs on Windows 2008 Server (SP2 or R2 SP1). To see this for yourself, enable Remote Desktop for your deployment and connect to a running instance.
Knowing this, you should be able to set up a mechanism to perform a file-copy of your app_offline.htm to your app root based on some type of administrative command. You'll just need to make sure each of your web role instances perform this action.
David has provided you with a good answer. However, you might be missing out on what Azure can do for you. You should be able to virtually eliminate down time with Azure by running multiple instances and using SQL Azure which is triple backed up for you. You can also backup SQL Azure using http://msdn.microsoft.com/en-us/library/ff951624.aspx

Resources