How to make a IIS Site/Content Backup? - iis

i want to learn ASP.NET, for this, i read some basic tutorials for managing the IIS Web-Server. Im Wondering how i could make a full backup of my site (Configuration and Content). Im running the IIS Server on a Hyper-V Windows Server 2012R2 Core and administrating over Powershell Remote.
In the Intenet, ifound an article about some basic stuff (see here)
This Article said, i can make a full backup of my IIS Configuration and Content over
Backup-WebConfiguration -Name "My Backup"
And Restore it over
Restore-WebConfiguration -Name "My Backup"
The Problem is: It seems it really only makes backup from the Configuration and not from the Content. For Example: It Restores the Websites from IIS:\Sites but not the physical stuff like an Application Folder in it and a default.htm. If i delete the default .htm and the folders, use the Restore-WebConfiguration, it still does not restore it - only the WebConfiguration itself.
From the Articel i guessed, it would be restore also the content ....
Did i make something wrong? How can i do what i want "from scratch" without Scripts from MS Web Deploy 3.0 ?
thanks for help and best regards,

Backup-WebConfiguration only backs up the configuration items detailed in the applicationHost.config file. It does not deal with the actual content, just how that content is handled by IIS.
To do both is easy enough, here's a quick function that creates a zip file (just enter in the path to your inetpub directory) and backs up the configuration. (This requires Powershell v3 or higher) The backup will have a Creation Date automatically set for it (you can see a list of your backups by using the Get-WebConfigurationBackup, so this goes ahead and adds the date and time to the zip file as well so they can be matched up.
If you're making more than one backup in the same day, you'll need to tweak the file name of the compressed file as it only has the date in its file name.
function Backup-WebServer
{
[CmdletBinding()]
Param
(
[Parameter(Mandatory=$true,ValueFromPipelineByPropertyName=$true)]
[ValidateNotNullOrEmpty()]
[ValidateScript({Test-Path $_})]
[string]$Source,
[Parameter(Mandatory=$true,ValueFromPipelineByPropertyName=$true)]
[ValidateNotNullOrEmpty()]
[ValidateScript({Test-Path $_})]
[string]$Destination,
[Parameter(Mandatory=$true,ValueFromPipelineByPropertyName=$true)]
[ValidateNotNullOrEmpty()]
[string]$ConfigurationName
)
Add-Type -Assembly System.IO.Compression.FileSystem
Import-Module WebAdministration
$compressionLevel = [System.IO.Compression.CompressionLevel]::Optimal
$date = $(Get-Date -Format d).Replace('/','-')
$fileName = "Inetpub-Backup $date.zip"
$inetpubBackup = Join-Path -Path $Destination -ChildPath $fileName
[System.IO.Compression.ZipFile]::CreateFromDirectory($Source,$inetpubBackup,$compressionLevel,$false)
Backup-WebConfiguration -Name $ConfigurationName
}

Related

How can I retrieve files within a repo in Azure within a Powershell script?

I need to (within a Powershell script) determine all the possible controller file names within a repository. This is for a test that will hit every controller and verify it is functioning and not missing from the startup file. (MVC app)
Since all the files are DLLs I cannot simply ask for the files in the folder nor do I want to hard code the names. How can I get a listing of files within a certain folder in order call each one to test within a powershell script?
Perhaps a better way to ask this is:
How can I list files within a folder that is inside a repo? (using a Powershell script)
You could refer to this doc:Interacting with Azure Web Apps Virtual File System using PowerShell and the Kudu API. It uses the VFS API in the wiki doc. And there is a api to list files at directory specified by path.
GET /api/vfs/{path}/
Lists files at directory specified by path.
And in the previous doc, under the title Downloading a File from an App Service, there is a scripts to download the files. I use the path without $kuduPath to list files. And you need to Getting the Kudu REST API Authorisation header via PowerShell. And then the scripts would be like this.
$kuduApiUrl="https://<webname>.scm.azurewebsites.net/api/vfs/site/wwwroot/"
Invoke-RestMethod -Uri $kuduApiUrl `
-Headers #{"Authorization"=$kuduApiAuthorisationToken;"If-Match"="*"} `
-Method GET `
-OutFile $localPath `
-ContentType "multipart/form-data"
And here is the result pic.It will list all files and the folders.
hope this could help you, if you still have other questions,please let me know.

Enable Cache On Azure CDN

I am setting up Azure CDN, and having trouble setting the Cache-Control header.
I used Cloudberry Explorer to setup a sync between my server folders and the CDN. This is working well. All my files were copied to the CDN with no problem.
Under Tools > Http Headers > Edit Http Header I set the value for Cache-Control to be: public,max-age=604800
However, this does not appear to be having any effect (according to both Fiddler and Page Speed).
Any tips on setting the Cache-Control header for the Azure CDN would be GREATLY appreciated.
I had this issue myself and needed to update the Cache-Control header on thousands of files. To prevent caching issues in sites, I re-deploy these files with every release to a new path.
I was able to patch together some different suggestions online and ultimately landed on the following solution, which I currently use for deploying one of my production apps.
You need two files, and the script assumes they're in the same directory on your computer:
A text file with a listing of the files in the container (see example below)
The PowerShell script
The Text File (file-list.txt)
The file should be in the example format below with the full file path as deployed to the CDN container. Note this uses forward slashes, and should not include the container name since it will be included in the script. The name of this text file will be included in the PowerShell script below.
v12/app/app.js
v12/app/app.min.js
v12/app/app.min.js.map
v12/app/account/signup.js
v12/app/account/signup.min.js
... (and so on)
The Script (cdn-cache-control.ps1)
The full script is below. You'll need to replace the constants like STORAGE_ACCOUNT_NAME, STORAGE_KEY, and you may need to update the path to the Azure SDK DLL if you have a different version. There are also 2 possible implementations of $blobClient; I repurposed some of this code from a source online, and the un-commented one works for me.
The key difference between what I have here and what you'll find online is the inclusion of $blob.FetchAttributes(). Without explicitly calling this method, a majority of the blob properties like Content-Type, Last Modified Date, and others will be loaded into memory as empty/default values, then when $blob.SetProperties() is called these empty values will blow away the existing ones in the CDN, causing files to load without a Content-Type among other things.
Add-Type -Path "C:\Program Files\Microsoft SDKs\Azure\.NET SDK\v2.9\bin\Microsoft.WindowsAzure.StorageClient.dll"
$accountName = "STORAGE_ACCOUNT_NAME"
$accountKey = "STORAGE_KEY"
$blobContainerName = "STORAGE_CONTAINER_NAME"
$storageCredentials = New-Object Microsoft.WindowsAzure.StorageCredentialsAccountAndKey -ArgumentList $accountName,$accountKey
$storageAccount = New-Object Microsoft.WindowsAzure.CloudStorageAccount -ArgumentList $storageCredentials,$true
#$blobClient = $storageAccount.CreateCloudBlobClient()
$blobClient = [Microsoft.WindowsAzure.StorageClient.CloudStorageAccountStorageClientExtensions]::CreateCloudBlobClient($storageAccount)
$cacheControlValue = "max-age=31556926"
echo "Setting cache control: $cacheControlValue"
Get-Content "file-list.txt" | foreach {
$blobName = "$blobContainerName/$_".Trim()
$blob = $blobClient.GetBlobReference($blobName)
$blob.FetchAttributes()
$blob.Properties.CacheControl = $cacheControlValue
$blob.SetProperties()
echo $blobName
}
It was tricky to find information about mass-setting the Cache-Control header but I've run this script for multiple production releases with great success. I've verified the configuration of the header as well, and routinely run Google's PageSpeed Insights against my site to verify.

IIS6 bat file - Home Directory

How do I get the Root/Home Directory of a website in IIS6 using a batch file??
My Scenario:
I am creating a tool to summarise and report of sites in IIS. I am using batch files and running iisweb /query to get all the sites then looping over the results and using iisvdir /query "Website Name" to get the virtual directories.
However it has to be backwards compatible with IIS6 and I am having trouble getting the Home Directory of the site.
I don't think you can do this directly from a batch file, but you should be able to do it from a vbscript which you can call from a batch file.
The trick is to use the IIS WMI provider which gives you access to the IIS metabase. For example, the script below should echo the name and path of every virtual directory on the local server.
set provider = GetObject("winmgmts://localhost/root/MicrosoftIISv2")
set results = provider.ExecQuery("SELECT Name,Path from IISWebVirtualDirSetting")
for each item in results
WScript.Echo item.Name
WScript.Echo item.Path
next
If you saved this script as iispaths.vbs (just as an example), you could then call it from a batch file with:
cscript //nologo iispaths.vbs
Unfortunately I don't have access to a machine with IIS6, so I am unable to test this at the moment, but if you have any problems getting it to work, feel free to let me know in the comments and I'll do my best to fix the issue.
I don't have a IIS6 server, however, through some searching, I found that:
IIS6 uses %SystemRoot%\system32\inetsrv\MetaBase.xml and %SystemRoot%\system32\inetsrv\MBSchema.xml for storing configuration (The IIS Metabase (IIS 6.0));
If your server isn't changing home-directories too often, those xml should be updated;
using a command line parser (like xmlstartlet), you can extract Path property from IIsWebVirtualDir node (according Metabase Structure), using XPath.
With xmlstartlet, a command like below, would output root path:
xml sel -t -v "//IIsWebVirtualDir[#Location='/LM/W3SVC/1/ROOT']/#Path" "%SystemRoot%\system32\inetsrv\MetaBase.xml"
Maybe schema needs to be corrected.
This can be a command line approach. I can't test it as I don't have any IIS6 server neither I can get any MetaBase.xml sample.

Error creating IIS WebApplication

I added a webApplication Under Default website using powershell as follows:
function CreateWebApplication([string]$WebApplicationName,[string]$AppPoolName,[string]$PhysicalPath)
{
try
{
New-WebApplication -Name $WebApplicationName -Site 'Default Web Site' -PhysicalPath $PhysicalPath -ApplicationPool $AppPoolName
Write-Host "Created WebApplication :" $WebApplicationName
}
catch [Exception]
{
write-host $_.Exception.Message `n;
}
}
Its getting created fine,But when i am seeing this in IIS,it has only two section Names IIS and management.but if same thing i do from IIS console,three sections named ASP.net,IIS and management are added.Why i am not getting third section.Another thing if i am trying to enable directory browsing on same,its giving an error that unable to retrieve directory browsing giving some problem with applicationhost.config file.But when i am trying to find applicationhost.config file,its not there in the asked location "C:\Windows\System32\inetsrv".What is going wrong here i am not able to fin out.
Please help :)
The application pool is most likely not being created with a clr version.
If you open the application pool from IIS and look at the .NET version it will say
No managed code.
You can create the app pool and specify the managedRuntime version.

WMI/PowerShell Bug: why is CreateSite creating *two* sites all of the sudden?

I have a simple PowerShell script that uses WMI to create a web site on a Vista box. Yes, I know PowerShell has an IIS provider for working with IIS 7, but this script must also support IIS 6.0, so that rules that out.
Anyway, the script was working just fine, but all of the sudden (and I mean that literally, I made zero code changes to the script) it started creating a second, broken site for every call to the CreateNewSite method. Below is the script. Anyone have any ideas?
$path = "C:\My Path\WebSite"
$site = "TestSite"
$hostHeader = "demo.blah.com"
$service = Get-WmiObject -namespace "root\MicrosoftIISv2" -class "IIsWebService"
$bindingClass = [wmiclass]'root\MicrosoftIISv2:ServerBinding'
$bindings = $bindingClass.CreateInstance()
$bindings.IP = ""
$bindings.Port = "80"
$bindings.Hostname = $hostHeader
$result = $service.CreateNewSite($site, $bindings, $path)
The above script was just creating a site named 'TestSite', but now it's also creating a site called 'SITE_1786339847' (the number changes, but it's always similar to that). I have stepped through the script executing one line at a time, and neither site is created until the CreateNewSite method is invoked. Is WMI just buggy?
Whoops, answered my own question. I checked the raw IIS 7.0 configuration file and found an orphaned virtual directory that was associated to a site with the ID 1786339847. When I removed that virtual directory from the configuration file, the script started working correctly again.
In case anyone runs into something similar, grab the site ID for the bad site from IIS Manager before deleting it, then open up C:\Windows\system32\inetsrv\config\applicationHost.config. Scan the file for that ID and look for any orphaned references to it. Be sure you have a backup first.

Resources