Enable Cache On Azure CDN - azure

I am setting up Azure CDN, and having trouble setting the Cache-Control header.
I used Cloudberry Explorer to setup a sync between my server folders and the CDN. This is working well. All my files were copied to the CDN with no problem.
Under Tools > Http Headers > Edit Http Header I set the value for Cache-Control to be: public,max-age=604800
However, this does not appear to be having any effect (according to both Fiddler and Page Speed).
Any tips on setting the Cache-Control header for the Azure CDN would be GREATLY appreciated.

I had this issue myself and needed to update the Cache-Control header on thousands of files. To prevent caching issues in sites, I re-deploy these files with every release to a new path.
I was able to patch together some different suggestions online and ultimately landed on the following solution, which I currently use for deploying one of my production apps.
You need two files, and the script assumes they're in the same directory on your computer:
A text file with a listing of the files in the container (see example below)
The PowerShell script
The Text File (file-list.txt)
The file should be in the example format below with the full file path as deployed to the CDN container. Note this uses forward slashes, and should not include the container name since it will be included in the script. The name of this text file will be included in the PowerShell script below.
v12/app/app.js
v12/app/app.min.js
v12/app/app.min.js.map
v12/app/account/signup.js
v12/app/account/signup.min.js
... (and so on)
The Script (cdn-cache-control.ps1)
The full script is below. You'll need to replace the constants like STORAGE_ACCOUNT_NAME, STORAGE_KEY, and you may need to update the path to the Azure SDK DLL if you have a different version. There are also 2 possible implementations of $blobClient; I repurposed some of this code from a source online, and the un-commented one works for me.
The key difference between what I have here and what you'll find online is the inclusion of $blob.FetchAttributes(). Without explicitly calling this method, a majority of the blob properties like Content-Type, Last Modified Date, and others will be loaded into memory as empty/default values, then when $blob.SetProperties() is called these empty values will blow away the existing ones in the CDN, causing files to load without a Content-Type among other things.
Add-Type -Path "C:\Program Files\Microsoft SDKs\Azure\.NET SDK\v2.9\bin\Microsoft.WindowsAzure.StorageClient.dll"
$accountName = "STORAGE_ACCOUNT_NAME"
$accountKey = "STORAGE_KEY"
$blobContainerName = "STORAGE_CONTAINER_NAME"
$storageCredentials = New-Object Microsoft.WindowsAzure.StorageCredentialsAccountAndKey -ArgumentList $accountName,$accountKey
$storageAccount = New-Object Microsoft.WindowsAzure.CloudStorageAccount -ArgumentList $storageCredentials,$true
#$blobClient = $storageAccount.CreateCloudBlobClient()
$blobClient = [Microsoft.WindowsAzure.StorageClient.CloudStorageAccountStorageClientExtensions]::CreateCloudBlobClient($storageAccount)
$cacheControlValue = "max-age=31556926"
echo "Setting cache control: $cacheControlValue"
Get-Content "file-list.txt" | foreach {
$blobName = "$blobContainerName/$_".Trim()
$blob = $blobClient.GetBlobReference($blobName)
$blob.FetchAttributes()
$blob.Properties.CacheControl = $cacheControlValue
$blob.SetProperties()
echo $blobName
}
It was tricky to find information about mass-setting the Cache-Control header but I've run this script for multiple production releases with great success. I've verified the configuration of the header as well, and routinely run Google's PageSpeed Insights against my site to verify.

Related

How can I retrieve files within a repo in Azure within a Powershell script?

I need to (within a Powershell script) determine all the possible controller file names within a repository. This is for a test that will hit every controller and verify it is functioning and not missing from the startup file. (MVC app)
Since all the files are DLLs I cannot simply ask for the files in the folder nor do I want to hard code the names. How can I get a listing of files within a certain folder in order call each one to test within a powershell script?
Perhaps a better way to ask this is:
How can I list files within a folder that is inside a repo? (using a Powershell script)
You could refer to this doc:Interacting with Azure Web Apps Virtual File System using PowerShell and the Kudu API. It uses the VFS API in the wiki doc. And there is a api to list files at directory specified by path.
GET /api/vfs/{path}/
Lists files at directory specified by path.
And in the previous doc, under the title Downloading a File from an App Service, there is a scripts to download the files. I use the path without $kuduPath to list files. And you need to Getting the Kudu REST API Authorisation header via PowerShell. And then the scripts would be like this.
$kuduApiUrl="https://<webname>.scm.azurewebsites.net/api/vfs/site/wwwroot/"
Invoke-RestMethod -Uri $kuduApiUrl `
-Headers #{"Authorization"=$kuduApiAuthorisationToken;"If-Match"="*"} `
-Method GET `
-OutFile $localPath `
-ContentType "multipart/form-data"
And here is the result pic.It will list all files and the folders.
hope this could help you, if you still have other questions,please let me know.

How to make a IIS Site/Content Backup?

i want to learn ASP.NET, for this, i read some basic tutorials for managing the IIS Web-Server. Im Wondering how i could make a full backup of my site (Configuration and Content). Im running the IIS Server on a Hyper-V Windows Server 2012R2 Core and administrating over Powershell Remote.
In the Intenet, ifound an article about some basic stuff (see here)
This Article said, i can make a full backup of my IIS Configuration and Content over
Backup-WebConfiguration -Name "My Backup"
And Restore it over
Restore-WebConfiguration -Name "My Backup"
The Problem is: It seems it really only makes backup from the Configuration and not from the Content. For Example: It Restores the Websites from IIS:\Sites but not the physical stuff like an Application Folder in it and a default.htm. If i delete the default .htm and the folders, use the Restore-WebConfiguration, it still does not restore it - only the WebConfiguration itself.
From the Articel i guessed, it would be restore also the content ....
Did i make something wrong? How can i do what i want "from scratch" without Scripts from MS Web Deploy 3.0 ?
thanks for help and best regards,
Backup-WebConfiguration only backs up the configuration items detailed in the applicationHost.config file. It does not deal with the actual content, just how that content is handled by IIS.
To do both is easy enough, here's a quick function that creates a zip file (just enter in the path to your inetpub directory) and backs up the configuration. (This requires Powershell v3 or higher) The backup will have a Creation Date automatically set for it (you can see a list of your backups by using the Get-WebConfigurationBackup, so this goes ahead and adds the date and time to the zip file as well so they can be matched up.
If you're making more than one backup in the same day, you'll need to tweak the file name of the compressed file as it only has the date in its file name.
function Backup-WebServer
{
[CmdletBinding()]
Param
(
[Parameter(Mandatory=$true,ValueFromPipelineByPropertyName=$true)]
[ValidateNotNullOrEmpty()]
[ValidateScript({Test-Path $_})]
[string]$Source,
[Parameter(Mandatory=$true,ValueFromPipelineByPropertyName=$true)]
[ValidateNotNullOrEmpty()]
[ValidateScript({Test-Path $_})]
[string]$Destination,
[Parameter(Mandatory=$true,ValueFromPipelineByPropertyName=$true)]
[ValidateNotNullOrEmpty()]
[string]$ConfigurationName
)
Add-Type -Assembly System.IO.Compression.FileSystem
Import-Module WebAdministration
$compressionLevel = [System.IO.Compression.CompressionLevel]::Optimal
$date = $(Get-Date -Format d).Replace('/','-')
$fileName = "Inetpub-Backup $date.zip"
$inetpubBackup = Join-Path -Path $Destination -ChildPath $fileName
[System.IO.Compression.ZipFile]::CreateFromDirectory($Source,$inetpubBackup,$compressionLevel,$false)
Backup-WebConfiguration -Name $ConfigurationName
}

How do I modify the Site Collection in SharePoint 2013?

When I try to open a form published from InfoPath I now get this error:
"The following location is not accessible, because it is in a different site collection:
https//portal/sites/forms/Daily%20Activity/Forms/template.xsn?SaveLocation=https//portal.alamedacountyfire.org/sites/forms/Daily%20Activity/&Source=https//portal.alamedacountyfire.org/sites/forms/Daily%2520Activity/Forms/AllItems.aspx&ClientInstalled=false&OpenIn=Browser&NoRedirect=true&XsnLocation=https//PORTAL/sites/forms/Daily%20Activity/Forms/template.xsn."
Correlation ID:12c0ab9c-caff-80a8-f1b4-64d81dcfa6ea
Following are some options that you can try:
1) Save the form template (.xsn) as the source files in the publish options. Look at the manifest file in notepad and see if you can find a reference to the incorrect location. If so, correct it and Republish the form.
2) Clear the InfoPath cache on that machine. Start->Run "infopath /cache clearall"
3) See if the site collection has a managed path, if so, give the proper url while publishing. The XSN might be getting deployed on the root site and throws error since the intended list does'nt exist.
I found this worked for me. Got the answer from another post.
"I had a similar problem and found it was due to the request management service routing from my web application host header to the server name.
There was a routing rule in my request management settings. I just disabled routing and the problem went away. I used the following powershell to disable it. "
$w = Get-SPWebApplication "http://webapphostname"
$r = $w | Get-SPRequestManagementSettings
$r.RoutingEnabled = $false
$r.Update()
You may want to configure it rather than disable it. Here’s a good resource to get you started:
http://www.harbar.net/articles/sp2013rm1.aspx

How to establish a continuous deployment of non-.NET project/solution to Azure?

I have connected Visual Studio Online to my Azure website. This is not a .NET ASP.NET MVC project, just several static HTML files.
Now I want to get my files uploaded to Azure and available 'online' after my commits/pushes to the TFS.
When a build definition (based on GitContinuousDeploymentTemplate.12.xaml) is executed it fails with an obvious message:
Exception Message: The process parameter ProjectsToBuild is required but no value was set.
My question: how do I setup a build definition so that it automatically copies my static files to Azure on commits?
Or do I need to use a different tooling for this task (like WebMatrix).
update
I ended up with creating an empty website and deploying it manually from Visual Studio using webdeploy. Other possible options to consider to create local Git at Azure.
Alright, let me try to give you an answer:
I was having quite a similar issue. I had a static HTML, JS and CSS site which I needed to have in TFS due to the project and wanted to make my life easier using the continuous deployment. So what I did was following:
When you have a Git in TFS, you get an URL for the repository - something like:
https://yoursite.visualstudio.com/COLLECTION/PROJECT/_git/REPOSITORY
, however in order to access the repository itself, you need to authenticate, which is not currently possible, if you try to put the URL with authentication into Azure:
https://username:password#TFS_URL
It will not accept it. So what you do, in order to bind the deployment is that you just put the URL for repository there (the deployment will fail, however it will prepare the environment for us to proceed).
However, when you link it there, you can get DEPLOYMENT TRIGGER URL on the Configure tab of the Website. What it is for is that when you push a change to your repository (say to GitHub) what happens is that GitHub makes a HTTP POST request to that link and it tells Azure to deploy new code onto the site.
Now I went to Kudu which is the underlaying system of Azure Websites which handles the deployments. I figured that if you send correct contents in the HTTP POST (JSON format) to the DEPLOYMENT TRIGGER URL, you can have it deploy code from any repository and it even authenticates!
So the thing left to do is to generate the alternative authentication credentials on the TFS site and put the whole request together. I wrapped this entire process into the following PowerShell script:
# Windows Azure Website Configuration
#
# WAWS_username: The user account which has access to the website, can be obtained from https://manage.windowsazure.com portal on the Configure tab under DEPLOYMENT TRIGGER URL
# WAWS_password: The password for the account specified above
# WAWS: The Azure site name
$WAWS_username = ''
$WAWS_password = ''
$WAWS = ''
# Visual Studio Online Repository Configuration
#
# VSO_username: The user account used for basic authentication in VSO (has to be manually enabled)
# VSO_password: The password for the account specified above
# VSO_URL: The URL to the Git repository (branch is specified on the https://manage.windowsazure.com Configuration tab BRANCH TO DEPLOY
$VSO_username = ''
$VSO_password = ''
$VSO_URL = ''
# DO NOT EDIT ANY OF THE CODE BELOW
$WAWS_URL = 'https://' + $WAWS + '.scm.azurewebsites.net/deploy'
$BODY = '
{
"format": "basic",
"url": "https://' + $VSO_username + ':' + $VSO_password + '#' + $VSO_URL + '"
}'
$authorization = "Basic "+[System.Convert]::ToBase64String([System.Text.Encoding]::UTF8.GetBytes($WAWS_username+":"+$WAWS_password ))
$bytes = [System.Text.Encoding]::ASCII.GetBytes($BODY)
$webRequest = [System.Net.WebRequest]::Create($WAWS_URL)
$webRequest.Method = "POST"
$webRequest.Headers.Add("Authorization", $authorization)
$webRequest.ContentLength = $bytes.Length
$webRequestStream = $webRequest.GetRequestStream();
$webRequestStream.Write($bytes, 0, $bytes.Length);
$webRequest.GetResponse()
I hope that what I wrote here makes sense. The last thing you would need is to bind this script to a hook in Git, so when you perform a push the script gets automatically triggered after it and the site is deployed. I haven't figured this piece yet tho.
This should also work to deploy a PHP/Node.js and similar code.
The easiest way would be to add them to an empty ASP .NET project, set them to be copied to the output folder, and then "build" the project.
Failing that, you could modify the build process template, but that's a "last resort" option.

WMI/PowerShell Bug: why is CreateSite creating *two* sites all of the sudden?

I have a simple PowerShell script that uses WMI to create a web site on a Vista box. Yes, I know PowerShell has an IIS provider for working with IIS 7, but this script must also support IIS 6.0, so that rules that out.
Anyway, the script was working just fine, but all of the sudden (and I mean that literally, I made zero code changes to the script) it started creating a second, broken site for every call to the CreateNewSite method. Below is the script. Anyone have any ideas?
$path = "C:\My Path\WebSite"
$site = "TestSite"
$hostHeader = "demo.blah.com"
$service = Get-WmiObject -namespace "root\MicrosoftIISv2" -class "IIsWebService"
$bindingClass = [wmiclass]'root\MicrosoftIISv2:ServerBinding'
$bindings = $bindingClass.CreateInstance()
$bindings.IP = ""
$bindings.Port = "80"
$bindings.Hostname = $hostHeader
$result = $service.CreateNewSite($site, $bindings, $path)
The above script was just creating a site named 'TestSite', but now it's also creating a site called 'SITE_1786339847' (the number changes, but it's always similar to that). I have stepped through the script executing one line at a time, and neither site is created until the CreateNewSite method is invoked. Is WMI just buggy?
Whoops, answered my own question. I checked the raw IIS 7.0 configuration file and found an orphaned virtual directory that was associated to a site with the ID 1786339847. When I removed that virtual directory from the configuration file, the script started working correctly again.
In case anyone runs into something similar, grab the site ID for the bad site from IIS Manager before deleting it, then open up C:\Windows\system32\inetsrv\config\applicationHost.config. Scan the file for that ID and look for any orphaned references to it. Be sure you have a backup first.

Resources