How to publish website form webapplication hosted in Azure? - azure

need solution for website publishing form web application hosted in Azure.
I tried the following code, It create the domain but I was not able to upload the Published website.
private HttpResponseMessage CreateWebsite(CreateSiteViewModel site)
{
var cert = X509Certificate.CreateFromCertFile(Server.MapPath(site.CertPath));
string uri = string.Format("https://management.core.windows.net/{0}/services/WebSpaces/{1}/sites/", site.Subscription, site.WebSpaceName);
// A url which is looking for the right public key with
// the incomming https request
var req = (HttpWebRequest)WebRequest.Create(uri);
String dataToPost =string.Format(
#"<Site xmlns=""http://schemas.microsoft.com/windowsazure"" xmlns:i=""http://www.w3.org/2001/XMLSchema-instance"">
<HostNames xmlns:a=""http://schemas.microsoft.com/2003/10/Serialization/Arrays"">
<a:string>{0}.azurewebsites.net</a:string>
</HostNames>
<Name>{0}</Name>
<WebSpaceToCreate>
<GeoRegion>{1}</GeoRegion>
<Name>{2}</Name>
<Plan>VirtualDedicatedPlan</Plan>
</WebSpaceToCreate>
</Site>", site.SiteName, site.WebSpaceGeo, site.WebSpaceName);
req.Method = "POST"; // Post method
//You can also use ContentType = "text/xml";
// with the request
req.UserAgent = "Fiddler";
req.Headers.Add("x-ms-version", "2013-08-01");
req.ClientCertificates.Add(cert);
// Attaching the Certificate To the request
// when you browse manually you get a dialogue box asking
// that whether you want to browse over a secure connection.
// this line will suppress that message
//(pragramatically saying ok to that message).
string postData = dataToPost;
var encoding = new ASCIIEncoding();
byte[] byte1 = encoding.GetBytes(postData);
// Set the content length of the string being posted.
req.ContentLength = byte1.Length;
Stream newStream = req.GetRequestStream();
newStream.Write(byte1, 0, byte1.Length);
// Close the Stream object.
newStream.Close();
var rsp = (HttpWebResponse)req.GetResponse();
var reader = new StreamReader(rsp.GetResponseStream());
String retData = reader.ReadToEnd();
req.GetRequestStream().Close();
rsp.GetResponseStream().Close();
return new HttpResponseMessage
{
StatusCode = rsp.StatusCode,
Content = new StringContent(retData)
};
}

I am not entirely sure what you try to achieve here. But if I understand correctly you want to publish a website programmatic.
You cannot do this (publish a website programmatic) with Azure Management APIs. Azure management APIs are to manage Azure services and resources. The web site content itself is not in any way Azure Service, nor an Azure resource.
If you want to programmaticly publish a website to Azure Web Site, I would suggest taking deep read into How to deploy an Azure Web site.
Out from what is mentioned there, pretty easy to automate are
Web Deploy
Repositories using GIT
MSBuild
any other that you are familiar with ...

Related

ChatBot retrieve data from SharePoint On Premise Issue

I've developed a chatbot that communicates with SharePoint on Premise,
When I run the chatbot in Emulator its work.
But When I run at Web that hosted outside of SharePoint, it does not work.
Herewith my screenshot of Error On Azure, From the result of Error is starting from XMLReader and SyndicationFeed
Success in Local Emulator
Herewith my Souce Code.
private async Task ProcessRSSAsync(ITurnContext<IMessageActivity> turnContext, LuisResult luisResult, string intent, CancellationToken cancellationToken)
{
var questionluis = turnContext.Activity.Text;
await turnContext.SendActivityAsync("intent recognize" + intent);
var intentresut = intent;
await turnContext.SendActivityAsync("Get LUIS Entity");
await turnContext.SendActivityAsync(string.Join("\t", luisResult.Entities.Select((entityObj) => entityObj.Entity)));
var entityfound = string.Join("\t", luisResult.Entities.Select((entityObj) => entityObj.Entity));
string spxurl = #"https://intra.aspac.com/sites/sg/daw/_layouts/15/srchrss.aspx?k=*%20ListId:7BC0F2C3-6366-48B8-B88A-8738BE1F9C31";
await turnContext.SendActivityAsync("Intent: " + intent.ToString() + " Entity: " + entityfound.ToString());
////---------------------------------------------------------------------------------------------
//22112019
try
{
//#ES09122019
var credentials = new NetworkCredential("email#example.com", "Pa$$w0rd", "sg.kworld.com");
var handler = new HttpClientHandler { Credentials = credentials, UseDefaultCredentials = false };
var client = new HttpClient(handler);
client.BaseAddress = new Uri("https://intra.aspac.com/sites/sg/daw/");
HttpResponseMessage resp = client.GetAsync("_layouts/15/srchrss.aspx?k=" + entityfound + "*%20ListId:7BC0F2C3-6366-48B8-B88A-8738BE1F9C31").Result;
string respString = resp.Content.ReadAsStringAsync().Result;
if (resp.StatusCode == HttpStatusCode.OK)
{
await turnContext.SendActivityAsync("Connected");
//Success 06122019 .
try
{
string spurl = #"https://intra.aspac.com/sites/sg/daw/_layouts/15/srchrss.aspx?k=*%20ListId:7BC0F2C3-6366-48B8-B88A-8738BE1F9C31";
XmlSecureResolver resolver = new XmlSecureResolver(new XmlUrlResolver(), spurl);
resolver.Credentials = new NetworkCredential("email#example.com.sg", "Pa$$w0rd", "sg.kworld.com");
XmlReaderSettings settings = new XmlReaderSettings();
settings.DtdProcessing = DtdProcessing.Parse;
settings.ValidationType = ValidationType.DTD;
settings.XmlResolver = resolver;
XmlReader reader = XmlReader.Create(spurl, settings);
SyndicationFeed feed = SyndicationFeed.Load(reader);
reader.Close();
var attachments = new List<Attachment>();
foreach (SyndicationItem item in feed.Items)
{
//Get Title,Description,URL
String title = item.Title.Text;
String description = item.Summary.Text;
String link = item.Links.FirstOrDefault().Uri.ToString();
//Hero Card
var heroCard = new HeroCard(
title: item.Title.Text,
// subtitle: description,
buttons: new CardAction[]
{
new CardAction(ActionTypes.OpenUrl,"Learn More",value:link)
}
).ToAttachment();
attachments.Add(heroCard);
}
var reply = MessageFactory.Carousel(attachments);
await turnContext.SendActivityAsync(reply);
await ProcessCosmoDBStorageLUISAsync(turnContext, questionluis, intent, entityfound, respString, cancellationToken);
}
catch (Exception ex)
{
await turnContext.SendActivityAsync(ex.ToString());
}
}
}
catch (Exception ex)
{
await turnContext.SendActivityAsync("Sorry,Currently Server Under Maintenace");
await turnContext.SendActivityAsync(ex.ToString());
}
}
any solution for this and suggestion?
ok, I think I finally understand this better, so hopefully can put a useful reply together. Would be much easier if we had a shared whiteboard :-)
Basically, in terms of hosting a bot on the Microsoft Bot Framework Services, you need to have a registration in Azure. However, there are two different options, and both are VERY different in terms of hosting. When you "create" the resource in Azure, and search for "Bot", you'll see two options - "Web App Bot" and "Bot Channels Registration":
"Bot Channels Registration" means JUST registering your bot in Azure, but HOSTING it elsewhere.
"Web App Bot" - INCLUDES the "Bot Channels Registration" but ALSO adds hosting using an Azure Web Application (so it's a Bot registration PLUS hosting)
From the screenshot you posted, I can see you've selected (2) above, and so your bot is running inside Azure, and therefore can't connect to your on premises resource (SharePoint).
As a result, I'd suggest one of two options:
Create an Azure Application Proxy - this is basically a small gateway so that your bot HOSTED in Azure can securely talk to your on-premises SharePoint. There is in fact a specific use case for SharePoint in particular.
Delete and re-create your Azure Bot entry to instead be just a "Bot Channels Registration", and then in the "Settings" screen you can call a bot hosted at any "https" endpoint. You can then have your bot run on the local network, but it will need a live "https" address (not -that- hard to do, but you have to involve your IT team to get a live web address, like "whatever.aspac.com", and you'll need an SSL/TLS certificate so that it can run httpS instead of just http.
Which option you choose might depend on the skills and resources on your team, as well as in the organisation. For instance, the company might have Azure Application Proxy configured already, in which case that saves a lot of work. It might have a wildcard certificate, which would make option (2) easier, etc.
Either way, I hope that helped, but feel free to ask more if anything is still unclear.
i had a similar problem using a on premise database. as you are deploying your bot externally, the bot needs resources that are available on the internet, and not contained internally. It will work fine using the bot emulator because it has access to what your machine has.
Saying that, azure has developed some actions which you can use to help this problem. If you look at application proxys, that may be able to help you out.
i think thats what you mean... anyway!

Socket handle leak happening on Azure Web App

I have been stuck in one issue. I am getting error of "Unable to connect to the remote server. An attempt was made to access a socket in a way forbidden by its access permissions".
After searching on web and taking help of azure support, I came to know that if Web App reaches outbound connection limit of azure web app instance, it refuses connections or kill extra connections. Here is the image of open socket handles
My application calls third party WebAPI and wcf service. I have written code to close connections after making call to APIs. but it doesn't work for me. I did following code to call Web API.
var request = (HttpWebRequest)WebRequest.Create("www.xyz.com");
request.Method = "POST";
request.ContentType = "application/x-www-form-urlencoded";
request.KeepAlive = false;
request.Headers.Add("Authorization", "Handshake");
byte[] bodyData;
bodyData = Encoding.UTF8.GetBytes(input_data);
request.ContentLength = bodyData.Length;
request.GetRequestStream().Write(bodyData, 0, bodyData.Length);
request.GetRequestStream().Flush();
request.GetRequestStream().Close();
using (var response = request.GetResponse())
{
using (var reader = new StreamReader(response.GetResponseStream(),
Encoding.UTF8))
{
string output_data = reader.ReadToEnd();
}
response.Close();
}
Could anyone guide me how to get rid on this issue?

Sending IM with Skype for Business Online from Console App

I am trying to set up a C# console app that can send notifications/reminders to users via Skype for Business online from a generic AD account. I was excited to see the other day that according to this page, UCWA is now supported in Skype for Business online: https://msdn.microsoft.com/en-us/library/office/mt650889.aspx.
I've been trying to follow this tutorial to get this set up: https://msdn.microsoft.com/en-us/library/office/mt590891(v=office.16).aspx. So far I haven't really had much luck... I have my application set up in Azure AD but I get stuck at the "Requesting an access token using implicit grant flow" step of that article (not 100% certain I'm taking the correct actions before that either)... so far I have this:
string clientId = "xxxxxxxx"
string resourceUri = "https://webdir.online.lync.com";
string authorityUri = "https://login.windows.net/common/oauth2/authorize";
AuthenticationContext authContext = new AuthenticationContext(authorityUri);
UserCredential cred = new UserCredential("username", "password");
string token = authContext.AcquireToken(resourceUri, clientId, cred).AccessToken;
var poolReq = CreateRequest("https://webdir.online.lync.com/autodiscover/autodiscoverservice.svc/root", "GET",token);
var poolResp = GetResponse(poolReq);
dynamic tmp = JsonConvert.DeserializeObject(poolResp);
string resourcePool = tmp._links.user.href;
Console.WriteLine(resourcePool);
var accessTokenReq = CreateRequest("https://login.windows.net/common/oauth2/authorize"
+ "?response_type=id_token"
+ "&client_id=" + clientId
+ "&redirect_uri=https://login.live.com/oauth20_desktop.srf"
+ "&state=" + Guid.NewGuid().ToString()
+ "&resource=" + new Uri(resourcePool).Host.ToString()
, "GET",token);
var accessTokenResp = GetResponse(accessTokenReq);
my GetResponse and CreateRequest methods:
public static string GetResponse(HttpWebRequest request)
{
string response = string.Empty;
using (HttpWebResponse httpResponse = request.GetResponse() as System.Net.HttpWebResponse)
{
//Get StreamReader that holds the response stream
using (StreamReader reader = new System.IO.StreamReader(httpResponse.GetResponseStream()))
{
response = reader.ReadToEnd();
}
}
return response;
}
public static HttpWebRequest CreateRequest(string uri, string method, string accessToken)
{
HttpWebRequest request = System.Net.WebRequest.Create(uri) as System.Net.HttpWebRequest;
request.KeepAlive = true;
request.Method = method;
request.ContentLength = 0;
request.ContentType = "application/json";
request.Headers.Add("Authorization", String.Format("Bearer {0}", accessToken));
return request;
}
accessTokenResp is an office online logon page, not the access token I need to move forward... so I'm stuck. I've tried quite a few variations of the above code.
I've been scouring the net for more examples but can't really find any, especially since UCWA support for Office 365 is so new. Does anyone have an example of how to do what I am trying to do or can point me to one? Everything I've found so far hasn't really even been close to what I'm trying. I can't use the Skype for Business client SDK unfortunately either as it doesn't meet all of my requirements.
I came to a working solution using ADAL (v3), with the help of steps outlined at
Authentication using Azure AD
Here the steps, which involve requesting multiple authentication tokens to AAD using ADAL
Register your application, as Native Application, in Azure AD.
Perform autodiscovery to find user's UCWA root resource URI.
This can be done by performing a GET request on
GET https://webdir.online.lync.com/Autodiscover/AutodiscoverService.svc/root?originalDomain=yourdomain.onmicrosoft.com
Request an access token for the UCWA root resource returned in the autodiscovery response, using ADAL
For instance, your root resource will be at
https://webdir0e.online.lync.com/Autodiscover/AutodiscoverService.svc/root/oauth/user?originalDomain=yourdomain.onmicrosoft.com
you'll have to obtain a token from AAD for resource https://webdir0e.online.lync.com/
Perform a GET on the root resource with the bearer token obtained from ADAL
GET https://webdir0e.online.lync.com/Autodiscover/AutodiscoverService.svc/root/oauth/user?originalDomain=yourdomain.onmicrosoft.com
This will return, within the user resource, the URI for applications resource, where to create your UCWA application. This in my case is:
https://webpoolam30e08.infra.lync.com/ucwa/oauth/v1/applications
Residing then in another domain, thus different audience / resource, not included in the auth token previously obatained
Acquire a new token from AAD for the host resource where the home pool and applications resource are (https://webpoolam30e08.infra.lync.com in my case)
Create a new UCWA application by doing a POST on the applications URI, using the token obtained from ADAL
Voilá, your UCWA application is created. What I notice at the moment, is that just few resources are available, excluding me / presence. So users' presence can be retrieved, but self presence status can't be changed.
I've been able however to retrieve my personal note, and the following resources are available to me:
people
communication
meetings
Show me some code:
Function to perform the flow obtaining and switching auth tokens
public static async Task<UcwaApp> Create365UcwaApp(UcwaAppSettings appSettings, Func<string, Task<OAuthToken>> acquireTokenFunc)
{
var result = new UcwaApp();
result.Settings = appSettings;
var rootResource = await result.Discover365RootResourceAsync(appSettings.DomainName);
var userUri = new Uri(rootResource.Resource.GetLinkUri("user"), UriKind.Absolute);
//Acquire a token for the domain where user resource is
var token = await acquireTokenFunc(userUri.GetComponents(UriComponents.SchemeAndServer, UriFormat.SafeUnescaped));
//Set Authorization Header with new token
result.AuthToken = token;
var usersResult = await result.GetUserResource(userUri.ToString());
//
result.ApplicationsUrl = usersResult.Resource.GetLinkUri("applications");
var appsHostUri = new Uri(result.ApplicationsUrl, UriKind.Absolute).GetComponents(UriComponents.SchemeAndServer, UriFormat.SafeUnescaped);
//Acquire a token for the domain where applications resource is
token = await acquireTokenFunc(appsHostUri);
//Set Authorization Header with new token
result.AuthToken = token;
//
var appResult = await result.CreateApplicationAsync(result.ApplicationsUrl, appSettings.ApplicationId, appSettings.UserAgent, appSettings.Culture);
return result;
}
Usage code ato retrieve OAuth tokens using ADAL
var ucSettings = new UcwaAppSettings
{
UserAgent = "Test Console",
Culture = "en-us",
DomainName = "yourdomain.onmicrosoft.com",
ApplicationId = "your app client id"
};
var acquireTokenFunc = new Func<string, Task<OAuthToken>>(async (resourceUri) =>
{
var authContext = new AuthenticationContext("https://login.windows.net/" + ucSettings.DomainName);
var ar = await authContext.AcquireTokenAsync(resourceUri,
ucSettings.ApplicationId,
new UserCredential("myusername", "mypassword"));
return new OAuthToken(ar.AccessTokenType, ar.AccessToken, ar.ExpiresOn.Ticks);
});
var app = await UcwaApp.Create365UcwaApp(ucSettings, acquireTokenFunc);
It should be of course possible to avoid hard-coding username and password using ADAL, but this was easier for PoC and especially in case of Console Application as you asked
I've just blogged about this using a start-to-finish example, hopefully it will help you. I only go as far as signing in, but you can use it with another post I've done on sending IMs using Skype Web SDK here (see day 13 and 14) and combine the two, it should work fine.
-tom
Similar to Massimo's solution, I've created a Skype for Business Online C# based console app that demonstrates how to sign and use UCWA to create/list/delete meetings and change user presence. I haven't gotten around to extending it to send IM's, but you're certainly welcome to clone my repository and extend it to your needs. Just drop in your Azure AD tenant name and native app ID into the code.
I think they just turned this on today - I was doing something unrelated with the Skype Web SDK samples and had to create a new Azure AD app, and noticed that there are two new preview features for receiving conversation updates and changing user information.
Now everything in the Github samples works for Skype For Business Online.

Create and Update Named Caches in Azure Managed Cache using Management API

I am attempting to create an Azure Managed Cache using PowerShell and the Azure Management API, this two pronged approach is required because the Offical Azure PowerShell Cmdlets only have very limited support for Creation and Update of Azure Managed Cache. There is however an established pattern for calling the Azure Management API from PowerShell.
My attempts at finding the correct API to call have been somewhat hampered by limited documentation on the Azure Managed Cache API. However after working my way through the cmdlets using both the source code and the -Debug option in PowerShell I have been able to find what appear to be the correct API endpoints, as such I have developed some code to access these endpoints.
However, I have become stuck after the PUT request has been accepted to the Azure API as subsequent calls to the Management API /operations endpoint show that the result of this Operation was Internal Server Error.
I have been using Joseph Alabarhari's LinqPad to explore the API as it allows me to rapidly itterate on a solution using the minimum possible code, so to execute the following code snippets you will need both LinqPad and the following extension in your My Extensions script:
public static X509Certificate2 GetCertificate(this StoreLocation storeLocation, string thumbprint) {
var certificateStore = new X509Store(StoreName.My, storeLocation);
certificateStore.Open(OpenFlags.ReadOnly);
var certificates = certificateStore.Certificates.Find(X509FindType.FindByThumbprint, thumbprint, false);
return certificates[0];
}
The complete source code including the includes are available below:
My Extensions - you can replace an "My Extensions" by right clicking My Extensions in the bottom left hand pane and choosing "Open Script Location in Windows Explorer" then replacing the highlighted file with this one. Alternatively you may wish to merge my extensions into your own.
Azure Managed Cache Script - you should simply be able to download and double click this, once open and the above extensions and certificates are in place you will be able to execute the script.
The following settings are used throughout the script, the following variables will need to it for anyone who is following along using their own Azure Subscription ID and Management Certificate:
var cacheName = "amc551aee";
var subscriptionId = "{{YOUR_SUBSCRIPTION_ID}}";
var certThumbprint = "{{YOUR_MANAGEMENT_CERTIFICATE_THUMBPRINT}}";
var endpoint = "management.core.windows.net";
var putPayloadXml = #"{{PATH_TO_PUT_PAYLOAD}}\cloudService.xml"
First I have done some setup on the HttpClient:
var handler = new WebRequestHandler();
handler.ClientCertificateOptions = ClientCertificateOption.Manual;
handler.ClientCertificates.Add(StoreLocation.CurrentUser.GetCertificate(certThumbprint));
var client = new HttpClient(handler);
client.DefaultRequestHeaders.Add("x-ms-version", "2012-08-01");
This configures HttpClient to both use a Client Certificate and the x-ms-version header, the first call to the API fetches the existing CloudService that contains the Azure Managed Cache. Please note this is using an otherwise empty Azure Subscription.
var getResult = client.GetAsync("https://" + endpoint + "/" + subscriptionId + "/CloudServices");
getResult.Result.Dump("GET " + getResult.Result.RequestMessage.RequestUri);
This request is successful as it returns StatusCode: 200, ReasonPhrase: 'OK', I then parse some key information out of the request: the CloudService Name, the Cache Name and the Cache ETag:
var cacheDataReader = new XmlTextReader(getResult.Result.Content.ReadAsStreamAsync().Result);
var cacheData = XDocument.Load(cacheDataReader);
var ns = cacheData.Root.GetDefaultNamespace();
var nsManager = new XmlNamespaceManager(cacheDataReader.NameTable);
nsManager.AddNamespace("wa", "http://schemas.microsoft.com/windowsazure");
var cloudServices = cacheData.Root.Elements(ns + "CloudService");
var serviceName = String.Empty;
var ETag = String.Empty;
foreach (var cloudService in cloudServices) {
if (cloudService.XPathSelectElements("//wa:CloudService/wa:Resources/wa:Resource/wa:Name", nsManager).Select(x => x.Value).Contains(cacheName)) {
serviceName = cloudService.XPathSelectElement("//wa:CloudService/wa:Name", nsManager).Value;
ETag = cloudService.XPathSelectElement("//wa:CloudService/wa:Resources/wa:Resource/wa:ETag", nsManager).Value;
}
}
I have pre-created a XML file that contains the payload of the following PUT request:
<Resource xmlns="http://schemas.microsoft.com/windowsazure">
<IntrinsicSettings>
<CacheServiceInput xmlns="">
<SkuType>Standard</SkuType>
<Location>North Europe</Location>
<SkuCount>1</SkuCount>
<ServiceVersion>1.3.0</ServiceVersion>
<ObjectSizeInBytes>1024</ObjectSizeInBytes>
<NamedCaches>
<NamedCache>
<CacheName>default</CacheName>
<NotificationsEnabled>false</NotificationsEnabled>
<HighAvailabilityEnabled>false</HighAvailabilityEnabled>
<EvictionPolicy>LeastRecentlyUsed</EvictionPolicy>
</NamedCache>
<NamedCache>
<CacheName>richard</CacheName>
<NotificationsEnabled>true</NotificationsEnabled>
<HighAvailabilityEnabled>true</HighAvailabilityEnabled>
<EvictionPolicy>LeastRecentlyUsed</EvictionPolicy>
</NamedCache>
</NamedCaches>
</CacheServiceInput>
</IntrinsicSettings>
</Resource>
I construcuct a HttpRequestMessage with the above Payload and a URL comprised of the CloudService and Cache Names:
var resourceUrl = "https://" + endpoint + "/" + subscriptionId + "/cloudservices/" + serviceName + "/resources/cacheservice/Caching/" + cacheName;
var data = File.ReadAllText(putPayloadXml);
XDocument.Parse(data).Dump("Payload");
var message = new HttpRequestMessage(HttpMethod.Put, resourceUrl);
message.Headers.TryAddWithoutValidation("If-Match", ETag);
message.Content = new StringContent(data, Encoding.UTF8, "application/xml");
var putResult = client.SendAsync(message);
putResult.Result.Dump("PUT " + putResult.Result.RequestMessage.RequestUri);
putResult.Result.Content.ReadAsStringAsync().Result.Dump("Content " + putResult.Result.RequestMessage.RequestUri);
This request is nominally accepted by the Azure Service Management API as it returns a StatusCode: 202, ReasonPhrase: 'Accepted' response; this essentially means that the payload has been accepted and will be processed offline, the Operation ID can be parsed out of the HTTP Header to retreve further information:
var requestId = putResult.Result.Headers.GetValues("x-ms-request-id").FirstOrDefault();
This requestId can be used to request an update upon the status of the operation:
var operation = client.GetAsync("https://" + endpoint + "/" + subscriptionId + "/operations/" + requestId);
operation.Result.Dump(requestId);
XDocument.Load(operation.Result.Content.ReadAsStreamAsync().Result).Dump("Operation " + requestId);
The request to the /operations endpoint results in the following payload:
<Operation xmlns="http://schemas.microsoft.com/windowsazure" xmlns:i="http://www.w3.org/2001/XMLSchema-instance">
<ID>5364614d-4d82-0f14-be41-175b3b85b480</ID>
<Status>Failed</Status>
<HttpStatusCode>500</HttpStatusCode>
<Error>
<Code>InternalError</Code>
<Message>The server encountered an internal error. Please retry the request.</Message>
</Error>
</Operation>
And this is where I am stuck, the chances are I am subtly malforming the request in such a way that the underlying request is throwing a 500 Internal Server Error, however without a more detailed error message or API documentation I don't think there is anywhere I can go with this.
We worked with Richard offline and the following XML payload got him un-blocked.
Note - When adding/removing named cache to an existing cache, the object size is fixed.
Note 2- The Azure Managed Cache API is sensitive to whitespace between the element and the element.
Also please note, we are working on adding Named cache capability to our PowerShell itself, so folks don't have to use APIs to do so.
<Resource xmlns="http://schemas.microsoft.com/windowsazure" xmlns:i="http://www.w3.org/2001/XMLSchema-instance">
<IntrinsicSettings><CacheServiceInput xmlns="" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema">
<SkuType>Standard</SkuType>
<Location>North Europe</Location>
<SkuCount>1</SkuCount>
<ServiceVersion>1.3.0</ServiceVersion>
<ObjectSizeInBytes>1024</ObjectSizeInBytes>
<NamedCaches>
<NamedCache>
<CacheName>default</CacheName>
<NotificationsEnabled>false</NotificationsEnabled>
<HighAvailabilityEnabled>false</HighAvailabilityEnabled>
<EvictionPolicy>LeastRecentlyUsed</EvictionPolicy>
<ExpirationSettings>
<TimeToLiveInMinutes>10</TimeToLiveInMinutes>
<Type>Absolute</Type>
</ExpirationSettings>
</NamedCache>
<NamedCache>
<CacheName>richard</CacheName>
<NotificationsEnabled>false</NotificationsEnabled>
<HighAvailabilityEnabled>false</HighAvailabilityEnabled>
<EvictionPolicy>LeastRecentlyUsed</EvictionPolicy>
<ExpirationSettings>
<TimeToLiveInMinutes>10</TimeToLiveInMinutes>
<Type>Absolute</Type>
</ExpirationSettings>
</NamedCache>
</NamedCaches>
</CacheServiceInput>
</IntrinsicSettings>
</Resource>

Staging or Production Instance?

Is there anywhere in the service runtime that would tell me if I'm currently running on 'Staging' or 'Production'? Manually modifying the config to and from production seems a bit cumbersome.
You should really not change your configurations when you're based upon if you're in Prod or Staging. Staging area is not designed to be a "QA" environment but only a holding-area before production is deployed.
When you upload a new deployment, current deployment slot where you upload your package to is destroyed and is down for 10-15minutes while upload and start of VM's is happening. If you upload straight into production, that's 15 minutes of production downtime. Thus, Staging area was invented: you upload to staging, test the stuff, and click "Swap" button and your Staging environment magically becomes Production (virtual IP swap).
Thus, your staging should really be 100% the same as your production.
What I think you're looking for is QA/testing environment? You should open up a new service for Testing environment with its own Prod/Staging. In this case, you will want to maintain multiple configuration file sets, one set per deployment environment (Production, Testing, etc.)
There are many ways to manage configuration-hell that occurs, especially with Azure that has on top of .config files, its own *.cscfg files. The way I prefer to do it with Azure project is as follows:
Setup a small Config project, create folders there that match Deployment types. Inside each folder setup sets of *.config & *.cscfg files that match to particular deployment environment: Debug, Test, Release... these are setup in Visual Studio as well , as build target types. I have a small xcopy command that occurs during every compile of the Config project that copies all the files from Build Target folder of Config project into root folder of the Config project.
Then every other project in the solution, LINKS to the .config or .cscfg file from the root folder of the Config project.
Voila, my configs magically adapt to every build configuration automatically. I also use .config transformations to manage debugging information for Release vs. non-Release build targets.
If you've read all this and still want to get at the Production vs. Staging status at runtime, then:
Get deploymentId from RoleEnvironment.DeploymentId
Then use Management API with a proper X509 certificate to get at the Azure structure of your Service and call the GetDeployments method (it's rest api but there is an abstraction library).
Hope this helps
Edit: blog post as requested about the setup of configuration strings and switching between environments # http://blog.paraleap.com/blog/post/Managing-environments-in-a-distributed-Azure-or-other-cloud-based-NET-solution
Sometimes I wish people would just answer the question.. not explain ethics or best practices...
Microsoft has posted a code sample doing exactly this here: https://code.msdn.microsoft.com/windowsazure/CSAzureDeploymentSlot-1ce0e3b5
protected void Page_Load(object sender, EventArgs e)
{
// You basic information of the Deployment of Azure application.
string deploymentId = RoleEnvironment.DeploymentId;
string subscriptionID = "<Your subscription ID>";
string thrumbnail = "<Your certificate thumbnail print>";
string hostedServiceName = "<Your hosted service name>";
string productionString = string.Format(
"https://management.core.windows.net/{0}/services/hostedservices/{1}/deploymentslots/{2}",
subscriptionID, hostedServiceName, "Production");
Uri requestUri = new Uri(productionString);
// Add client certificate.
X509Store store = new X509Store(StoreName.My, StoreLocation.LocalMachine);
store.Open(OpenFlags.OpenExistingOnly);
X509Certificate2Collection collection = store.Certificates.Find(
X509FindType.FindByThumbprint, thrumbnail, false);
store.Close();
if (collection.Count != 0)
{
X509Certificate2 certificate = collection[0];
HttpWebRequest httpRequest = (HttpWebRequest)HttpWebRequest.Create(requestUri);
httpRequest.ClientCertificates.Add(certificate);
httpRequest.Headers.Add("x-ms-version", "2011-10-01");
httpRequest.KeepAlive = false;
HttpWebResponse httpResponse = httpRequest.GetResponse() as HttpWebResponse;
// Get response stream from Management API.
Stream stream = httpResponse.GetResponseStream();
string result = string.Empty;
using (StreamReader reader = new StreamReader(stream))
{
result = reader.ReadToEnd();
}
if (result == null || result.Trim() == string.Empty)
{
return;
}
XDocument document = XDocument.Parse(result);
string serverID = string.Empty;
var list = from item
in document.Descendants(XName.Get("PrivateID",
"http://schemas.microsoft.com/windowsazure"))
select item;
serverID = list.First().Value;
Response.Write("Check Production: ");
Response.Write("DeploymentID : " + deploymentId
+ " ServerID :" + serverID);
if (deploymentId.Equals(serverID))
lbStatus.Text = "Production";
else
{
// If the application not in Production slot, try to check Staging slot.
string stagingString = string.Format(
"https://management.core.windows.net/{0}/services/hostedservices/{1}/deploymentslots/{2}",
subscriptionID, hostedServiceName, "Staging");
Uri stagingUri = new Uri(stagingString);
httpRequest = (HttpWebRequest)HttpWebRequest.Create(stagingUri);
httpRequest.ClientCertificates.Add(certificate);
httpRequest.Headers.Add("x-ms-version", "2011-10-01");
httpRequest.KeepAlive = false;
httpResponse = httpRequest.GetResponse() as HttpWebResponse;
stream = httpResponse.GetResponseStream();
result = string.Empty;
using (StreamReader reader = new StreamReader(stream))
{
result = reader.ReadToEnd();
}
if (result == null || result.Trim() == string.Empty)
{
return;
}
document = XDocument.Parse(result);
serverID = string.Empty;
list = from item
in document.Descendants(XName.Get("PrivateID",
"http://schemas.microsoft.com/windowsazure"))
select item;
serverID = list.First().Value;
Response.Write(" Check Staging:");
Response.Write(" DeploymentID : " + deploymentId
+ " ServerID :" + serverID);
if (deploymentId.Equals(serverID))
{
lbStatus.Text = "Staging";
}
else
{
lbStatus.Text = "Do not find this id";
}
}
httpResponse.Close();
stream.Close();
}
}
Staging is a temporary deployment slot used mainly for no-downtime upgrades and ability to roll back an upgrade.
It is advised not to couple your system (either in code or in config) with such Azure specifics.
Since Windows Azure Management Libraries and thanks to #GuaravMantri answer to another question you can do it like this :
using System;
using System.Linq;
using System.Security.Cryptography.X509Certificates;
using Microsoft.Azure;
using Microsoft.WindowsAzure.Management.Compute;
using Microsoft.WindowsAzure.Management.Compute.Models;
namespace Configuration
{
public class DeploymentSlotTypeHelper
{
static string subscriptionId = "<subscription-id>";
static string managementCertContents = "<Base64 Encoded Management Certificate String from Publish Setting File>";// copy-paste it
static string cloudServiceName = "<your cloud service name>"; // lowercase
static string ns = "http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceConfiguration";
public DeploymentSlot GetSlotType()
{
var managementCertificate = new X509Certificate2(Convert.FromBase64String(managementCertContents));
var credentials = new CertificateCloudCredentials(subscriptionId, managementCertificate);
var computeManagementClient = new ComputeManagementClient(credentials);
var response = computeManagementClient.HostedServices.GetDetailed(cloudServiceName);
return response.Deployments.FirstOrDefault(d => d.DeploymentSlot == DeploymentSlot.Production) == null ? DeploymentSlot.Staging : DeploymentSlot.Production;
}
}
}
An easy way to solve this problem is setting at your instances an key to identify which environment it is running.
1) Set at your production slot:
Set it Settings >> Application settings >> App settings
And create a key named SLOT_NAME and value "production". IMPORTANT: check Slot setting.
2) Set at your staging slot:
Set it Settings >> Application settings >> App settings
And create a key named SLOT_NAME and value "staging". IMPORTANT: check Slot setting.
Access from your application the variable and identify which environment the application is running. In Java you can access:
String slotName = System.getenv("APPSETTING_SLOT_NAME");
Here are 4 points to consider
VIP swap only makes sense when your service faces the outside world. AKA, when it exposes an API and reacts to requests.
If all your service does is pull messages from a queue and process them, then your services is proactive and VIP swap is not a good solution for you.
If your service is both reactive and proactive, you may want to reconsider your design. Perhaps split the service into 2 different services.
Eric's suggestion of modifying the cscfg files pre- and post- VIP swap is good if the proactive part of your service can take a short down time (Because you first configure both Staging and Production to not pull messages, then perform VIP Swap, and then update Production's configuration to start pulling messages).

Resources