It seems, in my WebAPI project, which uses OWIN, the call always returns NULL:
var appAssembly = Assembly.GetEntryAssembly();
I've also tried:
var entryAssembly = new StackTrace().GetFrames().Last().GetMethod().Module.Assembly;
All that returns is "System.Web".
How do you capture the App Name, Version???
I am trying to capture this information right at Startup for the Web API project:
/// <summary>
/// The OWIN startup class.
/// </summary>
public class Startup
{
/// <summary>
/// The necessary OWIN configuration method.
/// </summary>
/// <param name="app">The app being started under OWIN hosting.</param>
public void Configuration(IAppBuilder app)
{
var appAssembly = Assembly.GetEntryAssembly();
Aspect.Logging.LoggingHandler.Initialize(appAssembly, "Hard Coded App Name!");
Log.Information("Starting App...");
// Order is important here. The security wiring must happen first.
ConfigureAuthentication(app);
// Create web configuration and register with WebAPI.
HttpConfiguration config = new HttpConfiguration();
WebApiConfig.Register(config);
// Configure documentation.
ConfigureDocumentation(config);
// Configure support for static files (e.g. index.html).
app.UseFileServer(new FileServerOptions
{
EnableDefaultFiles = true,
FileSystem = new PhysicalFileSystem(".")
});
// Start the API.
app.UseWebApi(config);
Log.Information("App started.");
}
Use:
var appAssembly = typeof(Startup).Assembly;
Related
Problem Statement: We have a requirement to upload log data to Azure Storage from a Xamarin.IOS application. The logs are not created by the user of the application, and there's no constraint on the user to keep the application open for any amount of time after the logs are generated. We want to reliably upload our logs with a couple points in mind:
The user might send the app into the background
The file sizes can be up to 15MB
We don't care when we get them. We're open to scheduling a task for this.
In looking at potential solutions to this problem, the Xamarin documentation states that in iOS7+:
NSURLSession allows us to create tasks to:
Transfer content through network and device interruptions.
Upload and download large files ( Background Transfer Service ).
So it seems like NSURLSession is a good candidate for this sort of work, but I wonder if I am reinventing the wheel. Does the WindowsAzure.Storage client library respect app backgrounding with an upload implementation based on NSURLSession, or if I want to upload the data in the background, is it necessary to upload to an intermediate server I control with a POST method, and then relay data to Azure Storage? There doesn't seem to be any indication from the public Azure documentation that uploads can be done via scheduled task.
I got this working. I've simplified the classes and methods into a single method. Only the necessities are here.
public void UploadFile(File playbackFile)
{
/// Specify your credentials
var sasURL = "?<the sastoken>";
/// Azure blob storage URL
var storageAccount = "https://<yourstorageaccount>.blob.core.windows.net/<your container name>";
/// specify a UNIQUE session name
var configuration =
NSUrlSessionConfiguration.CreateBackgroundSessionConfiguration("A background session name");
/// create the session with a delegate to recieve callbacks and debug
var session = NSUrlSession.FromConfiguration(
configuration,
new YourSessionDelegate(),
new NSOperationQueue());
/// Construct the blob endpoint
var url = $"{storageAccount}/{playbackFile.Name}{sasURL}";
var uploadUrl = NSUrl.FromString(url);
/// Add any headers for Blob PUT. x-ms-blob-type is REQUIRED
var dic = new NSMutableDictionary();
dic.Add(new NSString("x-ms-blob-type"), new NSString("BlockBlob"));
/// Create the request with NSMutableUrlRequest
/// A default NSUrlRequest.FromURL() is immutable with a GET method
var request = new NSMutableUrlRequest(uploadUrl);
request.Headers = dic;
request.HttpMethod = "PUT";
/// Create the task
var uploadTask = session.CreateUploadTask(
request,
NSUrl.FromFilename(playbackFile.FullName));
/// Start the task
uploadTask.Resume();
}
/// Delegate to recieve callbacks. Implementations are omitted for brevity
public class YourSessionDelegate: NSUrlSessionDataDelegate
{
public override void DidBecomeInvalid(NSUrlSession session, NSError error)
{
Console.WriteLine(error.Description);
}
public override void DidSendBodyData(NSUrlSession session, NSUrlSessionTask task, long bytesSent, long totalBytesSent, long totalBytesExpectedToSend)
{
Console.WriteLine(bytesSent);
}
public override void DidReceiveData(NSUrlSession session, NSUrlSessionDataTask dataTask, NSData data)
{
Console.WriteLine(data);
}
public override void DidCompleteWithError(NSUrlSession session, NSUrlSessionTask task, NSError error)
{
var uploadTask = task as NSUrlSessionUploadTask;
Console.WriteLine(error?.Description);
}
public override void DidReceiveResponse(NSUrlSession session, NSUrlSessionDataTask dataTask, NSUrlResponse response, Action<NSUrlSessionResponseDisposition> completionHandler)
{
Console.WriteLine(response);
}
public override void DidFinishEventsForBackgroundSession(NSUrlSession session)
{
using (AppDelegate appDelegate = UIApplication.SharedApplication.Delegate as AppDelegate)
{
var handler = appDelegate.BackgroundSessionCompletionHandler;
if (handler != null)
{
appDelegate.BackgroundSessionCompletionHandler = null;
handler();
}
}
}
}
Helpful documentation:
https://learn.microsoft.com/en-us/rest/api/storageservices/put-blob
https://developer.apple.com/documentation/foundation/nsmutableurlrequest/1408793-setvalue
https://learn.microsoft.com/en-us/dotnet/api/foundation.insurlsessiontaskdelegate?view=xamarin-ios-sdk-12
Hopefully someone finds this useful and spends less time on this than I did. Thanks #SushiHangover for pointing me in the right direction.
In Azure function i can get the FunctionAppDirectory from the context but How do I get the FunctionAppDirectory in Configure method.
I need FunctionAppDirectory at <<FunctionAppDirectory>> in the code:
// --------------------------------------------------------------------------------------------------------------------
// <copyright file="WebJobsExtensionStartup.cs" company="Microsoft">
// Copyright (c) Microsoft Corporation. All rights reserved.
// </copyright>
// --------------------------------------------------------------------------------------------------------------------
using System;
using System.Collections.Generic;
using System.Globalization;
using System.IO;
using System.Linq;
using System.Text.RegularExpressions;
using Intercom.Helpers;
using Microsoft.ApplicationInsights.Extensibility;
using Microsoft.Azure.KeyVault;
using Microsoft.Azure.Services.AppAuthentication;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Hosting;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.Configuration.AzureKeyVault;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Logging;
[assembly: WebJobsStartup(typeof(ConvAi.BfChannel.BotManagementService.WebJobsExtensionStartup), "Web Jobs Extension")]
namespace ConvAi.BfChannel.BotManagementService
{
/// <summary>
/// WebJobsExtensionStartup
/// So this Azure function should be deployed with App service plan.
/// </summary>
public class WebJobsExtensionStartup : IWebJobsStartup
{
/// <summary>
/// AzureServiceTokenProvider which is used for requesting identity token.
/// </summary>
public static AzureServiceTokenProvider AzureServiceTokenProvider { get; set; }
/// <summary>
/// Configure services.
/// </summary>
/// <param name="builder">WebJob Builder</param>
public void Configure(IWebJobsBuilder builder)
{
try
{
bool isLocal = string.IsNullOrEmpty(Environment.GetEnvironmentVariable("WEBSITE_INSTANCE_ID"));
// Gets the default configuration
var serviceConfig = builder.Services.FirstOrDefault(s => s.ServiceType.Equals(typeof(IConfiguration)));
var rootConfig = (IConfiguration)serviceConfig.ImplementationInstance;
var config = new ConfigurationBuilder()
.SetBasePath(<<FunctionAppDirectory>>)
.AddConfiguration(rootConfig)
.AddJsonFile($#"Config\botregistrationOptions.{rootConfig["environmentName"]}.json", optional: false)
.Build();
// Replace the existing config
builder.Services.AddSingleton<IConfiguration>(config);
}
catch (Exception ex)
{
//log
throw;
}
}
}
}
You can replace context.FunctionAppDirectory with Environment.CurrentDirectory.
At least, that works locally and that's exactly where you need local.settings.json to work, so this change should be safe.
When you would use it on azure, you need to add .AddEnvironmentVariables() to config.
Update:
Use the following code when you work both locally and azure.
var config = new ConfigurationBuilder()
.SetBasePath(<<FunctionAppDirectory>>)
.AddConfiguration(rootConfig)
.AddJsonFile($#"Config\botregistrationOptions.{rootConfig["environmentName"]}.json", optional: false)
.AddEnvironmentVariables()
.Build();
See here for a good explanation how to handle app settings in Functions v2: https://blog.jongallant.com/2018/01/azure-function-config/
Bottom line: For local debugging you usually use a local.settings.json file. When deployed in Azure, you don't use config files, but instead the app settings are injected as environment variables into your Function. One huge advantage of that is that you can use things like Azure Key Vault integration of secure storage of settings: https://learn.microsoft.com/en-us/azure/app-service/app-service-key-vault-references
Please forgive me if this has already been asked before. I looked around, but my situation didn't come into play on any answered question I came across.
I'm Using SignalR 2.2.0
Setup: I have a WebAPI 2 web application (call it API For short) that holds my hub called ChatHub. I have a MVC site (MVC) that is calling my hub on the API site.
Both of these sites are on the same server just different ports. I am using VS 2013 and when I test locally, my local system also uses the same ports...also the url to the API site is loaded from the web config which is different for Release and Debug and local(so the url is correct and the ports on the server are fine...really on thing wrong is OnDisconnectednot getting fired)
After a lot of trial and error and searching, I finally got a test application up. Everything was working perfect. Then I had to modify my hub fit into the business model. IE Take the in memory list of users and messages and record them in the database (among other things). Everything work perfectly when running locally, however once the sites are published to IIS on the server....The OnDisconnected is never called. On the test app / when running locally, this is almost instantly hit by most browsers. But even after waiting 15+ minutes, the method is still not fired.
Here is my hub:(shorted for clarity)
/// <summary>
/// Chat Hub Class
/// </summary>
public class ChatHubAsync : Hub
{
#region Data Members
IDemographicsProvider DemographicsProvider { get; set;}
IChatRepository Repo { get; set; }
#endregion
#region CTOR
/// <summary>
/// Unity Constructor
/// </summary>
[InjectionConstructor]
public ChatHubAsync()
: this(new ChatRepositoryEF(), DataProviders.Demographics)
{
}
/// <summary>
/// Constructor for Chat Hub
/// </summary>
/// <param name="Repository"></param>
/// <param name="Demographics"></param>
public ChatHubAsync(IChatRepository Repository, IDemographicsProvider Demographics)
{
Repo = Repository;
DemographicsProvider = Demographics;
}
#endregion
#region Methods
/// <summary>
/// On Connected method to call base class
/// </summary>
/// <returns></returns>
public override async Task OnConnected()
{
await base.OnConnected();
}
/// <summary>
/// Connect to Hub
/// </summary>
/// <returns>Void</returns>
public async Task Connect()
{
if (await Repo.GetUser(Context.ConnectionId) == null)
{
await RemoveDuplicates(getGuidFromQueryString("userID"), getGuidFromQueryString("groupID"));
var user = await CreateUser();
await Repo.Connect(user);
await Clients.Caller.onConnected(user);
}
}
/// <summary>
/// Add User To Group
/// </summary>
/// <returns></returns>
public async Task AddToGroup()
{
Guid id = getGroupFromQueryString();
if (id != Guid.Empty)
{
string groupID = id.ToString();
var user = await Repo.GetUser(Context.ConnectionId);
try
{
if(user == null)
{
await Connect();
user = await Repo.GetUser(Context.ConnectionId);
}
await Groups.Add(Context.ConnectionId, groupID);
var users = await Repo.OnlineUsers(id);
var messages = await Repo.RetrieveMessages(id, 20);
var status = await Repo.AddOnlineUserToGroup(Context.ConnectionId, id);
await Clients.Caller.onGroupJoined(user, users, messages, status);
Clients.Group(groupID, Context.ConnectionId).onNewUserConnected(user);
}
catch(Exception E)
{
Console.WriteLine(E.Message);
}
}
}
/// .....More Methods that are irrelevant....
/// <summary>
/// Disconnect from Hub
/// </summary>
/// <param name="stopCalled"></param>
/// <returns></returns>
public override async Task OnDisconnected(bool stopCalled)
{
try
{
var item = await Repo.GetUser(Context.ConnectionId);
if (item != null)
{
if (item.GroupID != null && item.GroupID != Guid.Empty)
{
var id = item.GroupID.ToString();
Repo.Disconnect(Context.ConnectionId);
Clients.OthersInGroup(id).onUserDisconnected(Context.ConnectionId, item.UserName);
Groups.Remove(Context.ConnectionId, id);
}
}
}
catch (Exception E)
{
Console.WriteLine(E.Message);
}
await base.OnDisconnected(stopCalled);
}
#endregion
#region private Messages
private async Task<IOnlineUser> CreateUser()
{
///Code removed
}
private Guid getGroupFromQueryString()
{
return getGuidFromQueryString("groupID");
}
private Guid getGuidFromQueryString(string name)
{
Guid id;
try
{
var item = getItemFromQueryString(name);
if (Guid.TryParse(item, out id))
{
return id;
}
throw new Exception("Not a Valid Guid");
}
catch(Exception E)
{
Console.WriteLine(E.Message);
return Guid.Empty;
}
}
private async Task RemoveDuplicates(Guid User, Guid Group)
{
///Code removed
}
#endregion
}
UPDATE:
I have no idea why, but once I removed the calls to the database (ALL CALLS TO THE DATABASE) and went back to in memory lists, On Disconnected started getting called again.
Added back any call to the database using either straight sql or EntityFramework and the OnDisconnected stopped getting called.
Any ideas why adding database calls would cause the onDisconnected to stop getting called?
In case anyone else is having this issue. The problem was caused by the application not impersonating the correct user....it worked fine in visual studio because when impersonation failed to use the provided user, it used me. Out side of VS it didn't have that option and tried using the machine account which failed to log in. This was fixed by adding the impersonation in the application pool instead of using it in the web config. I was having issues with this earlier in the project when using async calls, but I thought I had those fixed with a few changes to the web config. I worked for all the async calls that I was making, but it was still failing in a few select areas that was causing the OnDisconnect to not fire on the server.
Our site uses ADFS for auth. To reduce the cookie payload on every request we're turning IsSessionMode on (see Your fedauth cookies on a diet).
The last thing we need to do to get this working in our load balanced environment is to implement a farm ready SecurityTokenCache. The implementation seems pretty straightforward, I'm mainly interested in finding out if there are any gotchas we should consider when dealing with SecurityTokenCacheKey and the TryGetAllEntries and TryRemoveAllEntries methods (SecurityTokenCacheKey has a custom implementation of the Equals and GetHashCode methods).
Does anyone have an example of this? We're planning on using AppFabric as the backing store but an example using any persistent store would be helpful- database table, Azure table-storage, etc.
Here are some places I've searched:
In Hervey Wilson's PDC09
session he uses a
DatabaseSecurityTokenCache. I haven't been able to find the sample
code for his session.
On page 192 of Vittorio Bertocci's excellent
book, "Programming Windows Identity Foundation" he mentions uploading
a sample implementation of an Azure ready SecurityTokenCache to the
book's website. I haven't been able to find this sample either.
Thanks!
jd
3/16/2012 UPDATE
Vittorio's blog links to a sample using the new .net 4.5 stuff:
ClaimsAwareWebFarm
This sample is an answer to the feedback we got from many of you guys: you wanted a sample showing a farm ready session cache (as opposed to a tokenreplycache) so that you can use sessions by reference instead of exchanging big cookies; and you asked for an easier way of securing cookies in a farm.
To come up with a working implementation we ultimately had to use reflector to analyze the different SessionSecurityToken related classes in Microsoft.IdentityModel. Below is what we came up with. This implementation is deployed on our dev and qa environments, seems to be working fine, it's resiliant to app pool recycles etc.
In global.asax:
protected void Application_Start(object sender, EventArgs e)
{
FederatedAuthentication.ServiceConfigurationCreated += this.OnServiceConfigurationCreated;
}
private void OnServiceConfigurationCreated(object sender, ServiceConfigurationCreatedEventArgs e)
{
var sessionTransforms = new List<CookieTransform>(new CookieTransform[]
{
new DeflateCookieTransform(),
new RsaEncryptionCookieTransform(
e.ServiceConfiguration.ServiceCertificate),
new RsaSignatureCookieTransform(
e.ServiceConfiguration.ServiceCertificate)
});
// following line is pseudo code. use your own durable cache implementation.
var durableCache = new AppFabricCacheWrapper();
var tokenCache = new DurableSecurityTokenCache(durableCache, 5000);
var sessionHandler = new SessionSecurityTokenHandler(sessionTransforms.AsReadOnly(),
tokenCache,
TimeSpan.FromDays(1));
e.ServiceConfiguration.SecurityTokenHandlers.AddOrReplace(sessionHandler);
}
private void WSFederationAuthenticationModule_SecurityTokenValidated(object sender, SecurityTokenValidatedEventArgs e)
{
FederatedAuthentication.SessionAuthenticationModule.IsSessionMode = true;
}
DurableSecurityTokenCache.cs:
/// <summary>
/// Two level durable security token cache (level 1: in memory MRU, level 2: out of process cache).
/// </summary>
public class DurableSecurityTokenCache : SecurityTokenCache
{
private ICache<string, byte[]> durableCache;
private readonly MruCache<SecurityTokenCacheKey, SecurityToken> mruCache;
/// <summary>
/// The constructor.
/// </summary>
/// <param name="durableCache">The durable second level cache (should be out of process ie sql server, azure table, app fabric, etc).</param>
/// <param name="mruCapacity">Capacity of the internal first level cache (in-memory MRU cache).</param>
public DurableSecurityTokenCache(ICache<string, byte[]> durableCache, int mruCapacity)
{
this.durableCache = durableCache;
this.mruCache = new MruCache<SecurityTokenCacheKey, SecurityToken>(mruCapacity, mruCapacity / 4);
}
public override bool TryAddEntry(object key, SecurityToken value)
{
var cacheKey = (SecurityTokenCacheKey)key;
// add the entry to the mru cache.
this.mruCache.Add(cacheKey, value);
// add the entry to the durable cache.
var keyString = GetKeyString(cacheKey);
var buffer = this.GetSerializer().Serialize((SessionSecurityToken)value);
this.durableCache.Add(keyString, buffer);
return true;
}
public override bool TryGetEntry(object key, out SecurityToken value)
{
var cacheKey = (SecurityTokenCacheKey)key;
// attempt to retrieve the entry from the mru cache.
value = this.mruCache.Get(cacheKey);
if (value != null)
return true;
// entry wasn't in the mru cache, retrieve it from the app fabric cache.
var keyString = GetKeyString(cacheKey);
var buffer = this.durableCache.Get(keyString);
var result = buffer != null;
if (result)
{
// we had a cache miss in the mru cache but found the item in the durable cache...
// deserialize the value retrieved from the durable cache.
value = this.GetSerializer().Deserialize(buffer);
// push this item into the mru cache.
this.mruCache.Add(cacheKey, value);
}
return result;
}
public override bool TryRemoveEntry(object key)
{
var cacheKey = (SecurityTokenCacheKey)key;
// remove the entry from the mru cache.
this.mruCache.Remove(cacheKey);
// remove the entry from the durable cache.
var keyString = GetKeyString(cacheKey);
this.durableCache.Remove(keyString);
return true;
}
public override bool TryReplaceEntry(object key, SecurityToken newValue)
{
var cacheKey = (SecurityTokenCacheKey)key;
// remove the entry in the mru cache.
this.mruCache.Remove(cacheKey);
// remove the entry in the durable cache.
var keyString = GetKeyString(cacheKey);
// add the new value.
return this.TryAddEntry(key, newValue);
}
public override bool TryGetAllEntries(object key, out IList<SecurityToken> tokens)
{
// not implemented... haven't been able to find how/when this method is used.
tokens = new List<SecurityToken>();
return true;
//throw new NotImplementedException();
}
public override bool TryRemoveAllEntries(object key)
{
// not implemented... haven't been able to find how/when this method is used.
return true;
//throw new NotImplementedException();
}
public override void ClearEntries()
{
// not implemented... haven't been able to find how/when this method is used.
//throw new NotImplementedException();
}
/// <summary>
/// Gets the string representation of the specified SecurityTokenCacheKey.
/// </summary>
private string GetKeyString(SecurityTokenCacheKey key)
{
return string.Format("{0}; {1}; {2}", key.ContextId, key.KeyGeneration, key.EndpointId);
}
/// <summary>
/// Gets a new instance of the token serializer.
/// </summary>
private SessionSecurityTokenCookieSerializer GetSerializer()
{
return new SessionSecurityTokenCookieSerializer(); // may need to do something about handling bootstrap tokens.
}
}
MruCache.cs:
/// <summary>
/// Most recently used (MRU) cache.
/// </summary>
/// <typeparam name="TKey">The key type.</typeparam>
/// <typeparam name="TValue">The value type.</typeparam>
public class MruCache<TKey, TValue> : ICache<TKey, TValue>
{
private Dictionary<TKey, TValue> mruCache;
private LinkedList<TKey> mruList;
private object syncRoot;
private int capacity;
private int sizeAfterPurge;
/// <summary>
/// The constructor.
/// </summary>
/// <param name="capacity">The capacity.</param>
/// <param name="sizeAfterPurge">Size to make the cache after purging because it's reached capacity.</param>
public MruCache(int capacity, int sizeAfterPurge)
{
this.mruList = new LinkedList<TKey>();
this.mruCache = new Dictionary<TKey, TValue>(capacity);
this.capacity = capacity;
this.sizeAfterPurge = sizeAfterPurge;
this.syncRoot = new object();
}
/// <summary>
/// Adds an item if it doesn't already exist.
/// </summary>
public void Add(TKey key, TValue value)
{
lock (this.syncRoot)
{
if (mruCache.ContainsKey(key))
return;
if (mruCache.Count + 1 >= this.capacity)
{
while (mruCache.Count > this.sizeAfterPurge)
{
var lru = mruList.Last.Value;
mruCache.Remove(lru);
mruList.RemoveLast();
}
}
mruCache.Add(key, value);
mruList.AddFirst(key);
}
}
/// <summary>
/// Removes an item if it exists.
/// </summary>
public void Remove(TKey key)
{
lock (this.syncRoot)
{
if (!mruCache.ContainsKey(key))
return;
mruCache.Remove(key);
mruList.Remove(key);
}
}
/// <summary>
/// Gets an item. If a matching item doesn't exist null is returned.
/// </summary>
public TValue Get(TKey key)
{
lock (this.syncRoot)
{
if (!mruCache.ContainsKey(key))
return default(TValue);
mruList.Remove(key);
mruList.AddFirst(key);
return mruCache[key];
}
}
/// <summary>
/// Gets whether a key is contained in the cache.
/// </summary>
public bool ContainsKey(TKey key)
{
lock (this.syncRoot)
return mruCache.ContainsKey(key);
}
}
ICache.cs:
/// <summary>
/// A cache.
/// </summary>
/// <typeparam name="TKey">The key type.</typeparam>
/// <typeparam name="TValue">The value type.</typeparam>
public interface ICache<TKey, TValue>
{
void Add(TKey key, TValue value);
void Remove(TKey key);
TValue Get(TKey key);
}
Here is a sample that I wrote. I use Windows Azure to store the tokens forever, defeating any possible replay.
http://tokenreplaycache.codeplex.com/releases/view/76652
You will need to place this in your web.config:
<service>
<securityTokenHandlers>
<securityTokenHandlerConfiguration saveBootstrapTokens="true">
<tokenReplayDetection enabled="true" expirationPeriod="50" purgeInterval="1">
<replayCache type="LC.Security.AzureTokenReplayCache.ACSTokenReplayCache,LC.Security.AzureTokenReplayCache, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null" />
</tokenReplayDetection>
</securityTokenHandlerConfiguration>
</securityTokenHandlers>
</service>
Here at work we're working with an OData WCF Service to create our new API. To fully implement our API, we've started extending the service with custom functions that allow us to trigger specific functionality that can't be exposed through the normal means of OData.
One example is switching a Workspace entity into advanced mode. This requires alot of checks and mingling with data, that we opted to move this to a seperate function. This is the complete code of our Api.svc service:
using System.Net;
using System.ServiceModel.Web;
namespace TenForce.Execution.Web
{
using System;
using System.Data.Services;
using System.Data.Services.Common;
using System.Security.Authentication;
using System.ServiceModel;
using System.Text;
using Microsoft.Data.Services.Toolkit;
using Api2;
using Api2.Implementation.Security;
using Api2.OData;
/// <summary>
/// <para>This class represents the entire OData WCF Service that handles incoming requests and processes the data needed
/// for those requests. The class inherits from the <see cref="ODataService<T>">ODataService</see> class in the toolkit to
/// implement the desired functionality.</para>
/// </summary>
[ServiceBehavior(IncludeExceptionDetailInFaults = true)]
public class Api : ODataService<Context>
{
#region Initialization & Authentication
// This method is called only once to initialize service-wide policies.
public static void InitializeService(DataServiceConfiguration config)
{
config.UseVerboseErrors = true;
config.SetEntitySetAccessRule("*", EntitySetRights.All);
config.SetServiceOperationAccessRule("*", ServiceOperationRights.All);
config.DataServiceBehavior.MaxProtocolVersion = DataServiceProtocolVersion.V2;
Factory.SetImplementation(typeof(Api2.Implementation.Api));
}
/// <summary>
/// <para>This function is called when a request needs to be processed by the OData API.</para>
/// <para>This function will look at the headers that are supplied to the request and try to extract the relevant
/// user credentials from these headers. Using those credentials, a login is attempted. If the login is successfull,
/// the request is processed. If the login fails, an AuthenticationException is raised instead.</para>
/// <para>The function will also add the required response headers to the service reply to indicate the success
/// or failure of the Authentication attempt.</para>
/// </summary>
/// <param name="args">The arguments needed to process the incoming request.</param>
/// <exception cref="AuthenticationException">Invalid username and/or password.</exception>
protected override void OnStartProcessingRequest(ProcessRequestArgs args)
{
#if DEBUG
Authenticator.Authenticate("secretlogin", string.Empty, Authenticator.ConstructDatabaseId(args.RequestUri.ToString()));
#else
bool authSuccess = Authenticate(args.OperationContext, args.RequestUri.ToString());
args.OperationContext.ResponseHeaders.Add(#"TenForce-RAuth", authSuccess ? #"OK" : #"DENIED");
if (!authSuccess) throw new AuthenticationException(#"Invalid username and/or password");
#endif
base.OnStartProcessingRequest(args);
}
/// <summary>
/// <para>Performs authentication based upon the data present in the custom headers supplied by the client.</para>
/// </summary>
/// <param name="context">The OperationContext for the request</param>
/// <param name="url">The URL for the request</param>
/// <returns>True if the Authentication succeeded; otherwise false.</returns>
private static bool Authenticate(DataServiceOperationContext context, string url)
{
// Check if the header is present
string header = context.RequestHeaders["TenForce-Auth"];
if (string.IsNullOrEmpty(header)) return false;
// Decode the header from the base64 encoding
header = Encoding.UTF8.GetString(Convert.FromBase64String(header));
// Split the header and try to authenticate.
string[] components = header.Split('|');
return (components.Length >= 2) && Authenticator.Authenticate(components[0], components[1], Authenticator.ConstructDatabaseId(url));
}
#endregion
#region Service Methods
/*
* All functions that are defined in this block, are special Service Methods on our API Service that become
* available on the web to be called by external parties. These functions do not belong in the REST specifications
* and are therefore placed here as public functions.
*
* Important to know is that these methods become case-sensitive in their signature as well as their parameters when
* beeing called from the web. Therefore we need to properly document these functions here so the generated document
* explains the correct usage of these functions.
*/
/// <summary>
/// <para>Switches the specified <see cref="Workspace">Workspace</see> into advanced mode, using the specified
/// Usergroup as the working <see cref="Usergroup">Usergroup</see> for the Workspace.</para>
/// <para>The method can be called using the following signature from the web:</para>
/// <para>http://applicationurl/api.svc/SwitchWorkspaceToAdvancedMode?workspaceId=x&usergroupId=y</para>
/// <para>Where x stands for the unique identifier of the <see cref="Workspace">Workspace</see> entity and y stands for the unique
/// identifier of the <see cref="Usergroup">Usergroup</see> entity.</para>
/// <para>This method can only be invoked by a HTTP GET operation and returns a server response 200 when properly executed.
/// If the request fails, the server will respond with a BadRequest error code.</para>
/// </summary>
/// <param name="workspaceId">The unique <see cref="Workspace">Workspace</see> entity identifier.</param>
/// <param name="usergroupId">The unique <see cref="Usergroup">Usergroup</see> entity identifier.</param>
[WebGet]
public void SwitchWorkspaceToAdvancedMode(int workspaceId, int usergroupId)
{
Api2.Objects.Workspace ws = Factory.CreateApi().Workspaces.Read(workspaceId);
Api2.Objects.Usergroup ug = Factory.CreateApi().UserGroups.Read(usergroupId);
if(!Factory.CreateApi().Workspaces.ConvertToAdvancedPrivilegeSetup(ws, ug))
throw new WebFaultException(HttpStatusCode.BadRequest);
}
#endregion
}
}
The code is a bit large, but basicly what this extra functions do is check the supplied headers for each request and authenticate against the application with the provided username and password to ensure only valid users can work with our OData service.
The problem currently exists in the new function we declared at the bottom. The API requires a usercontext to be set for executing the functionality. This is normally done through the Authenticator class.
With the debugger, I followed a request and checked if the Authenticator is beeing called and it is. However when the SwitchWorkspaceToAdvancedMode function is triggered, this context is lost and it appears as nobody ever logged in.
The function calls are like this:
Create a new Api.svc instance
Trigger the OnStartProcessingRequest
Trigger the Authenticate method Trigger the
SwitchWorkspaceToAdvancedMode method
But this last one receives an error from the API stating that no login occured and no user context has been set. This means that we set the current thread principal to the thread that logged in.
From the error messages, I'm concluding that the actuall request for the SwitchWorkspaceToAdvancedMode is running on a different thread, and therefor it seems that no login ever occured because this is done from a different thread.
Am I right in this assumption, and if so, can I prevent this or work around it?
I've solved this issue by adding a new ServiceBehavior to the DataService:
[ServiceBehavior(IncludeExceptionDetailInFaults = true, InstanceContextMode = InstanceContextMode.PerSession)]
This solved the apparent threading issue I had