Cannot inject dependencies to Azure WorkerRole object using Spring.NET - azure

I have moderate experience in developing web applications using spring.net 4.0 , nhibernate 3.0 for ASP.net based web applications. Recently I ran into a situation where I needed to use spring.net to inject my service dependencies which belong to the WorkerRole class. I created the app.config file as I normally did with the web.config files on for spring. Here it is for clarity. (I have excluded the root nodes)
<configSections>
<sectionGroup name="spring">
<section name="context" type="Spring.Context.Support.WebContextHandler, Spring.Web" requirePermission="false" />
<section name="objects" type="Spring.Context.Support.DefaultSectionHandler, Spring.Core" requirePermission="false" />
<section name="parsers" type="Spring.Context.Support.NamespaceParsersSectionHandler, Spring.Core" />
</sectionGroup>
</configSections>
<spring>
<context>
<!-- Application services and data access that has been previously developed and tested-->
<resource uri="assembly://DataAccess/data-access-config.xml" />
<resource uri="assembly://Services/service-config.xml" />
<resource uri="AOP.xml" />
<resource uri="DI.xml"/>
</context>
<parsers>
<parser type="Spring.Data.Config.DatabaseNamespaceParser, Spring.Data" />
<parser type="Spring.Transaction.Config.TxNamespaceParser, Spring.Data" />
<parser type="Spring.Aop.Config.AopNamespaceParser, Spring.Aop" />
</parsers>
</spring>
Similarly Here's the AOP.xml
<object id="FilterServiceProxy" type="Spring.Aop.Framework.ProxyFactoryObject, Spring.Aop">
<property name="proxyInterfaces" value="Domain.IFilterService"/>
<property name="target" ref="FilterService"/>
<property name="interceptorNames">
<list>
<value>UnhandledExceptionThrowsAdvice</value>
<value>PerformanceLoggingAroundAdvice</value>
</list>
</property>
</object>
</objects>
and the DI.xml
<object type="FilterMt.WorkerRole, FilterMt" >
<property name="FilterMtService1" ref="FilterServiceProxy"/>
</object>
However, I was unable to inject any dependencies into the worker role. Can someone please let me know what I am doing wrong here ? Is there a different way to configure Spring.net DI for windows azure applications ?
I don't get any configuration errors but I see that the dependencies have not been injected because the property object to which I've tried injection, remains null.

Based on my experience, you cannot inject anything into your WorkerRole class (the class that implements RoleEntryPoint). What I do, so far with Unity (I also built my own helper for Unity to help me inject Azure settings), is that I have my own infrastructure that runs and is built by Unity, but I create it in the code for the worker role.
For example, I initialize the dependency container in my OnStart() method of RoleEntry point, where I resolve anything I need. Then in my Run() method I call a method on my resolved dependency.
Here is a quick, stripped off version of my RoleEntryPoint's implementation:
public class WorkerRole : RoleEntryPoint
{
private UnityServiceHost _serviceHost;
private UnityContainer _container;
public override void Run()
{
// This is a sample worker implementation. Replace with your logic.
Trace.WriteLine("FIB.Worker entry point called", "Information");
using (this._container = new UnityContainer())
{
this._container.LoadConfiguration();
IWorker someWorker = this._container.Resolve<IWorker>();
someWorker.Start();
IWorker otherWorker = this._container.Resolve<IWorker>("otherWorker");
otherWorker.Start();
while (true)
{
// sleep 30 minutes. we don't really need to do anything here.
Thread.Sleep(1800000);
Trace.WriteLine("Working", "Information");
}
}
}
public override bool OnStart()
{
// Set the maximum number of concurrent connections
ServicePointManager.DefaultConnectionLimit = 12;
// For information on handling configuration changes
// see the MSDN topic at http://go.microsoft.com/fwlink/?LinkId=166357.
this.CreateServiceHost();
return base.OnStart();
}
public override void OnStop()
{
this._serviceHost.Close(TimeSpan.FromSeconds(30));
base.OnStop();
}
private void CreateServiceHost()
{
this._serviceHost = new UnityServiceHost(typeof(MyService));
var binding = new NetTcpBinding(SecurityMode.None);
RoleInstanceEndpoint externalEndPoint =
RoleEnvironment.CurrentRoleInstance.InstanceEndpoints["ServiceEndpoint"];
string endpoint = String.Format(
"net.tcp://{0}/MyService", externalEndPoint.IPEndpoint);
this._serviceHost.AddServiceEndpoint(typeof(IMyService), binding, endpoint);
this._serviceHost.Open();
}
As you can see, my own logic is IWorker interface and I can have as many implementations as I want, and I instiate them in my Run() method. What I do more is to have a WCF Service, again entirely configured via DI with Unity. Here is my IWorker interface:
public interface IWorker : IDisposable
{
void Start();
void Stop();
void DoWork();
}
And that's it. I don't have any "hard" dependencies in my WorkerRole, just the Unity Container. And I have very complex DIs in my two workers, everything works pretty well.
The reason why you can't interfere directly with your WorkerRole.cs class, is that it is being instantiated by the Windows Azure infrastructure, and not by your own infrastructure. You have to accept that, and built your infrastructure within the WorkerRole appropriate methods. And do not forget that you must never quit/break/return/exit the Run() method. Doing so will flag Windows Azure infrastructure that there is something wrong with your code and will trigger role recycling.
Hope this helps.

I know this is an old question, but I'm going through the same learning curve and would like to share my findings for someone who struggles to understand the mechanics.
The reason you can't access DI in your worker role class is because this is run in a separate process in the OS, outside of IIS. Think of your WebRole class as being run in a Windows Service.
I've made a little experiment with my MVC web-site and WebRole class:
public class WebRole : RoleEntryPoint
{
public override void Run()
{
while (true)
{
Thread.Sleep(10000);
WriteToLogFile("Web Role Run: run, Forest, RUN!");
}
}
private static void WriteToLogFile(string text)
{
var file = new System.IO.StreamWriter("D:\\tmp\\webRole.txt", true); // might want to change the filename
var message = string.Format("{0} | {1}", DateTime.UtcNow, text);
file.WriteLine(message);
file.Close();
}
}
This would write to a file a new string every 10 seconds (or so). Now start your Azure site in debugging mode, make sure the site deployed to Azure emulator and the debugger in VS has started. Check that the site is running and check that WebRole is writing to the file in question.
Now stop the IIS Express (or IIS if you are running it in full blown installation) without stopping the VS debugger. All operations in your web-site are stopped now. But if you check your temp file, the process is still running and you still get new lines added every 10 seconds. Until you stop the debugger.
So whatever you have loaded in memory of web-application is inside of the IIS and not available inside of Worker Role. And you need to re-configure your DI and other services from the scratch.
Hope this helps someone to better understand the basics.

Related

Configuring NLog with ServiceStack to not be NullDebugLogger

I'm new to NLog and have chosen to add it to my ServiceStack (4.0.44) web services however it's not working as I expect as I always end up with a NullDebugLogger.
I have
Global.Asax
Sub Application_Start(ByVal sender As Object, ByVal e As EventArgs)
LogManager.LogFactory = New NLogFactory()
Dim appHost As New MyAppHost
appHost.Init()
End Sub
I've also manually added an NLog.config file to log to the debugger
<?xml version="1.0" encoding="utf-8" ?>
<nlog xmlns="http://www.nlog-project.org/schemas/NLog.xsd"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.nlog-project.org/schemas/NLog.xsd NLog.xsd"
autoReload="true"
throwExceptions="true"
internalLogLevel="Trace" internalLogFile="c:\temp\nlog-internal.log" >
<targets>
<!-- log to the debugger -->
<target xsi:type="Debugger" name="debugger" layout="${logger}::${message}"/>
</targets>
<rules>
<logger name="*" minlevel="Trace" writeTo="debugger" />
</rules>
</nlog>
and finally in my class I have the following
public class MyClass
{
public static ILog Log;
public MyClass()
{
Log = LogManager.GetLogger(typeof(MyClass));
}
public void LogSomething()
{
Log.Debug("Starting to LogSomething");
}
}
When I debug, the Log object in my class shows as a ServiceStack.Logging.NullDebugLogger which I believe is the default but I can't figure out how to change it to something I can use. I'm sure I'm missing something simple but can't figure out what it is. My web services are in a different project (in the same solution) which is why my Global.asax is VB and the class is C#. I also have no reference in web.config to NLog.config but I assume that Nlog picks that up anyway.
The way logging works is very simple, LogManager.LogFactory just sets a static property where all subsequent calls to LogManager.GetLogger(Type) will use that concrete factory to return the preferred logger implementation. So it just needs to be sent once on Application Start before any calls to LogManager.GetLogger() is made.
LogManager.LogFactory defaults to NullLogFactory but never gets set by ServiceStack, so the only reasons why it wouldn't retain the NLogFactory is if LogManager.GetLogger() isn't being retrieved in the same AppDomain where it was set or it's only being set after LogManager.GetLogger() is called or some of your code is reverting it back to LogManager.LogFactory = new NullLogFactory().
My hunch since you've shown both C# and VB.NET code is that it's not being set in the same Web Application, i.e. your static property set in VB.NET is not visible in the AppDomain where your C# code is running.

NLog with DNX Core 5.0

I am attempting to implement NLog logging using ASP.Net 5 and MVC 6. Be default, both DNX 451 and DNX Core 50 are included in the project template.
I am attempting to implement NLog Logging by following the example here.
However, in the sample app, there is the following line -
#if !DNXCORE50
factory.AddNLog(new global::NLog.LogFactory());
#endif
And if I run the app, this line never gets hit because the mvc application has dnx core 50 installed by default.
Is there any loggers that are available for DNX Core 50? If not, what purpose does dnx core serve in the default mvc app - is it actually needed?
Edit: If I remove the #if !DNXCORE50.... line above, I get a the following error -
DNX Core 5.0 error - The type or namespace name 'NLog' could not be found in the global namespace'
DNX Core 5.0 is only necessary if you want the cloud-optimized cross-platform version of the .Net framework; if you still plan on using the MVC app within only a Windows environment, you can remove your dnxcore50 framework reference from your project.json.
NLog for .NET Core (DNX environment) is currently available in version 4.4.0-alpha1.
Steps:
Create NLog.config
<?xml version="1.0" encoding="utf-8" ?>
<targets>
<target xsi:type="ColoredConsole" name="ToConsole" />
</targets>
<rules>
<logger name="*" minlevel="Info" writeTo="ToConsole" />
</rules>
Load and parse configuration
private static ILogger _logger;
public static void LoggerSetup()
{
var reader = XmlReader.Create("NLog.config");
var config = new XmlLoggingConfiguration(reader, null); //filename is not required.
LogManager.Configuration = config;
_logger = LogManager.GetCurrentClassLogger();
}
public static void Main(string[] args)
{
LoggerSetup();
// Log anything you want
}
When dealing with the MVC tooling in MVC6 (dnx stuff), the answer to this is very fluid.
In order to get NLog to work with my web app, I had to do a couple steps:
-> Big thanks to two NLog discussions(here and here)
I just needed to add the configuration setup in my Startup.cs's constructor:
public Startup(IHostingEnvironment env)
{
// Set up configuration sources.
var builder = new ConfigurationBuilder()
.AddJsonFile("appsettings.json")
.AddEnvironmentVariables();
// Set up logging configuration
// from: https://github.com/NLog/NLog/issues/641
// and: https://github.com/NLog/NLog/issues/1172
var reader = XmlTextReader.Create(File.OpenRead(Path.Combine(builder.GetBasePath(),"NLog.config"))); //stream preferred above byte[] / string.
LogManager.Configuration = new XmlLoggingConfiguration(reader, null); //filename is not required.
log.Info("NLogger starting");
Configuration = builder.Build();
}
I consider this a bit of a stop-gap as Microsoft is introducing a new Logging interface (that I hope will end up being like SLF4J.org is in Java). Unfortunately, documentation on that is a bit thin at the time I'm writing this. NLog is working diligently on getting themselves an implementation of the new dnx ILoggingProvider interface.
Additional information about my project setup
My NLog.config file is located in the project root folder, next to
the project.json and appsettings.json. I had to do a little digging
inside AddJsonFile() to see how they handled pathing.
I used yeoman.io and their aspnet generator to set up the web project.
Version of NLog, thanks to Lukasz Pyrzyk above:
"NLog": "4.4.0-alpha1"

Configuring Jersey Test Framework with Security

I am writing a REST web service using Jersey, and I'm trying to write a set of unit tests to test the service using the Jersey Test Framework.
However, I use HTTP Authentication and SecurityContext as part of my web service, and I'm having issues setting up JTF to allow me to test these aspects. I can send authentication information in the request, but how do I configure it to know about the different roles and users I wish to set up?
I'm currently using Jetty (via JettyTestContainerFactory), but can switch to different test containers if needed.
The specific configuration I am trying to achieve is two roles, and four users with the combinations of those possible roles (e.g. No roles, role a, role b, roles a and b). The web service will handle giving access to different URLs, so that doesn't need to be specified in the configuration.
I have done this by implementing my own Jetty Test container similar to the one provided by Jersey. We use an embedded Jetty for testing our application in development normally and by creating our own test container based on that embedded Jetty it loads our web application as it would if it was started by a Java main process.
We use a custom Jetty Security Handler configured in a jetty-env.xml file which the embedded Jetty uses to configure the security.
<Set name="securityHandler">
<New class="com.example.DevelopmentSecurityHandler">
<Set name="loginService">
<New class="com.example.DevelopmentLoginService">
<Set name="name">LocalRealm</Set>
<Set name="config">src/main/webapp/WEB-INF/users.properties</Set>
<Call name="start" />
</New>
</Set>
<Set name="authenticator">
<New class="com.example.DevelopmentAuthenticator"></New>
</Set>
<Set name="checkWelcomeFiles">true</Set>
</New>
</Set>
That Jetty env file is loaded by embedded Jetty:
XmlConfiguration configuration = null;
if (jettyEnvFile.exists()) {
try {
configuration = new XmlConfiguration(jettyEnvFile.toURI().toURL());
} catch (Exception e) {
throw new ProcessingException(String.format("Exception loading jetty config from %s", jettyEnvFile));
}
} else {
LOG.warn("No jetty-env.xml found.");
}
The users.properties file referenced in that xml is a simple user to role mapping e.g.
USERNAME=PASSWORD,ROLE_NAME1,ROLE_NAME2
Depending how you configure your Jetty security this may or may not work for you. You can also configure this programmatically, there's lots of examples of embedded Jetty here. The SecuredHelloHandler.java example there could be a good start for you.
For the test container you can basically start by copying org.glassfish.jersey.test.jetty.JettyTestContainerFactory and org.glassfish.jersey.jetty.JettyHttpContainerFactory essentially changing the
public static Server createServer(final URI uri, final SslContextFactory sslContextFactory, final JettyHttpContainer handler, final boolean start)
method to create your version of an embedded Jetty server with security configured however you require.

Windows Azure creating virtual directory to local storage

I need a help with creating virtual directory pointing to local storage in Windows Azure (production environment). I can set up a virtual directory manually but it is being erased every time Azure is restarted. The clue is to create the virtual directory via config file when I upload a package of my project on Azure. The question is how to create such directory so that it is pointing to local storage.
Thx in advance for any suggestions.
Best Regards,
Darek
I suggest you create the virtual directory by interacting with IIS in the WebRole's OnStart method:
public class WebRole : RoleEntryPoint
{
public override bool OnStart()
{
// Connect to the IIS site.
using (var manager = new Microsoft.Web.Administration.ServerManager())
{
var localResourcePath = RoleEnvironment.GetLocalResource("MyResource").RootPath;
// Add to the root application.
var rootSite = manager.Sites[RoleEnvironment.CurrentRoleInstance.Id + "_Web"];
var rootApplication = rootSite.Applications["/"];
rootApplication.VirtualDirectories.Add("/myVdir", localResourcePath);
// Save
manager.CommitChanges();
}
...
}
}
If I'm right you'll need to set the execution context to elevated for this to work. You can do this in the ServiceDefintion.csdef:
<?xml version="1.0" encoding="utf-8"?>
<ServiceDefinition name="MyProject" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceDefinition" schemaVersion="2012-05.1.7">
<WebRole name="MyRole" vmsize="Small" enableNativeCodeExecution="true">
<Runtime executionContext="elevated" />
...
</WebRole>
</ServiceDefinition>
Note: You'll need to reference Microsoft.Web.Administration.dll (C:\Windows\System32\inetsrv)

Spring Security MethodSecurityInterceptor

I've got an Java application running on tomcat6.
I'm using spring 3.0.4 and spring security 3.0.5.
to protect the access of my dao methods i want to use the spring security MethodSecurityInterceptor. but this one doesn't actually "intercept" access at all.
It is configured like this:
<bean id="securityInterceptor" class="org.springframework.security.access.intercept.aopalliance.MethodSecurityInterceptor">
<property name="authenticationManager">
<ref bean="authenticationManager"/>
</property>
<property name="accessDecisionManager">
<ref bean="accessDecisionManager"/>
</property>
<property name="securityMetadataSource">
<value>
com.xkst.dao.InvoiceDao.*=ROLE_ADMIN
com.xkst.dao.UserDao.*=ROLE_ADMIN
</value>
</property>
</bean>
According to the configuration every access of any of the methods of the UserDao should be intercepted an controlled.
The methods to be protected get accessed by a rich client Java application. To make the service available for the client I use the spring HttpInvokerServiceExporter.
The dao classes are not exported directly. there is a single serviceclass being exported providing a single point of access for the client.
On the client side I've got this clientContext.xml file which references the exported service on the server.
In the client code I just load the context and pick the exported bean out of it
public class SecurityTest {
public static void main(String[] args) {
ClassPathXmlApplicationContext ctx = new ClassPathXmlApplicationContext("clientContext.xml");
EntityServiceInterface serverService = (EntityServiceInterface) ctx.getBean("entityServiceInterface");
List<UserEntity> users = serverService.performGetAllUsers();
for(UserEntity user : users) {
System.out.println(user.getUserName());
}
}
here i can just invoke any method of the 'serverService' on my client which should be protected by the 'MethodSecurityInterceptor' without authenticating. I can query all data from my 'UserDao'.
I really don't know what the missing link is.
authenticationManager and accessDecisionManager are configured as well. there is no error message at the startup of the server. it even logs the creation of the "secured methods" like:
2011-08-01 10:38:48,675 INFO MethodDefinitionMap:75 - Adding secure method [public java.util.List com.xkst.dao.UserDao.findAll()] with attributes [[ROLE_ADMIN]]
So what am I doing wrong?

Resources