Using a custom sink with ServiceStack.Logging.Serilog? - servicestack

Is there a non-obvious way (to me at least) to add a custom sink e.g. MongoDB or MicrosoftTeams as part of instantiating the Serilog factory in the ServiceStack framework or will it be a case of rolling your own factory and implementation of ILog?
PM> Install-Package ServiceStack.Logging.Serilog
LogManager.LogFactory = new SerilogFactory();
ServiceStack Logging
Serilog
Example: MongoDB Sink
This works without using the ServiceStack implementation, but is it considered bad form?
public override void Configure(Container container)
{
Log.Logger = new LoggerConfiguration()
.WriteTo.MongoDBCapped("mongodb://mymongourl:27017/mylogs",
collectionName: "mycollectionoflogs", cappedMaxSizeMb: 50,
cappedMaxDocuments: 10000)
.CreateLogger();
SetConfig(new HostConfig
{
DefaultRedirectPath = "/metadata",
DebugMode = AppSettings.Get(nameof(HostConfig.DebugMode), false)
});
}
and in the ServiceInterface message implementation:
public object Any(MyRequest request)
{
Log.Information("I'm a lumberjack and I'm OK");
return new MyRequestResponse
{
Result = $"{ results.Chop() }"
};
}

I've just added a constructor overload to use a custom Serilog logger in this commit, this change is available from v5.1.1 that's now available on MyGet.
With this change you can pass a custom Serilog logger with ServiceStack's SerilogFactory, e.g:
LogManager.LogFactory = new SerilogFactory(new LoggerConfiguration()
.WriteTo.MongoDBCapped("mongodb://mymongourl:27017/mylogs",
collectionName: "mycollectionoflogs", cappedMaxSizeMb: 50,
cappedMaxDocuments: 10000)
.CreateLogger());
You can use the Serilog logger directly like in your example except it wont be able to capture ServiceStack's built-in logs or be able to substitute it later with any of the other ServiceStack loggers.

Related

how distinguish traces from different instances .net core application in Application Insights

I work on .NET Core 2.2 console application that uses Microsoft.Extensions.Logging and is configured to send logs to Azure Application Insights using Microsoft.ApplicationInsights.Extensibility by:
services.AddSingleton(x =>
new TelemetryClient(
new TelemetryConfiguration
{
InstrumentationKey = "xxxx"
}));
...
var loggerFactory = serviceProvider.GetService<ILoggerFactory>();
loggerFactory.AddApplicationInsights(serviceProvider, logLevel);
It works ok: I can read logs in Application Insights. But the application can be started simultanously in few instances (in different Docker containers). How can I distinguish traces from different instances? I can use source FileName, but I don't know how I should inject it.
I tried to use Scope:
var logger = loggerFactory.CreateLogger<Worker>();
logger.BeginScope(dto.FileName);
logger.LogInformation($"Start logging.");
It's interesting that my configuration is almost identical as in example: https://github.com/MicrosoftDocs/azure-docs/issues/12673
But in my case I can't see the property "FileName" in Application Insights.
For console project, if you want to use the custom ITelemetryInitializer, you should use this format: .TelemetryInitializers.Add(new CustomInitializer());
Official doc is here.
I test it at my side, and it works. The role name can be set.
Sample code is below:
static void Main(string[] args)
{
TelemetryConfiguration configuration = TelemetryConfiguration.CreateDefault();
configuration.InstrumentationKey = "xxxxx";
configuration.TelemetryInitializers.Add(new CustomInitializer());
var client = new TelemetryClient(configuration);
ServiceCollection services = new ServiceCollection();
services.AddSingleton(x => client);
var provider = services.BuildServiceProvider();
var loggerFactory = new LoggerFactory();
loggerFactory.AddApplicationInsights(provider, LogLevel.Information);
var logger = loggerFactory.CreateLogger<Program>();
logger.LogInformation("a test message 111...");
Console.WriteLine("Hello World!");
Console.ReadLine();
}
Check the role name in azure portal:
If you really have no way to distinguish them you can use a custom telemetry initializer like this:
public class CustomInitializer : ITelemetryInitializer
{
public void Initialize(ITelemetry telemetry)
{
telemetry.Context.Cloud.RoleName = Environment.MachineName;
}
}
and/or you can add a custom property:
public class CustomInitializer : ITelemetryInitializer
{
public void Initialize(ITelemetry telemetry)
{
if(telemetry is ISupportProperties)
{
((ISupportProperties)telemetry).Properties["MyIdentifier"] = Environment.MachineName;
}
}
}
In this example I used Environment.MachineName but you can of course use something else if needed. Like this work Id parameter of yours.
the wire it up using:
services.AddSingleton<ITelemetryInitializer, CustomInitializer>();

ASP.NET Core How to read the content of an AzureTable

I develop a ASP.NET Core application working with Azure Tables.
So, I created a tables storage account in Azure Portal, created a table, filled it with some test data, and now I would like to display the content of that table to test the reading.
my appsettings.json is
{
"ConnectionStrings": {
"MyTables":"DefaultEndpointsProtocol=https;AccountName=yyy;AccountKey=xxx;EndpointSuffix=core.windows.net"
},
"Logging": {
"IncludeScopes": false,
[etc etc...]
}
}
And my Startup.cs:
public class Startup
{
public Startup(IHostingEnvironment env)
{
var builder = new ConfigurationBuilder()
.SetBasePath(env.ContentRootPath)
.AddJsonFile("appsettings.json", optional: false, reloadOnChange: true)
.AddJsonFile($"appsettings.{env.EnvironmentName}.json", optional: true)
.AddEnvironmentVariables();
Configuration = builder.Build();
// here in debug we can see the connection string, that is OK
Console.WriteLine($"conn string:{Configuration["ConnectionStrings:MyTables"]}");
}
public IConfigurationRoot Configuration { get; }
// This method gets called by the runtime. Use this method to add services to the container.
public void ConfigureServices(IServiceCollection services)
{
// Add framework services.
services.AddMvc();
}
And here is my controller I try to Display the values:
using Microsoft.AspNetCore.Mvc;
using Microsoft.WindowsAzure.Storage;
using Microsoft.WindowsAzure.Storage.Table;
using NextMove.Models;
using System.Text;
[...]
public class HelloWorldController : Controller
{
public string ReadTables() {
// ????? Code does not work, as Startup not a reference
string myConnString = Startup.Configuration["ConnectionStrings:MyTables"];
//////////////////////////////////
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(myConnString);
CloudTableClient tableClient = storageAccount.CreateCloudTableClient();
CloudTable table = tableClient.GetTableReference("themes");
TableQuery<ProjectThemeEntity> query = new TableQuery<ProjectThemeEntity>().Where(TableQuery.GenerateFilterCondition("PartitionKey", QueryComparisons.Equal, "fr"));
StringBuilder response = new StringBuilder("Here is your test table:");
foreach (ProjectThemeEntity item in table.ExecuteQuery(query)) {
response.AppendLine($"Key: {item.RowKey}; Value: {item.Description}");
}
return response.ToString();
}
//
// GET: /HelloWorld/
public IActionResult Index() {
return View();
}
Questions:
a) How to fix this code in order to get the connection string?
b) There should be a "Table.ExecuteQuery(query)" as per this MSDN article in the controller's foreach, but it does not find such a method in CloudTable class, I however added the necessary references, as shown in the controller's code above, only two "Async" methods are available:
PS.
-For the (b) question several people has the same issue here, hope the situation changed now...
You can't access Startup.Configuration from the controller because it's not a static property. Even though you've made it public (generally not a good idea) it still requires you to have an instance of Startup to get access to it.
Generally to get access to settings in ASP.NET Core it's best to create a class with the properties you want and use the IOptions pattern to get them with Dependency Injection. In your startup where you configure your services (add services to the dependency injection container) you would use the helper methods to add your configuration object to the container and then in your controller you would specify you wanted an IOptions or IOptionsSnapshot to get access to it.
I'd suggest you don't put your data access in your controller though. It makes your controller harder to read and harder to maintain if you need to change your strategy later. Move your ReadTables method to its own class and add it to the DI container in Startup taking whatever settings you need to create the service. Use constructor injection in your controller to get the service and execute calls from your controller actions where you need them.

ServiceStack 4.5 configure log4net programmatically

I am starting a new project with ServiceStack 4.5. Is there any way to configure log4net programmatically? In the documentation I found
LogManager.LogFactory = new Log4NetFactory(configureLog4Net: true);
I added this to the constructor of the AppHost class. However this seems to assume that you put the configuration to the App.config file (I am doing self-hosting on a windows service).
In some other projects I wrote a singleton and then used the Log4Net API to do the configuration:
private static void CreateFileAppender(ref Logger bedInventoryLogger, string logFilePath, Level logLevel, int maxFileSizeInMb, bool filterNh)
{
var filePatternLayout = new PatternLayout
{
ConversionPattern = "%date; [%thread]; %-5level; %logger; [%type{1}.%method]; - %message%newline"
};
filePatternLayout.ActivateOptions();
var bediLogFileAppender = new RollingFileAppender
{
File = logFilePath,
AppendToFile = true,
MaximumFileSize = $"{maxFileSizeInMb}MB",
MaxSizeRollBackups = 5,
RollingStyle = RollingFileAppender.RollingMode.Size,
LockingModel = new FileAppender.MinimalLock(),
Layout = filePatternLayout,
StaticLogFileName = true,
Threshold = logLevel
};
if (filterNh)
{
bediLogFileAppender.AddFilter(new LoggerMatchFilter
{
LoggerToMatch = "NHibernate",
AcceptOnMatch = false
});
bediLogFileAppender.AddFilter(new LoggerMatchFilter
{
LoggerToMatch = "NHibernate.SQL",
AcceptOnMatch = false
});
bediLogFileAppender.AddFilter(new LoggerMatchFilter
{
LoggerToMatch = "FluentNHibernate",
AcceptOnMatch = false
});
}
bediLogFileAppender.ActivateOptions();
bedInventoryLogger.AddAppender(bediLogFileAppender);
}
Since I used several logs, appenders etd and wanted to turn off NHibernate logging (I am using NHibernate 4 as ORM) etc. I found it more convenient to do configuration in C# than in XML.
Is it possible to hook this in with ServiceStack or do I better use Log4Net directly?
The default ServiceStack Log4Net adapter doesn't allow you to inject a configured Log4Net instance however the adapter classes are easy to copy and modify which are just in this 2 files which basically just forward the calls to Log4Net:
Log4NetFactory.cs
Log4NetLogger.cs

How do we integrate elmah logging in servicestack

I am new to servicestack and elman logging.
Can any body suggest how do we integrate elmah in service stack applications.
Thank you...
If you have an existing logging solution then you can use the ServiceStack.Logging.Elmah project. It is available via NuGet.
Exceptions, errors and fatal calls will be logged to Elmah in addition to the originally intended logger. For all other log types, only the original logger is used.
So if you are already using Log4Net then you can just configure Elmah like this
ElmahLogFactory factory = new ElmahLogFactory(new Log4NetFactory());
If you don't want to wrap in over an existing log then you can just research adding Elmah to any ASP.NET website. There is no reason it wouldn't work just because you are using ServiceStack.
using ServiceStack.Logging;
using ServiceStack.Logging.Elmah;
using ServiceStack.Logging.NLogger;
public AppHost()
: base(
"description",
typeof(MyService).Assembly)
{
LogManager.LogFactory = new ElmahLogFactory(new NLogFactory());
}
public override void Configure(Container container)
{
this.ServiceExceptionHandler += (request, exception) =>
{
// log your exceptions here
HttpContext context = HttpContext.Current;
ErrorLog.GetDefault(context).Log(new Error(exception, context));
// call default exception handler or prepare your own custom response
return DtoUtils.HandleException(this, request, exception);
};
// rest of your config
}
}
Now your ServiceStack error's appear in Elmah (assuming you've setup web.config etc).
Actually kampsj answer is better than Gavin's as Gavins causes double-logging to elmah by calling explicit elmah logger and then the default servicestack error handling...which itself already does the logging.
So really all you need is this (below assuming you want to wrap NLog with Elmah)
public class YourAppHost : AppHostBase
{
public YourAppHost() //Tell ServiceStack the name and where to find your web services
: base("YourAppName", typeof(YourService).Assembly)
{
LogManager.LogFactory = new ElmahLogFactory(new NLogFactory());
}
//...just normal stuff...
}
You could just have this above:
ElmahLogFactory factory = new ElmahLogFactory();
...but you probably should wrap another type of logger for non-error logging, like Debug and Warn.
This section on configuring Elmah and the Logging.Elmah UseCase for a working example of ServiceStack and Elmah configured together.
The ElmahLogFactory can be configured in your Global.asax before initializing the ServiceStack AppHost, e.g:
public class Global : System.Web.HttpApplication
{
protected void Application_Start(object sender, EventArgs e)
{
var debugMessagesLog = new ConsoleLogFactory();
LogManager.LogFactory = new ElmahLogFactory(debugMessagesLog, this);
new AppHost().Init();
}
}

Is it possible to mock NLog log methods?

Is it possible/easy to mock NLog log methods, using Rhino Mocks or similar?
Using Nuget : install-package NLog.Interface
Then: ILogger logger = new LoggerAdapter([logger-from-NLog]);
You can only mock virtual methods. But if You create some interface for logging and then implement it using NLog You can use dependency injection and in Your tests use mocked interface to see if system under test (SUT) is logging what You expect it to log.
public class SUT
{
private readonly ILogger logger;
SUT(ILogger logger) { this.logger = logger;}
MethodUnderTest() {
// ...
logger.LogSomething();
// ...
}
}
// and in tests
var mockLogger = new MockLogger();
var sut = new SUT(mockLogger);
sut.MethodUnderTest();
Assert.That("Expected log message", Is.Equal.To(mockLogger.LastLoggedMessage));
The simple answer, is 'no'. Looking at the code, dependency-injection is not supported, which seems rather an oversight, especially as it doesn't look difficult to implement (at first glance).
The only interfaces in the project are there to support COM interop objects and a few other things. The main Logger concrete class neither implements an interface, nor provides virtual methods.
You could either provide an interface yourself, or use Moles/TypeMock/ another isolation framework to mock the dependency.
I've used code like this to stub out the NLog logging code. You can make use of NLog's MemoryTarget which just keeps messages in memory until it's disposed of. You can query the content of the log using Linq or whatever (this example uses FluentAssertions)
using FluentAssertions
using Microsoft.VisualStudio.TestTools.UnitTesting;
using NLog;
using NLog.Config;
using NLog.Targets;
...
private MemoryTarget _stubLogger;
[TestInitialize]
public void Setup()
{
ConfigureTestLogging();
}
protected virtual LoggingConfiguration GetLoggingConfiguration()
{
var config = new NLog.Config.LoggingConfiguration();
this._stubLogger = new MemoryTarget();
_stubLogger.Layout = "${level}|${message}";
config.AddRule(LogLevel.Debug, LogLevel.Fatal, this._stubLogger);
return config;
}
protected virtual void ConfigureTestLogging()
{
var config = GetLoggingConfiguration();
NLog.LogManager.Configuration = config;
}
[TestMethod]
public void ApiCallErrors_ShouldNotThrow()
{
// arrange
var target = new Thing();
// act
target.DoThing();
// assert
this._stubLogger.Logs.Should().Contain(l =>
l.Contains("Error|") &&
l.Contains("Expected Message"));
}

Resources