I want to log nlog generated application logs in app service diagnostic blob [i.e, Application Logging (Blob)
] but only default logs are printed not the nlog based custom logs
but I can print Application Logging (Filesystem) when file target is added to nlog.config. The problem is only with blob.
nlog.config file:
<?xml version="1.0" encoding="utf-8" ?>
<nlog xmlns="http://www.nlog-project.org/schemas/NLog.xsd"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
autoReload="true"
throwConfigExceptions="true"
internalLogLevel="info"
internalLogFile="d:\home\LogFiles\temp\internal-nlog-AspNetCore3.txt">
<!-- enable asp.net core layout renderers -->
<extensions>
<add assembly="NLog.Web.AspNetCore"/>
</extensions>
<!-- the targets to write to -->
<targets>
<target xsi:type="Trace" name="String" layout="${level}\: ${logger}[0]${newline} |trace| ${message}${exception:format=tostring}" />
<target xsi:type="Console" name="lifetimeConsole" layout="${level}\: ${logger}[0]${newline} |console| ${message}${exception:format=tostring}" />
</targets>
<rules>
<logger name="*" minlevel="Trace" writeTo="lifetimeConsole,String" final="true"/>
</rules>
</nlog>
program.cs file
namespace testapp
{
public class Program
{
public static void Main(string[] args)
{
var logger = NLog.Web.NLogBuilder.ConfigureNLog("nlog.config").GetCurrentClassLogger();
try
{
logger.Debug("init main");
CreateHostBuilder(args).Build().Run();
}
catch (Exception exception)
{
logger.Error(exception, "Stopped program because of exception");
throw;
}
finally
{
NLog.LogManager.Shutdown();
}
}
public static IHostBuilder CreateHostBuilder(string[] args) =>
Host.CreateDefaultBuilder(args)
.ConfigureLogging(logging =>
{
logging.SetMinimumLevel(Microsoft.Extensions.Logging.LogLevel.Trace);
logging.AddConsole();
logging.AddDebug();
logging.AddAzureWebAppDiagnostics();
})
.UseNLog() // NLog: Setup NLog for Dependency injection
.ConfigureServices(serviceCollection => serviceCollection
.Configure<AzureBlobLoggerOptions>(options =>
{
options.BlobName = "testlog.txt";
}))
.ConfigureWebHostDefaults(webBuilder =>
{
webBuilder.UseStartup<Startup>();
});
}
}
The Nlog based loggings are not logged in app service diagnostic blob, instead only default logging is printed.
Kindly help to resolve this issue.
Seems that System.Diagnostics.Trace-Target works best for ASP.NET-application and not for ASP.NetCore applications in Azure.
When using AddAzureWebAppDiagnostics it will redirect all output written to Microsoft ILogger to FileSystem or Blob. But any output written to pure NLog Logger-objects will not be redirected.
Maybe the solution is to setup a NLog FileTarget writing to the HOME-directory:
<nlog>
<targets async="true">
<!-- Environment Variable %HOME% matches D:/Home -->
<target type="file" name="appfile" filename="${environment:HOME:cached=true}/logfiles/application/app-${shortdate}-${processid}.txt" />
</targets>
<rules>
<logger name="*" minLevel="Debug" writeTo="appFile" />
</rules>
</nlog>
See also: https://learn.microsoft.com/en-us/azure/app-service/troubleshoot-diagnostic-logs#stream-logs
For including Blob-output then take a look at NLog.Extensions.AzureBlobStorage
Alternative to Blob-output could be ApplicationInsights but it might have different pricing.
Related
I have an Azure function that looks like this:
[FunctionName("CreateWidgetWorkspace")]
public async Task<IActionResult> CreateWidgetWorkspace(
[HttpTrigger(AuthorizationLevel.Anonymous, "post", Route = "widget/workspaces")] HttpRequest req,
[Queue("widgetworkspaces"), StorageAccount("WidgetStorageQueue")] ICollector<string> messageQueue,
ILogger log)
{
WorkspaceResponse response = new WorkspaceResponse();
var content = await new StreamReader(req.Body).ReadToEndAsync();
log.LogInformation($"Received following payload: {content}");
var workspaceRequest = JsonConvert.DeserializeObject<Workspace>(content);
if (workspaceRequest.name != null){
messageQueue.Add(JsonConvert.SerializeObject(workspaceRequest));
}
else {
response.status = "Error: Invalid Request";
response.requestId=null;
}
return new OkObjectResult(JsonConvert.SerializeObject(response));
}
Everythings works well - the connection string is defined in my local.settings.json file like this:
{
"IsEncrypted": false,
"Values": {
"FUNCTIONS_WORKER_RUNTIME": "dotnet",
"WidgetStorageQueue":
"DefaultEndpointsProtocol=https;AccountName=accountname;AccountKey=asdf+asdf+AStRrziLg=="
}
}
But now i've created a managed Identity and it has been assigned "Contributor" role on all resources inside the resource group.
So I need to refactor this code to no longer use the connection string in local.settings / environment variables. But to use the managed id.
Can you point me to an article or video that would put me on the right path?
I actually prefer not to Azure key vault if possible.
Thanks.
EDIT 1
I've added the 2 packages referenced in the answer. This is what I have in my csproj file:
<ItemGroup>
<PackageReference Include="Azure.Data.Tables" Version="12.4.0" />
<PackageReference Include="Azure.Storage.Queues" Version="12.10.0" />
<PackageReference Include="Microsoft.Azure.Functions.Extensions" Version="1.1.0" />
<PackageReference Include="Microsoft.Azure.Webjobs.Extensions.ServiceBus" Version="5.2.0" />
<PackageReference Include="Microsoft.Azure.WebJobs.Extensions.Storage.Queues" Version="5.0.1" />
<PackageReference Include="Microsoft.NET.Sdk.Functions" Version="4.0.1" />
</ItemGroup>
<ItemGroup>
<None Update="host.json">
<CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory>
</None>
<None Update="local.settings.json">
<CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory>
<CopyToPublishDirectory>PreserveNewest</CopyToPublishDirectory>
</None>
</ItemGroup>
And this is what my local.settings.json file looks like:
{
"IsEncrypted": false,
"Values": {
"FUNCTIONS_WORKER_RUNTIME": "dotnet","WidgetStorageQueue__queueServiceUri":"https://mystorageaccountname.queue.core.windows.net"
},
"ConnectionStrings": {}
}
But I'm getting an error :
2022-05-05T19:30:00.774Z] Executed 'CreateWidgetWorkspace' (Failed, Id=asdf-a22b-asdf-asdf-asdf, Duration=6356ms)
[2022-05-05T19:30:00.777Z] System.Private.CoreLib: Exception while executing function: CreateWidgetWorkspace. Azure.Storage.Queues: Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
RequestId:asdf-8003-asdf-asdf-asdf
Time:2022-05-05T19:30:00.7494033Z
[2022-05-05T19:30:00.781Z] Status: 403 (Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.)
[2022-05-05T19:30:00.782Z] ErrorCode: AuthenticationFailed
[2022-05-05T19:30:00.784Z]
[2022-05-05T19:30:00.785Z] Additional Information:
[2022-05-05T19:30:00.788Z] AuthenticationErrorDetail: Issuer validation failed. Issuer did not match.
[2022-05-05T19:30:00.790Z]
[2022-05-05T19:30:00.791Z] Content:
[2022-05-05T19:30:00.793Z] <?xml version="1.0" encoding="utf-8"?><Error><Code>AuthenticationFailed</Code><Message>Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
RequestId:asdf-8003-asdf-asdf-asdf
Time:2022-05-05T19:30:00.7494033Z</Message><AuthenticationErrorDetail>Issuer validation failed. Issuer did not match.</AuthenticationErrorDetail></Error>
[2022-05-05T19:30:00.795Z]
[2022-05-05T19:30:00.796Z] Headers:
[2022-05-05T19:30:00.797Z] Server: Microsoft-HTTPAPI/2.0
[2022-05-05T19:30:00.801Z] x-ms-request-id: asdf-asdf-asdf-asdf-60671b000000
[2022-05-05T19:30:00.802Z] x-ms-error-code: AuthenticationFailed
[2022-05-05T19:30:00.809Z] Date: Thu, 05 May 2022 19:29:59 GMT
[2022-05-05T19:30:00.810Z] Content-Length: 422
[2022-05-05T19:30:00.811Z] Content-Type: application/xml
[2022-05-05T19:30:00.812Z] .
Here's my Azure Resource Group:
Here's the function app - you can see I have assigned a user-assigned managed identity to it:
And here are the RBAC roles assigned my managed identity:
Questions
Based on my limited knowledge / reading, it feels like I should install Azure.Identity and create some sort of DefaultAzureCredential?
https://learn.microsoft.com/en-us/dotnet/api/overview/azure/identity-readme#specifying-a-user-assigned-managed-identity-with-the-defaultazurecredential
EDIT 2
The changes suggested in the answer basically work. To clarify, the setting in local.setting.json that actually works is this:
"[nameofConnectioninC#Method]__serviceUri":"https://[nameOfStorageAccount].queue.core.windows.net/"
This fails when you local debug, but when you publish everything, upstream tests work.
To use Identity-based connections with Queue Storage, you will need to:
Update you application to use these Nuget Packages: Azure.Storage.Queues and Microsoft.Azure.WebJobs.Extensions.Storage.Queues.
Create an app settings called <CONNECTION_NAME_PREFIX>__serviceUri:
The data plane URI of the queue service to which you are connecting, using the HTTPS scheme.
https://<storage_account_name>.queue.core.windows.net
So in your case you need to create a setting called WidgetStorageQueue__serviceUri
Grant permission to your function app identity to access queue storage. If you just need to send message to a queue, you could use the Storage Queue Data Message Sender role.
I've created a small function app that reproduces this use case.
csproj file:
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>net6.0</TargetFramework>
<AzureFunctionsVersion>v4</AzureFunctionsVersion>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="Azure.Storage.Queues" Version="12.9.0" />
<PackageReference Include="Microsoft.Azure.WebJobs.Extensions.Storage.Queues" Version="5.0.0" />
<PackageReference Include="Microsoft.NET.Sdk.Functions" Version="4.1.0" />
</ItemGroup>
<ItemGroup>
<None Update="host.json">
<CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory>
</None>
<None Update="local.settings.json">
<CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory>
<CopyToPublishDirectory>Never</CopyToPublishDirectory>
</None>
</ItemGroup>
</Project>
function file
using System.IO;
using System.Threading.Tasks;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.AspNetCore.Http;
using Microsoft.Extensions.Logging;
namespace FunctionApp1
{
public static class Function1
{
[FunctionName("Function1")]
public static async Task<IActionResult> Run(
[HttpTrigger(AuthorizationLevel.Function, "post", Route = null)] HttpRequest req,
[Queue("widgetworkspaces"), StorageAccount("WidgetStorageQueue")] ICollector<string> queueCollector,
ILogger log)
{
string requestBody = await new StreamReader(req.Body).ReadToEndAsync();
queueCollector.Add(requestBody);
return new OkObjectResult(requestBody);
}
}
}
settings file
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "UseDevelopmentStorage=true",
"FUNCTIONS_WORKER_RUNTIME": "dotnet",
"WidgetStorageQueue__queueServiceUri": "https://<storage_account_name>.queue.core.windows.net"
}
}
I am trying to enable SSL for my azure function using the letsencrypt site-extension for azure as described here. I was following the instructions in that wiki and on this website.
However, I get an error when it tries to verify the website.
The error indicates that the acme-challenge page cannot be accessed (404).
This is my web.config under .well-known/:
<?xml version="1.0" encoding="utf-8"?>
<configuration>
<system.webServer>
<handlers>
<clear />
<add name="ACMEStaticFile" path="*" verb="*" modules="StaticFileModule" resourceType="Either" requireAccess="Read" />
</handlers>
<staticContent>
<remove fileExtension="." />
<mimeMap fileExtension="." mimeType="text/plain" />
</staticContent>
</system.webServer>
<system.web>
<authorization>
<allow users="*"/>
</authorization>
</system.web>
</configuration>
Does anyone have any ideas on what could be going wrong?
Here's how you can do that.
In your function app, create a new proxy for the challenge directory (this is required as the challenge will be a http get to /.well-known/acme-challenge/, and per default a function in a function app will only answer on /api/.
You can setup the proxy in the following way.
proxies.json
{
"$schema": "http://json.schemastore.org/proxies",
"proxies": {
"LetsEncryptProxy": {
"matchCondition": {
"route": "/.well-known/acme-challenge/{code}"
},
"backendUri": "http://%WEBSITE_HOSTNAME%/api/letsencrypt/.well-known/acme-challenge/{code}"
}
}
}
The important setting here is the Route Template: /.well-known/acme-challenge/{*rest} this will match all request that goes to the challenge directory.
Please note that proxies are a preview feature and you have to enable it, for it to work.
Reference:
https://github.com/sjkp/letsencrypt-siteextension/wiki/Azure-Functions-Support
https://blog.bitscry.com/2018/07/06/using-lets-encrypt-ssl-certificates-with-azure-functions/
I figured it out:
Similar to KetanChawda-MSFT's answer, except the proxy needs to hit the api endpoint function that you create.
So here's the function:
// https://YOURWEBSITE.azurewebsites.net/api/letsencrypt/{code}
#r "Newtonsoft.Json"
using System.Net;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Extensions.Primitives;
using Newtonsoft.Json;
public static async Task<HttpResponseMessage> Run(HttpRequestMessage req, string code, TraceWriter log)
{
log.Info($"C# HTTP trigger function processed a request. {code}");
var content = File.ReadAllText(#"D:\home\site\wwwroot\.well-known\acme-challenge\"+code);
var resp = new HttpResponseMessage(HttpStatusCode.OK);
resp.Content = new StringContent(content, System.Text.Encoding.UTF8, "text/plain");
return resp;
}
And here's the proxy that hits that function when letsEncrypt tries to verify the acme challenge:
{
"$schema": "http://json.schemastore.org/proxies",
"proxies": {
"LetsEncryptProxy": {
"matchCondition": {
"route": "/.well-known/acme-challenge/{code}"
},
"backendUri": "http://%WEBSITE_HOSTNAME%/api/letsencrypt/{code}"
}
}
}
I have a .net core 2 MVC web application that uses application insights on Azure.
I also configured nlog to trace with application insights.
Everything works on my pc, as I found exceptions and tracing on azure, but when I deploy the application and use it on azure it doesn't generates any event on application insights (I found only the events in the log file).
So I tried to create an instance of TelemetryClient in a controller and it works even in the the deployed instance:
TelemetryClient tc = new TelemetryClient()
{
InstrumentationKey = "11111111-2222-3333-4444-555555555555"
};
tc.Context.User.Id = Environment.MachineName;
tc.Context.Session.Id = "session_id";
tc.Context.Device.OperatingSystem = Environment.OSVersion.ToString();
tc.TrackTrace("Privacy page says hello with TelemetryClient");
Here are the snippets of my project:
appsettings.json
{
"ApplicationInsights": {
"InstrumentationKey": "11111111-2222-3333-4444-555555555555"
}
}
appsettings.Staging.json
{
"ConnectionStrings": {
"DefaultConnection": "Server=tcp:dom.database.windows.net,1433;Initial Catalog=dom;Persist Security Info=False;User ID=user;Password=pass;MultipleActiveResultSets=False;Encrypt=True;TrustServerCertificate=False;Connection Timeout=30;"
},
"AllowedHosts": "*",
"Logging": {
"LogLevel": {
"Default": "Trace",
"System": "Information",
"Microsoft": "Information"
}
}
}
I defined the same ASPNETCORE_ENVIRONMENT value on my VS and on Azure (Staging) to be sure to load the same appsettings and deployed all the files.
I load the configuration in this way
var configuration = new ConfigurationBuilder()
.AddJsonFile("appsettings.json")
.AddJsonFile($"appsettings.{environmentName}.json", optional: true)
.AddEnvironmentVariables()
.Build();
CreateWebHostBuilder is this
public static IWebHostBuilder CreateWebHostBuilder(string[] args, IConfiguration config) =>
WebHost.CreateDefaultBuilder(args)
.ConfigureAppConfiguration((hostingContext) =>
{
//config.AddJsonFile("appsettings.json");
})
.UseStartup<Startup>()
.ConfigureLogging(
logging =>
{
logging.ClearProviders();
logging.SetMinimumLevel(Microsoft.Extensions.Logging.LogLevel.Trace);
})
.UseApplicationInsights() // Enable Application Insights
.UseNLog();
nlog.config contains
<extensions>
<add assembly="Microsoft.ApplicationInsights.NLogTarget" />
</extensions>
<targets>
<target type="ApplicationInsightsTarget" name="aiTarget" />
<target xsi:type="File" name="f" fileName="${basedir}/logs/${shortdate}.log"
layout="${longdate} ${uppercase:${level}} ${message}" />
</targets>
<rules>
<logger name="*" minlevel="Warn" writeTo="aiTarget" />
<logger name="*" minlevel="Warn" writeTo="f" />
</rules>
It seems to me that something is wrong in the configuration or the InstrumentationKey, but I don't know how to inspect it.
Any idea, or... is there any way to know how the application insights library is configured in order to find some useful info to solve the problem? I tried with the remote debugging but I have no idea of what inspect.
Based on your description, I think you have set another application insights key in azure portal -> your web application -> configuration-> application settings.
Please check if you did this or not:
If the key is there, you need remove it. Or put this line of code AddEnvironmentVariables() before the AddJsonFile(), like below:
var configuration = new ConfigurationBuilder()
.AddEnvironmentVariables() //put this before AddJsonFile()
.AddJsonFile("appsettings.json")
.AddJsonFile($"appsettings.{environmentName}.json", optional: true)
.Build();
Please let me know if you have more issues.
I'm working on .Net Core 2.0 console application. NLog version 4.5.7.
Here is my config file:
<?xml version="1.0" encoding="utf-8" ?>
<nlog
xmlns="http://www.nlog-project.org/schemas/NLog.xsd"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
throwExceptions="false"
internalLogFile="Logs/Internal/log.txt"
internalLogLevel="Warn"
internalLogToConsole="false"
internalLogToConsoleError="false"
internalLogToTrace="false"
internalLogIncludeTimestamp="true">
<targets>
<default-wrapper
xsi:type="AsyncWrapper"
timeToSleepBetweenBatches="0"
overflowAction="Block"/>
<default-target-parameters
xsi:type="File"
layout="${longdate}|${level:uppercase=true}|${message}
${onexception:${newline}}${exception:format=tostring}"
archiveAboveSize="10485760"
maxArchiveFiles="10"
archiveNumbering="DateAndSequence"
archiveOldFileOnStartup="true"
keepFileOpen="true"
cleanupFileName="false"/>
<target
name="driver"
xsi:type="File"
archiveFileName="Logs/Driver/Archives/log.{#####}.txt"
fileName="Logs/Driver/log.txt"/>
</targets>
<rules>
<logger
name="driver"
minlevel="Trace"
writeTo="driver"/>
</rules>
</nlog>
I'm setting this config on startup of my application:
LogManager.Configuration = new XmlLoggingConfiguration(<path>, false);
And then I'm adding file targets to this configuration programmatically:
string
deviceTag = string.Concat("Device_", _id.ToString()),
loggerName = string.Concat("logger_", deviceTag);
LogManager.Configuration.AddRuleForAllLevels(
new FileTarget(string.Concat("target_", deviceTag))
{
FileName = string.Concat("Logs/", deviceTag, "/log.txt"),
ArchiveFileName = string.Concat("Logs/", deviceTag,
"/Archives/log.{#####}.txt")
}, loggerName);
_logger = LogManager.GetLogger(loggerName);
LogManager.ReconfigExistingLoggers();
Target is working, logs are written. But default-target-parameters from xml config are ignored.
This is a bug or a feature?
Or I do something wrong?
I'm currently using log4net and azure files to store my logs, works ace.
I've been searching and can't find any configuration to make the logger create files no bigger than a given KB size.
This is the configuration I have:
<rollingStyle value="Size" />
<MaxSizeRollBackups value="10" />
<MaximumFileSize value="10KB" />
<AzureStorageConnectionString value="connectiondatahere" />
<ShareName value="filelog" />
<Path value="processor" />
<File value="processor_{yyyy-MM-dd}.txt" />
<layout type="log4net.Layout.PatternLayout">
<ConversionPattern value="%date %-5level %logger %message%newline"/>
</layout>
</appender>
<root>
<level value="ALL" />
<appender-ref ref="AzureFileAppender"/>
</root>
I've tried a few variations of this configuration but no luck.
After reviewed the source code of log4net-appender-azurefilestorage, I found the log file size limit is not support in azure file appender currently. I suggest you rewrite the azure file appender by yourself and add the size limit feature.
Below are the steps to do it.
Step 1, Add a property named MaximumFileSize to AzureFileAppender class.
public int MaximumFileSize { get; set; }
Step 2, add the size limit code when appending log to file.
protected override void Append(LoggingEvent loggingEvent)
{
Initialise(loggingEvent);
var buffer = Encoding.UTF8.GetBytes(RenderLoggingEvent(loggingEvent));
if ((_file.Properties.Length + buffer.Length) > MaximumFileSize)
{
//do something if the file reach the max file size
}
else
{
_file.Resize(_file.Properties.Length + buffer.Length);
using (var fileStream = _file.OpenWrite(null))
{
fileStream.Seek(buffer.Length * -1, SeekOrigin.End);
fileStream.Write(buffer, 0, buffer.Length);
}
}
}
Step 3, After that, you could add the size limit(per byte) to configuration file.
<MaximumFileSize value="10240" />