JsonLayout with non formatted message - nlog

I have little problem with JsonLayout
NLog version: 4.7.10
Platform: netcoreapp 3.1
Current Nlog config
<target name="jsonFileMw" xsi:type="File" fileName="logs\mw.log"
archiveAboveSize="10240"
maxArchiveDays="5"
archiveNumbering="DateAndSequence"
archiveEvery="Day"
enableArchiveFileCompression="true">
<layout xsi:type="JsonLayout" includeAllProperties="true">
<attribute name="time" layout="${longdate}" />
<attribute name="level" layout="${level:upperCase=true}"/>
<attribute name="message" layout="${message}" />
</layout>
</target>
my logging code
_logger.LogInformation("request received. {RequestUrl} {RequestBody}", "some url", "some body");
this logging code produces following log line:
{ "time": "2021-08-02 15:07:30.8198", "level": "INFO", "message": "request received. some url some body", "RequestUrl": "some url", "RequestBody": "some body" }
As you can see this adds log properties to message also which means logging same information twice. As a result log file size increases. I just want to keep message simple. Desired output is below:
{ "time": "2021-08-02 15:07:30.8198", "level": "INFO", "message": "request received. {RequestUrl} {RequestBody}", "RequestUrl": "some url", "RequestBody": "some body" }
How can I achieve this?

You can do this:
<attribute name="messagetemplate" layout="${message:raw=true}" />
See also: https://github.com/NLog/NLog/wiki/Message-Layout-Renderer
See also: https://github.com/NLog/NLog/wiki/How-to-use-structured-logging#output-captured-properties

Related

NLog.Targets.Splunk - Possible to get rid of the "Properties" wrap?

In: NLog.Targets.Splunk
https://github.com/AlanBarber/NLog.Targets.Splunk
When use the nlog configuration with:
includeEventProperties="true"
or if I have:
includeEventProperties="false" and use:
<contextproperty name="host" layout="${machinename}" />
<contextproperty name="threadid" layout="${threadid}" />
<contextproperty name="logger" layout="${logger}" />
I get the logs in the following format (properties wrapped in "Properties"):
{"Level":"Info","MessageTemplate":"ApiRequest","RenderedMessage":"ApiRequest","Properties":{"httpMethod":"GET","statusCode":200}, ...}
Is it possible to get rid of the Properties-wrap, and have it more flat?
{ "Level": "Info", "httpMethod": "GET", "statusCode":200, ... }
Many thanks! :-)

Azure Function - Failed to start a new language worker for runtime: dotnet-isolated

I have a dotnet 5 function app that I've been building and deploying from a devops pipeline for a couple of weeks.
Following the most recent release, I see the following error in App Insights:
Exception type System.TimeoutException
Exception message The operation has timed out.
LogLevel Error
prop__{OriginalFormat} Failed to start a new language worker for runtime: dotnet-isolated.
Category Microsoft.Azure.WebJobs.Script.Workers.Rpc.RpcFunctionInvocationDispatcher
System.TimeoutException: The operation has timed out.
at Microsoft.Azure.WebJobs.Script.Grpc.GrpcWorkerChannel.StartWorkerProcessAsync()
csproj file:
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<OutputType>Exe</OutputType>
<TargetFramework>net5.0</TargetFramework>
<Nullable>enable</Nullable>
<UserSecretsId>4f786da6-0d47-4ccc-b343-638a6e34e1cf</UserSecretsId>
</PropertyGroup>
<ItemGroup>
<None Remove="local.settings.json" />
</ItemGroup>
<ItemGroup>
<Content Include="local.settings.json">
<CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory>
<CopyToPublishDirectory>Never</CopyToPublishDirectory>
</Content>
</ItemGroup>
<ItemGroup>
<PackageReference Include="Microsoft.AspNetCore.Mvc.Abstractions" Version="2.2.0" />
<PackageReference Include="Microsoft.AspNetCore.Mvc.Core" Version="2.2.5" />
<PackageReference Include="Microsoft.Azure.Functions.Worker" Version="1.2.0" />
<PackageReference Include="Microsoft.Azure.Functions.Worker.Extensions.Abstractions" Version="1.0.0" />
<PackageReference Include="Microsoft.Azure.Functions.Worker.Extensions.Http" Version="3.0.13" />
<PackageReference Include="Microsoft.Azure.Functions.Worker.Extensions.Storage" Version="4.0.4" />
<PackageReference Include="Microsoft.Azure.Functions.Worker.Sdk" Version="1.0.3" />
<PackageReference Include="Microsoft.Azure.Services.AppAuthentication" Version="1.6.1" />
<PackageReference Include="Microsoft.Data.SqlClient" Version="3.0.0" />
<PackageReference Include="Microsoft.Extensions.Configuration.UserSecrets" Version="5.0.0" />
<PackageReference Include="NSwag.AspNetCore" Version="13.11.1" />
<PackageReference Include="Serilog.AspNetCore" Version="4.1.0" />
<PackageReference Include="Serilog.Sinks.ApplicationInsights" Version="3.1.0" />
<PackageReference Include="Serilog.Sinks.MSSqlServer" Version="5.6.0" />
</ItemGroup>
<ItemGroup>
<ProjectReference Include="..\infrastructure\SmsRouter.GovNotify\SmsRouter.GovNotify.csproj" />
<ProjectReference Include="..\SmsRouter.Infrastructure\SmsRouter.EntityFramework.csproj" />
<ProjectReference Include="..\SmsRouter.Utrn\SmsRouter.Utrn.csproj" />
</ItemGroup>
<ItemGroup>
<None Update="host.json">
<CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory>
</None>
</ItemGroup>
</Project>
host.json:
{
"version": "2.0"
}
Function App Configuration:
[
{
"name": "APPINSIGHTS_INSTRUMENTATIONKEY",
"value": "<my key is here>",
"slotSetting": true
},
{
"name": "AzureWebJobsStorage",
"value": "DefaultEndpointsProtocol=https;AccountName=storesmsroutermsdn;AccountKey=<my key is here>;EndpointSuffix=core.windows.net",
"slotSetting": false
},
{
"name": "FUNCTIONS_EXTENSION_VERSION",
"value": "~3",
"slotSetting": false
},
{
"name": "FUNCTIONS_WORKER_RUNTIME",
"value": "dotnet-isolated",
"slotSetting": false
},
{
"name": "WEBSITE_CONTENTAZUREFILECONNECTIONSTRING",
"value": "DefaultEndpointsProtocol=https;AccountName=storesmsroutermsdn;AccountKey=<my key is here>;EndpointSuffix=core.windows.net",
"slotSetting": false
},
{
"name": "WEBSITE_CONTENTSHARE",
"value": "func-smsrouter-msdn-01b300",
"slotSetting": false
},
{
"name": "WEBSITE_ENABLE_SYNC_UPDATE_SITE",
"value": "true",
"slotSetting": false
},
{
"name": "WEBSITE_RUN_FROM_PACKAGE",
"value": "1",
"slotSetting": false
}
]
Function Definition
[Function("HttpExample")]
public static HttpResponseData Run([HttpTrigger(AuthorizationLevel.Function, "get", "post")] HttpRequestData req,
FunctionContext executionContext)
{
var response = req.CreateResponse(HttpStatusCode.OK);
response.Headers.Add("Content-Type", "text/plain; charset=utf-8");
response.WriteString("Welcome to Azure Functions!");
return response;
}
Has anyone else run into this problem?
Note: I have now created a support ticket for this via the Azure Portal - the id is 2106280050000196. Github issue here
Edit: Following the suggestion from #Kaylan, I used Azure CLI to create a new function app with --runtime dotnet-isolated param. I then deployed my functions into this (using devops pipeline with the Deploy Azure Function task) but I'm afraid I continue to see the same error.
I've also tried deploying to a fixed app service plan (rather than consumption) but continued to hit the same problem.
I was just dealing with the same problem. I finally fixed it by adding .ConfigureFunctionsWorkerDefaults() to my Program.cs file. I had removed it by accident.
I guess what I'm saying is, make sure you have .ConfigureFunctionsWorkerDefaults() in your Program.cs file. Here's an example:
using DataApi.AzureFunctions.Throttled;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.Hosting;
var host = new HostBuilder()
.ConfigureAppConfiguration(configBuilder => configBuilder.AddEnvironmentVariables())
.ConfigureFunctionsWorkerDefaults() // <---- OMITTING THIS IS ONE POSSIBLE CAUSE OF THE ERROR "Failed to start a new language worker for runtime: dotnet-isolated."
.ConfigureServices(Startup.Configure)
.UseDefaultServiceProvider((_, options) =>
{
options.ValidateScopes = true;
options.ValidateOnBuild = true;
})
.Build();
await host.RunAsync();
I got this propblem because I'd moved form a "normal" dotnet service and needed to tweak the FUNCTIONS_WORKER_RUNTIME from dotnet to dotnet-isolated:
local.settings.json
{
"IsEncrypted": false,
"Values": {
"FUNCTIONS_WORKER_RUNTIME": "dotnet-isolated",
....
}
}
Please make the below changes to your host.json file to include extensionBundle
{
"version": "2.0",
"extensionBundle": {
"id": "Microsoft.Azure.Functions.ExtensionBundle",
"version": "[2.*, 3.0.0)"
}
}
}
Upgrade to Microsoft.Azure.Functions.Worker version 1.3.0 or higher
Install-Package Microsoft.Azure.Functions.Worker -Version 1.3.0
Ensure that the appropriate runtime is specified while creating the Function App.
az functionapp create --consumption-plan-location westus --name <FunctionAppName> --resource-group <ResourceGroupName> --runtime dotnet-isolated --runtime-version 5.0 --functions-version 3 --storage-account <StorageAccountName>
I just ran a NuGet package update for:
Microsoft.Azure.Functions.Worker v1.3.0 → v1.4.0
Microsoft.Azure.Functions.Worker.sdk v1.0.3 → v1.0.4
This seems to have solved the problem.
Whatever you do, with Azure App Services and Function Apps, always, always, triple check you have all your Configuration Settings complete and typo-free. a missing or misspelled setting can cause start up and dependency injection problems with cryptic messages like this one. Also, if you are using staging slots, make sure they either also have all their configuration settings correct, or keep them stopped.
When you make some change that utterly destroy the app before it even starts, and have no idea what is causing this disaster.
If your function app is on Linux
I don't know, good luck? You can see logs of processes from Kudu when the app starts, which might help.
If your function app is on Windows
Open Kudu Console (Your Functions App > Advanced Tools > Go)
From the top menus, Debug Console > CMD
Navigate and open \home\LogFiles\eventlog.xml
note that the console opens at \home
Scroll to the bottom to see the latest error
If the file is huge and you have no idea which one is your error, you can delete the file, then restart the functions app. New logs will be populated.

Importing Data GI In web service

I have a requirement to import custom data into Acumatica using web service using web service.
I have created a custom table having 2 string field and one ntext field which will hold XML data.
Created a GI for it and exposed in web service endpoint.
The import JSON Data format is like this.
[
{
"OrderNbr": "1",
"CommandValue": "8",
"Xmldata": "<?xml version=\"1.0\" encoding=\"utf-8\"?><MLW Cmd=\"8\" TStamp=\"2018-12-21T11:38:25\" Id=\"dsgx1\" OrgId=\"157035408\" DevId=\"b9d863ca-REG-4825e4aa-566b5fc7\" RouteId=\"Resource-879-1\" StopId=\"Location230\" LocationKey=\"Location230\" StopType=\"67\"> <GPS Altitude=\"278.46383285522461\" Latitude=\"34.0487467032243\" Longitude=\"-84.673757432107507\" NoOfSat=\"7\" Speed=\"1.3679999828338623\" SatTStamp=\"2018-12-21T11:37:26\" Direction=\"0\" FixQuality=\"A\" /> <FieldData LCode=\"1\" OwnerId=\"Location230\"> <Field FId=\"89815\" Value=\"No\" /> <Field FId=\"89817\" Value=\"No\" /> <Field FId=\"89816\" Value=\"Patrick N\" /> </FieldData> <Job Id=\"Order-878-4\" Status=\"4\"> <Item Status=\"4\" Id=\"TIFTUF\" Mode=\"Manual\" /> </Job></MLW>"
}
]
I have tried in POSTMAN using basic authentication.
I am getting following error
PUT: 400 Bad request
GET: 500 Internal server error.
UPDATE: I have created a custom list page and configured it in the endpoint.
I have tested in POSTMAN and
Following are the endpoint and the JSON string used to create records
http://localhost/XYZ/(W(3))/entity/XYZ/17.200.001.001/MyResposeImport
{
"OrderNbr": {value :"b"},
"CommandValue": {value :"8"},
"Xmldata": {value :"<?xml version='1.0' encoding='utf-8'?><MLW Cmd='8' TStamp='2018-12-21T11:38:25' Id='dsgx1' OrgId='157035408' DevId='b9d863ca-REG-4825e4aa-566b5fc7' RouteId='Resource-879-1' StopId='Location230' LocationKey='Location230' StopType='67'> <GPS Altitude='278.46383285522461' Latitude='34.0487467032243' Longitude='-84.673757432107507' NoOfSat='7' Speed='1.3679999828338623' SatTStamp='2018-12-21T11:37:26' Direction='0' FixQuality='A' /> <FieldData LCode='1' OwnerId='Location230'> <Field FId='89815' Value='No' /> <Field FId='89817' Value='No' /> <Field FId='89816' Value='Patrick N' /> </FieldData> <Job Id='Order-878-4' Status='4'> <Item Status='4' Id='TIFTUF' Mode='Manual' /> </Job></MLW>"}
}
PUT Returns back OK and the response give below
{
"id": "94a00013-37bf-4077-bfb6-2e8662988547",
"rowNumber": 1,
"note": null,
"OrderNbr": {
"value": "b"
},
"ShippingStatus": {},
"XMLData": {},
"custom": {},
"files": []
}
I have checked in the back end and no record added to the table.
I have created the sitemap under the hidden section since the screen is only API call.
What may be the reason for the record not added to the table?
I have solved the Issue. The JSON Data field name is different from DAC label and API is look DAC label not the field name.
I have changed the JSON data to following and it works fine
{
"OrderNbr": {value :"b"},
"ShippingStatus": {value :"8"},
"XMLData": {value :"<?xml version='1.0' encoding='utf-8'?><MLW Cmd='8' TStamp='2018-12-21T11:38:25' Id='dsgx1' OrgId='157035408' DevId='b9d863ca-REG-4825e4aa-566b5fc7' RouteId='Resource-879-1' StopId='Location230' LocationKey='Location230' StopType='67'> <GPS Altitude='278.46383285522461' Latitude='34.0487467032243' Longitude='-84.673757432107507' NoOfSat='7' Speed='1.3679999828338623' SatTStamp='2018-12-21T11:37:26' Direction='0' FixQuality='A' /> <FieldData LCode='1' OwnerId='Location230'> <Field FId='89815' Value='No' /> <Field FId='89817' Value='No' /> <Field FId='89816' Value='Patrick N' /> </FieldData> <Job Id='Order-878-4' Status='4'> <Item Status='4' Id='TIFTUF' Mode='Manual' /> </Job></MLW>"}
}

Log4Net not displaying an accurate Timestamp

I'm logging an Azure-based webapp using a Log4Net CSV appender with:
I'm seeing multiple entries with an identical timestamp - clearly not logging the actual instant of a given event on the server:
2018-03-19 21:59:52.000 OrderId: 191096 Starts to validate, multi:
2018-03-19 21:59:52.000 OrderId: 191096 validation request:
2018-03-19 21:59:52.000 OrderId: 191096 passed validation. AuthKey:6128994
2018-03-19 21:59:52.000 OrderId: 191096 Single starts
2018-03-19 21:59:52.000 OrderId: 191096 submits:
2018-03-19 21:59:52.000 SaveOrderChanges: 191096
2018-03-19 21:59:52.000 SaveOrderChanges: 191096
I had thought perhaps it take to do with when the logs are written out to file vs. when the entry is literally generated but unless I'm mis-reading the context, this answer indicates otherwise.
Clearly I have something misconfigured. My CSV is built using code found at: http://element533.blogspot.com/2010/05/writing-to-csv-using-log4net.html
Full appender:
<appender name="CsvFileAppender" type="log4net.Appender.FileAppender">
<file value="D:/home/logfiles/log4netCSV.log" />
<lockingModel type="log4net.Appender.FileAppender+MinimalLock" />
<appendToFile value="true"/>
<threshold value="INFO" />
<layout type=" myWeb.CsvPatternLayout, myWeb">
<header value="DateTime,Thread,Level,Logger,Message,Exception
" />
<conversionPattern value="%date%newfield[%thread]%newfield %-5level%newfield% %property{Ip} _+ %aspnet-request{ASP.NET_SessionId} _+ %logger %newfield%message%newfield%exception%endrow" />
</layout>
</appender>
It is very strange that your dates are showing
2018-03-19 21:59:52.000
The default format is Iso8601 and separator between second and milisec is a comma:
https://github.com/apache/logging-log4net/blob/master/src/DateFormatter/Iso8601DateFormatter.cs#L26
I recommend you to use explicit dates:
%date{yyyy-MM-dd HH:mm:ss.fff}
Also if you want a CSV file you need to separate the values with commas:
<conversionPattern value="%date{yyyy-MM-dd HH:mm:ss.fff},[%thread],%level,..." />
Update:
I just tested this on Azure to see if it had anything to do, and it shows correctly:
http://swagger-net-test.azurewebsites.net/log4net.log
I have two log actions one right after the other and they do show different time stamps
DateTime,Thread,Level,Logger,Message
2018-04-08 13:19:48.658,[20],INFO,Swagger_Test.Controllers.LogController,Test1
2018-04-08 13:19:48.689,[20],ERROR,Swagger_Test.Controllers.LogController,Test2

Set log file limit with log4net and azure file appender

I'm currently using log4net and azure files to store my logs, works ace.
I've been searching and can't find any configuration to make the logger create files no bigger than a given KB size.
This is the configuration I have:
<rollingStyle value="Size" />
<MaxSizeRollBackups value="10" />
<MaximumFileSize value="10KB" />
<AzureStorageConnectionString value="connectiondatahere" />
<ShareName value="filelog" />
<Path value="processor" />
<File value="processor_{yyyy-MM-dd}.txt" />
<layout type="log4net.Layout.PatternLayout">
<ConversionPattern value="%date %-5level %logger %message%newline"/>
</layout>
</appender>
<root>
<level value="ALL" />
<appender-ref ref="AzureFileAppender"/>
</root>
I've tried a few variations of this configuration but no luck.
After reviewed the source code of log4net-appender-azurefilestorage, I found the log file size limit is not support in azure file appender currently. I suggest you rewrite the azure file appender by yourself and add the size limit feature.
Below are the steps to do it.
Step 1, Add a property named MaximumFileSize to AzureFileAppender class.
public int MaximumFileSize { get; set; }
Step 2, add the size limit code when appending log to file.
protected override void Append(LoggingEvent loggingEvent)
{
Initialise(loggingEvent);
var buffer = Encoding.UTF8.GetBytes(RenderLoggingEvent(loggingEvent));
if ((_file.Properties.Length + buffer.Length) > MaximumFileSize)
{
//do something if the file reach the max file size
}
else
{
_file.Resize(_file.Properties.Length + buffer.Length);
using (var fileStream = _file.OpenWrite(null))
{
fileStream.Seek(buffer.Length * -1, SeekOrigin.End);
fileStream.Write(buffer, 0, buffer.Length);
}
}
}
Step 3, After that, you could add the size limit(per byte) to configuration file.
<MaximumFileSize value="10240" />

Resources