Akka.Net logging to file - nlog

I am trying to follow this tutorial:
to add logging to my Akka.Net application but the log file is not being created.
Below is my App.config file:
<?xml version="1.0" encoding="utf-8" ?>
<configuration>
<configSections>
<section name="akka" type="Akka.Configuration.Hocon.AkkaConfigurationSection, Akka" />
</configSections>
<akka>
<hocon>
<![CDATA[
akka
{
loglevel = INFO
loggers = ["Akka.Logger.NLog.NLogLogger, Akka.Logger.NLog"]
}
]]>
</hocon>
</akka>
</configuration>
and this is my NLog.config file:
<?xml version="1.0" encoding="utf-8" ?>
<nlog xmlns="http://www.nlog-project.org/schemas/NLog.xsd"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<targets>
<target name="file" xsi:type="File" fileName="C:\Users\User\Documents\logs\test.log" />
</targets>
<rules>
<logger name="*" minlevel="Info" writeTo="file" />
</rules>
</nlog>
Program.cs
static void Main(string[] args)
{
ActorSystem actorSystem = ActorSystem.Create("IMSIFilteringActorSystem");
IActorRef fileSearcherActor = actorSystem.ActorOf(Props.Create(() => new FileSearcherActor()), "fileSearcherActor");
ILoggingAdapter logger = Logging.GetLogger(actorSystem, actorSystem, null);
actorSystem.Scheduler.ScheduleTellRepeatedly(TimeSpan.FromSeconds(0), TimeSpan.FromSeconds(5), fileSearcherActor, engineParams, ActorRefs.NoSender);
logger.Info("Sending messages from logging");
}
Any help is appreciated, Thank you

I think you miss the NLog configuration. Try something like this:
var config = new LoggingConfiguration();
var fileTarget = new FileTarget("fileTargetName")
{
FileName = "Absolute path to your log file.",
// The layout I once composed. Use yours or remove this property initialization at all.
Layout = #"[${level}][${longdate}][${stacktrace:format=Flat}]${literal:text=[Exception\: :when=length('${exception}')>0}${exception}${literal:text=]:when=length('${exception}')>0} <${message}>",
};
config.AddTarget(fileTarget);
config.AddRuleForAllLevels(fileTarget);
LogManager.Configuration = config;
To view all log entries (including those generated by Akka.NET itself), use the following configuration in your App.config:
akka
{
loggers = ["Akka.Logger.NLog.NLogLogger, Akka.Logger.NLog"]
loglevel = debug
log-config-on-start = on
actor
{
debug
{
receive = on # log any received message
autoreceive = on # log automatically received messages, e.g. PoisonPill
lifecycle = on # log actor lifecycle changes
event-stream = on # log subscription changes for Akka.NET event stream
unhandled = on # log unhandled messages sent to actors
}
}
}

Related

NLog ${aspnet-request-posted-body} not returning data

Trying to log all API calls for an ASP.NET Web API 2 project. Created a DelegatingHandler and not able to get the aspnet-request-posted-body layout render to work.
Type: Bug (or maybe I'm missing something?)
NLog version: 5.0.1
NLog.Web version: 5.1.0
NLog.Extensions.Logging version: (not installed)
Platform: .Net 4.7.2 (working with a ASP.Net Web API 2)
Current NLog config (xml or C#, if relevant)
<nlog autoReload="True"
throwConfigExceptions="False"
internalLogLevel="Trace"
internalLogFile="${basedir}App_Data\Logs\internal-nlog.txt"
xmlns="http://www.nlog-project.org/schemas/NLog.xsd"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<variable name="logDirectory" value="App_Data/Logs"/>
<!-- enable ASP.NET layout renderers -->
<extensions>
<add assembly="NLog.Web"/>
</extensions>
<targets async="true">
<target name="mainfile"
xsi:type="File"
fileName="${logDirectory}/main-${shortdate}.txt"
layout="${longdate}|${aspnet-request-posted-body}" />
</targets>
<logger name="*" minlevel="Trace" writeTo="mainfile" />
</nlog>
What is the current result?
2022-07-27 17:19:21.5446|
What is the expected result?
2022-07-27 17:19:21.5446|{"username":"xyz","password":"xyz"}
Did you checked the Internal log?
Yes, the Internal Log had 0 (zero) errors
Please post full exception details (message, stacktrace, inner exceptions)
None
Are there any workarounds?
Not sure
Is there a version in which it did work?
Have not tried
Can you help us by writing an unit test?
Not sure how for this
Code:
using System.Net.Http;
using System.Threading;
using System.Threading.Tasks;
namespace Web.Api.Handlers {
public class LogFilter : DelegatingHandler {
private readonly NLog.Logger _logger = NLog.LogManager.GetCurrentClassLogger();
protected override async Task<HttpResponseMessage> SendAsync(HttpRequestMessage request, CancellationToken cancellationToken) {
if (request?.Content != null) {
// Do stuff
}
var response = await base.SendAsync(request, cancellationToken);
if (response?.Content != null) {
// Do stuff
}
_logger.Info("test");
return response;
}
}
}
There is a breaking change in NLog.Web v5 where ${aspnet-request-posted-body} was removed (because the implementation was not threadsafe).
Then with NLog.Web.AspNetCore v5.1 it was restored, but required that one replaced:
app.Use(async (context, next) => {
context.Request.EnableBuffering();
await next();
});
With (If using ASP.NET Core):
app.UseMiddleware<NLog.Web.NLogRequestPostedBodyMiddleware>();
Of if using ASP.NET MVC v4, then register HTTP module NLog.Web.NLogRequestPostedBodyModule.

Uploading big files to dotnetcore app in Azure fails (404)

I can't upload big files and I'm not sure if it's still about a size limitation or about a timeout.
On the controller endpoint, I tried all the attributes I found (at once)
[HttpPost("[action]")]
[DisableRequestSizeLimit]
[RequestFormLimits(MultipartBodyLengthLimit = long.MaxValue, BufferBodyLengthLimit = long.MaxValue)]
[RequestSizeLimit(int.MaxValue)]
public async Task UploadForm()
During 'ConfigureServices' I also setup this:
services.Configure<FormOptions>(options =>
{
options.MemoryBufferThreshold = int.MaxValue;
options.ValueLengthLimit = int.MaxValue;
options.ValueCountLimit = int.MaxValue;
options.MultipartBodyLengthLimit = int.MaxValue; // In case of multipart
});
But I still get 404 errors after uploading a part of the file (30 MB are already too much).
Then I even tried setting up the kestrel with the following code, but like that the app doesn't even start (502)
.UseKestrel((KestrelServerOptions o) =>
{
o.Limits.KeepAliveTimeout = TimeSpan.FromMinutes(120);
o.Limits.RequestHeadersTimeout = TimeSpan.FromMinutes(120);
o.Limits.MaxRequestBodySize = null;
})
have a look of this Offcial doc.
Solution:
Change the value of maxAllowedContentLength.
Add these code in Web.config(under site/wwwroot on Kudu):
<configuration>
<system.webServer>
<security>
<requestFiltering>
<requestLimits maxAllowedContentLength="<valueInBytes>"/>
</requestFiltering>
</security>
</system.webServer>
</configuration>
This should work without restart.

Deploy asp.net core webapi app from VS Code to IIS web server errors

I am new to asp.net core, Im used to Visual Studio 2017 and how publishing web apps works with that.
I rewrote a simple login endpoint in asp.net core using VS Code that was previously written in Visual Studio 2017 using MVC and tried to publish it following articles I googled. I'm getting an error when I browse the app in IIS. Something else looks weird too, when I view the published files they look incomplete. I'm new to VS Code creating webapi core apps so maybe I am wrong. I'll attach a screen shot of the published files.
The error I am getting is: "
HTTP Error 502.3 - Bad Gateway
There was a connection error while trying to route the request.
"
I have installed the .NET Core SDK and required installs on the IIS machine and set up the APP Pool for No Managed Code as well.
Web Config
<?xml version="1.0" encoding="utf-8"?>
<configuration>
<location path="." inheritInChildApplications="false">
<system.webServer>
<handlers>
<add name="aspNetCore" path="*" verb="*" modules="AspNetCoreModule" resourceType="Unspecified" />
</handlers>
<aspNetCore processPath="dotnet" arguments=".\LB_CONNECT_API_2.dll" stdoutLogEnabled="false" stdoutLogFile=".\logs\stdout" hostingModel="InProcess" />
</system.webServer>
</location>
</configuration>
SCREENSHOT Of PUBLISHED FILES
I tried editing web.config and removing the V2 from AspNetCore
[Route("api/[controller]")]
[ApiController]
public class LoginController : ControllerBase
{
// POST: api/Login
public User Post(User user)
{
SqlDataReader reader = null;
SqlConnection myConnection = new SqlConnection
{
ConnectionString = #"Server=ServerName;Database=DB;User ID=User;Password=password;"
};
SqlCommand sqlCmd = new SqlCommand
{
CommandType = CommandType.StoredProcedure,
CommandText = "lb_Login",
Connection = myConnection
};
sqlCmd.Parameters.AddWithValue("#Email", user.Email);
sqlCmd.Parameters.AddWithValue("#Password", user.Password);
myConnection.Open();
reader = sqlCmd.ExecuteReader();
User _user = null;
while (reader.Read())
{
_user = new User
{
UserID = Guid.Parse(reader["UserID"].ToString()),
FirstName = reader["FirstName"].ToString(),
LastName = reader["LastName"].ToString(),
GroupName = reader["GroupName"].ToString(),
Email = reader["Email"].ToString(),
Cell = reader["Cell"].ToString()
};
}
return _user;
}
I need it to return the user object in JSON

Microsoft.AspNetCore.Server.Kestrel.Core.BadHttpRequestException: Request body too large

I'm trying to upload a 100MB film to my ASP.NET Core application.
I've set this attribute on my action:
[RequestSizeLimit(1_000_000_000)]
And I also changed my Web.config file to include:
<security>
<requestFiltering>
<!-- This will handle requests up to 700MB (CD700) -->
<requestLimits maxAllowedContentLength="737280000" />
</requestFiltering>
</security>
In other words, I've told IIS to allow files up to 700MBs and I've also told ASP.NET Core to allow files of near 1GB.
But I still get that error. And I can't find the answer. Any ideas?
P.S: Using these configurations, I could pass the 30MB default size. I can upload files of 50 or 70 Mega Bytes.
I think you just need: [DisableRequestSizeLimit]
below is a solution that worked for me to upload Zip files with additional form data to an API running .Net Core 3
// MultipartBodyLengthLimit was needed for zip files with form data.
// [DisableRequestSizeLimit] works for the KESTREL server, but not IIS server
// for IIS: webconfig... <requestLimits maxAllowedContentLength="102428800" />
[RequestFormLimits(ValueLengthLimit = int.MaxValue, MultipartBodyLengthLimit = int.MaxValue)]
[DisableRequestSizeLimit]
[Consumes("multipart/form-data")] // for Zip files with form data
[HttpPost("MyCustomRoute")]
public IActionResult UploadZippedFiles([FromForm] MyCustomFormObject formData)
{ }
For me (Asp.net core 3.1) the solution was to add these lines in ConfigureServices method of Startup.cs:
// 200 MB
const int maxRequestLimit = 209715200;
// If using IIS
services.Configure<IISServerOptions>(options =>
{
options.MaxRequestBodySize = maxRequestLimit;
});
// If using Kestrel
services.Configure<KestrelServerOptions>(options =>
{
options.Limits.MaxRequestBodySize = maxRequestLimit;
});
services.Configure<FormOptions>(x =>
{
x.ValueLengthLimit = maxRequestLimit;
x.MultipartBodyLengthLimit = maxRequestLimit;
x.MultipartHeadersLengthLimit = maxRequestLimit;
});
and editing web.config:
<system.webServer>
<security>
<requestFiltering>
<requestLimits maxAllowedContentLength="209715200" />
</requestFiltering>
</security>
</system.webServer>
NOTE: This is issue I faced when I migrated my application from
asp.net core 2.1 to 3.0
To fix this in asp.net core 3.0 I have changed my program.cs to modify maximum request body size like below.
public class Program
{
public static void Main(string[] args)
{
CreateWebHostBuilder(args).Build().Run();
}
public static IWebHostBuilder CreateWebHostBuilder(string[] args)
{
return WebHost.CreateDefaultBuilder(args)
.ConfigureKestrel((context, options) =>
{
options.Limits.MaxRequestBodySize = 737280000;
})
.UseStartup<Startup>();
}
}
}
I mean I have just added ConfigureKestrel part and added an attribute above my action method [RequestSizeLimit(737280000)] like below
[HttpPost]
[RequestSizeLimit(737280000)]
[Route("SomeRoute")]
public async Task<ViewResult> MyActionMethodAsync([FromForm]MyViewModel myViewModel)
{
//Some code
return View();
}
And my application started behaving correctly again without throwing BadHttpRequestException: Request body too large
reference: https://learn.microsoft.com/en-us/aspnet/core/mvc/models/file-uploads?view=aspnetcore-3.0#kestrel-maximum-request-body-size
I was using web.config to configure this (while our api is hosted in IIS):
<system.webServer>
<security>
<requestFiltering>
<requestLimits maxAllowedContentLength="157286400" />
</requestFiltering>
</security>
</system.webServer>
But now we are moving our api to linux containers and using Kestrel. Then I configured it like this:
.ConfigureWebHostDefaults(webBuilder =>
{
webBuilder
.ConfigureKestrel(serverOptions =>
{
serverOptions.Limits.MaxRequestBodySize = 157286400;
})
.UseStartup<Startup>();
})
157286400 = 150mb;
Azure functions have a hard-coded limit of 100MB: https://github.com/Azure/azure-functions-host/issues/5854

how to get azure webrole to log all 404s

I have been trying to get logging working with azure for my MVC project but so far haven't had much success.
I have a Diagnostics connection string in my ServiceConfiguration.Cloud.cscfg file which points to my blob storage:
...
<Setting name="Microsoft.WindowsAzure.Plugins.Diagnostics.DiagnosticsConnectionString" value="**ConectionString**" />
</ConfigurationSettings>
My web.config has tracing set up
...
<tracing>
<traceFailedRequests>
<remove path="*"/>
<add path="*">
<traceAreas>
<add provider="ASP" verbosity="Verbose" />
<add provider="ASPNET" areas="Infrastructure,Module,Page,AppServices" verbosity="Verbose" />
<add provider="ISAPI Extension" verbosity="Verbose" />
<add provider="WWW Server" areas="Authentication,Security,Filter,StaticFile,CGI,Compression,Cache,RequestNotifications,Module" verbosity="Verbose" />
</traceAreas>
<failureDefinitions timeTaken="00:00:15" statusCodes="400-599" />
</add>
</traceFailedRequests>
</tracing>
</system.webServer>
My WebRole.cs has the following in
using System;
using System.Collections.Generic;
using System.Linq;
using Microsoft.WindowsAzure;
using Microsoft.WindowsAzure.Diagnostics;
using Microsoft.WindowsAzure.ServiceRuntime;
namespace MvcWebRole1
{
public class WebRole : RoleEntryPoint
{
public override bool OnStart()
{
// Get the factory configuration so that it can be edited
DiagnosticMonitorConfiguration config = DiagnosticMonitor.GetDefaultInitialConfiguration();
// Set scheduled transfer interval for infrastructure logs to 1 minute
config.DiagnosticInfrastructureLogs.ScheduledTransferPeriod = System.TimeSpan.FromMinutes(1);
// Specify a logging level to filter records to transfer
config.DiagnosticInfrastructureLogs.ScheduledTransferLogLevelFilter = LogLevel.Verbose;
// Set scheduled transfer interval for user's Windows Azure Logs to 1 minute
config.Logs.ScheduledTransferPeriod = System.TimeSpan.FromMinutes(1);
DiagnosticMonitor.Start("Microsoft.WindowsAzure.Plugins.Diagnostics.DiagnosticsConnectionString", config);
//RoleEnvironment.Changing += this.RoleEnvironmentChanging;
return base.OnStart();
}
}
}
But the I am not seeing any diagnostics logs
The mam folder just contains an MACommanda.xml and a MASecret, vsdeploy folder is empty and the wad-control-container has a file for each deployment.
Am I missing something / doing something wrong?
I have been trying to follow the guides from http://msdn.microsoft.com/en-us/library/windowsazure/gg433048.aspx in particular http://channel9.msdn.com/learn/courses/Azure/Deployment/DeployingApplicationsinWindowsAzure/Exercise-3-Monitoring-Applications-in-Windows-Azure
Update:
I found the following which could be part of the problem
IIS7 Logs Are Not Collected Properly -
http://msdn.microsoft.com/en-us/library/hh134842
although that should only account for the 404s not working, with a failure definition of 15 seconds my 17 second sleep in my controller action should have still been logged
Here is how I was finally able to get all the logging working on the Azure web role:
In the WebRole.cs include the following:
// Get the default initial configuration for DiagnosticMonitor.
var config = DiagnosticMonitor.GetDefaultInitialConfiguration();
// Filter the logs so that only error-level logs are transferred to persistent storage.
config.DiagnosticInfrastructureLogs.ScheduledTransferLogLevelFilter = config.Logs.ScheduledTransferLogLevelFilter =
config.WindowsEventLog.ScheduledTransferLogLevelFilter = LogLevel.Verbose;
// Schedule a transfer period of 30 minutes.
config.DiagnosticInfrastructureLogs.ScheduledTransferPeriod = config.Logs.ScheduledTransferPeriod = config.WindowsEventLog.ScheduledTransferPeriod =
config.Directories.ScheduledTransferPeriod = config.PerformanceCounters.ScheduledTransferPeriod = TimeSpan.FromMinutes(1);
// Specify a buffer quota.
config.DiagnosticInfrastructureLogs.BufferQuotaInMB = config.Logs.BufferQuotaInMB = config.WindowsEventLog.BufferQuotaInMB =
config.Directories.BufferQuotaInMB = config.PerformanceCounters.BufferQuotaInMB = 512;
// Set an overall quota of 8GB maximum size.
config.OverallQuotaInMB = 8192;
// WindowsEventLog data buffer being added to the configuration, which is defined to collect event data from the System and Application channel
config.WindowsEventLog.DataSources.Add("System!*");
config.WindowsEventLog.DataSources.Add("Application!*");
// Use 30 seconds for the perf counter sample rate.
TimeSpan perfSampleRate = TimeSpan.FromSeconds(30D);
config.PerformanceCounters.DataSources.Add(new PerformanceCounterConfiguration()
{
CounterSpecifier = #"\Memory\Available Bytes",
SampleRate = perfSampleRate
});
config.PerformanceCounters.DataSources.Add(new PerformanceCounterConfiguration()
{
CounterSpecifier = #"\Processor(_Total)\% Processor Time",
SampleRate = perfSampleRate
});
config.PerformanceCounters.DataSources.Add(new PerformanceCounterConfiguration()
{
CounterSpecifier = #"\ASP.NET\Applications Running",
SampleRate = perfSampleRate
});
// Start the DiagnosticMonitor using the diagnosticConfig and our connection string.
DiagnosticMonitor.Start("Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString",config);
return base.OnStart();
In the web.config under system.WebServer add the following:
<tracing>
<traceFailedRequests>
<remove path="*"/>
<add path="*">
<traceAreas>
<add provider="ASPNET" areas="Infrastructure,Module,Page,AppServices" verbosity="Verbose" />
<add provider="WWW Server" areas="Authentication,Security,Filter,StaticFile,CGI,Compression,Cache,RequestNotifications,Module" verbosity="Verbose" />
</traceAreas>
<failureDefinitions statusCodes="400-599" />
</add>
</traceFailedRequests>
</tracing>
In the service definition file add the following under the web role:
<LocalResources>
<LocalStorage name="DiagnosticStore" sizeInMB="8192" cleanOnRoleRecycle="false"/>
</LocalResources>
That should enable all your logging in the MVC application.
Can you try by removing the "timeTaken" attribute from "failureDefinitions" node? Ref: http://msdn.microsoft.com/en-us/library/aa965046(v=VS.90).aspx.
Check the deployment's diagnostics settings by viewing the appropriate file in wad-control-container.
I note that you are not setting what in my experience is all the required values for DiagnosticInfrastructureLogs or for Logs, including bufferQuotaInMB and scheduledTransferLogLevelFilter.
Try this:
// Get the factory configuration so that it can be edited
DiagnosticMonitorConfiguration config = DiagnosticMonitor.GetDefaultInitialConfiguration();
config.DiagnosticInfrastructureLogs.bufferQuotaInMB = 512;
config.DiagnosticInfrastructureLogs.ScheduledTransferPeriod = System.TimeSpan.FromMinutes(1D);
config.DiagnosticInfrastructureLogs.ScheduledTransferLogLevelFilter = LogLevel.Verbose;
config.Logs.bufferQuotaInMB = 512;
config.Logs.ScheduledTransferLogLevelFilter = LogLevel.Verbose;
config.Logs.ScheduledTransferPeriod = System.TimeSpan.FromMinutes(1D);
Try that to start. Ensure as well that you have added trace listeners.

Resources