I’m in the process of creating my first production Azure Function and I’m trying to write log information to a local file and then eventually to Blob Storage. The local file is more for development troubleshooting and then ultimately I would like to have production information stored in Blob Storage. I’m not only new with Azure Functions but I’m also new with Serilog. I’ve used NLog in all my other applications but couldn’t get it to work with Azure Functions.
Currently I’m trying to get the local log working. I actually seem to have it working but I’m not understanding how I can tweak a couple things.
The first thing I’m trying to change is the amount of information that is getting logged. It seems to be logging a whole bunch of system type of information like Request info to the blob storage. There is so much stuff getting logged that what entry I'm adding in code gets lost. It looks like all the system entries are marked as Information which is why it’s probably showing up in my log. However, I would like to see if I could get it to only log data from when I specifically call the logger.Information(“some text”) in my code. Is there a way to suppress all of the Microsoft system information?
Second thing is how I can I make the Serilog configuration come from my local.settings.json file. Below is a sample of my file and I’m not sure if I would add the configuration information in the Values: property or if I would put it outside of that property into its own property? I’m assuming it would be in it’s own property but so far all my custom settings have been coming from the Values: property?
Do I need to add the Serilog.Settings.Configuration NuGet package? If so, then I’m not understanding how I configure my Startup.cs file to get the information from the local settings file instead of configuring the settings directly in code. Eventually, I would like to add it to Dependency Injection so I can use the logger in other classes as well.
Startup.cs
public override void Configure(IFunctionsHostBuilder builder)
{
builder.Services.AddTransient<IDataManager, DataManager>();
ConfigureServices(builder.Services).BuildServiceProvider(true);
}
private IServiceCollection ConfigureServices(IServiceCollection services)
{
services
.AddLogging(loggingBuilder =>
loggingBuilder.AddSerilog(
new LoggerConfiguration()
.MinimumLevel.Information()
.WriteTo.Console()
.WriteTo.File(#"D:\logs\AzureFunction\log_.txt", rollingInterval: RollingInterval.Day)
.CreateLogger())
);
return services;
}
Local.settings.json
{
"IsEncrypted": false,
"Values": {
"ProcessLookBackDays": "90",
"SqlConnection": "connection info",
"StorageConnection": "connection info"
"AzureWebJobsStorage": "connection info"
"InputContainer": "test-files",
"InputFolder": "input",
"FUNCTIONS_WORKER_RUNTIME": "dotnet"
},
"Serilog": {
"Using": [ "Serilog.Sinks.File" ],
"MinimumLevel": {
"Default": "Information"
},
"WriteTo": [
{
"Name": "File",
"Args": {
"path": "D:\\logs\\AzureFunction\\log_.log",
"rollingInterval": "Day",
"outputTemplate": "[{Timestamp:yyyy-MM-dd HH:mm:ss.fff zzz} {CorrelationId} {Level:u3}] {Username} {Message:lj}{NewLine}{Exception}"
}
}
]
}
}
Setting up configuration in local-setting. Json will not reflect in the azure function app. local-setting as the name suggest is for local use only while with azure you need to use the app setting and read them.
Just add a new setting in the app setting and you can then use the below code to read the settings.
var appsettings = Environment.GetEnvironmentVariable("Name of the setting");
When you want to use the external configuration with serilog then you can use the Serilog.Settings.Configuration.
Now you can configure the minimum level of a log event so all the log event with less importance than the specified minimum level will not be logged.
Log.Logger = new LoggerConfiguration()
.MinimumLevel.Debug()
.WriteTo.Console(restrictedToMinimumLevel: LogEventLevel.Information)
.CreateLogger();
Here we specify two things the .MinimumLevel.Debug() and restrictedToMinimumLevel this attribute dictates the minimum level for the that particular sink. A sink is just a place where you can log, they are configured using the Writeto tag. for e.g., in the above code the sink is a console. There are other sink too.
The minimum levels are verbose Debug, information, warning, error, fatal.
Reference:-
Read configuration from app setting by Ashish Patel
Use serilog to filter the logs
Serilog setting configuration
Related
I set up a POC/demo project to test application insights in order to have a full observability experience.
I think I followed the different recommandation, like using the workspace-based application insight resource, using the connection string approach (rather than the instrumentation key one), and used the C#/.net 6 minimal web api project template modified as described to enable/configure the application insights telemetry.
When I run my example, everything works fine regarding metrics, live metrics, application maps up to the application insight workspace (I can visualize charts, see the live metrics, start a map...).
BUT I can not see any logs... Where should I find those? How can I trouble shoot this?
(I added a console output for the loggs just to check if filtering is ok, but it works also...)
Do mind that if you write logs using the ILogger interface with severity level "Information" like this _logger.LogInformation("Some Info"); you need to adjust the loglevel settings because the default settings only log trace telemetry of level warning and up.
You can modify that in the configuration (also, see the docs):
{
"Logging": {
"LogLevel": {
"Default": "Information"
},
"ApplicationInsights": {
"LogLevel": {
"Default": "Information"
}
}
}
}
You can use the transaction search in the portal:
Or you can query the workspace directly:
I am currently trying to list all instances of an activity function and the orchestrator function using azure function core tools. The application synchronizes data from different sources into a centralized location.
The setup is as follows:
TimerTrigger -> Durable Orchestrator -> Multiple Activity Functions
In my concrete example, it is like this:
Start Synchronization -> Orchestrate Synchronizations -> Synchronize Source
So we start the synchronization process which starts the orchestrator. The orchestrator then starts multiple different synchronizations, one for each source. The problem though is that I cannot seem to get the azure function core tools to list me all instances of the functions I am interested in.
Unfortunately, I would really prefer not to have to use the REST api to query for this information. The setup really complicates things with IP restrictions and managed identity authentication. I think I can correct the setup to get things to work from my network + user, if really needed, but I think that will take way longer than required.
I have tried running the following command:
func durable get-instances
in a directory with a file called host.json with the following contents:
{
"version": "2.0",
"AzureWebJobsStorage":"DefaultEndpointsProtocol=https;AccountName=Name;AccountKey=Key;EndpointSuffix=core.windows.net",
}
I have also tried where the contents of the file are as follows:
{
"version": "2.0",
"extensions": {
"durableTask": {
"storageProvider": {
"connectionStringName": "DefaultEndpointsProtocol=https;AccountName=Name;AccountKey=Key;EndpointSuffix=core.windows.net"
}
}
}
}
I have tried calling the func durable get-instances with and without the --connection-string-setting parameter, with the values 'AzureWebJobsStorage' and 'extensions:durableTask:storageProvider:connectionStringName', but nothing seems to work. I keep getting the error No storage connection string found.
I know that the connection string is correct. I have pulled it directly from the storage account 'Access keys' blade.
Is there anything I am missing? What am I doing wrong?
Thanks to #juunas, I got it to work. I edited the host.json file to have the following content:
{
"version": "2.0"
}
and created another file called local.settings.json with the following contents:
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "DefaultEndpointsProtocol=https;AccountName=Name;AccountKey=Key;EndpointSuffix=core.windows.net",
}
}
Running func durable get-instances now works and returns a continuation token, but an empty list. I was not expecting that, but I can now start exploring and understanding here what is going on.
I've got an azure app service that I've set up like this:
But when I call IConfiguration.GetConnectionString("db") I get null back.
I've read articles like this https://mderriey.com/2018/08/21/azure-app-service-connection-strings-and-asp-net-core/ which say "it just works", but they're all several years old. I assume something's changed, but what?
Enumerating over all settings in my IConfiguration object I've got no connection strings. I do in development, where my appsettings.development.json has a connectionStrings: { db: "" } defined.
I can see and read the ENV variable: POSTGRESQLCONNSTR_db from within code, and it's value is correct (what I've set via the Azure portal).
Should I expect to be able to do IConfiguration.GetConnectionString("db")? Or am I expected to switch between reading env variables in prod vs dev.
Do I need to include some nuget package to make IConfiguration work under Azure with these ENV variables and their mad prefixes?
My startup.cs basically looks like:
public Startup(IConfiguration configuration)
{
this.Configuration = configuration;
}
public IConfiguration Configuration { get; }
Nothing else in there of interest to this question.
The POSTGRESQLCONNSTR_ prefix isn't supported by the environment variables configuration provider. The docs shows this, in an indirect fashion, where it states that the following prefixes are supported:
CUSTOMCONNSTR_
MYSQLCONNSTR_
SQLAZURECONNSTR_
SQLCONNSTR_
It's also apparent in the source code for the provider.
There are a couple of options for working around this:
Change the Type to Custom in the Connection strings section of the Azure portal.
Change to an Application setting of ConectionStrings:db in the Azure portal.
This is being tracked on GitHub: https://github.com/dotnet/runtime/issues/36123.
I got confused as well, so here it is:
You have two options to specify Connection String locally:
launchSettings.json (environmentVariables section)
"environmentVariables": {
"ASPNETCORE_ENVIRONMENT": "Development",
"SQLAZURECONNSTR_SomeConnectionString": "DefaultEndpointsProtocol=blah"
}
appSettings.json
"ConnectionStrings": {
"SomeConnectionString": "DefaultEndpointsProtocol=blah"
}
Having either way will allow you to get the connection string setting by calling:
IConfiguration.GetConnectionString("SomeConnectionString")
Function call above will also work when deployed to Azure, as it is using EnvironmentVariables configuration provider to read settings.
Instead of getting the config from the interface
IConfiguration.GetConnectionString("db")
try to get it from
Configuration.GetConnectionString("db")
And in production you have an empty string in production.appsetting.json and add the value in azure(appservice) configuration directly under connectionstrings(this will override the json setting file). And no nugets are needed for reading from appsettings
I have a Timer Azure Function which I execute in VS. Right click on the Azure Function project and Debug. The function has an ILogger log.
Inspecting the log object I can see that is has two loggers
Azure.Functions.Cli.Diagnostics.ColoredConsoleLogger
Microsoft.Azure.WebJobs.Script.Diagnostics.FileLogger
I also can see that the RootLogPath is %temp%\LogFiles\Application\Functions.
However at that location there is only a "Host" folder. I expected to find a "Function" folder as well with the log file.
Do I need to enable somehow the File Logger? Do I miss anything?
To get file logs in local dev, we do have to modify the fileLoggingMode to always in host.json. The default debugOnly setting doesn't make function write file logs locally.
For v2 Functions
{
"version": "2.0",
"logging": {
"fileLoggingMode": "always"
}
}
For v1 Functions
{
"tracing": {
"fileLoggingMode": "always"
}
}
Now that we can use the hugely flexible configuration engine from .NETCore - we can do something like this :
private static IConfigurationRoot SetConfig(ExecutionContext executionContext)
{
return new ConfigurationBuilder()
.SetBasePath(executionContext.FunctionAppDirectory)
.AddJsonFile("local.settings.json", optional: true, reloadOnChange: true)
.AddEnvironmentVariables()
.Build();
}
Which is great as it allow you to put more complicated configuration data in the config file - for instance
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "<< removed >>",
"FUNCTIONS_WORKER_RUNTIME": "dotnet",
"APPINSIGHTS_INSTRUMENTATIONKEY": "<< removed >>"
},
"MyCustomSettings": [
{
"ConnectionString": "<< removed >>",
"Folders": [
{
"ShareName": "share1",
"FolderName": "folder1"
},
{
"ShareName": "share2",
"FolderName": "folder2"
}
]
}
]
}
Again - great news! I can now get access to my strongly typed configuration with config["MyCustomSettings"]
What I don't get though - is how this can be deployed when publishing the function. Only the Values section is migrated to the Azure function Application Settings. I can obviously put this custom json in a json file and add it to the load statement like the local.settings.json
.AddJsonFile("my-custom-settings.json", optional: false, reloadOnChange: true)
but then this file has to be included in the deploy, and is not stored securely.
Any ideas?
This is not officially supported as of Nov 2019.
Warning
Avoid attempting to read values from files like local.settings.json or appsettings.{environment}.json on the Consumption plan. Values read from these files related to trigger connections aren't available as the app scales because the hosting infrastructure has no access to the configuration information.
There are so many blogs around advising to do this and it appears this may work if you only have a single instance, but as soon as the ScaleController triggers scaling, new instances will not be able to find the config files.
If you have triggers that use the %SettingName% syntax, they will not work as the function scales.
Functions team is considering possible Options (lol)
There is also the option of using the new App Configuration service but it is currently in preview and isn't available in all Azure regions.
It may be simpler to put your config in blob storage, and load it at startup? (Your blob store connectionstring will need to be in env variables)
So far the best we can do is to include your "not so secret" settings (things like MyThingTimeout or ExternalEndpointAddress) in a json file and use .AddJsonFile(...) and put secrets in KeyVault. This does force you to split your config and decide which goes where. (And also make sure your triggers only read from the Values section/Environment Variables)