Anyone know of any god guide for this?
First i created an Application Insight Resource and put:
APPINSIGHTS_INSTRUMENTATIONKEY = "INSTRUMENTATION KEY"
in the Function Apps Application Settings.
I have tried implementering the nuget package for the funtion app like this.
Createing a project.json file and pasting this:
{
"frameworks": {
"net46":{
"dependencies": {
"Microsoft.ApplicationInsights": "2.1.0"
}
}
}
}
It installed the nuget package (i could see it in the log, everything went well).
After that i put these snippets in my code to use the telemetry.TrackException(exception) functionality:
First...
using Microsoft.ApplicationInsights;
using Microsoft.ApplicationInsights.Extensibility;
Then:
var telemetry = new TelemetryClient(new TelemetryConfiguration("INSTRUMENTATION KEY"));
and in my catch:
telemetry.TrackException(e);
and when i try to save my Function app i get this error:
error CS1729: 'TelemetryConfiguration' does not contain a constructor that takes 1 arguments
You don't need to use reference the Application Insights library to use it with Functions. If you've already set the APPINSIGHTS_INSTRUMENTATIONKEY application setting, you can simply add the ILogger interface as a parameter to your function and it will automatically send the log data to your Application Insights instance.
Full documentation can be found here: https://learn.microsoft.com/en-us/azure/azure-functions/functions-monitoring
Addition to #ChrisGillum answer for .Net Core 3.1 Azure Function:
If you create a new Azure Function with Http trigger from Visual Studio the following line will exist in the example:
log.LogInformation("C# HTTP trigger function processed a request.");
Add "APPINSIGHTS_INSTRUMENTATIONKEY" to local.settings.json Values.
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "UseDevelopmentStorage=true",
"FUNCTIONS_WORKER_RUNTIME": "dotnet",
"APPINSIGHTS_INSTRUMENTATIONKEY": "<YOUR_GUID>"
},
}
Note that the key must be in an app setting named APPINSIGHTS_INSTRUMENTATIONKEY and nothing else.
Logging is then added automatically:
Complete guide for hosted values:
https://learn.microsoft.com/en-us/azure/azure-functions/functions-monitoring?tabs=cmd#enable-application-insights-integration
Related
I’m in the process of creating my first production Azure Function and I’m trying to write log information to a local file and then eventually to Blob Storage. The local file is more for development troubleshooting and then ultimately I would like to have production information stored in Blob Storage. I’m not only new with Azure Functions but I’m also new with Serilog. I’ve used NLog in all my other applications but couldn’t get it to work with Azure Functions.
Currently I’m trying to get the local log working. I actually seem to have it working but I’m not understanding how I can tweak a couple things.
The first thing I’m trying to change is the amount of information that is getting logged. It seems to be logging a whole bunch of system type of information like Request info to the blob storage. There is so much stuff getting logged that what entry I'm adding in code gets lost. It looks like all the system entries are marked as Information which is why it’s probably showing up in my log. However, I would like to see if I could get it to only log data from when I specifically call the logger.Information(“some text”) in my code. Is there a way to suppress all of the Microsoft system information?
Second thing is how I can I make the Serilog configuration come from my local.settings.json file. Below is a sample of my file and I’m not sure if I would add the configuration information in the Values: property or if I would put it outside of that property into its own property? I’m assuming it would be in it’s own property but so far all my custom settings have been coming from the Values: property?
Do I need to add the Serilog.Settings.Configuration NuGet package? If so, then I’m not understanding how I configure my Startup.cs file to get the information from the local settings file instead of configuring the settings directly in code. Eventually, I would like to add it to Dependency Injection so I can use the logger in other classes as well.
Startup.cs
public override void Configure(IFunctionsHostBuilder builder)
{
builder.Services.AddTransient<IDataManager, DataManager>();
ConfigureServices(builder.Services).BuildServiceProvider(true);
}
private IServiceCollection ConfigureServices(IServiceCollection services)
{
services
.AddLogging(loggingBuilder =>
loggingBuilder.AddSerilog(
new LoggerConfiguration()
.MinimumLevel.Information()
.WriteTo.Console()
.WriteTo.File(#"D:\logs\AzureFunction\log_.txt", rollingInterval: RollingInterval.Day)
.CreateLogger())
);
return services;
}
Local.settings.json
{
"IsEncrypted": false,
"Values": {
"ProcessLookBackDays": "90",
"SqlConnection": "connection info",
"StorageConnection": "connection info"
"AzureWebJobsStorage": "connection info"
"InputContainer": "test-files",
"InputFolder": "input",
"FUNCTIONS_WORKER_RUNTIME": "dotnet"
},
"Serilog": {
"Using": [ "Serilog.Sinks.File" ],
"MinimumLevel": {
"Default": "Information"
},
"WriteTo": [
{
"Name": "File",
"Args": {
"path": "D:\\logs\\AzureFunction\\log_.log",
"rollingInterval": "Day",
"outputTemplate": "[{Timestamp:yyyy-MM-dd HH:mm:ss.fff zzz} {CorrelationId} {Level:u3}] {Username} {Message:lj}{NewLine}{Exception}"
}
}
]
}
}
Setting up configuration in local-setting. Json will not reflect in the azure function app. local-setting as the name suggest is for local use only while with azure you need to use the app setting and read them.
Just add a new setting in the app setting and you can then use the below code to read the settings.
var appsettings = Environment.GetEnvironmentVariable("Name of the setting");
When you want to use the external configuration with serilog then you can use the Serilog.Settings.Configuration.
Now you can configure the minimum level of a log event so all the log event with less importance than the specified minimum level will not be logged.
Log.Logger = new LoggerConfiguration()
.MinimumLevel.Debug()
.WriteTo.Console(restrictedToMinimumLevel: LogEventLevel.Information)
.CreateLogger();
Here we specify two things the .MinimumLevel.Debug() and restrictedToMinimumLevel this attribute dictates the minimum level for the that particular sink. A sink is just a place where you can log, they are configured using the Writeto tag. for e.g., in the above code the sink is a console. There are other sink too.
The minimum levels are verbose Debug, information, warning, error, fatal.
Reference:-
Read configuration from app setting by Ashish Patel
Use serilog to filter the logs
Serilog setting configuration
I am currently trying to list all instances of an activity function and the orchestrator function using azure function core tools. The application synchronizes data from different sources into a centralized location.
The setup is as follows:
TimerTrigger -> Durable Orchestrator -> Multiple Activity Functions
In my concrete example, it is like this:
Start Synchronization -> Orchestrate Synchronizations -> Synchronize Source
So we start the synchronization process which starts the orchestrator. The orchestrator then starts multiple different synchronizations, one for each source. The problem though is that I cannot seem to get the azure function core tools to list me all instances of the functions I am interested in.
Unfortunately, I would really prefer not to have to use the REST api to query for this information. The setup really complicates things with IP restrictions and managed identity authentication. I think I can correct the setup to get things to work from my network + user, if really needed, but I think that will take way longer than required.
I have tried running the following command:
func durable get-instances
in a directory with a file called host.json with the following contents:
{
"version": "2.0",
"AzureWebJobsStorage":"DefaultEndpointsProtocol=https;AccountName=Name;AccountKey=Key;EndpointSuffix=core.windows.net",
}
I have also tried where the contents of the file are as follows:
{
"version": "2.0",
"extensions": {
"durableTask": {
"storageProvider": {
"connectionStringName": "DefaultEndpointsProtocol=https;AccountName=Name;AccountKey=Key;EndpointSuffix=core.windows.net"
}
}
}
}
I have tried calling the func durable get-instances with and without the --connection-string-setting parameter, with the values 'AzureWebJobsStorage' and 'extensions:durableTask:storageProvider:connectionStringName', but nothing seems to work. I keep getting the error No storage connection string found.
I know that the connection string is correct. I have pulled it directly from the storage account 'Access keys' blade.
Is there anything I am missing? What am I doing wrong?
Thanks to #juunas, I got it to work. I edited the host.json file to have the following content:
{
"version": "2.0"
}
and created another file called local.settings.json with the following contents:
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "DefaultEndpointsProtocol=https;AccountName=Name;AccountKey=Key;EndpointSuffix=core.windows.net",
}
}
Running func durable get-instances now works and returns a continuation token, but an empty list. I was not expecting that, but I can now start exploring and understanding here what is going on.
I'm working on a Azure Function project and for integration with FrontEnd purposes, I need to add a route prefix on the endpoints that is served by the application.
Right now, I'm adding the following code to my host.json.
{
"extensions": {
"http": { "routePrefix": "template" }
}
But this happens to break my application when I upload to my cloud.
Is there any way to have a add this code only locally or have a host.json configuration that just work locally?
We have created a .Net function app with Http trigger using visual studio code 2019. we have added the below route prefix to the host.json of the Http Trigger function & tested in local environment which is working fine.
Here is our host.json file & we haven't done any changes to the default function.Cs file:
{
"version": "2.0",
"logging": {
"applicationInsights": {
"samplingSettings": {
"isEnabled": true,
"excludedTypes": "Request"
}
},
"extensions": { "http": { "routePrefix": "template" } }
}
}
Here is the reference output from when we ran the function visual studio code :
Since we don't found any issues in our local environment we have pushed same code to a function app running in azure with .net as a runtime .
Post deploying code to function app we have tested the function by invoking the function url from PowerShell & it is succeeded.
Here is the reference output screen shot:
Here is the reference blog about creation of routing in Azure function app
I have a Timer Azure Function which I execute in VS. Right click on the Azure Function project and Debug. The function has an ILogger log.
Inspecting the log object I can see that is has two loggers
Azure.Functions.Cli.Diagnostics.ColoredConsoleLogger
Microsoft.Azure.WebJobs.Script.Diagnostics.FileLogger
I also can see that the RootLogPath is %temp%\LogFiles\Application\Functions.
However at that location there is only a "Host" folder. I expected to find a "Function" folder as well with the log file.
Do I need to enable somehow the File Logger? Do I miss anything?
To get file logs in local dev, we do have to modify the fileLoggingMode to always in host.json. The default debugOnly setting doesn't make function write file logs locally.
For v2 Functions
{
"version": "2.0",
"logging": {
"fileLoggingMode": "always"
}
}
For v1 Functions
{
"tracing": {
"fileLoggingMode": "always"
}
}
I have a simple trigger based Azure function which connects to an Azure event hub and writes out a message each time one is received on the event hub.
I created this as a C# Web Application based on the below post and am trying to debug this function locally:-
https://blogs.msdn.microsoft.com/appserviceteam/2017/03/16/publishing-a-net-class-library-as-a-function-app/
Here is my function code:-
using Microsoft.Azure.WebJobs.Host;
using System;
using System.Threading.Tasks;
namespace FunctionLibrary
{
public class EventHubProcessorFunction
{
public static void Run(string myEventHubMessage, TraceWriter log)
{
log.Info($"C# Event Hub trigger function processed a Vineet message: {myEventHubMessage}");
}
}
}
Here is my function.json
{
"disabled": false,
"scriptFile": ".\\bin\\FunctionAsWebApp.dll",
"entryPoint": "FunctionLibrary.EventHubProcessorFunction.Run",
"bindings": [
{
"type": "eventHubTrigger",
"name": "myEventHubMessage",
"direction": "in",
"path": "edpvineethub",
"connection": "AzureWebJobsServiceBus"
}
]
}
My folder structure is as below:-I have included the following files in the web application project:-
bin\FunctionAsWebApp.dll
NameOfYourFunction\function.json
AnotherFunctionIfYouHaveOne\function.json
appsettings.json
host.json
However I am getting the below error message when trying to run locally;-
No job functions found. Try making your job classes and methods public. If you'r
e using binding extensions (e.g. ServiceBus, Timers, etc.) make sure you've call
ed the registration method for the extension(s) in your startup code (e.g. confi
g.UseServiceBus(), config.UseTimers(), etc.).
Any help would be appreciated.
Check that your folder structure looks like this:
bin\FunctionAsWebApp.dll
NameOfYourFunction\function.json
AnotherFunctionIfYouHaveOne\function.json
appsettings.json
host.json
And function.json should of course reference the binary accordingly
"scriptFile": "..\\bin\\FunctionAsWebApp.dll"
Your initialization is likely failing to perform a lookup on the SB connection string, as it expects an App Setting/Environment variable name and you have the actual connection string there.
Please update your function.json to use an App Setting name, defined in your appsettings.json locally and the Function App settings when hosted, with the connection string set as its value.
Important: Since you have your connection string pasted above, I strongly recommend resetting your credentials.