looking for some advice here on developing a common logging environment for a NuGet package.
I've currently got a private NuGet package created from a C# library that works perfectly for local console apps and as such was written to log to the Windows event log.
Now I want to be able to use that package in an Azure function but of course Azure uses a different logging mechanism.
Any advice on how to create a logging overlay that allows the library to be compiled on my local PC in Visual Studio but also support logging in Azure if the package is used there?
Cheers,
Dave
I believe you are looking for the Microsoft.Extensions.Logging namespace and interface. This interface is used extensively across azure services, .Net 4+ and .Net Core. If you look for related nuget packages you will find multiple implementations for different logging scenarios and tools, including native sinks like event logs, functions, console as well as custom sinks like Splunk and Serilog.
If you separate your logging implementation from your application/package code and implement Microsoft.Extensions.Logging and adhere to the interface, then your logging interface can be easily plumbed in to many supported sinks, including functions.
A previous answer of mine provides an earlier, more manual approach to this problem. See here.
Related
I'm starting to wonder whether this is the right tool for the job, still here goes.
I'm attempting to automate the creation of our Azure Test environment using Azure SDK for JS. The environment spans many services (as you can imagine), including Classic ASP.NET app services.
Node is my safe space, so that is why I started with the JS SDK.
I have started scripting the creation of an app service using WebSiteManagementClient.webApps.createOrUpdate. I'm confused though, there is seemingly no way to configure any of the following:
Which app service plan the app service should be connected to. This feels fundamental.
The operating system, Windows or Linux.
The stack version, .NET 4.8, .NET Core, or whatever.
Is it possible to configure the above using the JS SDK, or am I going to have find another approach?
Update 23/03/21
Untested, but these are my findings so far:
App Service Plan - The plan is set using the serverFarmId property of the Site interface.
Operating system - Assuming Windows as the default, if you want a Linux app service, you change the kind property of Site from app, to app,linux.
Stack & version - In the SiteConfig interface, you have linuxFxVersion and windowsFxVersion. Again, I think the assumption is 'latest .NET' (e.g. .NET 4.8). For .NET Core 3.1, the setting looks to be DOTNETCORE|3.1.
It can be achieved using js SDK. I checked the source code and it is ok. But I don't recommend to use js sdk to do this.
Because you need to call the SDK, there are many internal logics that you need to code. This will waste a lot of your time. So I recommend you to use restapi.
The restapi method name is similar to the naming in the SDK, mainly because you can test api interfaces online to achieve the functions you want. So you can selectively choose the method you want to achieve the function you want.
Official doc
Web Apps - Create Or Update
As for your concerns, you only need to write all the configuration in json format and put it in the request body.
Tips:
First use the online interface, encode the json format, create a webapp according to your needs, and then integrate it into your code.
(If this question is poorly worded, could someone please help me clear it up?)
I have an Azure Function (2.0) which relies on some System.Drawing code. I've added a NuGet reference to System.Drawing.Common (4.5.0).
After publishing the app, however, when the function is called, it produces the error:
System.Private.CoreLib: Exception while executing function:
[MyFunctionName]. System.Drawing.Common: System.Drawing is not
supported on this platform.
As far as I'm aware, System.Drawing.Common is supported on .NET Core now, which I believe is the environment in which my Azure Function is running. The actual project is a .NET Standard 2.0 project, though.
I am confused as to how to resolve this. I've tried converting the project to a .NET Core 2.1 project but that led to bizarre errors related to "Metadata generation failed" and an inability to find System.Runtime.
My project references Microsoft.Azure.WebJobs.Extensions.EventGrid (2.0.0-beta2) if that's relevant.
It's not about the CLR, it's about the sandbox.
System.Drawing relies heavily on GDI/GDI+ to do its thing. Because of the somewhat risky nature of those APIs (large attack surface) they are restricted in the App Service sandbox.
Win32k.sys (User32/GDI32) Restrictions
For the sake of radical attack surface area reduction, the sandbox prevents almost all of the Win32k.sys APIs from being called, which practically means that most of User32/GDI32 system calls are blocked. For most applications this is not an issue since most Azure Web Apps do not require access to Windows UI functionality (they are web applications after all).
Try to find an different library that doesn't rely on System.Drawing/GDI, like ImageSharp.
A little update can help a lot of people.
Now you can switch your Azure Function to v3:
in Function app settings panel -> Runtime version ~3
With this setup, your code run in a sandbox that support Core 3, (you don't need to rebuild your dll to Core3, i run my .net core 2.1 dll without errors), and surprise... you don't get this exception anymore:
System.Drawing.Common: System.Drawing is not supported on this platform.
Because the CLI can found GDI/GDI+ that need in the subsystem.
Now i can convert html to pdf without go crazy.
It's possible to install Application Insights via the extensions section in Azure App Services, but it's also possible to just install the packages via NuGet and define the APPINSIGHTS_INSTRUMENTATIONKEY application setting. You can also do both.
What is the difference?
Edit:
I have found what the differences are between installing the extension or the NuGet packages:
You can configure monitoring by instrumenting the app in either of two ways:
Run-time - You can select a performance monitoring extension when your web app is already live. It isn't necessary to rebuild or re-install your app. You get a standard set of packages that monitor response times, success rates, exceptions, dependencies, and so on.
Build time - You can install a package in your app in development. This option is more versatile. In addition to the same standard packages, you can write code to customize the telemetry or to send your own telemetry. You can log specific activities or record events according to the semantics of your app domain.
Source: https://learn.microsoft.com/en-us/azure/application-insights/app-insights-azure-web-apps#run-time-or-build-time
But what if you do both? Will there be anything beneficial about it?
But what if you do both? Will there be anything beneficial about it?
The extension detects that your app has already brought Application Insights with it and won't do anything, except dropping a profiler, which helps collect full SQL Statement in Dependencies. Without the profiler full SQL Statement won't be collected but everything else should just work fine.
(If you are using 2.3.0 or earlier of SDK or if you application is targeting old .NET Framework like 4.0, then profiler does better correlation of dependencies as well.
In short, Starting with 2.4.0 of SDK, the only advantage of installing extension on top on nuget installation is to get full SQL Statements in Dependency Telemetry.
But what if you do both? Will there be anything beneficial about it?
As you know we could install the packages via NuGet to use Application Insights .For this way, we could add custom telemetry data in my code,and monitor the telemetry data in application insights tools in Visual Studio. This will be very convenient. You could also refer to this article to add custom telemetry data.
Code in MVC Porject:
public ActionResult Index()
{
Trace.TraceInformation("my trace info Home/Index");
var telemetry = new Microsoft.ApplicationInsights.TelemetryClient();
RequestTelemetry requestTelemetry = new RequestTelemetry();
telemetry.TrackTrace("Home/Index Main");
telemetry.TrackPageView("Home/Index");
return View();
}
The telemetry data in Application insight tool:
The application insights in App Service,you can only see limited data and trends in past 24 hours. This is very convenient for you to view the main telemetry data in the app service directly. But if you want to know more details, that is not a good choose.
The most comprehensive monitor data and services are in application insight service. You could click 'View more in application insights' in App Service monitor extension to go. Or you could go to application insight service directly.
Time Range in application insight service(include custom time range).
I'm trying to wrap my head around how we're supposed to build Azure functions.
I love the idea of building serverless, compact, single-function apps that respond to events.
Here are the problems I'm running into:
I have nice class libraries built in .NET Standard 2 that handle all my "backend needs" namely handling CRUD ops with Cosmos Db, Azure Table Storage, Azure SQL, Redis, Azure Storage. No matter what I did, I couldn't integrate these class libraries into an Azure Functions project. More details below.
Also, getting dependency injection in Azure Functions project has proven to be quite a task -- especially with my class libraries mentioned above.
At this point, the only option I'm seeing is to "copy and paste" code into a new Azure Functions project and use it without any DI.
This seems to go against "best practices". So what's the solution other than either to create monolithic code or wait till Azure Functions support .NET Core and DI.
I thought I could use my .NET Standard class libraries from a regular Azure Functions project targeting .NET Framework. After all, the idea of .NET Standard is to "standardize" things. I opened a couple of posts here on SO. I'm providing the links so that you can see the issues I've run into:
Using .NET Core 2.0 Libraries in WebJob Targeting .NET Framework 4.7
No parameterless constructor error in WebJobs with .NET Core and Ninject
P.S. My previous posts are referring to WebJobs. That was plan B approach because WebJobs seem half a step ahead of Azure Functions when it comes to supporting things like .NET Core and DI. Ultimately, I'd like to build a few Azure Functions that can use my class libraries built in .NET Standard 2.
Also, my previous posts mention that my class libraries target .NET Core 2.0. Since then I converted them to .NET Standard 2 which didn't really take much at all. I did this so that I truly conform to .NET Standard 2.
One issue is that Visual Studio has an outdated version of the Functions Core tools. Until this is resolved, you can work around in the following way:
Install the latest via npm by running npm install -g azure-functions-core-tools
In your Function App in VS, go to the Properties
Go to Debug, and click New... under Profile
Name the new Profile something like FunctionsNpm
Set the executable to (replace [YourUserName]): C:\Users\[YourUserName]\AppData\Roaming\npm\node_modules\azure-functions-core-tools\bin\func.exe
Set the arguments to host start
Set the working directory to $(TargetDir)
In toolbar, look for the green triangle icon to change your current Profile to the one you just created:
Now when you run from VS, you'll be using the npm tools instead of the older one that come with the VS package.
.NET Standard 2 support is on its way, see this github issue.
I am working on asp.net core webapp hosted on azure and I want to write my Elmah logs to my azure table storage .Many of the examples I looked into are using "API_KEY" and "logbucketId" but I am not sure what they are.
For instance as per elmah docs here https://docs.elmah.io/logging-to-elmah-io-from-aspnet-core/
app.UseElmahIo(
"API_KEY",
new Guid("LOG_ID"),
After installing nuget package, I don't see any API_KEY or LogBucketId in my appsettings.json file.
Where I can find my API_KEY and LogBucket_Id ?
https://github.com/ElmahCore/ElmahCore supports .net core now.
It is not based on Modules or Handlers, but Middleware.
It works fine although still limited.
The async calls are not being used at the moment. But it is quite extensible.
ELMAH doesn't work with ASP.NET Core, since ASP.NET Core doesn't work with HttpModules and HttpHandlers (ELMAH stands for Error Logging Modules And Handlers). ASP.NET Core does include a new (pre-release) diagnostic tool called ELM (Error Logging Middleware - creative, right?). You can find its source and samples here:
https://github.com/aspnet/Diagnostics/tree/release/1.1/src/Microsoft.AspNetCore.Diagnostics.Elm
Another option similar to ELMAH (and ELM) but with more capabilities is Glimpse:
http://getglimpse.com/
I realize neither of these directly answers your question but hopefully you'll find one or both of these alternative tools useful.
You can check ELMAH port to Net.Core version
https://github.com/ElmahCore/ElmahCore
Make sure it is matching your expectation before you use it.
Note: This new extension is in development and doesn't support all Elmah features.
ELMAH doesn't support ASP.NET Core. The documentation you are linking to is from elmah.io. While they have similar names and elmah.io uses ELMAH for some of its integrations, they don't share code or documentation.