Azure WebJob not logging to blob storage - azure

I have a dotnet core 1.1 web-api which uses ILogger to log to console. These logs are picked up and sent to blobs based on (appname)/month/day/hour if I turn on Application Logging (Blob) option under Monitoring/Diagnostic Logs in the azure console. This works out pretty well.
However, my webjob which uses the same ILogger and the same config does not have its output go to these mm/dd/hh directories.
If I turn on Application Logging (Filesystem) I see my logs, but I would really like them to go to the same blob storage location, so I can pick them up in Splunk.
Where am I going wrong?

According to your description, I created my console application with the target framework .NET Core 1.1 via VS2017 as follows:
Nuget packages:
Program.cs
class Program
{
static void Main(string[] args)
{
ILoggerFactory loggerFactory = new LoggerFactory()
.AddConsole()
.AddDebug()
.AddAzureWebAppDiagnostics();
ILogger<Program> logger = loggerFactory.CreateLogger<Program>();
logger.LogInformation("Hello World!");
logger.LogInformation("Sleeping for 5s before exit...");
Thread.Sleep(5 * 1000);
}
}
After deployed to azure, I could see the logs under KUDU console and the blob log file as follows:

Related

Application insights not generated for .Net core app deployed on service fabric Linux cluster

I have a .Net core application that is deployed on service fabric Linux cluster. Application insights are configured in the app.
Startup.cs
public void ConfigureServices(IServiceCollection services)
{
ApplicationInsights.AspNetCore.Extensions.ApplicationInsightsServiceOptions aiOptions
= new ApplicationInsights.AspNetCore.Extensions.ApplicationInsightsServiceOptions
{
EnableAdaptiveSampling = false,
EnableQuickPulseMetricStream = false,
InstrumentationKey = "xxx"
};
services.AddApplicationInsightsTelemetry(aiOptions);
I have a controller class that has some action methods and logs the information.
[HttpPost]
public ActionResult actionMethod(...)
{
TraceLine("------------------------------------");
//some code
}
private static void TraceLine(string msg)
{
msg = $">> {DateTime.UtcNow.ToString("o")}: {msg}";
Log.Information(msg);
}
I am using Serilog, configured in appsettings.json & Program.cs
When I hit action method directly from local (without hosting it on even local sf cluster), via Postman, I see app insights getting generated and pushed to azure.
azure app insights snapshot
But when I hit the action method that is deployed on Azure service fabric I don't see any insight getting generated.
What am I missing here?
Any help is much appreciated!
Well, we need to check a few things here:
1) The app insights URL and the instrumentation key in the deployment parameter files for cluster hosted on cloud (Cloud.xml)
2) After checking the Cloud.xml, the best way is to access the log files and check what is the actual problem.
There's a description here which explains how to discover where the log files are stored.
You can use RDP to access the machine, which is explained here.
I was able to solve the issue by using Microsoft.ApplicationInsights.ServiceFabric.Native SDK in my application to log app insights.
Refer .NetCore section in ApplicationInsights-ServiceFabric on how to configure insights for service fabric application.

Monitoring Azure WebJobs Health based on the errors in the WebJob logs with Application Insights

I was configured the Multi-step web test for monitoring the Azure Web Jobs health using Azure Application Insights by following this documentation. But this multi-step web test will check only the status of Azure Web Job, whether it is “Running, Failed and Aborted”.
Sometimes, Azure Web Job was Aborted. But the job runs inside it. So, I need to monitor the status of Azure Web Job based on error in the logs like shown in below figure using Multi-step web test.
You could use Application Insights Integration to implement it. The LogCategoryFilter has a Default property with an initial value of Information, meaning that any messages with levels of Information, Warning, Error or Critical will be logged.
You need three packages in total:
Microsoft.Azure.WebJobs.Logging.ApplicationInsights
Microsoft.Extensions.Logging
Microsoft.Extensions.Logging.Console
Configure the JobHostConfiguration
string instrumentationKey = Environment.GetEnvironmentVariable("APPINSIGHTS_INSTRUMENTATIONKEY");
if (!string.IsNullOrEmpty(instrumentationKey))
{
// build up a LoggerFactory with ApplicationInsights and a Console Logger
config.LoggerFactory = new LoggerFactory().AddApplicationInsights(instrumentationKey, null).AddConsole();
config.Tracing.ConsoleLevel = TraceLevel.Off;
}
Note: Don't forget adding the APPINSIGHTS_INSTRUMENTATIONKEY in your application setting.
I test ProcessQueueMessage webjob.
public static void ProcessQueueMessage([QueueTrigger("myqueue")] string message, ILogger logger)
{
logger.LogInformation("it is a error......");
logger.LogError(new Exception(), "it is a test error...");
}
This is my webjob log.
And this is the Application Insight page. You could find the Information, Warning and Exception are all shown there.

Azure APP Service Application Log not working

I have an Azure App for a .net core API, which in turn has a Sub Application (Virtual Directory). I have enabled Application log in the Diagnostic setting in the Azure Portal. I had done this for a another service and worked fine. When I have the services with multiple Virtual Directory setup it fails. Do we need any extra code in the Configure section if we have this scenario?
I can't reproduce the issue you mentioned. It works well to get Application logs. I also created a Virtual Directory in App Service. (refer to this article). And we needn’t to add extra code. You could follow my steps.
In Azure portal>APP Service>Application settings(create virtual directory ‘janley’):
Code in application: (Add the trace info).
public ActionResult Index()
{
Trace.TraceInformation("my trace info Home/Index");
return View();
}
public ActionResult About()
{
Trace.TraceInformation("my trace info Home/About");
return View();
}
public ActionResult Contact()
{
Trace.TraceInformation("my trace info Home/Contact");
return View();
}
You could see the logs in KuDu like this:
Besides, for more details about how to trace and view the application logs, you could read this article.
Changing the app from .net core to .net framework worked for me!

Does the Azure WebJobs SDK support pushing TextWriter logs into App Insights?

With the Azure WebJobs SDK, the process of adding logging to your functions is relatively straightforward: Add a TextWriter param to your triggered function, and write to it. That's it.
The SDK will will then associate and display these logs with their execution instances in the WebJobs Dashboard, which provides a relatively data-rich, yet frictionless view into the operationalization of your webjobs.
While this data is replicated into a user-accessible Azure Storage Blob Container, more custom code would be required to periodically push these logs to App Insights, which is undesirable.
Looking for ideas or solutions for how to push all logs pushed via the injected TextWriter to be pushed to AppInsights (or OMS, for that matter), complete with the webjobs execution/trigger instance metadata, thereby allowing a unified consumption experience for various log analytics.
Based on this Feature being tracked in the WebJobs SDK, I'm assuming that for now this is not possible? A long while back I looked into trying to inject my own TextWriter instance, but I would've had to fork the WebJobs SDK and use my customized assembly that changed a lot of architecture.
You can write a custom TraceWriter that sends log to AppInsights:
using System.Collections.Generic;
using System.Diagnostics;
using Microsoft.ApplicationInsights;
using Microsoft.Azure.WebJobs.Host;
public class AppInsightsTraceWriter : TraceWriter
{
private readonly TelemetryClient _telemetryClient;
public AppInsightsTraceWriter(TraceLevel level, TelemetryClient telemetryClient)
: base(level)
{
_telemetryClient = telemetryClient;
}
public override void Trace(TraceEvent traceEvent)
{
var eventTelemetry = new EventTelemetry() {Name = "WebjobTraceEvent"};
eventTelemetry.Properties.Add(traceEvent.Level.ToString(), traceEvent.ToString());
_telemetryClient.TrackEvent(eventTelemetry);
}
}
In this example, I inject the TelemetryClient class because you should only have one instance of the TelemetryClient class in your application.
So now you just need to configure the Jobhost to use your custom writer :
// Initialize the webjob configuration.
var config = new JobHostConfiguration();
// Only one instance of the telemetry client is needed
var telemetryClient = new TelemetryClient() {InstrumentationKey = "MyInstrumentationKey"};
// Add the app insights tracer for webjob logs/traces.
config.Tracing.Tracers.Add(new AppInsightsTraceWriter(TraceLevel.Info, telemetryClient));
// Detect when the webjob shut down
var cancellationToken = new WebJobsShutdownWatcher().Token;
cancellationToken.Register(() =>
{
// Before shut down, flush the app insights client.
telemetryClient.Flush();
});
new JobHost(config).RunAndBlock();
So if you have a function like that:
public static void ProcessQueueMessage([QueueTrigger("myqueue")] string logMessage, TextWriter log)
{
log.WriteLine(logMessage);
}
Every time you use log.WriteLine, an event will be sent to App Insights.
Note: if this sample also logs from the JobHost are sent to AppInsights.
This is super old (not sure why SO decided to put it in the sidebar for me after this long) but for anyone else that stumbles upon this, app insights is now the recommended way to monitor webjob executions.
Check out the documentation here which steps through the process of connecting app insights to webjobs.
This link walks you through configuring the logging portion of a new webjobs project. Check through the earlier sections to make sure that you've got all the prerequisites.
https://learn.microsoft.com/en-us/azure/app-service/webjobs-sdk-get-started#add-application-insights-logging
static async Task Main()
{
var builder = new HostBuilder();
builder.UseEnvironment(EnvironmentName.Development);
builder.ConfigureWebJobs(b =>
{
b.AddAzureStorageCoreServices();
b.AddAzureStorage();
});
builder.ConfigureLogging((context, b) =>
{
b.AddConsole();
// If the key exists in settings, use it to enable Application Insights.
string instrumentationKey = context.Configuration["APPINSIGHTS_INSTRUMENTATIONKEY"];
if (!string.IsNullOrEmpty(instrumentationKey))
{
b.AddApplicationInsightsWebJobs(o => o.InstrumentationKey = instrumentationKey);
}
});
var host = builder.Build();
using (host)
{
await host.RunAsync();
}
}
I will share the detailed steps to use Application Insights in Azure Web Job, please refer to it.
Create an new Azure Application Insights in Azure portal
Create Azure Web Job project in Visual Studio and install
Microsoft.ApplicationInsights
Set the instrumentation key and send
telemetry
public static void ProcessQueueMessage([QueueTrigger("queuename")] string message, TextWriter log)
{
TelemetryClient tc = new TelemetryClient();
tc.InstrumentationKey = "key copied from Azure portal";
tc.TrackTrace(message);
tc.Flush();
//log.WriteLine(message);
}
This documentation explained how to monitor usage and performance in Windows Desktop apps, you could refer to it to know how to use Azure Application Insights in non-web application. Besides, ApplicationInsights.Helpers.WebJobs could be also helpful.

Azure diagnostics logs not copied to storage account

I am new to the Azure platform and simply trying to pull my IIS logs into my storage account.
When running locally and using the storage emulator, I can see the logs without issue. However, when deploying the application, the log files are never created and the only container I see in blob storage is "vsdeploy".
I have followed the steps outlined here: http://www.windowsazure.com/en-us/develop/net/common-tasks/diagnostics/
I have imported the diagnostics module in my ServiceDefinition.csdef
...<Imports>
<Import moduleName="Diagnostics" />...
I have created a WebRole.cs class and configured directories for a scheduled transfer every two minutes.
public class WebRole : RoleEntryPoint
{
public override bool OnStart()
{
// For information on handling configuration changes
// see the MSDN topic at http://go.microsoft.com/fwlink/?LinkId=166357.
DiagnosticMonitorConfiguration diagConfig = DiagnosticMonitor.GetDefaultInitialConfiguration();
diagConfig.Directories.ScheduledTransferPeriod = TimeSpan.FromMinutes(2.0); //IIS LOGS
DiagnosticMonitor.Start("Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString", diagConfig);
return base.OnStart();
}
}
I have also verified my storage account connection string for cloud deployment is correct.
Still, when I deploy, nothing is ever created in the storage account. What piece am I missing or not configuring correctly that is keeping me from my logs?
Thanks.

Resources