I am new to the Azure platform and simply trying to pull my IIS logs into my storage account.
When running locally and using the storage emulator, I can see the logs without issue. However, when deploying the application, the log files are never created and the only container I see in blob storage is "vsdeploy".
I have followed the steps outlined here: http://www.windowsazure.com/en-us/develop/net/common-tasks/diagnostics/
I have imported the diagnostics module in my ServiceDefinition.csdef
...<Imports>
<Import moduleName="Diagnostics" />...
I have created a WebRole.cs class and configured directories for a scheduled transfer every two minutes.
public class WebRole : RoleEntryPoint
{
public override bool OnStart()
{
// For information on handling configuration changes
// see the MSDN topic at http://go.microsoft.com/fwlink/?LinkId=166357.
DiagnosticMonitorConfiguration diagConfig = DiagnosticMonitor.GetDefaultInitialConfiguration();
diagConfig.Directories.ScheduledTransferPeriod = TimeSpan.FromMinutes(2.0); //IIS LOGS
DiagnosticMonitor.Start("Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString", diagConfig);
return base.OnStart();
}
}
I have also verified my storage account connection string for cloud deployment is correct.
Still, when I deploy, nothing is ever created in the storage account. What piece am I missing or not configuring correctly that is keeping me from my logs?
Thanks.
Related
I have a .Net core application that is deployed on service fabric Linux cluster. Application insights are configured in the app.
Startup.cs
public void ConfigureServices(IServiceCollection services)
{
ApplicationInsights.AspNetCore.Extensions.ApplicationInsightsServiceOptions aiOptions
= new ApplicationInsights.AspNetCore.Extensions.ApplicationInsightsServiceOptions
{
EnableAdaptiveSampling = false,
EnableQuickPulseMetricStream = false,
InstrumentationKey = "xxx"
};
services.AddApplicationInsightsTelemetry(aiOptions);
I have a controller class that has some action methods and logs the information.
[HttpPost]
public ActionResult actionMethod(...)
{
TraceLine("------------------------------------");
//some code
}
private static void TraceLine(string msg)
{
msg = $">> {DateTime.UtcNow.ToString("o")}: {msg}";
Log.Information(msg);
}
I am using Serilog, configured in appsettings.json & Program.cs
When I hit action method directly from local (without hosting it on even local sf cluster), via Postman, I see app insights getting generated and pushed to azure.
azure app insights snapshot
But when I hit the action method that is deployed on Azure service fabric I don't see any insight getting generated.
What am I missing here?
Any help is much appreciated!
Well, we need to check a few things here:
1) The app insights URL and the instrumentation key in the deployment parameter files for cluster hosted on cloud (Cloud.xml)
2) After checking the Cloud.xml, the best way is to access the log files and check what is the actual problem.
There's a description here which explains how to discover where the log files are stored.
You can use RDP to access the machine, which is explained here.
I was able to solve the issue by using Microsoft.ApplicationInsights.ServiceFabric.Native SDK in my application to log app insights.
Refer .NetCore section in ApplicationInsights-ServiceFabric on how to configure insights for service fabric application.
I have a dotnet core 1.1 web-api which uses ILogger to log to console. These logs are picked up and sent to blobs based on (appname)/month/day/hour if I turn on Application Logging (Blob) option under Monitoring/Diagnostic Logs in the azure console. This works out pretty well.
However, my webjob which uses the same ILogger and the same config does not have its output go to these mm/dd/hh directories.
If I turn on Application Logging (Filesystem) I see my logs, but I would really like them to go to the same blob storage location, so I can pick them up in Splunk.
Where am I going wrong?
According to your description, I created my console application with the target framework .NET Core 1.1 via VS2017 as follows:
Nuget packages:
Program.cs
class Program
{
static void Main(string[] args)
{
ILoggerFactory loggerFactory = new LoggerFactory()
.AddConsole()
.AddDebug()
.AddAzureWebAppDiagnostics();
ILogger<Program> logger = loggerFactory.CreateLogger<Program>();
logger.LogInformation("Hello World!");
logger.LogInformation("Sleeping for 5s before exit...");
Thread.Sleep(5 * 1000);
}
}
After deployed to azure, I could see the logs under KUDU console and the blob log file as follows:
So I'm fairly new to working with Azure and there are some things I can't quite wrap my head around. One of them being the Azure Storage Account.
My web jobs keeps stopping with the following error "Unhandled Exception: System.InvalidOperationException: The account credentials for '[account_name]' are incorrect." Understanding the error however is not the problem, at least that's what I think. The problem lies in understanding why I need an Azure Storage Account to overcome it.
Please read on as I try to take you through the steps taken thus far. Hopefuly the real question will become more clear to you.
In my efforts to deploy a WebJob on Azure we have created the following resources so far:
App Service Plan
App Service
SQL server
SQL database
I'm using the following code snippet to prevent my web job from exiting:
JobHostConfiguration config = new JobHostConfiguration();
config.DashboardConnectionString = null;
new JobHost(config).RunAndBlock();
To my understanding from other sources the Dashboard connection string is optional but the AzureWebJobsStorage connection string is required.
I tried setting the required connection string in portal using the configuration found here.
DefaultEndpointsProtocol=[http|https];AccountName=myAccountName;AccountKey=myAccountKey
Looking further I found this answer that clearly states where I would get the values needed, namely an/my missing Azure Storage Account.
So now for the actualy question: Why do I need an Azure Storage Account when I seemingly have all the resources I need place for the WebJob to run? What does it do? Is it a billing thing, cause I thought we had that defined in the App Service Plan. I've tried reading up on Azure Storage Accounts over here but I need a bit more help understanding how it relates to everything.
From the docs:
An Azure storage account provides resources for storing queue and blob data in the cloud.
It's also used by the WebJobs SDK to store logging data for the dashboard.
Refer to the getting started guide and documentation for further information
The answer to your question is "No", it is not mandatory to use Azure Storage when you are trying to setup and run a Azure web job.
If you are using JobHost or JobHostConfiguration then there is indeed a dependency for Storage accounts.
Sample code snippet is give below.
class Program
{
static void Main()
{
Functions.ExecuteTask();
}
}
public class Functions
{
[NoAutomaticTrigger]
public static void ExecuteTask()
{
// Execute your task here
}
}
The answer is no, you don't. You can have a WebJob run without being tied to an Azure Storage Account. Like Murray mentioned, your WebJob dashboard does use a storage account to log data but that's completely independent.
Azure WebJob SDK uses the storage connection string defined in the AzureWebJobsStorage and AzureWebJobsDashboard app settings for its logging and dashboard.
WebJob SDK creates the following blob container in AzureWebJobsStorage:
azure-webjobs-hosts
WebJob SDK creates the following blob containers in AzureWebJobsDashboard
azure-jobs-host-output
azure-webjobs-hosts
Many blobs are created in the above blob containers as the WebJob runs. The containers can be bloated or saturated if there is no clean-up mechanism.
What is the cleanup mechanism for the above blob containers?
Update
The answer below is a workaround. At this point, there is no built-in mechanism to clean up the WebJobs logs. The logs can pile up quite large as the Job runs in a long term. Developers must create the cleanup mechanism on their own. Azure Functions is a good way of implementing such cleanup process. An example is provided in the below answer.
What is the clean up mechanism for the blobs that WebJobs SDK creates in the AzureWebJobsDashboard connection?
I haven’t found a way to do it. There is an open issue on GitHub which related to this topic, but haven’t been closed.
No way to set webjob logging retention policy
In a similar issue on GitHub we found that Azure WebJob SDK have changed a way of saving logs to multi tables of Azure Table Storage. We can easily delete the table per month. For logs writen in Azure Blob Storage haven’t been grouped by month until now.
WebJobs.Logging needs to support log purge / retention policies
To delete the older WebJob log, I suggest you create a time triggered WebJob to delete the logs which you wanted.
Is there any AzureFunction code sample shows how to do the blob cleanup?
Code below is for your reference.
// Parse the connection string and return a reference to the storage account.
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(storageConnectionString);
// Create the table client.
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
// Retrieve a reference to a container.
var container = blobClient.GetContainerReference("azure-webjobs-hosts");
// Query out all the blobs which created after 30 days
var blobs = container.GetDirectoryReference("output-logs").ListBlobs().OfType<CloudBlob>()
.Where(b => b.Properties.LastModified < new DateTimeOffset(DateTime.Now.AddDays(-30)));
// Delete these blobs
foreach (var item in blobs)
{
item.DeleteIfExists();
}
When I run my worker role locally, I can open the Windows Azure Compute Emulator application and look at the standard output and error of my worker process.
When I remote desktop into my Azure instance, I don't know where to get that same information. Where do I find standard output and error?
If you want to see your standard output and error of your worker process in an actual deployment then you will need to do some additional configuration. This data must be stored in a persistent storage.
First step is to enable Diagnostics in the configuration window of your WorkerRole. Here a storage account must be specified.
The next step is to add additional code to the OnStart() method of your WorkerRole. Here you can not only configure the standard output and error, but also you can listen to windows events and diagnostic information as provided in the following code example.
public override bool OnStart()
{
DiagnosticMonitorConfiguration diagConfig =
DiagnosticMonitor.GetDefaultInitialConfiguration();
// Windows event logs
diagConfig.WindowsEventLog.DataSources.Add("System!*");
diagConfig.WindowsEventLog.DataSources.Add("Application!*");
diagConfig.WindowsEventLog.ScheduledTransferLogLevelFilter = LogLevel.Error;
diagConfig.WindowsEventLog.ScheduledTransferPeriod = TimeSpan.FromMinutes(5);
// Azure application logs
diagConfig.Logs.ScheduledTransferLogLevelFilter = LogLevel.Verbose;
diagConfig.Logs.ScheduledTransferPeriod = TimeSpan.FromMinutes(5);
// Performance counters
diagConfig.PerformanceCounters.DataSources.Add(
new PerformanceCounterConfiguration()
{
SampleRate = TimeSpan.FromSeconds(5),
CounterSpecifier = #"\Processor(*)\% Processor Time"
});
diagConfig.PerformanceCounters.ScheduledTransferPeriod =
TimeSpan.FromMinutes(5);
DiagnosticMonitor.Start(
"Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString", diagConfig);
return base.OnStart();
}
After these settings your diagnostic data will be visible in the configured Azure Table storage. You can easily write tools to visualize your data here, but there are also some commercial tools that have built in functionality for this. For example
Cerebrata Diagnostics Manager.
If for some reason you don't want to use Azure Storage for storing log files you can implement a custom trace listener that may write logs anywhere else. Here is a description about how to do that. You may simply open a http port and transfer them to your own server.
Trace message are not stored anywhere in Window Azure instead if you configure Azure Diagnostics properly those message are sent to Windows Azure Table Storage (WADLogsTable Table) from there you can get them.
If you want to know how to enable Azure Diagnostics for Traces visit the link below and look for Windows Azure Diagnostics Demonstration code sample:
http://msdn.microsoft.com/en-us/library/windowsazure/hh411529.aspx
You can learn details about Azure Diagnostics here.