Console application Logs to store in Azure storage - azure

I am using a console application that runs on on-premise servers triggered by a task scheduler. This console application performs the various actions and needed to be logged these. It would generate logs of around 200kb per run and the console app runs every hour.
Since the server is not accessible to us, I am planning to store the logs to Azure. I read about blob/table storage.
I would like to know what is the best strategy to store the logs in Azure.
Thank you.

Though you can write logging data in Azure Storage (both Blobs and Tables), it would actually make more sense if you use Azure Application Insights for logging this data.
I recently did the same for a console application I built. I found it incredibly simple.
I created an App Insight Resource in my Azure Subscription and got the instrumentation key. I then installed App Insights SDK and referenced appropriate namespaces in my project.
using Microsoft.ApplicationInsights;
using Microsoft.ApplicationInsights.DataContracts;
using Microsoft.ApplicationInsights.Extensibility;
This is how I initialized the telemetry client:
var appSettingsReader = new AppSettingsReader();
var appInsightsInstrumentationKey = (string)appSettingsReader.GetValue("AppInsights.InstrumentationKey", typeof(string));
TelemetryConfiguration configuration = TelemetryConfiguration.CreateDefault();
configuration.InstrumentationKey = appInsightsInstrumentationKey;
telemetryClient = new TelemetryClient(configuration);
telemetryClient.InstrumentationKey = appInsightsInstrumentationKey;
For logging trace data, I simply did the following:
TraceTelemetry telemetry = new TraceTelemetry(message, SeverityLevel.Verbose);
telemetryClient.TrackTrace(telemetry);
For logging error data, I simply did the following:
catch (Exception excep)
{
var message = string.Format("Error. {0}", excep.Message);
ExceptionTelemetry exceptionTelemetry = new ExceptionTelemetry(excep);
telemetryClient.TrackException(exceptionTelemetry);
telemetryClient.Flush();
Task.Delay(5000).Wait();//Wait for 5 seconds before terminating the application
}
Just keep one thing in mind though: Make sure you wait for some time (5 seconds is good enough) to flush the data before terminating the application.
If you're still keen on writing logs to Azure Storage, depending on the logging library you're using you will find suitable adapters that will write directly into Azure Storage.
For example, there's an NLog target for Azure Tables: https://github.com/harouny/NLog.Extensions.AzureTableStorage (though this project is not actively maintained).

Related

Best way to log app data from a console app running inside a container on ACI

A seeming simple ask, but had no luck finding the best way.
So i need to log application events from a console app that will spin up inside a container and do some work then die.
How can i log custom data from inside?
I've tried Azure Monitor and created a workspace and used HTTP Data Collector API inside the app but no joy in working out where logs are being stored.
Is there a simple way to log to an Azure Storage account and then using Azure Monitor to manage the events?
I've been googling for hours but a lot of posts are 8 years old and not relevant and i cannot really find a simple use case in modern azure.
Perhaps it's so simple i just cannot see it
Any pointers or links greatly received!
thanks
Paul
Why not trace events using Application Insight custom events ?
https://learn.microsoft.com/en-us/azure/azure-monitor/app/api-custom-events-metrics
With that you can trace events with any metadata and check them in the Azure Application Insights Blade or reach them by the Application Insights SDK or The Api.
You just need to create an Application Insight instance and use the Telemetry Key to do that.
SDK: https://github.com/Microsoft/ApplicationInsights-dotnet
API : https://dev.applicationinsights.io/reference
Code sample to write events:
TelemetryClient client = new TelemetryClient();
client .InstrumentationKey = "INSERT YOUR KEY";
client.TrackEvent("SomethingInterestingHappened");
Also you can send more than just an string value:
tc.TrackEvent("PurchaseOrderSubmitted",
new Dictionary<string, string>()
{
{"CouponCode", "JULY2015" }
}, new Dictionary<string, double>()
{
{"OrderTotal", 68.99 },
{"ItemsOrdered", 5}
});

Do I need an Azure Storage Account to run a WebJob?

So I'm fairly new to working with Azure and there are some things I can't quite wrap my head around. One of them being the Azure Storage Account.
My web jobs keeps stopping with the following error "Unhandled Exception: System.InvalidOperationException: The account credentials for '[account_name]' are incorrect." Understanding the error however is not the problem, at least that's what I think. The problem lies in understanding why I need an Azure Storage Account to overcome it.
Please read on as I try to take you through the steps taken thus far. Hopefuly the real question will become more clear to you.
In my efforts to deploy a WebJob on Azure we have created the following resources so far:
App Service Plan
App Service
SQL server
SQL database
I'm using the following code snippet to prevent my web job from exiting:
JobHostConfiguration config = new JobHostConfiguration();
config.DashboardConnectionString = null;
new JobHost(config).RunAndBlock();
To my understanding from other sources the Dashboard connection string is optional but the AzureWebJobsStorage connection string is required.
I tried setting the required connection string in portal using the configuration found here.
DefaultEndpointsProtocol=[http|https];AccountName=myAccountName;AccountKey=myAccountKey
Looking further I found this answer that clearly states where I would get the values needed, namely an/my missing Azure Storage Account.
So now for the actualy question: Why do I need an Azure Storage Account when I seemingly have all the resources I need place for the WebJob to run? What does it do? Is it a billing thing, cause I thought we had that defined in the App Service Plan. I've tried reading up on Azure Storage Accounts over here but I need a bit more help understanding how it relates to everything.
From the docs:
An Azure storage account provides resources for storing queue and blob data in the cloud.
It's also used by the WebJobs SDK to store logging data for the dashboard.
Refer to the getting started guide and documentation for further information
The answer to your question is "No", it is not mandatory to use Azure Storage when you are trying to setup and run a Azure web job.
If you are using JobHost or JobHostConfiguration then there is indeed a dependency for Storage accounts.
Sample code snippet is give below.
class Program
{
static void Main()
{
Functions.ExecuteTask();
}
}
public class Functions
{
[NoAutomaticTrigger]
public static void ExecuteTask()
{
// Execute your task here
}
}
The answer is no, you don't. You can have a WebJob run without being tied to an Azure Storage Account. Like Murray mentioned, your WebJob dashboard does use a storage account to log data but that's completely independent.

Azure Storage services logs

I am a beginner in Azure and need some help. We are facing a bit of problem with Azure Storage services and are unable to proceed.
Ok now the issue is
http://blogs.msdn.com/b/windowsazurestorage/archive/2014/08/05/microsoft-azure-storage-service-version-removal.aspx
To summarize:
We have to inspect the log version of an/all of blobs,tables,queues in case any of them are using the one set for planned removal. I have enabled logging for the webapplication on the azure portal site. I am able to see the three services as under
https://.blob.core.windows.net
https://.table.core.windows.net
https://.queue.core.windows.net
Now in the articles as below I gather that we get the log format as this where they have a version included but have NOT specfied from where to locate the logs and how to gather the logs. I have tried different things from using https://.blob.core.windows.net/$logs but makes no difference.
The logs required should be in this format(sample)
Here is a sample log entry, with the version used highlighted – in this case the request was an anonymous GetBlob request which implicitly used the 2009-09-19 version:
1.0;2011-08-09T18:52:40.9241789Z;GetBlob;AnonymousSuccess;200;18;10;anonymous;;myaccount;blob;"https:// myaccount.blob.core.windows.net/thumbnails/lake.jpg?timeout=30000";"/myaccount/thumbnails/lake.jpg";a84aa705-8a85-48c5-b064-b43bd22979c3;0;123.100.2.10;2009-09-19;252;0;265;100;0;;;"0x8CE1B6EA95033D5";Friday, 09-Aug-11 18:52:40 GMT;;;;"8/9/2011 6:52:40 PM ba98eb12-700b-4d53-9230-33a3330571fc"
Can you please show me a way to view these logs. Any tool to use ?
Since these logs are stored in a blob container called $logs, any storage explorer which supports viewing data from this blob container can be used to view the contents. To the best of my knowledge following tools support viewing data from this container: Azure Storage Explorer, Cerebrata Azure Management Studio, Cloud Portam (Disclosure: I am the developer working on this tool).
However before you could view the data you would need to enable logging on your storage account. Only when logging is enabled on the storage account you will see this container show up in your storage account. To enable logging, again you can use Azure Management Studio or Cloud Portam or you could use the code below (the code I mentioned below assumes you have the latest version of Storage Client Library):
static void SetLoggingProperties()
{
CloudStorageAccount account = new CloudStorageAccount(new StorageCredentials(StorageAccount, StorageAccountKey), true);
LoggingProperties properties = new LoggingProperties()
{
LoggingOperations = LoggingOperations.All,
RetentionDays = 365,
Version = "1.0",
};
ServiceProperties serviceProperties = new ServiceProperties()
{
Cors = null,
HourMetrics = null,
MinuteMetrics = null,
Logging = properties,
};
var blobClient = account.CreateCloudBlobClient();
blobClient.SetServiceProperties(serviceProperties);
var tableClient = account.CreateCloudTableClient();
tableClient.SetServiceProperties(serviceProperties);
var queueClient = account.CreateCloudQueueClient();
queueClient.SetServiceProperties(serviceProperties);
}
Once logging properties are set, give it some time for logs to show up.

Azure: Where do I find my role's logs when using Remote Desktop?

When I run my worker role locally, I can open the Windows Azure Compute Emulator application and look at the standard output and error of my worker process.
When I remote desktop into my Azure instance, I don't know where to get that same information. Where do I find standard output and error?
If you want to see your standard output and error of your worker process in an actual deployment then you will need to do some additional configuration. This data must be stored in a persistent storage.
First step is to enable Diagnostics in the configuration window of your WorkerRole. Here a storage account must be specified.
The next step is to add additional code to the OnStart() method of your WorkerRole. Here you can not only configure the standard output and error, but also you can listen to windows events and diagnostic information as provided in the following code example.
public override bool OnStart()
{
DiagnosticMonitorConfiguration diagConfig =
DiagnosticMonitor.GetDefaultInitialConfiguration();
// Windows event logs
diagConfig.WindowsEventLog.DataSources.Add("System!*");
diagConfig.WindowsEventLog.DataSources.Add("Application!*");
diagConfig.WindowsEventLog.ScheduledTransferLogLevelFilter = LogLevel.Error;
diagConfig.WindowsEventLog.ScheduledTransferPeriod = TimeSpan.FromMinutes(5);
// Azure application logs
diagConfig.Logs.ScheduledTransferLogLevelFilter = LogLevel.Verbose;
diagConfig.Logs.ScheduledTransferPeriod = TimeSpan.FromMinutes(5);
// Performance counters
diagConfig.PerformanceCounters.DataSources.Add(
new PerformanceCounterConfiguration()
{
SampleRate = TimeSpan.FromSeconds(5),
CounterSpecifier = #"\Processor(*)\% Processor Time"
});
diagConfig.PerformanceCounters.ScheduledTransferPeriod =
TimeSpan.FromMinutes(5);
DiagnosticMonitor.Start(
"Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString", diagConfig);
return base.OnStart();
}
After these settings your diagnostic data will be visible in the configured Azure Table storage. You can easily write tools to visualize your data here, but there are also some commercial tools that have built in functionality for this. For example
Cerebrata Diagnostics Manager.
If for some reason you don't want to use Azure Storage for storing log files you can implement a custom trace listener that may write logs anywhere else. Here is a description about how to do that. You may simply open a http port and transfer them to your own server.
Trace message are not stored anywhere in Window Azure instead if you configure Azure Diagnostics properly those message are sent to Windows Azure Table Storage (WADLogsTable Table) from there you can get them.
If you want to know how to enable Azure Diagnostics for Traces visit the link below and look for Windows Azure Diagnostics Demonstration code sample:
http://msdn.microsoft.com/en-us/library/windowsazure/hh411529.aspx
You can learn details about Azure Diagnostics here.

Azure Worker Role Control Start Stop and Status

I'm doing my first project but large one on developing Azure Application with Intergration Component.
Currently most of the integration are done using SSIS Packages and would like to transform them on to Worker Role in Azure.
Could someone please help me to understand the following queries regarding Worker Role please?
Is there way to start or stop the Worker role (just like SSIS or Windows Schedulers) via GUI? If not how to achieve this?
How do I know my worker role has been running or not running (including why it's not running ie. logs)
How do I spin multiple worker role based on time (i.e. (9:00AM to 11:00AM spin 4 roles and scale down on quiet period)
Does the following code creates any poison message or dead lock (if multiple there are 10,000 messages to process and every 5 seconds the new thread (Processsing.run) is started?
while(true)
{
var thread = new Thread(Run);
thread.start();
Thread.Sleep(5000);
Trace.WriteLine("Working", "Information");
}
public class PhotoProcessing
{
public static void Run()
{
// Read from queue
CloudQueueMessage msg =
Storage.Queue.GetNextMessage();
while(msg != null)
{
string[] message = msg.AsString.Split('$');
if(message.Length == 2)
{
AddWatermark(message[0], message[1]);
}
// Message has been read so remove it
Storage.Queue.DeleteMessage(msg);
// Get next message if any
msg = Storage.Queue.GetNextMessage();
}
}
Is there way to start or stop the Worker role (just like SSIS or Windows Schedulers) via GUI? If not how to achieve this?
There are actually many ways to achieve this. You can use Windows Azure Portal to do or you could use 3rd party tools (like our Cloud Storage Studio) or you could write your own application using Windows Azure Service Management API (http://msdn.microsoft.com/en-us/library/ee460799.aspx)
How do I know my worker role has been running or not running (including why it's not running ie. logs)
Again you could use one of the GUI based tools to see the status of your roles. As far as why the roles are not running, you would need to enable Windows Azure Diagnostics in your worker role (http://msdn.microsoft.com/en-us/library/gg433048.aspx)
How do I spin multiple worker role based on time (i.e. (9:00AM to 11:00AM spin 4 roles and scale down on quiet period)
You can write your own application using Windows Azure Service Management API to do so or you could make use of 3rd party tools like AzureWatch from Paraleap or Azure Management Cmdlets (both from Microsoft and our company). While the cmdlets will get the job done, I believe Azure Watch is much more sophisticated solution. We wrote a blog post for autoscaling some days back which you can find here: http://www.cerebrata.com/Blog/post/Scale-your-Windows-Azure-instances-with-Azure-Management-Cmdlets.aspx.

Resources