Azure Storage services logs - azure

I am a beginner in Azure and need some help. We are facing a bit of problem with Azure Storage services and are unable to proceed.
Ok now the issue is
http://blogs.msdn.com/b/windowsazurestorage/archive/2014/08/05/microsoft-azure-storage-service-version-removal.aspx
To summarize:
We have to inspect the log version of an/all of blobs,tables,queues in case any of them are using the one set for planned removal. I have enabled logging for the webapplication on the azure portal site. I am able to see the three services as under
https://.blob.core.windows.net
https://.table.core.windows.net
https://.queue.core.windows.net
Now in the articles as below I gather that we get the log format as this where they have a version included but have NOT specfied from where to locate the logs and how to gather the logs. I have tried different things from using https://.blob.core.windows.net/$logs but makes no difference.
The logs required should be in this format(sample)
Here is a sample log entry, with the version used highlighted – in this case the request was an anonymous GetBlob request which implicitly used the 2009-09-19 version:
1.0;2011-08-09T18:52:40.9241789Z;GetBlob;AnonymousSuccess;200;18;10;anonymous;;myaccount;blob;"https:// myaccount.blob.core.windows.net/thumbnails/lake.jpg?timeout=30000";"/myaccount/thumbnails/lake.jpg";a84aa705-8a85-48c5-b064-b43bd22979c3;0;123.100.2.10;2009-09-19;252;0;265;100;0;;;"0x8CE1B6EA95033D5";Friday, 09-Aug-11 18:52:40 GMT;;;;"8/9/2011 6:52:40 PM ba98eb12-700b-4d53-9230-33a3330571fc"
Can you please show me a way to view these logs. Any tool to use ?

Since these logs are stored in a blob container called $logs, any storage explorer which supports viewing data from this blob container can be used to view the contents. To the best of my knowledge following tools support viewing data from this container: Azure Storage Explorer, Cerebrata Azure Management Studio, Cloud Portam (Disclosure: I am the developer working on this tool).
However before you could view the data you would need to enable logging on your storage account. Only when logging is enabled on the storage account you will see this container show up in your storage account. To enable logging, again you can use Azure Management Studio or Cloud Portam or you could use the code below (the code I mentioned below assumes you have the latest version of Storage Client Library):
static void SetLoggingProperties()
{
CloudStorageAccount account = new CloudStorageAccount(new StorageCredentials(StorageAccount, StorageAccountKey), true);
LoggingProperties properties = new LoggingProperties()
{
LoggingOperations = LoggingOperations.All,
RetentionDays = 365,
Version = "1.0",
};
ServiceProperties serviceProperties = new ServiceProperties()
{
Cors = null,
HourMetrics = null,
MinuteMetrics = null,
Logging = properties,
};
var blobClient = account.CreateCloudBlobClient();
blobClient.SetServiceProperties(serviceProperties);
var tableClient = account.CreateCloudTableClient();
tableClient.SetServiceProperties(serviceProperties);
var queueClient = account.CreateCloudQueueClient();
queueClient.SetServiceProperties(serviceProperties);
}
Once logging properties are set, give it some time for logs to show up.

Related

Console application Logs to store in Azure storage

I am using a console application that runs on on-premise servers triggered by a task scheduler. This console application performs the various actions and needed to be logged these. It would generate logs of around 200kb per run and the console app runs every hour.
Since the server is not accessible to us, I am planning to store the logs to Azure. I read about blob/table storage.
I would like to know what is the best strategy to store the logs in Azure.
Thank you.
Though you can write logging data in Azure Storage (both Blobs and Tables), it would actually make more sense if you use Azure Application Insights for logging this data.
I recently did the same for a console application I built. I found it incredibly simple.
I created an App Insight Resource in my Azure Subscription and got the instrumentation key. I then installed App Insights SDK and referenced appropriate namespaces in my project.
using Microsoft.ApplicationInsights;
using Microsoft.ApplicationInsights.DataContracts;
using Microsoft.ApplicationInsights.Extensibility;
This is how I initialized the telemetry client:
var appSettingsReader = new AppSettingsReader();
var appInsightsInstrumentationKey = (string)appSettingsReader.GetValue("AppInsights.InstrumentationKey", typeof(string));
TelemetryConfiguration configuration = TelemetryConfiguration.CreateDefault();
configuration.InstrumentationKey = appInsightsInstrumentationKey;
telemetryClient = new TelemetryClient(configuration);
telemetryClient.InstrumentationKey = appInsightsInstrumentationKey;
For logging trace data, I simply did the following:
TraceTelemetry telemetry = new TraceTelemetry(message, SeverityLevel.Verbose);
telemetryClient.TrackTrace(telemetry);
For logging error data, I simply did the following:
catch (Exception excep)
{
var message = string.Format("Error. {0}", excep.Message);
ExceptionTelemetry exceptionTelemetry = new ExceptionTelemetry(excep);
telemetryClient.TrackException(exceptionTelemetry);
telemetryClient.Flush();
Task.Delay(5000).Wait();//Wait for 5 seconds before terminating the application
}
Just keep one thing in mind though: Make sure you wait for some time (5 seconds is good enough) to flush the data before terminating the application.
If you're still keen on writing logs to Azure Storage, depending on the logging library you're using you will find suitable adapters that will write directly into Azure Storage.
For example, there's an NLog target for Azure Tables: https://github.com/harouny/NLog.Extensions.AzureTableStorage (though this project is not actively maintained).

View SharedAccessBlobPolicy created programatically - in Azure portal

I'm creating a container and then a Shared Access Signature for that container in code as so:
SharedAccessBlobPolicy policy = new SharedAccessBlobPolicy()
{
Permissions = SharedAccessBlobPermissions.Write,
SharedAccessExpiryTime = DateTime.UtcNow.AddHours(36)
};
var sas = container.GetSharedAccessSignature(policy, $#"{id}-{DateTime.Now}");
That work's fine.
However when I go into Azure portal I can't see a list of policies that have been created.
Does anyone know if this is possible and if so where/how?
Azure Portal offers very limited functionality for managing Storage Accounts. As of today, this functionality doesn't exist there.
What you could do is use any Storage Explorer available in the market (including Microsoft's own Storage Explorer - http://storageexplorer.com) and view access policies there.

Azure Storage Account Metrics only visible for Classic Storage Account

I've tested creating both classic storage account (manage.windowsazure.com) and a "new" storage account in the new Azure Portal. Set them up similar and run the same code to create and configure a queue. But metrics is only showing for the classic storage account in the Portal (Able to see both accounts in the new Portal)
I have set up the ServiceProperties like this, and can successfully see these changes saved when fetching service properties or looking in the Azure Portal.
CloudStorageAccount storageAccount =
CloudStorageAccount.parse(storageConnectionString);
CloudQueueClient queueClient = storageAccount.createCloudQueueClient();
MetricsProperties metricsProperties = new MetricsProperties();
metricsProperties.setMetricsLevel(MetricsLevel.SERVICE_AND_API);
metricsProperties.setRetentionIntervalInDays(2);
LoggingProperties loggingProperties = new LoggingProperties();
loggingProperties.setRetentionIntervalInDays(10);
loggingProperties.setLogOperationTypes(EnumSet.of(LoggingOperations.READ, LoggingOperations.WRITE, LoggingOperations.DELETE));
ServiceProperties serviceProperties = new ServiceProperties();
serviceProperties.setHourMetrics(metricsProperties);
serviceProperties.setMinuteMetrics(metricsProperties);
serviceProperties.setLogging(loggingProperties);
queueClient.uploadServiceProperties(serviceProperties);
When I use Microsoft Azure Storage Explorer both accounts has the tables for metrics and logging set up, so both look like this and the tables contains data.
So from here it looks similar. But the metrics graphs and options are only available for the Classic Storage account in Azure Portal. For the "new" Storage account it only says "No available data".
Is it a bug? Or is a classic Storage Account default configured with some properties I manually need to apply to the new Storage account to make it behave similar?
Screenshot from Microsoft Azure Storage Explorer
According to your code setting, I leverage WindowsAzure.Storage (version 7.2.1) to configure my storage account metrics both on the classic Storage Account and the new Storage Account as follows:
var blobClient = storageAccount.CreateCloudBlobClient();
MetricsProperties metricsProperties = new MetricsProperties();
metricsProperties.MetricsLevel = MetricsLevel.ServiceAndApi;
metricsProperties.RetentionDays = 2;
LoggingProperties loggingProperties = new LoggingProperties();
loggingProperties.RetentionDays = 10;
loggingProperties.LoggingOperations = LoggingOperations.Read | LoggingOperations.Write | LoggingOperations.Delete;
ServiceProperties serviceProperties = new ServiceProperties();
serviceProperties.HourMetrics=metricsProperties;
serviceProperties.MinuteMetrics=metricsProperties;
serviceProperties.Logging=loggingProperties;
blobClient.SetServiceProperties(serviceProperties);
Upon the code snippet, you could configure the minute/hour metrics for your Blob Storage.
Since you have confirmed that the related tables contain metric records, you could try to log into the Azure Portal, choose your storage account, click QUEUE SERVICE > Metrics, click Edit chart and change the Time Range as follows:
Note: The time range is set to today by default if there has any metric records. There could be data latency, you could try to specify the time range and find out whether you could retrieve your metrics data as you expected.

Tool or usage example to generate and view SAS (Shared Access Signatures) of both Azure Block Blob and Azure File Share

I am looking for a tool or usage example to generate and view SAS (Shared Access Signatures) of both Azure Block Blob and Azure File Share. There are lots of examples for Block Blob and Containers but what about Azure File Share SAS examples or tools.
Ability to create Shared Access Signature on a File Service Share is announced in the latest version of REST API. You must use Storage Client Library 5.0.0 for that purpose.
First, install this library from Nuget:
Install-Package WindowsAzure.Storage -Version 5.0.0
Then the process of creating a SAS on a File Service Share is very much similar to creating a SAS on a blob container. Please see sample code below:
static void FileShareSas()
{
var account = new CloudStorageAccount(new StorageCredentials(accountName, accountKey), true);
var fileClient = account.CreateCloudFileClient();
var share = fileClient.GetShareReference("share");
var sasToken = share.GetSharedAccessSignature(new Microsoft.WindowsAzure.Storage.File.SharedAccessFilePolicy()
{
Permissions = Microsoft.WindowsAzure.Storage.File.SharedAccessFilePermissions.List,
SharedAccessExpiryTime = new DateTimeOffset(DateTime.UtcNow.AddDays(1))
});
}
In the above code, we're creating a SAS with List permission that will expire one day from current date/time (in UTC).
Also if you're looking for a tool to do so, may I suggest you take a look at Cloud Portam (Disclosure: I am building this tool). Recently we released the functionality to manage SAS on a Share.

Azure: Where do I find my role's logs when using Remote Desktop?

When I run my worker role locally, I can open the Windows Azure Compute Emulator application and look at the standard output and error of my worker process.
When I remote desktop into my Azure instance, I don't know where to get that same information. Where do I find standard output and error?
If you want to see your standard output and error of your worker process in an actual deployment then you will need to do some additional configuration. This data must be stored in a persistent storage.
First step is to enable Diagnostics in the configuration window of your WorkerRole. Here a storage account must be specified.
The next step is to add additional code to the OnStart() method of your WorkerRole. Here you can not only configure the standard output and error, but also you can listen to windows events and diagnostic information as provided in the following code example.
public override bool OnStart()
{
DiagnosticMonitorConfiguration diagConfig =
DiagnosticMonitor.GetDefaultInitialConfiguration();
// Windows event logs
diagConfig.WindowsEventLog.DataSources.Add("System!*");
diagConfig.WindowsEventLog.DataSources.Add("Application!*");
diagConfig.WindowsEventLog.ScheduledTransferLogLevelFilter = LogLevel.Error;
diagConfig.WindowsEventLog.ScheduledTransferPeriod = TimeSpan.FromMinutes(5);
// Azure application logs
diagConfig.Logs.ScheduledTransferLogLevelFilter = LogLevel.Verbose;
diagConfig.Logs.ScheduledTransferPeriod = TimeSpan.FromMinutes(5);
// Performance counters
diagConfig.PerformanceCounters.DataSources.Add(
new PerformanceCounterConfiguration()
{
SampleRate = TimeSpan.FromSeconds(5),
CounterSpecifier = #"\Processor(*)\% Processor Time"
});
diagConfig.PerformanceCounters.ScheduledTransferPeriod =
TimeSpan.FromMinutes(5);
DiagnosticMonitor.Start(
"Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString", diagConfig);
return base.OnStart();
}
After these settings your diagnostic data will be visible in the configured Azure Table storage. You can easily write tools to visualize your data here, but there are also some commercial tools that have built in functionality for this. For example
Cerebrata Diagnostics Manager.
If for some reason you don't want to use Azure Storage for storing log files you can implement a custom trace listener that may write logs anywhere else. Here is a description about how to do that. You may simply open a http port and transfer them to your own server.
Trace message are not stored anywhere in Window Azure instead if you configure Azure Diagnostics properly those message are sent to Windows Azure Table Storage (WADLogsTable Table) from there you can get them.
If you want to know how to enable Azure Diagnostics for Traces visit the link below and look for Windows Azure Diagnostics Demonstration code sample:
http://msdn.microsoft.com/en-us/library/windowsazure/hh411529.aspx
You can learn details about Azure Diagnostics here.

Resources