Where does Azure keeps non-vm logs? Can them be downloaded programmatically? - azure

Azure keeps a bunch of VM (and cloud service) related logs in WAD* tables. The question is about actions which do not necessarily affect VMs. Say one deleted a Table Storage. Does Azure keep a log record about that? If yes, where? How to fetch them using a program/script?

The Service Management REST API can be used to retrieve the operation logs programmatically.
List Subscription Operations
https://msdn.microsoft.com/library/azure/gg715318.aspx

Related

where i can find azure logs for tracking api response

I want to view logs in azure, I mean logs that I have in the console in localhost where I can find them in web site deployed in azure? I am consuming an external API and I want to see what I send and what I received from the prod env
thank you
There are couple of ways to track logs in Azure
Azure API Management
Azure Monitor
Azure API Management helps you track all kinds of requests including
View activity logs
View resource logs
View metrics of your API
Set up an alert rule when your API gets unauthorized calls
Azure Monitor on the other hand helps it possible to programmatically retrieve the available default metric definitions, granularity, and metric values.
The data can be saved in a separate data store such as Azure SQL Database, Azure Cosmos DB, or Azure Data Lake. From there additional analysis can be performed as needed.

Azure Service Fabric backups (non-persistent data)

I have 3 applications deployed to Azure Service Fabric via ARM template. The only items that have been identified as needing to be backed up are some resources. They include a central blob storage, about 5 SQL databases, and the key vault. The cluster and apps can be redeployed right away via the template.
Searching for backup solutions, I'm seeing a lot of info on backups for services, but not for specific resources like I have here. Can anyone point me to the right direction/sample code on how to do this or is it even an option?
Ok so, storage cannot be backed up using Azure services. You have to créate a program\script that will do that for you.
For the keyvault you can use this powershell cmdlet.
For the SQL there are a bunch of ways to do that, but perhaps you can settle with the built-in backup, which happens automatically and goes 7-35 days back (depending on your tier)

Notify email when Azure Storage table gets new entry

Is there an inbuilt way to notify an email when a new entry is added to the Table?
I am asking for anything programatically just within their own UI
Not currently but you could put it on an Azure Storage Queue and process it to Table Storage and send an Email with Azure Functions.
Check out this page what is possible - https://learn.microsoft.com/en-us/azure/azure-functions/functions-triggers-bindings
Okay, so the easiest way to see the data is to use their desktop app
Use https://azure.microsoft.com/en-us/features/storage-explorer/
The lastest Azure product updates covering event-driven applications suggest to adopt these application patterns to react to events published by Azure Storage. These docs/resources might help to explore the application pattern and current platform/framework support.
Azure Storage
Reacting to Blob storage events (preview)
Did not found anything similar for Azure Tables - what about suggesting this on UserVoice?
CosmosDB
Working with the change feed support in Azure Cosmos DB

Determine SQL Azure Region without using Admin Console

I am working on a solution that uses SQL Azure. Part of the project deals with backups and using the DAC Web Services for backups.
The issue is that there is a different endpoint depending on which region the Azure SQL database is in. As I am working with multiple groups, and cannot ensure which region the database will be in, I am looking for a way to programmatically determine the region.
The region is also important, as I want to copy the backups to a different region just to be on the safe side.
I know that I can look in the Admin console, but I would like to use code to solve this problem.
Additional information:
The application is running on Azure using Worker Roles for functionality.
I do not have access to all of the account-id's to use the full REST API.
I do have access to the master database on the Azure Sql Server.
Working on this in C# (I failed to put the language)
You can use Get Servers request (GET https://management.database.windows.net:8443/<subscription-id>/servers) of Azure REST API to enumerate SQL servers which gives the Location or Region more info at msdn -> http://msdn.microsoft.com/en-us/library/windowsazure/gg715269.aspx

Why do we link an azure storage account to a cloud service?

Why do we link an azure storage account to a cloud service? How does it help? What happens if I do not link them?
Two reasons:
Easier management - you have better idea of what is your overall configuration for a particular deployment
Easier management - upon deleting a resource you are being asked whether you want to delete the linked resources also
By the way, you can also link a Windows Azure SQL Database to a Cloud Service.
The whole idea is to help you better manage the services. There is no other reason and nothing will happen if you do not link. But think a bit - if you manage 3 subscriptions, 2 cloud services deployments each, 2 storage accounts per deployment. That is 6 cloud services, 12 storage accounts. Can you easily tell which service is using which account?
The cloud service depends on the storage account. When deploying the cloud service it will create a container called vsdeploy with a block blob that is used for the VMs it creates.
It also stores crash dump files there as well under the container wad-crashdumps. The folder structure is WAD{GUID}{worker role}{instance}. Then it will store all the .dmp files as block blobs.

Resources