So I am building a application in azure and I am using Azure Log Analytics and I am trying to find s good way for people on my team that dont have access to azure but need to be able to access the Logs. Does anyone have simple fast ways to create something like this. Good techinologies good ways to give people access to it?
Is using Power BI to ingest your log analytics queries an option?.
The caveat here would be the need to redo any potential charts and graphs however Power BI offers a lot of functionality as well as opportunities to join with other day sets.
In your scenario the trick would be using a service account credentials when publishing the dataset.
You may try to use Azure Log Analytics rest api.
Then you can provide the authentication(it only authenticates to log analytics, not the entire azure) to the end user, and let them write query to fetch the logs; Or you can write a middle-ware, which can process the query request from end users.
So there are a few ways to do this:
You can use the ALA api to generate a home grown log portal
There are multiple SAS options out there
DataDog
Splunk
AppDynamics
Power Bi
Not specifically logs but prometheus and grafa for matrics and alerts and its dirt cheap compared to app insights
Related
I need to build a dashboard which will visuallize the usage and cost of many azure subscriptions. accounts, departments.
My plan was:
Send the data that is 'behind' the Azure Cost Analysis view, to the log analytics workspace.
In the log analytics workspace, perform custom aggregations / filters.
Display those aggregations as charts in Azure Metrics or directly in Azure Dashboard.
Problem is with step 1, I dont know how to send the data that is 'behind' the Azure Cost Analysis view, to the log analytics workspace.
I thought of two solutions:
Fetching the data from azure cost & billing API.
Schedule Export cost analysis data to a storage account, and then somehow moving the data from the storage account to the log analytics workspace.
Both solutions seems to me a bit 'overkilling' - is there a more direct approach to send the cost analysis data to log analytics workspace?
If there is no option such as that, I would be happy to know how would you suggest moving the exported data from the storage account to the log analytics, or do you have some other idea?
Thank you!
The only native solution is, to schedule from the Costs-Blade an export of the Costs as CSV into a StorageAccount. If you want to load the Data into a Log-Analytics-Workspace, Azure Automation and a scheduled Script would work.
I believe a direct approach is currently not available but I see this feature request raised in UserVoice / feedback forum for the same requirement. If interested, you may upvote it because in general the responsible Azure product / feature team would triage / start checking feasibility and prioritizing a received feedback based on various factors like number of votes a feedback receives, feasibility, open prioritized backlog items, etc.
I would suggest you to fetch the data from azure cost & billing API and send that data to Log Analytics from a REST API client by using the HTTP Data Collector API. For more information and illustration with examples, refer this Azure document. Or else if you want to fetch the data from azure cost & billing API and store it in a machine then you may go with custom logs. For more information w.r.t it, refer this Azure document.
Other related references:
Use cost alerts to monitor usage and spending
Supported metrics with Azure Monitor
We run a software application on azure for one of our customers. The customer want to see the performance of the systems. This consist of two parts. One is the metric information of the servers and they also want to see some information I want to provide by custom logging.
My plan is to give the customer access to the portal and only allow him access to the metric information and the custom tables.
It seems to me that by assigning a role to the customer I should be able to block all the other possibilities.
Does someone can me tell which actions I have to allow/forbid to achieve this? Or were I can find the information for this?
Solution #1
Instead of giving Read access to the virtual machine which may breaks security policy, I'd recommend to go with Azure Log Analytics (ref: https://learn.microsoft.com/en-us/azure/log-analytics/log-analytics-overview
) workspace. That said, you will need to create a workspace which collects and stores server metrics (ref: https://learn.microsoft.com/en-us/azure/log-analytics/log-analytics-quick-collect-windows-computer) and other custom metrics.
Your customer will be given access to the workspace only which he can see all metrics in a dashboard. If there is a need for log filtering, you can use Log Analytics query language (ref: https://learn.microsoft.com/en-us/azure/log-analytics/log-analytics-log-search-transition)
Log Analytics is a paid service. You are given free up to 10 workspaces per subscription. The workspace is considered an Azure resource so the limit follows by subscription limit, which means you can create up to 800 workspaces per a resource group. A subscription can allow 800 * 800 (for reference if you would like to do capacity planning for your workspace-based solution). For Log Analytics pricing, read here (https://azure.microsoft.com/en-us/pricing/details/log-analytics/).
Log Analytics is a good choice as its value proportion is to offer your customer intuitive dashboard to monitor their virtual machine performance, and to offer Near Real Time monitoring. And this solution is a cloud native compatibility.
There is a management solution which offers a bundle of VM capacity and performance monitoring which you can try now https://learn.microsoft.com/en-us/azure/log-analytics/log-analytics-capacity
Solution #2
Log Analytics might not be your choice because it might add more Azure service and operational cost. If you need a cheaper cost, you would need to collect your virtual machine by Performance Counter which is a built-in feature in Windows OS. With Performance Counter you can export to Excel file, or visualize into Power BI or some custom chart.
Other Solutions
You can utilize Azure Monitor and API to get data, For example, this API https://learn.microsoft.com/en-us/rest/api/monitor/metricdefinitions/list. You would certainly need to visualize or format in some intuitive way to satisfy your customer. It can be a custom front-end web, or Power BI or even Excel with chart.
You can just query to Azure Blob Storage and use Stream Analytics combining with Power BI to visualize your data (https://thuansoldier.net/?p=7187).
There is not a single solution. This really depends on your existing resource capacity, financial stuff or so on.
Currently I am logging my custom log messages to an Azure Table.
Now I need to automatically trigger the sending of emails based on log types and also need to generate an analysis report from the log messages.
Which service is more suitable to get this done? Azure Application Insights or Azure Log Analytics?
I think Application Insights will fit both - creating reports as well as sending out emails. You can do the same with Log Snalytics but the difference is, is that Log Analytics is basically a logical storage of all your log data and you can create custom reports, alerts etc. across many different services, also, everything can be nicely visualized in OMS.
As being said in the comments, you need to describe a bit more about the scenario.
since a couple of weeks I'm working with Microsoft Azure and I wonder if there is a possibility to create realtime charts in my Web App for external customers.
I know Microsoft provides two different services called 'Power BI', which supports realtime charts and 'Power BI embedded'. But my problem is that, as far as I know, Power BI is only intended for internal users and Power BI embedded, which is inteded for charts e.g. in Web Apps for external customers, only provides reports which are not realtime.
Am I missing something or is it currently not possible to provide realtime charts inside web apps with the given services of Azure? If yes, what would be alternatives to achieve my goal?
Thank you very much in advance.
Kind regards,
Felix
I would look at Power BI Embedded, with the data source using a Direct Query connection to Azure SQL Database or Azure SQL Datawarehouse. Every user action in the report (filtering, drilling etc) will generate a query against the database.
That Power BI Embedded architecture is explained on this page:
https://learn.microsoft.com/en-us/azure/power-bi-embedded/power-bi-embedded-what-is-power-bi-embedded
Direct Query is explained on this page:
https://powerbi.microsoft.com/en-us/documentation/powerbi-azure-sql-database-with-direct-connect/
1) Consider that Real Time is like in an IOT scenario where you see your graphics on your dashboard moving in Real Time and not after a refresh. So in this context you should consider using Azure Stream Analytics Jobs. It's get an input from a blob storage, an event Hub, ..., and then in output you can use your power BI account to write in real time events ingested from Azure Streaming Analytics. Very powerfull! you use SQL for querying the input, the only thing to be aware is the thumbling time window that is somehow new to the SQL language.
2) Letting your customer access to the dashboard I would suggest you to pubblicate your dashboard for free access, and then secure your dashboard inside a web app on which you can apply a security pattern. You can also invite people outside of your organisation via email. Which is faster than the previous solution, but people accessing to your report must have a power BI Pro license. You can use the free trial for 60 days.
Hope that helps!
Cheers!
I'm quite new in Development for Azure, I have a asp.net mvc 4 application in a Azure Cloud Service.
I have a application that has a considering quantity of transaction providing by API and I need to implement some applications loggings for improve the daily diagnostic, I'm looking for a tutorial that store those into a Blob Storage instead of SQL Database, but without relevant success.
Blob Storage sounds good because I don't need to increase substantially my database that also has all the business data and could compromise a business resource (Database) becoming slower because of log transactions.
If I decide to go to storage in SQL database I'm thinking in use Log4Net.
What you guys suggest and send me tutorial that I can follow, please.
Thank you.
Sorry our logging guidance is a little hard to find - something that we are currently working on resolving - but for now please take a look at the following resources:
Client logging overview - Essentially all client library operations are output using System.Diagnostics, so you intercept and write to text / xml file just using a standard TraceListener.
Analytics and Server logs - We have extensive service side logging capabilities as well - which troubleshooting distributed apps much simpler.
Let me know if you have any questions.
Jason