Azure Appinsights vs Log Analytics - azure

Currently I am logging my custom log messages to an Azure Table.
Now I need to automatically trigger the sending of emails based on log types and also need to generate an analysis report from the log messages.
Which service is more suitable to get this done? Azure Application Insights or Azure Log Analytics?

I think Application Insights will fit both - creating reports as well as sending out emails. You can do the same with Log Snalytics but the difference is, is that Log Analytics is basically a logical storage of all your log data and you can create custom reports, alerts etc. across many different services, also, everything can be nicely visualized in OMS.
As being said in the comments, you need to describe a bit more about the scenario.

Related

How to add logs in log analytics in azure logic apps?

I am following this tutorial and I am also able to add the logic app logs into azure log analytics but the problem is that logic analytics for logic apps is still in preview mode.
https://learn.microsoft.com/en-us/azure/logic-apps/monitor-logic-apps-log-analytics
I have few question regarding this.
should I use it for logging as it is still in preview mode.
If not what other options do I have to logs data in azure monitor?
Preview mode is mode where full-fledged features are not available. This type of modes is provided to give feedbacks to improve it better.
If you ask me about to use it or not, I usually use it and i get desired results, and it works fine for me Example-Reference.
The other way I monitor logs is by using below process:
So firstly, I send logs to Log analytics workspace and then, I created another logic apps and get logs by below process:
Another way is this to log Logs of Azure Logic Apps.

Azure analytics workspace as source in a alerts

I have a hard time here understanding what a log analytics workspace is. As I have a requirement to monitor 7 application insigths out of many. Send emails if some of them throws exceptions. I can see in alerts you can only select a single application insight. And I dont want to create 7 alerts. So my plan was to create a analytics workspace. I havent found anyway to bind application insights to a workspace. Is that possible? I can see you in alerts can choose a analytics workspace as source, but what does that mean? Sounds like you some how can group data in that specific workspace?
I can see when I create a new Application Insight service i have the option to choose a workspace, but what with existing?
So as per your requirement, you can create the Workspace-based Application Insights resources, or connect the existing classical AI to azure log analytics. And of course, these 6 AI should connect to the same azure log analytics.
After you have the the Workspace-based Application Insights resources, the logs are logged into both the AI and the azure log analytics. So when you create an alert rule, select the azure log analytics, and create a custom log search alert rule. That can totally meet your requirement.
And you should also understand the table scheme between AI and Azure log Analytics. For example, in AI, the exceptions are logged in exceptions table; but if it's connected to azure log analytics, then in log analytics, the exceptions are in AppSystemEvents table.

Export Logs From Azure Log Analytics

So I am building a application in azure and I am using Azure Log Analytics and I am trying to find s good way for people on my team that dont have access to azure but need to be able to access the Logs. Does anyone have simple fast ways to create something like this. Good techinologies good ways to give people access to it?
Is using Power BI to ingest your log analytics queries an option?.
The caveat here would be the need to redo any potential charts and graphs however Power BI offers a lot of functionality as well as opportunities to join with other day sets.
In your scenario the trick would be using a service account credentials when publishing the dataset.
You may try to use Azure Log Analytics rest api.
Then you can provide the authentication(it only authenticates to log analytics, not the entire azure) to the end user, and let them write query to fetch the logs; Or you can write a middle-ware, which can process the query request from end users.
So there are a few ways to do this:
You can use the ALA api to generate a home grown log portal
There are multiple SAS options out there
DataDog
Splunk
AppDynamics
Power Bi
Not specifically logs but prometheus and grafa for matrics and alerts and its dirt cheap compared to app insights

How to send Azure costs and usage data to log analytics workspace or directly to azure metrics?

I need to build a dashboard which will visuallize the usage and cost of many azure subscriptions. accounts, departments.
My plan was:
Send the data that is 'behind' the Azure Cost Analysis view, to the log analytics workspace.
In the log analytics workspace, perform custom aggregations / filters.
Display those aggregations as charts in Azure Metrics or directly in Azure Dashboard.
Problem is with step 1, I dont know how to send the data that is 'behind' the Azure Cost Analysis view, to the log analytics workspace.
I thought of two solutions:
Fetching the data from azure cost & billing API.
Schedule Export cost analysis data to a storage account, and then somehow moving the data from the storage account to the log analytics workspace.
Both solutions seems to me a bit 'overkilling' - is there a more direct approach to send the cost analysis data to log analytics workspace?
If there is no option such as that, I would be happy to know how would you suggest moving the exported data from the storage account to the log analytics, or do you have some other idea?
Thank you!
The only native solution is, to schedule from the Costs-Blade an export of the Costs as CSV into a StorageAccount. If you want to load the Data into a Log-Analytics-Workspace, Azure Automation and a scheduled Script would work.
I believe a direct approach is currently not available but I see this feature request raised in UserVoice / feedback forum for the same requirement. If interested, you may upvote it because in general the responsible Azure product / feature team would triage / start checking feasibility and prioritizing a received feedback based on various factors like number of votes a feedback receives, feasibility, open prioritized backlog items, etc.
I would suggest you to fetch the data from azure cost & billing API and send that data to Log Analytics from a REST API client by using the HTTP Data Collector API. For more information and illustration with examples, refer this Azure document. Or else if you want to fetch the data from azure cost & billing API and store it in a machine then you may go with custom logs. For more information w.r.t it, refer this Azure document.
Other related references:
Use cost alerts to monitor usage and spending
Supported metrics with Azure Monitor

Best Practice to store Azure WebJob Logs incl. Data in Azure

I have several Azure WebJobs (.Net Framework, Not .Net Core) running which interact with an Azure Service Bus. Now I want to have a convenient way to store and analyze their Log-Messages (incl. the related Message from the Service Bus). We are talking about a lot of Log Messages per Day.
My Idea is to send the Logs to an Azure Event Hub and store them in an Azure SQL Database. Later I can have for example a WebApp that enables Users to conveniently browse and analyze the Logs and view the Messages.
Is this a bad Idea? Should I instead use Application Insights?
Application insight charges are more than your implementation. So i would say this is good idea. Just one change i would send each logs to logic apps and do some processing like sorting error logs, info logs etc differently. Also why are you thinking about SQL when this can be stored in non SQL Azure tables and fetch them from there.

Resources