Azure Application Insights - How to analyze exported data - azure

I have about 350GB of exported data that dates back to last june from my application insights. How can i analyze it?
How can i tell the portal to look back into that data and not only the last 90 days?

It is not possible to use exported data to power portal UX. Export feature is for integration with 3rd party services/backup/audit purposes/etc.

Well you could, but you'd have to write all of it yourself, and there'd be limitations.
you'd have to do the export which you're doing
you'd have to write something to parse and import that data back into your application insights resource into a custom schema (see https://learn.microsoft.com/en-us/azure/application-insights/app-insights-analytics-import)
and then you'd have to write tools/queries to join between old data in the custom schema and current data in the ai schema, being careful to de-dup any items that are in both places.
you wouldn't be able to use most of the built in portal tools, as they don't know about the custom schema. but analytics queries in tools like the analytics portal or workbooks could see both sets of data.

Related

Is it possible to disable custom logging in application insights for already deployed code?

We are storing some user info in production that we don't have to keep and thus need to disable custom logging in application insights immediately. I can't change the code in between production rollouts.
Is there any setting or configuration available in application insights?
One solution which will not require code redeployment:
If using Classic Application Insights resource -> migrate to Workspace-based (no code changes required)
Use Workspace transformation DCR
Using them you can either completely drop data or can erase some fields.
Note, though dropped data doesn't appear in Application Insights, since it was ingested, it still will be partially charged (please refer to documentation).
Note, Workspace schema for Application Insights tables is slightly different. Transformation DCR needs to be written in Workspace schema. For instance, "requests" table name is called "AppRequests" in Workspace. You can explore individual fields by opening this table in Workspace directly.

How to export Application Insights Analytics data to Tableau?

How can I link Applications Insights to Tableau so I can visualize my data?
Application Insights supports continuous export (https://learn.microsoft.com/en-us/azure/application-insights/app-insights-export-telemetry).
I expect Tableau to have some import capabilities. You probably will need to write an adapter from AppInsights data scheme to Tableau.
Application Insights also supports a REST API to query metrics, items (as ODATA), queries, etc.
https://dev.applicationinsights.io/documentation/Using-the-API/Events
that might be better than paying extra to duplicate (export) all your telemetry?

Microsoft Azure realtime charts for externals

since a couple of weeks I'm working with Microsoft Azure and I wonder if there is a possibility to create realtime charts in my Web App for external customers.
I know Microsoft provides two different services called 'Power BI', which supports realtime charts and 'Power BI embedded'. But my problem is that, as far as I know, Power BI is only intended for internal users and Power BI embedded, which is inteded for charts e.g. in Web Apps for external customers, only provides reports which are not realtime.
Am I missing something or is it currently not possible to provide realtime charts inside web apps with the given services of Azure? If yes, what would be alternatives to achieve my goal?
Thank you very much in advance.
Kind regards,
Felix
I would look at Power BI Embedded, with the data source using a Direct Query connection to Azure SQL Database or Azure SQL Datawarehouse. Every user action in the report (filtering, drilling etc) will generate a query against the database.
That Power BI Embedded architecture is explained on this page:
https://learn.microsoft.com/en-us/azure/power-bi-embedded/power-bi-embedded-what-is-power-bi-embedded
Direct Query is explained on this page:
https://powerbi.microsoft.com/en-us/documentation/powerbi-azure-sql-database-with-direct-connect/
1) Consider that Real Time is like in an IOT scenario where you see your graphics on your dashboard moving in Real Time and not after a refresh. So in this context you should consider using Azure Stream Analytics Jobs. It's get an input from a blob storage, an event Hub, ..., and then in output you can use your power BI account to write in real time events ingested from Azure Streaming Analytics. Very powerfull! you use SQL for querying the input, the only thing to be aware is the thumbling time window that is somehow new to the SQL language.
2) Letting your customer access to the dashboard I would suggest you to pubblicate your dashboard for free access, and then secure your dashboard inside a web app on which you can apply a security pattern. You can also invite people outside of your organisation via email. Which is faster than the previous solution, but people accessing to your report must have a power BI Pro license. You can use the free trial for 60 days.
Hope that helps!
Cheers!

Explore long-term data with Azure Application Insights and PowerBi

I use continuous export to pull in my Application Insights data into Power Bi. However, all of the data seems to be either 7-day or 30-day - how would I be able to view a chart with a longer-term timeframe (aka users over the last year)?
Update
Here is what I see in Power Bi:
From the screenshot, it looks like you're not using Continuous Export for this, but instead are using the Application Insights Content Pack for Power BI. The content pack has predefined views which us what you circled in your screenshot.
You can easily create your own viewed using the almost-ready-to-release Application Insights REST API (tracked with this UserVoice suggestion).
If you want to try this with the API, please send me a note at dalek#microsoft.com and I'll set you up. In the API documentation I show step-by-step how you can create charts from metric data in Power BI.
Thanks
Dale Koetke

Azure Storage custom audit and logs

I'm writing a small app that reads and writes from Azure Blob Storage (Images, documents, etc.)
I need to implement some logging that will log activities such as:
file uploaded
File deleted
File updates
etc.
So, basically I need my log to look something like this:
User John Doe Create a container "containerName" on 2016-05-05
User Mike Smith removed a blob test.jpg
etc...
UserIds and other additional info will be sent through method.
Example: CreateImage(String CreatedBy)
Question:
What is the best way to store and create such type of logs? The easiest one is to have SQL database with table Audit and all necessary columns. But I know that Azure has Azure Diagnostics. Can that be used to store and query logs? For example, I will need to see all file manipulations by user, by date, etc.
I would go using one of these ways:
1) Azure Storage Tables for logs. Here, you may store everything you need regarding logs. Then, if you need a functionality to get/filter/etc, you may look into LINQ to Azure Tables or even LINQPad if you need the desktop-ready software. However, some design considerations should be taken into account - design guidance is here.
2) Application Insights. Using custom events functionality, you may go with the powerful logging and then, on the portal, see how it is going. You may attach some metadata to the custom event, and then aggregate/filter/see that using convenient web interface. Or connect log4net to the AI, if you want to stream logs to the AI. AI may export its logs into the Azure Storage continuously, so you may take that and dive into it later.
IMHO, i would not say that SQL Database is the appropriate store for logs - it looks like too much (in terms of resources, maybe price, etc) for me for storing the logs in the full-fledged DB. Not very relevant, but interesting reading about working with a lot of records.

Resources