How to chart and metrices based on SQL Database data in azure Monitor - azure

I have to create chart and metrices in Azure Monitor overview dashboard based on Azure SQL database data. How can we do that? I don't want to show database performance. I want to display database data in dashboard.

Azure Monitor helps you maximize the availability and performance of
your applications and services like CPU performance, memory usage,
storage usage, etc. It can't be used to visualize the stored data in
Azure SQL database or any other service.
To visualize the data, you should either use tools like Azure Data Explorer or Power BI.
Please check Visualize data with Azure Data Explorer dashboards.

Related

Is Azure Synapse is a good choice for Time Series Data?

We are in the process of analyzing which database will be the best choices for Time Series data (like stock market data / trading data, market sentiments ..etc.)
Is Azure Synapse is a good choice for Time Series Data?
Azure Synapse data explorer (Preview) provides you with a dedicated query engine optimized and built for log and time series data workloads.
With this new capability now part of Azure Synapse's unified analytics platform, you can easily access your machine and user data to surface insights that can directly improve business decisions.
To complement the existing SQL and Apache Spark analytical runtimes, Azure Synapse data explorer is optimized for efficient log analytics, using powerful indexing technology to automatically index structured, semi-structured, and free-text data commonly found in telemetry data.
For more info please refer to below related articles:
https://learn.microsoft.com/en-us/azure/synapse-analytics/data-explorer/data-explorer-overview
Time series solution - Azure Architecture
Please note that the feature is in public preview.

How do i use Telemetry data saved in Azure blobs to build reports?

Azure Application Insights does not allow telemetry data retention for more then few days, however it has option called "Continuous export" which exports data into Azure Storage Blobs, so question is how do I build reports using data stored in blobs? Is there a way to use Azure Application Insight's Reporting system itself to point to blob storage as "Data Source" and see reports ?
How are others later building reports on Azure Application Insights data that is exported using "Continuous export" option ?
Regards
You can import the data from the Continuous Export into Azure Data Explorer (ADX). Here is a full article that explains this. In ADX you can keep the data as long as you want to.
Then, you can use the cross-query feature of Azure Monitor to also query data from ADX and thus you get a unified view of current and historical data.

Azure Data Explorer (ADX) vs Polybase vs Databricks

Question
Today I discovered another Azure service called Azure Data Explorer (ADX). Sorry for such comparison of services, I have good understanding of all except ADX. I feel like there is a big functionality overlay, so want to know the exact role of ADX in Azure infrastructure.
What is the use case when ADX is significantly better than Synapse/Databricks?
My understanding of ADX
AFAIK, ADX is a cluster (with per hour billing, like Databricks or Synapse, not like ADLA) that is handling database for you and is optimized for streaming ingestion and ad-hoc queries at scale. It also supports external tables, that has worse performance but cheaper (you pay for Blob/ADLS storage).
Details
I don't understand why do we need ADX if:
Azure Synapse has similar pricing model (cluster, per-hour), also it supports streaming ingestion and ad-hoc querying at scale. Azure Synapse support querying BlobStorage/ADLS through Polybase external tables.
Databricks is another service that is capable of doing it. Using Databricks Ingest and Delta Lake - you can ingest streaming data and consume them in both: streaming and batching way. Actually you can have interactive cluster that will handle ad-hoc queries for you.
Also if you want a real-time analytics - use Azure Stream Analytics. If you want Athena-like experience - use ADLA (still it doesn't support ADLS gen2).
Azure Data Explorer is focused on high velocity, high volume high variance (the 3 Vs of big data). It provides super fast interactive queries over such data that is streaming in. It supports json and text natively, including full text search and indexing.
It is used in a broad set of scenarios associated with sensing activity and time series in a large set of verticals: IoT, API logs, transaction monitoring and ad hoc data exploration.
Microsoft is offering ADX as a service as it is the major service that Microsoft is using for its own telemetry and all the analytical solutions as a service that we offer in Security, operational monitoring, game analytics, product insights usage analytics, Iot, Connected vehicles is built on ADX. You can find a full list in our docs. For clarity, SQL, Synapse, CosmosDB is storing its telemetry in Azure Data explorer...
SQL DW (AKA Synapse SQL pool) is an excellent data warehouse and implements the modern data warehouse pattern. ETL->Curated data model-> Load and serve via analysis services or power BI.
ADX is for real time analytics, enabling applying schema on read (SOR) on data as fresh as seconds old.
Consider ADX as a fully managed platform when replacing SOLR/Lucine based variants used for logs, time series databases and more.
Try it out in large workloads and you will see it is dramatically cheaper than the alternatives and much more powerful and performant.
Reach out to me if you need help.
Azure Data Explorer alias Kusto is focused on high volume data ingestion and almost real-time query and analytics. It is invented at Microsoft for log and telemetry analytics, but can be used for other purposes e.g. Iot, sensor data or web analytics. Same technology is used in Azure internal services like Azure Monitor and Log Analytics.
Similar capabilities could be build on Synapse or Databricks or HDInsight, but I see these as tools that fit much more broad use-cases. ADX has quite narrow focus. ADX does support queries (”KQL”) but has very limited SQL support. It is good for append only data, not for updates. It is not a data warehouse, database or data lake.
Microsoft material refers to the technology behind ADX with name Kusto. More info on this at https://learn.microsoft.com/en-us/azure/data-explorer/kusto/concepts/. A good comparison of services can be found in this blog post: https://vincentlauzon.com/2020/02/19/azure-data-explorer-kusto

Different ways to access data from Azure Data Share to Azure SQL Database

We are working on implementing a new project in Azure. The idea is to move out of on-premise systems into the cloud as we have our vendors, partners and clients moving into the cloud. The option we are trying out is to use Azure Data Share and have Azure SQL Database subscribe to the data.
The thing we are now trying to explore is once a new data snapshot is created how do we import this data into Azure SQL Database?
For instance we have Partner information and this information is made available via Azure Data Share and new data snapshot is created daily.
The part that I am not sure of is how to synchronize this data between Azure Data Share and Azure SQL Database.
Also, Is there an api available to expose this data out to external vendors, partners or clients from Azure SQL Database after we have data sync to Azure SQL Database from Azure Data Share?
Azure Data Share -> Azure SQL Database
Yes, Azure SQL Database is a supported.
Azure Data Share -> SQL Server Database (on-prem)? Is this option supported?
No, SQL Server Database (on-prem) is not supported.
Is there an api that could be consumed to read data?
Unfortunately, there is no such API that could be consumed to read data.
Azure Data Share enables organizations to simply and securely share data with multiple customers and partners. In just a few clicks, you can provision a new data share account, add datasets, and invite your customers and partners to your data share. Data providers are always in control of the data that they have shared. Azure Data Share makes it simple to manage and monitor what data was shared, when and by whom.
Azure Data Share helps enhance insights by making it easy to combine data from third parties to enrich analytics and AI scenarios. Easily use the power of Azure analytics tools to prepare, process, and analyze data shared using Azure Data Share.
Which Azure data stores does Data Share support?
Data Share supports data sharing to and from Azure Synapse Analytics, Azure SQL Database, Azure Data Lake Storage, Azure Blob Storage, and Azure Data Explorer. Data Share will support more Azure data stores in the future.
The below table details the supported data sources for Azure Data Share.
How to synchronize this data between Azure Data Share and Azure SQL Database.
You need to choose “Snapshot setting” to refresh data automatically.
A data provider can configure a data share with a snapshot setting. This allows incremental updates to be received on a regular schedule, either daily or hourly. Once configured, the data consumer has the option to enable the schedule.

Need solution to integrate Grafana with Azure data lake

I want to integrate Azure data lake storage with Grafana for visualization of time series data. I need to know what all the tools I can use to make it possible.
I used ADF to extract data from csv files stored in data lake and move to a table in Azure data explorer. After that, I used Azure data explorer plugin in grafana to visualize the same. It worked fine. But I need to know is there any other approach which may be better or cost-effective.
Integrating Grafana with Azure Data Lake is the best option when compared to others because the other options include data movements using ADF and additional cost for Azure SQL Datawarehouse along with the cost of PowerBI.
Reason:
Grafana is a leading open source software designed for visualizing time series analytics. It is an analytics and metrics platform that enables you to query and visualize data and create and share dashboards based on those visualizations. Combining Grafana’s beautiful visualizations with Azure Data Explorer’s snappy ad hoc queries over massive amounts of data, creates impressive usage potential.
The Grafana and Azure Data Explorer teams have created a dedicated plugin which enables you to connect to and visualize data from Azure Data Explorer using its intuitive and powerful Kusto Query Language. In just a few minutes, you can unlock the potential of your data and create your first Grafana dashboard with Azure Data Explorer.
For more details on visualizing data from Azure Data Explorer in Grafana please visit our documentation, “Visualize data from Azure Data Explorer in Grafana”.
Other options:
For Azure Data Lake Gen1:
You can use a mix of services to create visual representations of data stored in Data Lake Storage Gen1.
You can start by using Azure Data Factory to move data from Data Lake Storage Gen1 to Azure SQL Data Warehouse.
After that, you can integrate Power BI with Azure SQL Data Warehouse to create visual representation of the data.
For Azure Data Lake Gen2:
You can use a mix of services to create visual representations of data stored in Data Lake Storage Gen2.
You can start by using Azure Data Factory to move data from Data Lake Storage Gen2 to Azure SQL Data Warehouse.
After that, you can integrate Power BI with Azure SQL Data Warehouse to create visual representation of the data.
Hope this helps.
They just released a new guide. This is for Grafana 5.3
https://learn.microsoft.com/en-us/azure/data-explorer/grafana
you are able to test this by running Grafana in a Docker container (or for real, if you want). I followed the guide, and it is working almost exactly as expected. The only issue I am having is Grafana is concatenating the column name and the data in the column, making reading and formatting tricky.

Resources