Select best Azure storage for visualization and analysis - azure

I am writing tool to analyze data coming from a race simulator, I have two use cases:
Display live telemetry on a chart - so mostly visualization of incoming stuff, to detect manually anomalies
Calculate own metrics, analyze data and suggest actions based on them - this can be done after a session, doesn't have to be calculated live. Now I am focusing solely on storing data but I have to keep in mind that later it needs to be analyzed.
I was thinking about utilizing Event Hub to handle streaming of events, question is how to visualize data in the easiest way and what's the optimal storage for second use case - it has to be big data solution I believe, there will be many datapoints to analyze.

For visualization you can use Power Bi or another visualization tool running on containers.
For storing, you can go with Azure Time Series Insights or just sink from Event Hubs to Azure Cosmos DB and then, connect power bi on it to create your charts.
https://learn.microsoft.com/en-us/azure/time-series-insights/overview-what-is-tsi

Related

What is Azure Data Explorer? A Datalake? A datawarehouse?

As the title says, I'm confused about the role Azure Data Explorer has in the Azure data ecosystem. The documentation states that it's an analytics tool, but technically it ingests data from different sources such as kafka, spark and go on.
Is it some kind of enhanced datawarehouse?
TIA
"For our own troubleshooting needs we wanted to run ad-hoc queries on
the massive telemetry data stream produced by our service. Finding no
suitable solution, we decided to create one"
                            - Ziv Caspi Architect, Azure Data Explorer -
Once we established the need, we can discuss the implementation.
Here are some key features:
The service is distributed and can be easily scaled out (or in) which makes it good fit for big data (big as you need).
The data is ingested into the service in batch/stream and stored in a propriety format.
The data is stored in tables (columns & rows).
Columns' data types include bool, int, long, real, decimal, datetime & timespan as well as native support for JSON (the dynamic data type).
Everything is indexed, including free text that is tokenized and indexed with Full-text search index, which mean we can find rows with specific tokens in sub-seconds - seconds.
The data is stored in a columnar format which makes it great for aggregations on large volumes.
ADX has its own highly intuitive query language, KQL (Kusto Query Language), which supports numerous analytical features including distributed joins.
ADX has native support for time-series with a lot of built-in functionality around it (forecast, anomaly detection etc.).
Since the service was created to handle telemetry and telemetry does not change over time, the service was created as append only (inserts) + built-in support for data retention.
Later on, soft & hard deletes were added.
As of today, updates are not supported.
Here is some additional reading:
Ziv Caspi: Azure Data Explorer Technology 101
Brian Harry: Introducing Application Insights Analytics
Uri Barash: Azure Announcements: Azure Data Explorer

Ideal Azure/Power BI Architecture for Stream Analytics with RLS

We are using Azure Stream Analytics to build out a new IoT product. The data is successfully streaming to Power BI but there is no way to implement Row Level Security so we can display this data back to a customer, limited to only that customer's data. I am considering adding an Azure SQL DB between ASA and PBI and switching the PBI Dataset from a streaming dataset to Direct Query with a high page refresh rate but this seems like it will be a very intense workload for an Azure SQL DB to handle. There is the potential, as the product grows, for multiple inserts per second and querying every couple of seconds. Streaming seems like the better answer besides the missing RLS. Any tips?
There is the potential, as the product grows, for multiple inserts per second and querying every couple of seconds.
A small Azure SQL Database should handle that load. 1000/sec simple. 100,000/sec is probably too much.
And ASA can ensure that the output streams are not too frequent.

How can I decide, if I should use the Power BI API to push data into my streaming dataset or Azure Stream Analytics?

I am very new to Azure. I need to create a Power BI dashboard to visualize some data produced by a sensor. The dashboard needs to get updated "almost" real-time. I have identified that I need a push data set, as I want to visualize some historic data on a line chart. However, from the architecture point of view, I could use the Power BI REST APIs (which would be completely fine in my case, as we process the data with a Python app and I could use that to call Power BI) or Azure Stream Analytics (which could also work, I could dump the data to the Azure Blob storage from the Python app and then stream it).
Can you tell me generally speaking, what are the advantages/disadvantages of the two approaches?
Azure stream analytics lets you have multiple sources and define multiple targets and one of those targets could be Power-BI and Blob ... and at the same time you can use windowing function on the data as it comes in. It also provides you a visual way of managing your pipeline including windowing function.
In your case you are kind of replicating the incoming data to Blob first and secondly to power-BI. But if you have a use case to apply windowing function(1 minutes or so) as your data is coming in from multiple sources e.g. more than one sensor or a senor and other source, you have to fiddle around a lot to get it working manually, where as in stream analytics you can easily do it.
Following article highlights some of the pros and cons of Azure Analytics...
https://www.axonize.com/blog/iot-technology/the-advantages-and-disadvantages-of-using-azure-stream-analytics-for-iot-applications/
If possible, I would recommend streaming data to IoT Hub first, and then ASA can pick it up and render the same on Power BI. It will provide you better latency than streaming data from Blob to ASA and then Power BI. It is the recommended IoT pattern for remote monitoring, predictive maintenance etc , and provides you longer term options to add a lot of logic in the real-time pipelines (ML scoring, windowing, custom code etc).

Real time Streaming data into Azure Datawarehouse from sql server

I'm trying to build a real-time reporting service on top of Microsoft Azure Data Warehouse. Currently I have a SQL server with about 5 TB of data. I want to stream the data to the data warehouse and use Azure DW's computation power to generate real-time reporting based on data. Is there any ready to use/best practices to do that?
One approach I was considering is to load data into Kafka and then stream it into Azure DW via Spark streaming. However, this approach is more near-real-time than real-time. Is there any way to utilize SQL Server Change Data Capture to stream data into data warehouse?
I don't personally see Azure SQL Data Warehouse in a real-time architecture. It's a batch MPP system optimised for shredding billions of rows over multiple nodes. Such a pattern is not synonymous with sub-second or real-time performance, in my humble opinion. Real-time architectures tend to look more like Event Hubs > Stream Analytics in Azure. The low concurrency available (ie currently a max of 32 concurrent users) is also not a good fit for reporting.
As an alternative you might consider Azure SQL Database in-memory tables for fast load and then hand off to the warehouse at a convenient point.
You could Azure SQL Data Warehouse in a so-called Lambda architecture with a batch and real-time element, where is supports the batch stream. See here for further reading:
https://social.technet.microsoft.com/wiki/contents/articles/33626.lambda-architecture-implementation-using-microsoft-azure.aspx
If you’re looking for a SQL-based SaaS solution to power realtime reporting applications we recently released a HTTP API product called Stride, which is based on the open-source streaming-SQL database we build, PipelineDB, that can handle this type of workload.
The Stride API enables developers to run continuous SQL queries on streaming data and store the results of continuous queries in tables that get incrementally updated as new data arrives. This might be a simpler approach to adding the type of realtime analytics layer you mentioned above.
Feel free to check out the Stride technical docs for more detail.

Application insight -> export -> Power BI Data Warehouse Architecture

Our team have just recently started using Application Insights to add telemetry data to our windows desktop application. This data is sent almost exclusively in the form of events (rather than page views etc). Application Insights is useful only up to a point; to answer anything other than basic questions we are exporting to Azure storage and then using Power BI.
My question is one of data structure. We are new to analytics in general and have just been reading about star/snowflake structures for data warehousing. This looks like it might help in providing the answers we need.
My question is quite simple: Is this the right approach? Have we over complicated things? My current feeling is that a better approach will be to pull the latest data and transform it into a SQL database of facts and dimensions for Power BI to query. Does this make sense? Is this what other people are doing? We have realised that this is more work than we initially thought.
Definitely pursue Michael Milirud's answer, if your source product has suitable analytics you might not need a data warehouse.
Traditionally, a data warehouse has three advantages - integrating information from different data sources, both internal and external; data is cleansed and standardised across sources, and the history of change over time ensures that data is available in its historic context.
What you are describing is becoming a very common case in data warehousing, where star schemas are created for access by tools like PowerBI, Qlik or Tableau. In smaller scenarios the entire warehouse might be held in the PowerBI data engine, but larger data might need pass through queries.
In your scenario, you might be interested in some tools that appear to handle at least some of the migration of Application Insights data:
https://sesitai.codeplex.com/
https://github.com/Azure/azure-content/blob/master/articles/application-insights/app-insights-code-sample-export-telemetry-sql-database.md
Our product Ajilius automates the development of star schema data warehouses, speeding the development time to days or weeks. There are a number of other products doing a similar job, we maintain a complete list of industry competitors to help you choose.
I would continue with Power BI - it actually has a very sophisticated and powerful data integration and modeling engine built in. Historically I've worked with SQL Server Integration Services and Analysis Services for these tasks - Power BI Desktop is superior in many aspects. The design approaches remain consistent - star schemas etc, but you build them in-memory within PBI. It's way more flexible and agile.
Also are you aware that AI can be connected directly to PBI Web? This connects to your AI data in minutes and gives you PBI content ready to use (dashboards, reports, datasets). You can customize these and build new reports from the datasets.
https://powerbi.microsoft.com/en-us/documentation/powerbi-content-pack-application-insights/
What we ended up doing was not sending events from our WinForms app directly to AI but to the Azure EventHub
We then created a job that reads from the eventhub and send the data to
AI using the SDK
Blob storage for later processing
Azure table storage to create powerbi reports
You can of course add more destinations.
So basically all events are send to one destination and from there stored in many destinations, each for their own purposes. We definitely did not want to be restricted to 7 days of raw data and since storage is cheap and blob storage can be used in many analytics solutions of Azure and Microsoft.
The eventhub can be linked to stream analytics as well.
More information about eventhubs can be found at https://azure.microsoft.com/en-us/documentation/articles/event-hubs-csharp-ephcs-getstarted/
You can start using the recently released Application Insights Analytics' feature. In Application Insights we now let you write any query you would like so that you can get more insights out of your data. Analytics runs your queries in seconds, lets you filter / join / group by any possible property and you can also run these queries from Power BI.
More information can be found at https://azure.microsoft.com/en-us/documentation/articles/app-insights-analytics/

Resources