Microsoft Azure realtime charts for externals - azure

since a couple of weeks I'm working with Microsoft Azure and I wonder if there is a possibility to create realtime charts in my Web App for external customers.
I know Microsoft provides two different services called 'Power BI', which supports realtime charts and 'Power BI embedded'. But my problem is that, as far as I know, Power BI is only intended for internal users and Power BI embedded, which is inteded for charts e.g. in Web Apps for external customers, only provides reports which are not realtime.
Am I missing something or is it currently not possible to provide realtime charts inside web apps with the given services of Azure? If yes, what would be alternatives to achieve my goal?
Thank you very much in advance.
Kind regards,
Felix

I would look at Power BI Embedded, with the data source using a Direct Query connection to Azure SQL Database or Azure SQL Datawarehouse. Every user action in the report (filtering, drilling etc) will generate a query against the database.
That Power BI Embedded architecture is explained on this page:
https://learn.microsoft.com/en-us/azure/power-bi-embedded/power-bi-embedded-what-is-power-bi-embedded
Direct Query is explained on this page:
https://powerbi.microsoft.com/en-us/documentation/powerbi-azure-sql-database-with-direct-connect/

1) Consider that Real Time is like in an IOT scenario where you see your graphics on your dashboard moving in Real Time and not after a refresh. So in this context you should consider using Azure Stream Analytics Jobs. It's get an input from a blob storage, an event Hub, ..., and then in output you can use your power BI account to write in real time events ingested from Azure Streaming Analytics. Very powerfull! you use SQL for querying the input, the only thing to be aware is the thumbling time window that is somehow new to the SQL language.
2) Letting your customer access to the dashboard I would suggest you to pubblicate your dashboard for free access, and then secure your dashboard inside a web app on which you can apply a security pattern. You can also invite people outside of your organisation via email. Which is faster than the previous solution, but people accessing to your report must have a power BI Pro license. You can use the free trial for 60 days.
Hope that helps!
Cheers!

Related

implement a modern end to end reporting system based on Power BI and Azure Synapse

I am working on modernizing a reporting solution where the data sources are on prem on the customers' sql servers (2014) and the reports are displayed as Power BI reports on the customer's Power BI Service portal. Today I use SSIS to build a data warehouse, as well as an on premise data gateway to ensure the transport of data up to an Azure analysis services which in turn are used by the Power BI reports.
I have been wondering if I could use Azure Synapse to connect to customer data and in a most cost effective way transport data to Azure and link them to the Power BI workspace as a shared dataset. There are many possibilities, but it is important that the customer experiences that the reports are fast and stable, and if possible can cope with near real time.
I experience SSIS being cumbersome and expensive in azure. Are there mechanisms that make it cheap and fast to get data in azure? Do I need a data warehouse (Azure SQL database) or is it better to use data lake as storage for data? Needs to do incremental load too. And what if I need to do some transformations? Should I use Power BI dataflow or do I need to create Azure Data flows to achieve this?
Does anyone have good experience to use synapse (also with DevOps in mind) and get a good DEV, TEST and Prod environment for this? Or is using Synapse a cost driver and a simpler implementation will do? Give me your opinions and if you have links to good articles, please do so. Looking forward to hear from you
regards Geir
The honest answer is it depends on a lot of different things and I don't know that I can give you a solid answer. What I can do is try to focus down which services might be the best option.
It is worth noting that a Power BI dataset is essentially an Analysis Services database behind the scenes, so unless you are using a feature that is specifically only available in AAS and using a live connection, you may be able to eliminate that step. Refresh options are one of the things that are more limited in Power BI though, so the separate AAS DB might be necessary for your scenario.
There is a good chance that Power BI dataflows will work just fine for you if you can eliminate the AAS instance, and they have the added advantage of have incremental refresh as a core feature. In that case, Power BI will store the data in a data lake for you.
Synapse is an option, but probably not the best one for your scenario unless your dataset is large, SQL pools can get quite expensive, especially if you aren't making use of any of the compute options to do transformations.
Data Factory (also available as Synapse pipelines) without the SSIS integration is generally the least expansive option for moving large amounts of data. It allows you to use data flows to do some transformations and has things like incremental load. Outputting to a Data Lake is probably fine and the most cost effective, though in some scenarios something like an Azure SQL instance could be required if you specifically need some of those features.
If they want true real time, it can be done, but none of those tools really are built for it. In most cases the 48 refreshes per day (aka every 30 minutes) available on a Premium capacity are close enough to real time once you dig into the underlying purpose of a given report.
For true real time reporting, you would look at push and/or streaming datasets in Power BI and feed them with something like a Logic App or possibly Stream Analytics. There are a lot of limitations with push datasets though- more than likely you would want to set up a regular Power BI report and dataset and then add the real time dataset as a separate entity in addition to that.
As far as devops goes, pretty much any Azure service can be integrated with a pipeline. In addition to any code, any service or service settings can be deployed via an ARM template or CLI script.
Power BI has improved in the past couple years to have much better support for devops and dev/test/prod environments. Current best practices can be found in the Power BI documentation: https://learn.microsoft.com/en-us/power-bi/create-reports/deployment-pipelines-best-practices

Export Logs From Azure Log Analytics

So I am building a application in azure and I am using Azure Log Analytics and I am trying to find s good way for people on my team that dont have access to azure but need to be able to access the Logs. Does anyone have simple fast ways to create something like this. Good techinologies good ways to give people access to it?
Is using Power BI to ingest your log analytics queries an option?.
The caveat here would be the need to redo any potential charts and graphs however Power BI offers a lot of functionality as well as opportunities to join with other day sets.
In your scenario the trick would be using a service account credentials when publishing the dataset.
You may try to use Azure Log Analytics rest api.
Then you can provide the authentication(it only authenticates to log analytics, not the entire azure) to the end user, and let them write query to fetch the logs; Or you can write a middle-ware, which can process the query request from end users.
So there are a few ways to do this:
You can use the ALA api to generate a home grown log portal
There are multiple SAS options out there
DataDog
Splunk
AppDynamics
Power Bi
Not specifically logs but prometheus and grafa for matrics and alerts and its dirt cheap compared to app insights

Power BI Report using Facebook real time data

I want to show the insights(page view, comments, posts, etc) data of my Facebook page in a Power BI dashboard, that is embedded into my web-page, I am able to show the dummy report by downloading the Facebook page CSV report. If I have to real-time update the dashboard what is the procedure for that.
there are at least two options:
Connector
API
Connector
At this very moment there might be an issue connecting Power BI to Facebook but in general you should be able to use the built-in connector.
Please note that with the connector, once your report is published in Power BI Service it will not refresh data in real-time, instead in semi-real time. For example, you can define to refresh the data source up to 8 times a day (using Pro license) or even more if you have Enterprise Premium Capacity.
API
Instead, if you really need to deal with real-time data the alternative is using the API offered in the real-time streaming service of Power BI.

How to create workspace in A1 cored PowerBI service in azure portal?

Over the months of exploration into PBI, started with successfully creating a workspace using PowerBI pro license and ended with hosting a pbi report embedding into my custom MVC site using apps-own-data model.
First experience is maximum allowed embedded tokens running out.
My company decided to create a dedicated A1 core powerbi embedded service in a azure account. Now I have overcame token running out of count issue but seems cringy that my powerbi embedded service besides paused still my embedded site runs and accesses powerbi reports without any interruption.
Previously have created AD using embed tool provided by microsoft. I can see my AD been created in azure portal too.
How this is possible to view a pbi report where my azure powerbi embedded service been paused.
Am i supposed to use those pbi reports without getting billed?
Microsoft has limited information on documentation to clarify my doubts, but the PBI community site is somewhat helpful still having trouble getting clarification for the same.
Help required.
For your question:
How this is possible to view a pbi report where my azure powerbi embedded service been paused. Am i supposed to use those pbi reports without getting billed?
If the A1 Node is paused, then no, you will not be able to see your report or use the service in your front end. It has to be running to deliver the reports in your custom front end. You can still go into the Power BI Service with an assigned Power BI Pro licence and see your report, the workspace that the report has been deployed to, is flagged as 'embedded capacity' that will be shown as a diamond shape next to it.
You allocate the workspace to a capacity by editing the workspace and selecting the 'Advanced' option then 'Dedicated Capacity'
The MS documentation outlines pausing will not deliver content.
Pausing a capacity may prevent content from being available within
Power BI. Make sure to unassign workspaces from your capacity before
pausing to prevent interruption.
Pausing is designed to allow you to stop delivering connect for example, out side business hours, I have a few clients that only run their internal and external report during 7am to 7pm, the other 12 hours the service is paused. The A sku billing costs are reduced to 50%.
Hope that helps

Explore long-term data with Azure Application Insights and PowerBi

I use continuous export to pull in my Application Insights data into Power Bi. However, all of the data seems to be either 7-day or 30-day - how would I be able to view a chart with a longer-term timeframe (aka users over the last year)?
Update
Here is what I see in Power Bi:
From the screenshot, it looks like you're not using Continuous Export for this, but instead are using the Application Insights Content Pack for Power BI. The content pack has predefined views which us what you circled in your screenshot.
You can easily create your own viewed using the almost-ready-to-release Application Insights REST API (tracked with this UserVoice suggestion).
If you want to try this with the API, please send me a note at dalek#microsoft.com and I'll set you up. In the API documentation I show step-by-step how you can create charts from metric data in Power BI.
Thanks
Dale Koetke

Resources