PowerBI.com Dataset Refresh for Azure Data Lake with OAuth2 - azure

Situation: I have a Power BI desktop workbook with a data source connection to Azure Data Lake storage. I can query the storage and get to the datasets without any problem. I've created my visuals and published the workbook to PowerBI.com.
Next: I want to schedule the refresh of the dataset so the reports hit my Azure Data Lake files and get updated. However the OAuth2 popup and dialogue don't seem to work and after completing the login prompts I just get an endless spinning circle that never returns. As below.
Additional things: I've followed several different sets of instructions to register a new app in my Azure AD, granting it PowerBI API services. Assigning the app permissions to the storage etc. But without success. Tried native and web/API apps. Instructions here...
https://powerbi.microsoft.com/en-us/documentation/powerbi-developer-authenticate-a-client-app/
I've testing using various URLs for the data lake storage as my Power BI workbook source. Including ADL://... webhdfs://... and https://... All result in the same situation.
I can confirm my Power BI logins are my Microsoft "Work" accounts. As is my Azure Data Lake storage all connected up correctly with Office 365 and Azure AD.
Questions: What am I missing? I wish there was even an error message to work from. But it gives me nothing. If this were a .Net app giving it the client ID etc would make sense. But Power BI.com just asks for my login details.
Does the Power BI preview connector for Azure Data Lake even work once published to PowerBI.com? I can't find any example out there of anybody else doing this.
Many thanks for your time and assistance friends.

Related

Azure synapse link to connect Power App Dataverse to load azure storage gen2

I am new to this technologies and trying to connect under azure power apps there is option to connect azure synapse link to configure the azure storage gen2 and once connected, what ever the azure CRM 365 tables available, i can see the each table in that new link which i was created, after that the data loaded in azure storage but output totally changed. if i got normal crm to export excel data is different. i want to export data what ever the crm 365, i can see in storage as well with header. please help me.
I have to add or export with schema name(column as a header in that CSV file). and there is default partition is month, i have to change customization or remove the partition. please advise. how do we handle this.

Azure Monitor Export to a SQL Server

I need the near real-time front end data from a web app for use in PowerBI. I need to keep this data forever.
I would like to automatically export the App customEvents and pageViews tables for this purpose.
It seems like I need to go from Azure Logs -> Azure Storage Account -> Azure SQL Server -> PowerBI
The steps I'm having trouble with are going from Logs to storage, and then getting the data that's passed into there into a SQL server.
To send logs to Storage Accounts, Event Hubs and Log Analytics, go to the App Service and on the left panel select Diagnostic setting and click on + Diagnostic settings.
Select the options which are shown in below image to store the logs in Storage account and click on Save.
You can now use Azure Data Factory service to copy the logs from Azure Storage account to Azure SQL Database.
Please refer this tutorial from Microsoft – Copy data from Azure Blob storage to a SQL Database by using the Copy Data tool to implement the same.
Once data available in Database, we are good to use Power BI to read the data.
Open the Power BI dashboard and click on Get data from another source ->.
Select Azure -> Azure SQL Database and click on Connect.
Give the server’s name.
In the next step just give the username and password for your account and you will get the access.
Now you can select the data from any table and showcase it in Power BI dashboard as per of your requirement.

Importing Azure DevOps Work Item attachments to PBI

I am currently working on a project that involves using Azure DevOps as the main management tool. However, our team has decided to analyze the data generated in the platform through Power BI reports. As such, there are some attachments in the work items that we would like to see displayed in PBI, is there any way to do this?
There are multiple ways to connect to the powerbi to devops and get the work item data. Such as by using analytical view, oData queries and oData feed, etc.
To Create power bi reports using analytical view first you have to import data in the power bi, this can be done by selecting azure devops as an option in the Online service tab which itself is in the get data tab in the powerbi destop app
getdata -> online options -> azure devops
After that it will ask you for organization and team and ask you to login to your devops account too.
After this a popup will immerge and in which you can select the work items data based upon the duration time which has passed such today, last 30 days, entire history, etc.
Refer the following documentation

Power BI - question about local and cloud data source

I am creating a report in Power BI, where some data is imported from a cloud storage system. There is also a local data source (an excel sheet) being used.
My question is, if I publish this report on Power BI service and share it with someone, will they be able to see visuals using local data source as well?
There is also possibility of using Sharepoint. I can create a team in Sharepoint with the local excel file and use that as a source in Power BI. Am I correct in assuming this way people in my sharepoint team will be able to see all data in the report?
For your scenario with a spreadsheet from a desktop and a cloud data source:
If you prepare the report using import mode in PowerBI desktop and publish it to PowerBI online, then that report data will be visible to all users with access to the report in the provisioned workspace. The caveat is that data will not be able to be refreshed from the Excel file once the report is deployed online. When you create the report on your desktop, you have access to the cloud data and the spreadsheet, then a copy of that data is published to the PowerBI service. When PowerBI service is set to refresh, then it can't connect to your desktop and causes the issue.
To solve this you either need a personal or standard gateway. This provides the technology for PowerBI to connect to your on-premise (standard gateway) or on-desktop (personal mode) data. Once the gateway is in place, PowerBI can pull data in to the cloud from an on-premise network or a personal desktop to refresh reports.
The other alternative is, as you mention, putting the excel in SharePoint online. This effectively makes the spreadsheet a "cloud data source" and can be refreshed from PowerBI service without the need for a gateway.

PowerBI - Using an Azure blob as a DataFlow data source

Hopefully someone can help. I have a requirement of creating something like a "bring your own data model" feature/steps so clients can upload their own files, use it in a PowerBI data flow, link it with other sources (databases, services, etc), and create reports from it. The only problem is I'm having problems creating a data flow from an Azure blob file.
The files are CSV format
I have premium feature
I first tested and created an M query in PowerBI Desktop and it works, I can access the data
What I do is I copy my created M query to the Power BI data flow data source in a blank query. The result will is I can preview the data, but if I try to save it, I'm getting this error:
I tried the steps on this link but no luck.
Any ideas?

Resources