There are lot of unanswered questions on this area and it would be appreciated if someone takes a dig at it.
I am trying to replicate the Velocity chart and Burnup chart from Azure Devops to PowerBI, Is it possible to get the data from Azure DevOps by analytic view or Rest API into Power BI desktop?
I have referred few links but unable to get the solution.
Is there any straight forward way to get the exact data or query used by Azuredevops and connect that data to powerbi?
Unable to calculate planned and completed late data correctly.
Currently using work items-current data table
Here are links I referred:
https://www.linkedin.com/pulse/burnup-charts-powerbi-ferry-van-der-vorst
https://www.linkedin.com/pulse/create-velocity-chart-powerbi-ferry-van-der-vorst
Thank you and appreciate your help in advance.
Related
I am currently working on a project that involves using Azure DevOps as the main management tool. However, our team has decided to analyze the data generated in the platform through Power BI reports. As such, there are some attachments in the work items that we would like to see displayed in PBI, is there any way to do this?
There are multiple ways to connect to the powerbi to devops and get the work item data. Such as by using analytical view, oData queries and oData feed, etc.
To Create power bi reports using analytical view first you have to import data in the power bi, this can be done by selecting azure devops as an option in the Online service tab which itself is in the get data tab in the powerbi destop app
getdata -> online options -> azure devops
After that it will ask you for organization and team and ask you to login to your devops account too.
After this a popup will immerge and in which you can select the work items data based upon the duration time which has passed such today, last 30 days, entire history, etc.
Refer the following documentation
I would like to understand if it possible to automate (or partially automate) the creation of a Logic App connector to Power BI.
basically I would like to automate, as far as it is possible, the connector as described on this tutorial:
https://learn.microsoft.com/en-us/azure/iot-central/retail/tutorial-in-store-analytics-export-data-visualize-insights
Power Bi is using an OAUTH authentication hence I am not sure what is like the correct procedure to automate the full process.
I have been looking those resources but I have not found anything related to PowerBi
https://www.bruttin.com/2017/06/13/deploy-logic-app-with-arm.html
https://github.com/logicappsio/LogicAppConnectionAuth
Thank you in advance
Marco
I started using the Azure Cost Management Connector for Power BI. I could use it without an issue on Power BI desktop and import data into my report. However, when I published the report, it gives me the following error:
My report is pretty basic. It only has Azure Cost Management Connector as a data source. As far as the visuals go, I only have a simple table; no transformations, no customizations whatsoever.
But the problem is, I don't have any column named 'budgetAmount' any of my tables. I tried "Edit Credentials" option in the Dataset section of the report in Power BI report. It successfully logs me in but then throws this error. I'm only using the Usage Details from the Azure Cost Management connector.
Has anyone else faced a similar situation?
02/24: This issue is fixed and rolled out, make sure to install the latest PBI desktop from here
https://www.microsoft.com/en-us/download/details.aspx?id=58494
-------------------------------------------------------------------------------------------------------------------------------
The issue happens because there is no budget available under the "Cost Management -> Budget". Until we get out a fix the workaround for now is to add a test budget in the azure portal and the report should start working.
Steps to setup budget
https://learn.microsoft.com/en-us/azure/cost-management/tutorial-acm-create-budgets
Ywp I have same issue. Its because the cost management plugin pulls data from a table called Budgets, which has an error. You can bypass it in PowerBI desktop, but not if you want to schedule a data refresh in published power BI report. You can see this by going to query editor and selecting source on the RHS, then select the budgets row. There is no way I found to filter this row out before or during it reading data from the source.
Reference image:
I have a SQL Database on Azure and I want to share some information with two friends. And so I've created 3 VIEWS with that info.
The idea is to develop an easy solution to share a dashboard.
I was thinking to:
Make a powerpivot with that 3 VIEWs
Host the powerpivot on azure (trying to schedule the refresh)
Develop a .xlxs with a connection string pointing to my powerpivot on azure
But the problem is.. I think that is not possible to host my powerpivot on azure and be able to schedule the refresh.
Any sugestions?
Use powerBI! it's a complete suite of microsoft created for this problem.
the site is this : https://powerbi.microsoft.com/
You can subscribe, publish your excel and schedule refresh, you can found a tutorial to this site:
https://powerbi.microsoft.com/it-IT/documentation/powerbi-refresh-excel-file-onedrive/
I have created an azure ml experiment which fetches data from API and updates it in sql azure database. My power bi report picks data from this database and displays the report. The data from the source is changing frequently. So I need something like a checkbox in power bi which when checked will trigger the azure ml experiment and update the database with latest data.
I know that we can schedule it to run in Rstudio pipeline but we are not thinking of this approach as it is not financially viable.
Thanks in Advance.
You could use a direct query connection from Power BI to your Azure SQL instance. Then the reports in power bi will be always up to date with the latest data you have. Then the only question is when to trigger the ML experiment. If this really needs to be on demand (rather than on a schedule) you could do that in a button in your own App. You could embed the report in your app so that you get an end to end update.
You could have a look at the Azure Data Factory (ADF), that will help you build data pipelines in the cloud.
You can use ADF to read the data from the API (refresh your data), batch-wise-score it in Azure Machine Learning, and push it directly to your Azure SQL making PowerBI always seeing the latest data which will be scored.
Take a look at the following blog where they take data through this kind of pipeline. You just have to change that the data doesn't come from Stream Analytics but from your API.
http://blogs.msdn.com/b/data_insights_global_practice/archive/2015/09/16/event-hubs-stream-analytics-azureml-powerbi-end-to-end-demo-part-i-data-ingestion-and-preparation.aspx