How to get Data from Node-RED/ Octoblu to Power BI - azure

Trying to get a live dashboard of the state of my gates (ON, OFF)
The JSON format of my payload is
"msg": {
"time_on": 1437773972742,
"time_off": 1437773974231,
}
Does anyone have experience on how to send the states to power bi without using Azure Stream Analytics or Event Hub?
Edit:
Trying to send two json packages from Node-Red to Power BI to get live updates on my dashboard

If you want to use Stream Analytics you will need to flatten the properties by doing SELECT msg.time_on, msg.time_off FROM Input.
If you don't want to use Stream Analytics you will either need to push the data to one of the sources that Power BI can periodically pull from such as SQL Azure (Note: this will not be real time) or integrate with the Power BI push API by going through the resources here: http://dev.powerbi.com.
Ziv.

I'm not aware of Node-RED either; but there a pretty good samples here: https://github.com/PowerBI. You can also use our API Console (http://docs.powerbi.apiary.io/) to play with the API. The console can generate code for you in common languages like JavaScript, Ruby, Python, C#, etc.
Look at Create Dataset:
http://docs.powerbi.apiary.io/#reference/datasets/datasets-collection/create-a-dataset
and add rows to a table:
http://docs.powerbi.apiary.io/#reference/datasets/table-rows/add-rows-to-a-table-in-a-dataset
HTH, Lukasz

Related

How can I decide, if I should use the Power BI API to push data into my streaming dataset or Azure Stream Analytics?

I am very new to Azure. I need to create a Power BI dashboard to visualize some data produced by a sensor. The dashboard needs to get updated "almost" real-time. I have identified that I need a push data set, as I want to visualize some historic data on a line chart. However, from the architecture point of view, I could use the Power BI REST APIs (which would be completely fine in my case, as we process the data with a Python app and I could use that to call Power BI) or Azure Stream Analytics (which could also work, I could dump the data to the Azure Blob storage from the Python app and then stream it).
Can you tell me generally speaking, what are the advantages/disadvantages of the two approaches?
Azure stream analytics lets you have multiple sources and define multiple targets and one of those targets could be Power-BI and Blob ... and at the same time you can use windowing function on the data as it comes in. It also provides you a visual way of managing your pipeline including windowing function.
In your case you are kind of replicating the incoming data to Blob first and secondly to power-BI. But if you have a use case to apply windowing function(1 minutes or so) as your data is coming in from multiple sources e.g. more than one sensor or a senor and other source, you have to fiddle around a lot to get it working manually, where as in stream analytics you can easily do it.
Following article highlights some of the pros and cons of Azure Analytics...
https://www.axonize.com/blog/iot-technology/the-advantages-and-disadvantages-of-using-azure-stream-analytics-for-iot-applications/
If possible, I would recommend streaming data to IoT Hub first, and then ASA can pick it up and render the same on Power BI. It will provide you better latency than streaming data from Blob to ASA and then Power BI. It is the recommended IoT pattern for remote monitoring, predictive maintenance etc , and provides you longer term options to add a lot of logic in the real-time pipelines (ML scoring, windowing, custom code etc).

Snowflake Connection & Setup

I'm working with a client to manage an integration with their closed system CRM to their email platform (called MYGuestList) and BI reporting platform (Tableau).
Our CRM is doing to push a data replication to a SQL server (Microsoft Azure). We've been interested to bring Snowflake into the tech stack to house all data points (Google Analytics, email marketing efforts, media etc.) to produce this "single customer view" and manage our integrations.
I'm a bit lost as to what we do next (I can't seem to reach out to Snowflake for any sort of support) and would love any guidance on the following questions:
How do I go about connecting our Azure server (once set up) to Snowflake?
How might I suggest our third party developers look to integrate with our data points in Snowflake?
Do I need to purchase a third party connector (FiveTran/Stitch) to ETL data from Google Analytics?
Thank you in advance for your help with this very big newbie trying to solve this problem for my client!
Kate
You have this 3 options to load data:
Use some ETL tool, or own eg Python app to get data from source system and insert it into Snowflake (directly or to PUT file to Snowflake stage)
extract data as csv to cloud directory or Snowflake stage, and use bulk load (COPY INTO) to load it
you may use PIPE to automaticaly load data from files inserted like in second option

Visualisation of Web Data through API Calls

i am using a Web server service where i am able to get real-time data using RestAPI calls. Now i want to be able to collect the data - store them somehow and then visualise them in a nice way (produce graphs basically). My approach would be to store them in a database and then use the PowerBI's internal feature "Get Data" from an "SQL Server Database". No idea if this the correct approach. Can anyone advise here ?
Hello and welcome to Stack Overflow!
I agree with Andrey's comment above. But if you want to know about all the Data sources that PowerBI supports connecting to, please check the following resources:
Data sources in Power BI Desktop
Power BI Data Source Prerequisites
Connect to a web page from Power BI Desktop
Real-time streaming in Power BI
Additionally, you may also go through Microsoft Power BI Guided Learning to understand the next steps for visualization.
Hope this helps!
There's another approach that is to build a custom data connector for that API to Power BI.
That allows you to fetch the data inside Power BI, and build the visuals. You can store it in excel files or sql (you can use python scripts for this) and you can schedule refreshes on the service.

Application insight -> export -> Power BI Data Warehouse Architecture

Our team have just recently started using Application Insights to add telemetry data to our windows desktop application. This data is sent almost exclusively in the form of events (rather than page views etc). Application Insights is useful only up to a point; to answer anything other than basic questions we are exporting to Azure storage and then using Power BI.
My question is one of data structure. We are new to analytics in general and have just been reading about star/snowflake structures for data warehousing. This looks like it might help in providing the answers we need.
My question is quite simple: Is this the right approach? Have we over complicated things? My current feeling is that a better approach will be to pull the latest data and transform it into a SQL database of facts and dimensions for Power BI to query. Does this make sense? Is this what other people are doing? We have realised that this is more work than we initially thought.
Definitely pursue Michael Milirud's answer, if your source product has suitable analytics you might not need a data warehouse.
Traditionally, a data warehouse has three advantages - integrating information from different data sources, both internal and external; data is cleansed and standardised across sources, and the history of change over time ensures that data is available in its historic context.
What you are describing is becoming a very common case in data warehousing, where star schemas are created for access by tools like PowerBI, Qlik or Tableau. In smaller scenarios the entire warehouse might be held in the PowerBI data engine, but larger data might need pass through queries.
In your scenario, you might be interested in some tools that appear to handle at least some of the migration of Application Insights data:
https://sesitai.codeplex.com/
https://github.com/Azure/azure-content/blob/master/articles/application-insights/app-insights-code-sample-export-telemetry-sql-database.md
Our product Ajilius automates the development of star schema data warehouses, speeding the development time to days or weeks. There are a number of other products doing a similar job, we maintain a complete list of industry competitors to help you choose.
I would continue with Power BI - it actually has a very sophisticated and powerful data integration and modeling engine built in. Historically I've worked with SQL Server Integration Services and Analysis Services for these tasks - Power BI Desktop is superior in many aspects. The design approaches remain consistent - star schemas etc, but you build them in-memory within PBI. It's way more flexible and agile.
Also are you aware that AI can be connected directly to PBI Web? This connects to your AI data in minutes and gives you PBI content ready to use (dashboards, reports, datasets). You can customize these and build new reports from the datasets.
https://powerbi.microsoft.com/en-us/documentation/powerbi-content-pack-application-insights/
What we ended up doing was not sending events from our WinForms app directly to AI but to the Azure EventHub
We then created a job that reads from the eventhub and send the data to
AI using the SDK
Blob storage for later processing
Azure table storage to create powerbi reports
You can of course add more destinations.
So basically all events are send to one destination and from there stored in many destinations, each for their own purposes. We definitely did not want to be restricted to 7 days of raw data and since storage is cheap and blob storage can be used in many analytics solutions of Azure and Microsoft.
The eventhub can be linked to stream analytics as well.
More information about eventhubs can be found at https://azure.microsoft.com/en-us/documentation/articles/event-hubs-csharp-ephcs-getstarted/
You can start using the recently released Application Insights Analytics' feature. In Application Insights we now let you write any query you would like so that you can get more insights out of your data. Analytics runs your queries in seconds, lets you filter / join / group by any possible property and you can also run these queries from Power BI.
More information can be found at https://azure.microsoft.com/en-us/documentation/articles/app-insights-analytics/

How do you make a connection to Power BI through an application's API

I have an Azure SQL Database and have made a direct connection from Power BI to it. The problem is that to successfully import the data, I had to give direct access to the data through the database firewall which I cannot allow.
Is there a way to use my application's API as the data source for Power BI rather than SQL.
You cannot do that.
Most of the tools that work on representing/caching/plotting data work with industry-standard adapters (sql, mongo, hadoop, etc.). There are a varieties of reasons for that.
Some simpler tools might exist where you can push data for reprsentation but that kills the power of things like PowerBI, Periscope or ChartIO.
Now, why not grant PowerBI access to your database?
One option I would suggest is that you could make a small piece of code that gets the necessary data (either through your API or directly from DB) and pushes it to Power BI through their REST API.
You can query an API via PowerBI. Please see my answer to a similar question.
If you can, I would recommend using OData, as PowerBi plays well with it.
https://powerbi.microsoft.com/en-us/documentation/powerbi-desktop-tutorial-analyzing-sales-data-from-excel-and-an-odata-feed/

Resources