I am trying to find a way to connect powerBI, but coudn't found any resources to do same. e.g we can connect to mssql,postgress etc
thanks.
You can run Python scripts directly within Power BI and this allows access to the data sets as inputs and outputs back in to the data model.
Here is a link to the official documentation for the feature
You can also use the Python visual. This doesn't allow you to feed in to the data model but instead uses it to generate a visualisation which is then directly displayed.
Documentation on the Python visualisation is here
Related
I'm developing a time series model to anaylize the download traffic inside my organization. Now I'm trying to find a way of automatically running this code everyday and create alerts whenever I'm finding anomalies (high download volumes), so that is not necessary to do it manually. I'd also like to create a dashboard or an easy way to visualize the plots I'm getting in this case.
It'd be something similar to workbooks but with a deeper analysis.
Thanks!
We currently use SPSS Modeler for our analytics and output to excel files for reporting. We automate the running of modeler streams with PS Clementine.
We do have access to SQL server tables within Modeler via ODBC connections.
What I need to do is to automate sending some of the outputs created to an SFTP daily (Filezilla). The outputs currently sit in a Onedrive location.
Ideally I'd like to be able to do some checks on the file I.e how many rows of data held etc. If the checks pass or fail I'd like to then email a distribution list to either advise them to investigate or to advise the file has been transferred to the SFTP successfully.
I've done this using a combination of SAS Cloud / Hadoop/ SAS on prem / Globalscape before.
Is there a solution that suits SPSS modeler/ PS Clementine?
I've searched the forum regarding the following but haven't found a relevant solution for my set up so any help would be very much appreciated.
I don't think this is possible out-of-the-box.
However, I think the following workaround should be possible (assuming SPSS modeler has enough privileges in your environment): You could either use a Python for Spark export node or the SPSS Modeler Scripting API to check the data and use python for the sftp transfer.
We use python to extend the functionality of SPSS Modeler all the time with great success.
Recently Microsoft published the Microsoft Search API (beta) which provides the possibility to index external systems by creating an MS Graph search custom connector.
I created such a connector that was successful so far. I also pushed a few items to the index and in the MS admin center, I created a result type and a vertical. Now I'm able to find the regarded external items in the SharePoint Online modern search center in a dedicated tab belonging to the search vertical created before. So far so good.
But now I wonder:
How can I achieve that the external data is continuously pushed to the MS Search Index? (How can this be implemented? Is there any tutorial or a sample project? What is the underlying architecture?)
Is there a concept of Full / Incremental / Continuous Crawls for a Search Custom Connector at all? If so, how can I "hook" into a crawl in order to update changed data to the index?
Or do I have to implement it all on my own? And if so, what would be a suitable approach?
Thank you for trying out the connector APIs. I am glad to hear that you are able to get items into the index and see the results.
Regarding your questions, the logic for determining when to push items, and your crawl strategy is something that you need to implement on your own. There is no one best strategy per se, and it will depend on your data source and the type of access you have to that data. For example, do you get notifications every time the data changes? If not, how do you determine what data has changed? If none of that is possible, you might need to do a periodic full recrawl, but you will need to consider the size of your data set for ingestion.
We will look into ways to reduce the amount of code you have to write in the future, but right now, this is something you have to implement on your own.
-James
I recently implemented incremental crawling for Graph connectors using Azure functions. I created a timer triggered function that fetches the items updated in the data source since the time of the last function run and then updates the search index with the updated items.
I also wrote a blog post around this approach considering a SharePoint list as the data source. The entire source code can be found at https://github.com/aakashbhardwaj619/function-search-connector-crawler. Hope it would be useful.
i am using a Web server service where i am able to get real-time data using RestAPI calls. Now i want to be able to collect the data - store them somehow and then visualise them in a nice way (produce graphs basically). My approach would be to store them in a database and then use the PowerBI's internal feature "Get Data" from an "SQL Server Database". No idea if this the correct approach. Can anyone advise here ?
Hello and welcome to Stack Overflow!
I agree with Andrey's comment above. But if you want to know about all the Data sources that PowerBI supports connecting to, please check the following resources:
Data sources in Power BI Desktop
Power BI Data Source Prerequisites
Connect to a web page from Power BI Desktop
Real-time streaming in Power BI
Additionally, you may also go through Microsoft Power BI Guided Learning to understand the next steps for visualization.
Hope this helps!
There's another approach that is to build a custom data connector for that API to Power BI.
That allows you to fetch the data inside Power BI, and build the visuals. You can store it in excel files or sql (you can use python scripts for this) and you can schedule refreshes on the service.
Trying to get a live dashboard of the state of my gates (ON, OFF)
The JSON format of my payload is
"msg": {
"time_on": 1437773972742,
"time_off": 1437773974231,
}
Does anyone have experience on how to send the states to power bi without using Azure Stream Analytics or Event Hub?
Edit:
Trying to send two json packages from Node-Red to Power BI to get live updates on my dashboard
If you want to use Stream Analytics you will need to flatten the properties by doing SELECT msg.time_on, msg.time_off FROM Input.
If you don't want to use Stream Analytics you will either need to push the data to one of the sources that Power BI can periodically pull from such as SQL Azure (Note: this will not be real time) or integrate with the Power BI push API by going through the resources here: http://dev.powerbi.com.
Ziv.
I'm not aware of Node-RED either; but there a pretty good samples here: https://github.com/PowerBI. You can also use our API Console (http://docs.powerbi.apiary.io/) to play with the API. The console can generate code for you in common languages like JavaScript, Ruby, Python, C#, etc.
Look at Create Dataset:
http://docs.powerbi.apiary.io/#reference/datasets/datasets-collection/create-a-dataset
and add rows to a table:
http://docs.powerbi.apiary.io/#reference/datasets/table-rows/add-rows-to-a-table-in-a-dataset
HTH, Lukasz