Run sql user Query within Notebook - databricks

We would like to schedule sql Queries to be run and then emailed to a list of email addresses that are not users of databricks. Most BI tools have this feature.
You can schedule a Dashboard refreshes and have that emailed to databricks users, however this does not work for us for two reasons:
We would like to email out to external users.
Only a jpg image of the dashboard is provided, we would want the results in csv.
Is it possible to access the sql Queries saved from a Notebook, run them with the required parameters and then export the results into a csv to be emailed out?

Related

Azure Database -> Excel Query

I have a query being executed in a Azure server periodically and I need to add some code to it, so it can save some data from Tables/Views to a Excel file during the execution.
I have implemented some code like this on other databases (non-Azures), but executing the same code in Azure gives me messages like "Azure doesn't support" some of the tools I used.
What should I use to do this? I just got to save some Tables data to specific sheets in Excel.
Thanks in advance!
In case if the requirement is specific to Excel file creation ; you can use a logic app to query database from Azure SQL database and generate Excel file based on the below link:
https://community.dynamics.com/ax/b/d365fortechies/posts/logic-app-for-azure-sql-db-to-azure-file-storage-workflow
Note: You can select Excel file generation for Logic app rather than CSV mentioned in the above example or generate an CSV file and then convert into Excel
Since OPENDATASOURCE is not supported in Azure SQL. You also can use other ETL tools to save some data from Tables/Views to a Excel.
Such as Azure data factory:
Using Copy activity in Azure data factory, you can query from table, execute your sql query and execute stored procudure then convert to a Excel file. There are multiple destinations for you to choose to store this excel, cloud or local server.

Data entry for power bi report without additional direct query database

Goal:
Integrate power apps with power bi (vica verse) without setting up a SQL database for direct query to allow data input of internal tables to power bi report.
The database we are reading from cannot be used for data entry purposes.
E.g. there is a business process whereby a client must track the location of each asset. This is done by creating a table of a distinct id of each asset and then a data entry drop down is applied so users can update the location of each asset which is updated in the report visuals.
Resources:
https://www.google.com/search?q=integrate+power+bi+and+power+apps+direct+quert&oq=integrate+power+bi...
https://powerusers.microsoft.com/t5/Building-Power-Apps/Filtering-data-in-PowerApps-based-on-Power-B...
https://learn.microsoft.com/en-us/powerapps/maker/canvas-apps/powerapps-custom-visual
Solution attempt 1: creating a sharepoint list to store user entry results. An app was created and embedded within PowerBI, however the power bi report did filter results to the app. Likewise results were not immediately shown in the PowerBi report. SharePoint does not have a direct query function, and is limited to 8 refreshes a day + it could not be refreshed through the web browser, only by scheduled refreshes.
Attempted solution 2: using internal PowerBI datasets. Accessing the data from PowerBI datasets inside of powerapps seemed to be read only. Is this the case?
Question for the forum: how do you create data entry for a powerbi report without setting up another database to store results.

PowerBI - Using an Azure blob as a DataFlow data source

Hopefully someone can help. I have a requirement of creating something like a "bring your own data model" feature/steps so clients can upload their own files, use it in a PowerBI data flow, link it with other sources (databases, services, etc), and create reports from it. The only problem is I'm having problems creating a data flow from an Azure blob file.
The files are CSV format
I have premium feature
I first tested and created an M query in PowerBI Desktop and it works, I can access the data
What I do is I copy my created M query to the Power BI data flow data source in a blank query. The result will is I can preview the data, but if I try to save it, I'm getting this error:
I tried the steps on this link but no luck.
Any ideas?

Best practice to query an Azure database that is being updated repeatedly

I have an Azure database (using SQL Database), and also a separate device that measures floats (not relevant to the question).
As and when the data is being updated, say once every 5 minutes, I wish to update the database so that a new row is being formed with this data. I then intend to connect to PowerBI using the Azure database to form graphs etc.
As mentioned in the title, what would be the best practice? I have done my due diligence and it seems that the best way would just be to update the Azure database. Or should I consider updating a CSV file, then connect the CSV file to the Azure database and update it from there?
Reason why I'm considering to go the CSV file route is because I see that Excel has in-built refresh function, but I couldn't find anything from the Azure side.
https://support.office.com/en-ie/article/refresh-an-external-data-connection-in-excel-1524175f-777a-48fc-8fc7-c8514b984440
If you want to use Excel, you can see this Azure official document: Connect Excel to a single database in Azure SQL database and create a report.
Connect Excel to a single database in Azure SQL Database and import data and create tables and charts based on values in the database. In this tutorial you will set up the connection between Excel and a database table, save the file that stores data and the connection information for Excel, and then create a pivot chart from the database values.
Then, you can use the "Refresh Data" and try the tutorial you have found.
Hope this helps.

PowerBI.com Dataset Refresh for Azure Data Lake with OAuth2

Situation: I have a Power BI desktop workbook with a data source connection to Azure Data Lake storage. I can query the storage and get to the datasets without any problem. I've created my visuals and published the workbook to PowerBI.com.
Next: I want to schedule the refresh of the dataset so the reports hit my Azure Data Lake files and get updated. However the OAuth2 popup and dialogue don't seem to work and after completing the login prompts I just get an endless spinning circle that never returns. As below.
Additional things: I've followed several different sets of instructions to register a new app in my Azure AD, granting it PowerBI API services. Assigning the app permissions to the storage etc. But without success. Tried native and web/API apps. Instructions here...
https://powerbi.microsoft.com/en-us/documentation/powerbi-developer-authenticate-a-client-app/
I've testing using various URLs for the data lake storage as my Power BI workbook source. Including ADL://... webhdfs://... and https://... All result in the same situation.
I can confirm my Power BI logins are my Microsoft "Work" accounts. As is my Azure Data Lake storage all connected up correctly with Office 365 and Azure AD.
Questions: What am I missing? I wish there was even an error message to work from. But it gives me nothing. If this were a .Net app giving it the client ID etc would make sense. But Power BI.com just asks for my login details.
Does the Power BI preview connector for Azure Data Lake even work once published to PowerBI.com? I can't find any example out there of anybody else doing this.
Many thanks for your time and assistance friends.

Resources