HI i would like to connect my google sheets to azure and update my table automatically as any changes done in google sheets
i have try using Microsoft flow but doesn't work
Please try below three microsoft services to collaborate with Google sheets.
Logic Apps All Logic Apps regions except the following: - Azure
China regions
Flow All Flow regions PowerApps
All PowerApps regions
Mentioned in this document.
Also,you could vote up and push this feedback to add Google sheet connector in ADF.
Check out SeekWell, you can schedule refreshes from Sheets to your database up to every 5 minutes.
Disclaimer - I'm a cofounder at SeekWell.
It's late but I think it can be helpful for someone.
I like to use Skyvia for connecting Google Sheets and SQL Azure without coding:
https://skyvia.com/data-integration/integrate-google-sheets-sql-azure
All I need is to specify the connections to Google Sheets and SQL Azure and select data to replicate. Skyvia will copy the specified Google Sheets data to SQL Azure and maintain this copy up-to-date automatically with incremental updates.
Related
I am currently working on a project that involves using Azure DevOps as the main management tool. However, our team has decided to analyze the data generated in the platform through Power BI reports. As such, there are some attachments in the work items that we would like to see displayed in PBI, is there any way to do this?
There are multiple ways to connect to the powerbi to devops and get the work item data. Such as by using analytical view, oData queries and oData feed, etc.
To Create power bi reports using analytical view first you have to import data in the power bi, this can be done by selecting azure devops as an option in the Online service tab which itself is in the get data tab in the powerbi destop app
getdata -> online options -> azure devops
After that it will ask you for organization and team and ask you to login to your devops account too.
After this a popup will immerge and in which you can select the work items data based upon the duration time which has passed such today, last 30 days, entire history, etc.
Refer the following documentation
I am creating a report in Power BI, where some data is imported from a cloud storage system. There is also a local data source (an excel sheet) being used.
My question is, if I publish this report on Power BI service and share it with someone, will they be able to see visuals using local data source as well?
There is also possibility of using Sharepoint. I can create a team in Sharepoint with the local excel file and use that as a source in Power BI. Am I correct in assuming this way people in my sharepoint team will be able to see all data in the report?
For your scenario with a spreadsheet from a desktop and a cloud data source:
If you prepare the report using import mode in PowerBI desktop and publish it to PowerBI online, then that report data will be visible to all users with access to the report in the provisioned workspace. The caveat is that data will not be able to be refreshed from the Excel file once the report is deployed online. When you create the report on your desktop, you have access to the cloud data and the spreadsheet, then a copy of that data is published to the PowerBI service. When PowerBI service is set to refresh, then it can't connect to your desktop and causes the issue.
To solve this you either need a personal or standard gateway. This provides the technology for PowerBI to connect to your on-premise (standard gateway) or on-desktop (personal mode) data. Once the gateway is in place, PowerBI can pull data in to the cloud from an on-premise network or a personal desktop to refresh reports.
The other alternative is, as you mention, putting the excel in SharePoint online. This effectively makes the spreadsheet a "cloud data source" and can be refreshed from PowerBI service without the need for a gateway.
I create a model in Azure ML studio.
I deployed the web service.
Now, I know how to check one record at a time, but how can I load a csv file and made the algorithm go through all records ?
If I click on Batch Execution - it will ask me to create an account for Azure storage.
Is any way to execute multiple records from csv file without creating any other accounts?
Yes, there is a way and it is simple. What you need is an excel add-in. You need not create any other account.
You can either read Excel Add-in for Azure Machine Learning web services doc or you can watch Azure ML Excel Add-in video.
If you search for videos on excel add in for azure ml, you get other useful videos too.
I hope this is the solution you are looking for.
Situation: I have a Power BI desktop workbook with a data source connection to Azure Data Lake storage. I can query the storage and get to the datasets without any problem. I've created my visuals and published the workbook to PowerBI.com.
Next: I want to schedule the refresh of the dataset so the reports hit my Azure Data Lake files and get updated. However the OAuth2 popup and dialogue don't seem to work and after completing the login prompts I just get an endless spinning circle that never returns. As below.
Additional things: I've followed several different sets of instructions to register a new app in my Azure AD, granting it PowerBI API services. Assigning the app permissions to the storage etc. But without success. Tried native and web/API apps. Instructions here...
https://powerbi.microsoft.com/en-us/documentation/powerbi-developer-authenticate-a-client-app/
I've testing using various URLs for the data lake storage as my Power BI workbook source. Including ADL://... webhdfs://... and https://... All result in the same situation.
I can confirm my Power BI logins are my Microsoft "Work" accounts. As is my Azure Data Lake storage all connected up correctly with Office 365 and Azure AD.
Questions: What am I missing? I wish there was even an error message to work from. But it gives me nothing. If this were a .Net app giving it the client ID etc would make sense. But Power BI.com just asks for my login details.
Does the Power BI preview connector for Azure Data Lake even work once published to PowerBI.com? I can't find any example out there of anybody else doing this.
Many thanks for your time and assistance friends.
I have a SQL Database on Azure and I want to share some information with two friends. And so I've created 3 VIEWS with that info.
The idea is to develop an easy solution to share a dashboard.
I was thinking to:
Make a powerpivot with that 3 VIEWs
Host the powerpivot on azure (trying to schedule the refresh)
Develop a .xlxs with a connection string pointing to my powerpivot on azure
But the problem is.. I think that is not possible to host my powerpivot on azure and be able to schedule the refresh.
Any sugestions?
Use powerBI! it's a complete suite of microsoft created for this problem.
the site is this : https://powerbi.microsoft.com/
You can subscribe, publish your excel and schedule refresh, you can found a tutorial to this site:
https://powerbi.microsoft.com/it-IT/documentation/powerbi-refresh-excel-file-onedrive/