Azure Cross Directory Data Access - azure

I'm currently developing a Azure solution for one of my managed service clients.
we are developing a power bi service for their Azure backup/ azure recovery.
we are looking to host the whole process in our own azure environment, however we cannot get the data from A) their recovery vault logs into B) our Azure environment.
Anyone have any ideas on how to move data from their environment into our environment storage?
thank you

Power BI based reporting gets the data from storage accounts which store Azure Backup data. Once customer configured diagnostic settings to send data to a storage account(ask him to create a dedicated storage account for this), he can share the access keys with you so that you can connect to customer's storage account to get the required data and run Power Bi report in your environment.
This doc has all the details - with the change being in this case, customer will store data in his storage account and provide you access to that storage account via access key.

Related

Azure Monitor Export to a SQL Server

I need the near real-time front end data from a web app for use in PowerBI. I need to keep this data forever.
I would like to automatically export the App customEvents and pageViews tables for this purpose.
It seems like I need to go from Azure Logs -> Azure Storage Account -> Azure SQL Server -> PowerBI
The steps I'm having trouble with are going from Logs to storage, and then getting the data that's passed into there into a SQL server.
To send logs to Storage Accounts, Event Hubs and Log Analytics, go to the App Service and on the left panel select Diagnostic setting and click on + Diagnostic settings.
Select the options which are shown in below image to store the logs in Storage account and click on Save.
You can now use Azure Data Factory service to copy the logs from Azure Storage account to Azure SQL Database.
Please refer this tutorial from Microsoft – Copy data from Azure Blob storage to a SQL Database by using the Copy Data tool to implement the same.
Once data available in Database, we are good to use Power BI to read the data.
Open the Power BI dashboard and click on Get data from another source ->.
Select Azure -> Azure SQL Database and click on Connect.
Give the server’s name.
In the next step just give the username and password for your account and you will get the access.
Now you can select the data from any table and showcase it in Power BI dashboard as per of your requirement.

How to load a specific directory in a Azure container into powerBI

I want to load data into powerBI from a specific directory from an Azure container in an Azure storage account , I have tried using get data -> Azure -> Azure Blob Storage but it's getting all the data in that container , But I need only data from a specific directory in that container
Any help Appreciated
Yes, this capability is not supported in Power BI as of today. Here are some similar ideas on Power BI Ideas forum that you can vote for:
Allow PowerBI to connect to subfolder in azure blob storage
Allow power bi to connect to a particular blob file from a subfolder in the Azure Blob Store

Power BI Dataflow access network restricted Azure Storage Account

I'm using Power BI Dataflows to access spreadsheets I have in blob storage. I have configured IAM permissions on the storage account for myself and the Power BI Service user. The network configuration is set to 'Allow trusted Microsoft services to access this storage account' and 'Microsoft network routing endpoint' preferences.
First Test: Storage Account allowing access from all networks
I am able to access the spreadsheet from the Power BI Service and perform transformations.
Second Test: Storage Account allowing only selected networks
In this case, I have added a group of CIDR blocks for other services that need to access the storage account. I have also added the whitelists for the Power BI Service and PowerQueryOnline service using both the deprecated list and new json list.
When running the same connection from Power BI Service Dataflows I now get the 'Invalid Credentials' error message. After turning on logging for the storage account and running another successful test it looks like the requests are coming from private IP addresses (10.0.1.6), not any of the public ranges.
2.0;2020-09-18T12:57:17.0000567Z;ListFilesystems;OAuthSuccess;200;4;4;bearer;restrictiedmobacc;restrictiedmobacc;blob;"https://restrictiedmobacc.dfs.core.windows.net/?resource=account";"/restrictiedmobacc";7a6efbbd-e01f-004c-31bb-8d39a9000000;0;10.0.1.6;2018-06-17;2185;0;184;108;0;;;"gzip, deflate";Monday, 01-Jan-01 00:00:00 GMT;;"Microsoft.Data.Mashup (https://go.microsoft.com/fwlink/?LinkID=304225)";;"f5d7d551-0291-e765-f20d-09a337164e19";"31cae3e8-e77a-4db2-9050-a69c0555d912";"2f6a613f-ba8c-4432-bdb8-9a0ea0a9f51d";"b52893c8-bc2e-47fc-918b-77022b299bbc";"https://storage.azure.com";"https://sts.windows.net/2f6a613f-ba8c-4432-bdb8-9a0ea0a9f51d/";"<MY EMAIL ADDRESS>";;"{"action":"Microsoft.Storage/storageAccounts/blobServices/containers/read", "roleAssignmentId":"9fe216db-d682-462c-b408-4133a454ef1a", "roleDefinitionId":"8e3af657-a8ff-443c-a75c-2fe8c4bcb635", "principals": [{"id": "31cae3e8-e77a-4db2-9050-a69c0555d912", "type":"User"}], "denyAssignmentId":""}"
I'm at a loss as what to try next, it is a requirement that this storage account not be open to the world. I have read that you can use a On Premise Data Gateway so that you can lock the address range down to that device, but I don't really want to go down that route.
Have you tried to enable a Service endpoint for Azure Storage within the VNet?
The service endpoint routes traffic from the VNet through an optimal path to the Azure Storage service.
Could you also check if you have whitelisted the following links, you will find them in this link:
https://learn.microsoft.com/en-us/power-bi/admin/power-bi-whitelist-urls
Kr,
Abdel
After speaking with Microsoft Support I have been told
It is not possible to connect Power BI Service with a storage account that has restricted network access enabled.
However, after doing some reading on Azure Data Factory I noticed a statement...
"Services deployed in the same region as the storage account use private IP addresses for communication. Thus, you cannot restrict access to specific Azure services based on their public outbound IP address range."
Therefore I created a storage account in UK West with our Power BI Service in UK South. Looking at logs on the storage account I can now see requests from Power BI coming over a 51.0.0.0/8 range instead of private addresses. By adding 51.0.0.0/8 to the allowed CIDRs, Power BI Service Dataflows can now access the spreadsheets stored in the Datalake.

Different ways to access data from Azure Data Share to Azure SQL Database

We are working on implementing a new project in Azure. The idea is to move out of on-premise systems into the cloud as we have our vendors, partners and clients moving into the cloud. The option we are trying out is to use Azure Data Share and have Azure SQL Database subscribe to the data.
The thing we are now trying to explore is once a new data snapshot is created how do we import this data into Azure SQL Database?
For instance we have Partner information and this information is made available via Azure Data Share and new data snapshot is created daily.
The part that I am not sure of is how to synchronize this data between Azure Data Share and Azure SQL Database.
Also, Is there an api available to expose this data out to external vendors, partners or clients from Azure SQL Database after we have data sync to Azure SQL Database from Azure Data Share?
Azure Data Share -> Azure SQL Database
Yes, Azure SQL Database is a supported.
Azure Data Share -> SQL Server Database (on-prem)? Is this option supported?
No, SQL Server Database (on-prem) is not supported.
Is there an api that could be consumed to read data?
Unfortunately, there is no such API that could be consumed to read data.
Azure Data Share enables organizations to simply and securely share data with multiple customers and partners. In just a few clicks, you can provision a new data share account, add datasets, and invite your customers and partners to your data share. Data providers are always in control of the data that they have shared. Azure Data Share makes it simple to manage and monitor what data was shared, when and by whom.
Azure Data Share helps enhance insights by making it easy to combine data from third parties to enrich analytics and AI scenarios. Easily use the power of Azure analytics tools to prepare, process, and analyze data shared using Azure Data Share.
Which Azure data stores does Data Share support?
Data Share supports data sharing to and from Azure Synapse Analytics, Azure SQL Database, Azure Data Lake Storage, Azure Blob Storage, and Azure Data Explorer. Data Share will support more Azure data stores in the future.
The below table details the supported data sources for Azure Data Share.
How to synchronize this data between Azure Data Share and Azure SQL Database.
You need to choose “Snapshot setting” to refresh data automatically.
A data provider can configure a data share with a snapshot setting. This allows incremental updates to be received on a regular schedule, either daily or hourly. Once configured, the data consumer has the option to enable the schedule.

Clarification regarding storage account in web applications

I have an on-premises mvc application with a database calls to one more server.
When I deploy this application to windows azure, I am curious to know what will be stored in the storage account for this cloud service?
Is it database records or something else?
Given you mentioned creating a Cloud Service (so, I'm assuming Web Role for your MVC app): The deployment needs a storage account, at a minimum, for storing diagnostic log information, as well as your cloud service package and configuration.
Storage account is mostly used for "Blob" storage. In Azure environment we should not prefer to store blob data( like image and doc/PDF ) in database.best practice to store blob storage link.
Azure Storage provides the flexibility to store and retrieve large amounts of unstructured data, such as documents and media files with Azure Blobs; structured nosql based data with Azure Tables; reliable messages with Azure Queues and use SMB based Azure Files for migrating on-premises applications to the cloud.
for Overview and reference : http://azure.microsoft.com/en-in/documentation/services/storage/

Resources