https://learn.microsoft.com/en-us/power-bi/service-real-time-streaming
I am using the REST API to send live data to PowerBi. (From a native application on a Windows)
How can I handle the authentication , encryption and other security features while streaming data to PowerBi?
Can I use the powerBI gateway somehow?
I see following issues currently:
If someone gets the REST Api link to the dataset, they can induce incorrect data.
The json data that I stream is not encrypted
As #silent said in his answer, the communication is fully encrypted. Pushing data to a streaming dataset looks like this (sample code in PowerShell):
$endpoint = "https://api.powerbi.com/beta/08bbc04c-a46d-4c45-b587-9dec9454fc2d/datasets/15e4b6c3-4697-442f-91f9-2ad056eef2a8/rows?key=QINNGFRYZnWHHFA51G6VCDeL%2FYyfh0oDZ0qsV1qwzIh18tNfs2POjWgFIJdnWxxA3bjqJqfMhWPOhzQ6bK3vgw%3D%3D"
$payload = #{
"datetime" ="2019-05-03T17:17:05.830Z"
"somevalue" =98.6
}
Invoke-RestMethod -Method Post -Uri "$endpoint" -Body (ConvertTo-Json #($payload))
Note that the endpoint uses HTTPS protocol, i.e. it is encrypted. However, the difference between pushing data to a streaming dataset and pushing data to a "normal" push dataset, is that with streaming you do not use access token generated when you authenticate yourself against Azure AD, but a key in the endpoint URL. You must protect this key the same way as you are protecting your account's password. If someone got an access to them, he can harm you. So in the regard of authentication, there is not much difference. Also, because the communication is encrypted, you should not worry about the streamed data.
You can't use the gateway, because you don't need it. First, it is used to enable communication from Power BI Service to your premise (while in your case it is in the other direction and can be established directly) and second, even if you communicate through the gateway, the data will be encrypted the same way (so there is no difference regarding that).
Not sure if I understand your question correctly, but the REST APIs are all only accessible via TLS encryption and authentication is based on Azure AD.
All requests to REST APIs are secured using Azure AD OAuth.
https://learn.microsoft.com/en-us/power-bi/service-real-time-streaming#using-power-bi-rest-apis-to-push-data
Related
I have an Azure app named myApp. This app is a website on which you have to sign in with your Microsoft account.
I created a scope in this app named myscope.
What I want to do is make a web request (in powershell) to this website.
I managed to do it with the method here by "manually" getting the code first and then the token and finally calling
Invoke-RestMethod -Uri "myurl" -Headers #{"authorization" = "Bearer $token"}
What I want to do now is exactly the same thing but in a fully non-interactive way. I mean without the need to log in manually. It can be with credentials, client secret or other ideas ...
Could someone explain to me the steps to do this?
Thanks !
This feels like you may need to look at the architecture, since you have 2 clients that need to get data.
GETTING AN ACCESS TOKEN (AUTHENTICATING)
If this sounds right then it would involve first registering two different entries in Azure AD:
Website client uses Authorization Code Flow to authenticate users via Azure and get tokens
Powershell client is not interactive so should probably use Client Credentials Flow
USING AN ACCESS TOKEN (DATA ACCESS)
The most standard setup would work like this:
Website client sends access token to a Web API
Powershell client sends access token to the same Web API
If the data access code is in the website itself then this may not work, since the web app is not designed to work for the Powershell client, and is likely to require a secure cookie that Powershell cannot provide.
FIRST STEPS
I would focus on the data the Powershell client needs and design how it should work if done properly. Can this be factored out into a small API as above?
If this is expensive then maybe a tactical solution could be used, such as a new website endpoint that accepts tokens and is only called from Powershell.
Is there a way to save the output of an Azure Data Factory Web Activity into a dataset?
Here is my current use case:
I have to dynamically build a JSON post request
The API I'm trying to reach requires a SSL certificate so I have to use the Web Activity Authentication Client Certificate option.
The API also requires basic authentication so I input the Content -Type and authorization guid in the header section of the Web Activity.
Once I get the JSON response from my post request I need to save the response into a blob storage some where.
I tried using the Copy Data Set HTTPS or Rest API as a data set source but both only allow one type of authentication certificate or Basic authentication.
Is there a way I can configure the Rest API or HTTPS data set source handle both types of authentications (SSL and Basic Authorization) or capture all the Web Activity output into a blob storage?
Thank you all for your help! I'm desperate at the moment lol..
Here is what my Web Activity looks like (sorry I had hide part of the credentials for security purposes:
Please use Http dataset in your Coput Activity.
When we create the linked service of the Http dataset, select client certificate option and embedded data, then we need to upload the SSL certificate.
The offical document is here.
Unfamiliar with handling secure data but now I've began working in azure, specifically with a dynamics instance and logic apps. The webhook needs to give the external api secure data such as ssn. What's the best way to go about sending secure data like this over network? Oauth2 is implemented but is there something else I can implement so we are not directly sending the ssn?
Always using SSL (HTTPS) is a secure way to send data over the wire. I believe a little more added security would be to use Certificate and Public Key Pinning if possible.
Also, another way to secure sensitive data would be to first store it in Azure Key Vault and share the reference to that secret in your webhook call to the Logic App.
The Logic App would then acquire the secret from Azure Key Vault using Managed Identity.
We have a windows service which monitors a folder (using filewatcher of C#) for files and uploads the files to a blob. Windows service retrieves the Write only SAS token , which is used to generate the blob client to upload to a blob, from a WebAPI endpoint(TLS 1.2) secured with ADFS 2.0 by passing the JWT retrieved from ADFS WS-Trust 1.3 endpoint passing user name and password.
My experience is limited in the area of security. I have two questions.
1- Should there be an encryption before I upload the data to blob? If yes, how can I implement it.
2- Would retrieving the SAS token from an endpoint, even though it is secured with ADFS and is over https, possess any kind of security risk
1- Should there be an encryption before I upload the data to blob? If yes, how can I implement it.
Per my understanding, if you want extra security during transit and your stored data to be encrypted, you could leverage Client-side encryption and refer to this tutorial. At this point, you need to make programmatic changes to your application.
Also, you could leverage Storage Service Encryption (SSE) which does not provide for the security of the data in transit, but it provides the following benefit:
SSE allows the storage service automatically encrypt the data when writing it to Azure Storage. When you read the data from Azure Storage, it will be decrypted by the storage service before being returned. This enables you to secure your data without having to modify code or add code to any applications.
I would recommend you could just leverage HTTPs for your data in transit and SSE to encrypt your blobs. For how to enable SSE, you could refer to here. Additionally, you could follow here about Azure Storage security guide.
2- Would retrieving the SAS token from an endpoint, even though it is secured with ADFS and is over https, possess any kind of security risk
SAS provides you with a way to grant the limited permissions to resources in your storage account to other clients. For security consideration, you could set interval over which your SAS is valid. Also, you could limit the IP addresses which could Azure Storage would accept the SAS. Per my understanding, the endpoint for generating SAS token is secured with ADFS 2.0 over HTTPs, I assumed that it is safe enough.
I am creating an application and native apps which shall access a bunch of RestFul Webservices. We are already using HTTPS to secure the API, but wanted to understand if we still need something like an API Key or OAuth Key to authenticate the data
Encryption and Authentication are two separate things. Encryption merely prevents outside parties from snooping the transmitted data. You need Authentication if you want to control who has access to which resources using the API.