Error when creating the linked service for Hubspot in ADF - azure

I am trying to pull data from Hubspot to ADF using the Hubspot connector in ADF.
I have also registered Hubpost with the Azure AD.
Followed the steps as per the doc, still unable to create the linked service.
https://learn.microsoft.com/en-us/azure/data-factory/connector-hubspot#linked-service-properties
Error message
I have used Postman app to generate the Access & refresh tokens.
Any help is appreciated. Thanks.

In the Hubspot developer account, Select the scope. Then use the Postman app, to generate the Access and refresh tokens by entering the required parameters. Use the POST method to get the tokens.
Use the fresh tokens generated in the ADF linked service to establish connections.
Once the connection is established, Hubspot will list the tables which are available and can be copied using ADF.
Create a new pipeline in ADF, select the Lookup activity, then the copy activity to copy the desired table into the destination.

Related

Import data from Salesforce Report using ADF not working

I am working on importing data from Salesforce Report using ADF. I am using copy activity and have created salesforce linked service.
I followed the Microsoft documentation https://learn.microsoft.com/en-us/azure/data-factory/connector-salesforce to setup the ADF. I am getting empty rows.
Used this tip: Get Salesforce reports by specifying a query as {call "<report name>"}. An example is "query": "{call "TestReport"}".
I can see header information but all rows are empty. What I am missing here?

How to use Power BI REST API's in Azure Data Factory

I am trying to create a list of all the workspaces and the reports contained in each one of them for a documentation project.
I found online that we can use this to get the workspaces, I wanna use it with a "web" activity:
https://api.powerbi.com/v1.0/myorg/groups
And then I want to use the IDs we get in the output with a foreach and another web activity inside of it and use this to get the reports in each workspace, then copy it somewhere (datalake or DB) :
https://api.powerbi.com/v1.0/myorg/groups/{groupId}/reports
But I don't know how to configure the activity and the authentication.
If there is a better way like connecting directly to Power BI I'm all ears, I tried to do a get data from web source but I don't have any "key" for API and organization authentication doesn't work.
When I run the code here: https://learn.microsoft.com/en-us/rest/api/power-bi/groups/getgroups it works perfectly.
Thanks in advance
'But I don't know how to configure the activity and the authentication.' For this question, I'm not sure if you don't know how to use access token to call an API.
Register an azure ad application with the api permission of 'https://analysis.windows.net/powerbi/api/Workspace.ReadWrite.All'
so that you can generate an access token by this application with
this scope. Add api permission.
Use ropc flow to generate an access token. This flow contains user information so that your api could know who you are to return correct groups.
Add 'Authorization' in request head with value of 'Bearer accessToken' to call the api.
ropc flow 1.0:

How to Dynamically adding HTTP endpoint to load data into azure data lake by using Azure Data Factory and the REST api is cookie autheticated

I am trying to dynamically add/update linked service REST based on certain trigger/events to consume a RESP API to be authenticated using cookie which provides telemetry data. This telemetry data will be stored in Data Lake Gen2 and then will use Data Bricks to move to secondary data storage/SQL Server.
Have someone tried this? I am not able to find the cookie based Auth option while adding the linked service REST.
Also how to create data pipes dynamically or to have the parameters of the rest api to be dynamic ?
Currently, unfortunately this is not possible using Azure data factory native components/activities. For now at least, you cannot get access to the response cookies from a web request in data factory. Someone has put a feature request for this or something that might help, see here
It might be possible to do this via an Azure function to get/save the cookie and then send it as part of a following request. I had a similar problem but resorted to using Azure functions for all of it, but I guess you could just do the authentication part with a function! ;-)
EDIT: update
Actually, after I wrote this I went back to check if this was still the case and looks like things have changed. There now appears (never seen this before) in the web response output, a property called "ADFWebActivityResponseHeaders" and as you can see there is property for the "Set-Cookie"
See example below:-

Import Data from SQL Azure to Excel Using AAD authentication and sharing a link of excel online

The Scenario
Step 1:- User clicks a link which redirects to excel online.
Step 2:- The excel on load will fetch the data from SQL azure (or can use ODATA feed ) using logged in user ID. (some custom logic to filter data based on user).
I did some research and can see excel power query as an option, but i cannot provide SQL server authentication for fetching data from SQL azure, because i need the logged in User details. I tried using Azure AAD authentication for SQL azure, but seems like power query does not have that option either to connect to SQL azure with AAD integrated Authentication. For OData i can use custom authorization implementation but how do i get the user information (logged into excel online) and pass it to Odata.
Looking for help in two major parts:-
1) Any pointers on step 1, do i have to put a excel sheet somewhere in share point (any other options??) and share the link? in that case how does it work if two different set of users (who see data filtered based on permissions) will work?
2) Help on step 2, how do i bring the data to excel with SQL azure or using ODATA but based on logged in user.
P.S :- Correct me if i am in completely wrong direction or any better ideas to implement this scenario.

Power BI Data Source with Refresh ability

I am trying to setup a data set in Power BI which can be refreshed on a need basis or scheduled to refresh.
I am uploading an excel workbook which has a power query.
the power query is connecting to Replicon Service to get data via service. the query looks as below:
Source = Web.Contents(
"https://na2.replicon.com/services/ClientService1.svc/GetActiveClients",
[
Headers=
[
#"Authorization"="Bearer *ValidToken*",
#"Accept"="application/json",
#"Content-Type"="application/json"
],
Content=Text.ToBinary("{}")
]
)
the request is a POST operation and hence Content field is used in Web.Contents Options argument. Authentication is via Bearer token.
Data source setting is Anonymous Credentials with privacy Level set to None.
This works fine and I am able to retrieve the results and even refresh form within workbook.
Once I upload this to PowerBI and attempt to refresh the newly created Dataset it says:
You cannot refresh yet because you need to provide valid credentials for your data sources in the dataset.
So I go to Manage Data sources. Click Edit Credentials. Select "Anonymous" Authentication method and click on Sign In and it says "Login Failed".
Why is that so? It appears that the Headers are lost when I upload the Excel workbook. How can I accomplish this? Is there any alternate ways of being able to set up a Data Set which can be refreshed - the source being a web service?
this is what i ended up doing finally.
my scenario first:
my requests are POST
Authentication is via bearer token which needed to be passed via Request Header. this was the requirement of the replicon service i am trying to invoke which i couldn't change
the dataset that is created in PowerBI needed to be refreshable.
Since i couldn't get it to work directly from inside PowerBI i introduced an intermediate layer. this would interpret GET requests from PowerBI. process the token from query string. accept the service and operation also as query string parameters. it would then create POST request to the Real Service (replicon services in my case). the service name and operation names were also picked up from request URL. the token was pushed as part of request header
so the request from PowerBI would look as PowerBI needed i.e.
Web.Contents("https://intermediatelayer.com?access_token="*validtoken*"&ServiceName="ClientService"&Operation="GetActiveClients"")
not an ideal solution but works.
Manage Data Sources is validating the Anonymous credentials with a GET request to the URL without the hardcoded headers, like you suggest. It's basically running
Web.Contents("https://na2.replicon.com/services/ClientService1.svc/GetActiveClients")
which fails with "(405): Method Not Allowed", and so Manage Data Sources thinks the credentials are wrong.
Short of making the service reply with a success response for the above M, I don't see any way to set up refresh on this mashup.

Resources