In existing pipeline the REST API key values (client ID and client secret) are expired. So, I have generated new keys values. I am able to generate token and extract API data in Postman using new client id and client secret.
In ADF pipeline, I am giving the new key values to generate token. But when calling API the newly generated token is not working. It is giving me the below error.
Do I need to change anything else?enter image description here
Details Failure happened on ' Source side . ErrorCode RestSourceCallFailed , Type Microsoft.Data
Transfer.Common.Shared.Hybrid DeliveryException , Message The
HttpStatusCode 401 indicates failure .
From the error message it seems to be a permission issue. please make sure that the service principal has enough permissions.
More information about RestAPI Calling in Azure Data Factory use below Reference in which we have detailed information about
• REST linked service using UI
• Connector configuration details
• AAD service principal authentication
• API key authentication
• Copy data from REST connector into Azure Data Lake Storage in JSON format using OAuth.
Reference:
https://learn.microsoft.com/en-us/azure/data-factory/connector-rest?tabs=data-factory#how-to-use-this-solution-template
Related
I am using Azure Container apps with Azure Blob Store as a state store. It is a simple Hello World (weather service) app using dotnet 6. App starts up fine, on Post I am trying to save the generated weather information to Azure Blob Store as JSON. I have configured Dapr components in Azure Container Apps for StateStore using Azure blob storage. I am using storage key (secondary key) as explained in this Microsoft documentation
Upon doing a Swagger and looking at log I get the following error.
Dapr.DaprException: State operation failed: the Dapr endpoint indicated a failure. See InnerException for details.
2022-07-17T01:10:35.716245402Z ---> Grpc.Core.RpcException: Status(StatusCode="Internal", Detail="failed saving state in state store statestore: -> github.com/Azure/azure-storage-blob-go/azblob.newStorageError, /home/vsts/work/1/go/pkg/mod/github.com/!azure/azure-storage-blob-go#v0.10.0/azblob/zc_storage_error.go:42
2022-07-17T01:10:35.716524109Z ===== RESPONSE ERROR (ServiceCode=AuthenticationFailed) =====
2022-07-17T01:10:35.716795515Z Description=Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
2022-07-17T01:10:35.716812515Z RequestId:863bcef4-401e-0069-5f7a-99724b000000
2022-07-17T01:10:35.716820115Z Time:2022-07-17T01:10:35.7137648Z, Details:
2022-07-17T01:10:35.716825516Z AuthenticationErrorDetail: Issuer validation failed. Issuer did not match.
2022-07-17T01:10:35.716831516Z Code: AuthenticationFailed
Error is Authentication Failed. I am unsure what I am missing since I am not making any additional config in storage account such as VNET service end point etc. Account is enabled for Key access. Any help is appreciated.
Below is the code that I am using
using var client = new DaprClientBuilder().Build();
var forecast = new WeatherForecast()
{
Date = DateTime.Now.AddDays(1),
TemperatureC = Random.Shared.Next(-20, 55),
Summary = Summaries[Random.Shared.Next(Summaries.Length)]
};
await client.SaveStateAsync<WeatherForecast>(stateStoreName,key,forecast);
Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
i think i found the answer. The issue was with metadata since the key set in metadata of the component.yaml as mentioned in the Microsoft documentation is not working. I changed it to use secretref and referred in metadata directly in the portal. Not sure why the error was showing Authentication error but it is finally working.
I would need to start an Azure Data Factory pipeline from REST API as per https://learn.microsoft.com/en-us/rest/api/datafactory/pipelines/createrun#code-try-0
I have created an AAD app, and given it the Azure Service Management API Permission. However when the client through the OAuth2 Implicit flow receives an id_token and invokes that API to start a pipeline I get
{
"error": {
"code": "InvalidAuthenticationToken",
"message": "The access token is invalid."
}
}
Am i using proper API pemrission? thanks.
If you just want to use OAuth2 flow to get the token to call the REST API, the client credentials flow is more suitable than the Implicit flow in this case.
Please follow the steps below.
1.Get values for signing in and create a new application secret.
2.Navigate to the data factory -> Access control (IAM) -> Add -> add your AD App as an RBAC role e.g. Contributor, Owner, Data Factory Contributor, details follow this.
3.In the postman, follow the screenshot below, fix the request body got from step 1, then use the token to call REST API, it will work fine.
POST https://login.microsoftonline.com/<tenant-id>/oauth2/v2.0/token
client_id=<client_id>
&scope=https://management.azure.com/.default
&client_secret=<client_secret>
&grant_type=client_credentials
I cant test it right now but I would assume that having Data Factory Contributor should be enough for this.
https://learn.microsoft.com/en-us/azure/role-based-access-control/built-in-roles#data-factory-contributor
I am trying to list the Queues/Topics in an Azure Service Bus using the REST API.
When I try to connect I just get back a blank feed saying "This is the list of publicly-listed services currently available".
I am using the RootManageSharedAccessKey in the portal (for dev only, I can create a more restricted key later) so it should have all the access rights that I need, I just can't seem to get it to return anything. This documentation seems to suggest that this will work, but there's no actual working examples, just theoretical responses.
I have tried doing a GET request with the signature in the URL like this:
https://myservicebusnamespace.servicebus.windows.net/$Resources/Queues;SharedAccessKeyName=RootManageSharedAccessKey;SharedAccessKey=MYSHAREDACCESSKEY
I have also tried doing it like this:
https://myservicebusnamespace.servicebus.windows.net/$Resources
and then setting the Authorization header to
WRAP access_token="MYSHAREDACCESSKEY="
Both times I just get this back
<feed xmlns="http://www.w3.org/2005/Atom">
<title type="text">Publicly Listed Services</title>
<subtitle type="text">This is the list of publicly-listed services currently available.</subtitle>
<id>uuid:6a5d438d-1793-451b-be41-XXXXXXXXXXXX;id=XXXXXX</id>
<updated>2020-06-28T13:03:04Z</updated>
<generator>Service Bus 1.1</generator>
</feed>
If I change the url slightly to be:
https://myservicebusnamespace.servicebus.windows.net/$Resources/Queues/
I get a slightly different response back of:
<Error>
<Code>401</Code>
<Detail>claim is empty. TrackingId:c40a2bd2-490d-4b5b-adde-33bc89aa84ff_G36, SystemTracker:myservicebusnamespace.servicebus.windows.net:$Resources/Queues, Timestamp:2020-06-28T13:27:40</Detail>
</Error>
Which seems to suggest that I am not authorised, or I am missing something. If I add an acutual queue name to the end of that url, it goes back to the original response.
I believe there is another way to get this information by using subscription ids and pem keys... using the management urls (https://management.core.windows.net/{subscription ID}/services/ServiceBus/Namespaces/{Namespace}/Topics/)
but this should all be possible using the format above, I just can't figure out the exact format required.
EDIT/UPDATE: If I don't include my auth claim, the result is exactly the same, suggesting that it's not seeing my auth claim or it's invalid. However if I include it, and just make it the token, without the WRAP bit at the start, I get an exception saying
<Error>
<Code>401</Code>
<Detail>MalformedToken: Invalid authorization header: The request is missing WRAP authorization credentials. TrackingId:7be2d7f0-c165-4658-8bf1-ea104c43defc_G28, SystemTracker:NoSystemTracker, Timestamp:2020-06-28T13:33:09</Detail>
</Error>
So it's like it's reading it then ignoring it?
If you want to list queues or topics we can use Azure service bus service rest api or Azure Resource Manager Rest API. For more details, please refer to the following steps
Azure service bus service rest api
Generate SAS token. For more details, please refer to the document
For example, I use python to create sas token
import hmac
import time
import hashlib
import base64
import urllib
sb_name='bowmantest'
// your entity path such as $Resources/topics (list topics) $Resources/queues(list queues)
topic='$Resources/topics'
url=urllib.parse.quote_plus("https://{}.servicebus.windows.net/{}".format(sb_name,topic))
sas_value='' // your share access key
sas_name='RootManageSharedAccessKey' // your share access rule name
expiry = str(int(time.time() + 10000))
to_sign =(url + '\n' + expiry).encode('utf-8')
sas = sas_value.encode('utf-8')
signed_hmac_sha256 = hmac.HMAC(sas, to_sign, hashlib.sha256)
signature = urllib.parse.quote(base64.b64encode(signed_hmac_sha256.digest()))
auth_format = 'SharedAccessSignature sig={0}&se={1}&skn={2}&sr={3}'
auth=auth_format.format(signature,expiry,sas_name,url)
print(auth)
Call the rest API
1). list Queues
GET https://<namespace name>.servicebus.windows.net/$Resources/queues
Authorization <sas token>
2). List topics
GET https://<namespace name>.servicebus.windows.net/$Resources/topics
Authorization <sas token>
Azure Resource Manager Rest API
create a service principal and assign Azure RABC role to the sp(I use Azure CLI)
az login
#it will create a service principal and assign contributor role to the sp
az ad sp create-for-rbac -n "jonsp2"
Get Azure AD token
POST /{tenant}/oauth2/v2.0/token HTTP/1.1 //Line breaks for clarity
Host: login.microsoftonline.com
Content-Type: application/x-www-form-urlencoded
client_id=<app id>
&scope=https://management.azure.com/.default
&client_secret=<app password>
&grant_type=client_credentials
call the rest API
List Queues
GET https://management.azure.com/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.ServiceBus/namespaces/{namespaceName}/queues?api-version=2017-04-01
Authorization Bearer <AD token>
I have created an AAD app as per https://learn.microsoft.com/en-us/azure/storage/common/storage-auth-aad-app.
The access is given to the azure storage account for the AAD app created.
Got the client id and client secret.
To create a user delegation key and user delegation sas, I am using the approach and code as defined in
https://learn.microsoft.com/en-us/azure/storage/blobs/storage-blob-user-delegation-sas-create-dotnet.
(set environment variables as mentioned in article).
I am able to generate the user delegation key using method GetUserDelegationSasBlob.
The container and blob file is existing one.
Now I am using the method ReadBlobWithSasAsync to read the contents of the blob using the SAS uri as generated above.
But, I get error as below.
This request is not authorized to perform this operation using this
permission. RequestId:5d127eaf-101e-00be-6666-6a3875000000
Time:2019-09-13T19:04:15.4109144Z
Status: 403 (This request is not authorized to perform this operation
using this permission.)
ErrorCode: AuthorizationPermissionMismatch
In another approach, I am generating the user delegation key using rest api.
https://learn.microsoft.com/en-us/rest/api/storageservices/get-user-delegation-key
I am able to get user delegation key in xml format.
I am creating SAS from it as per steps in
https://learn.microsoft.com/en-us/rest/api/storageservices/create-user-delegation-sas
For signature, I am using this code, using StringToSign and secret value as received from delegation key.
var encoding = new System.Text.ASCIIEncoding();
byte[] keyByte = encoding.GetBytes(secret);
byte[] messageBytes = encoding.GetBytes(ToSign);
using (var hmacsha256 = new HMACSHA256(keyByte))
{
byte[] hashmessage = hmacsha256.ComputeHash(messageBytes);
String sig= Convert.ToBase64String(hashmessage);
}
I am doing the GET request.
I have tried various set of parameter values, like,
sr: b and c
sks: b and c
sp: racwd and r and rw and few more
skv and sv is 2018-11-09 because this version is required for creating user delegation key.
But the GET api returns the error.
AuthenticationFailed
Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the
signature. RequestId:e4bc8f0f-d01e-0046-7367-6af368000000
Time:2019-09-13T19:12:27.7780695Z
Signature fields not well formed.
Try to assign the Storage Blob Data Contributor role to the storage account.
The Reader role is an Azure Resource Manager role that permits users to view storage account resources, but not modify them. It does not provide read permissions to data in Azure Storage, but only to account management resources.
Refer to this article.
Setting up a custom policy in Azure AD B2C to connect to an ADFS Identity Provider. This requires a SAML metadata endpoint as specified in the documentation at the link below.
https://learn.microsoft.com/en-us/azure/active-directory-b2c/active-directory-b2c-custom-setup-adfs2016-idp#configure-an-adfs-relying-party-trust
The error being encountered is:
AADB2C90022: Unable to return metadata for the policy [my-policy] in tenant [my-tenant].onmicrosoft.com.
and is being encountered when I go to the endpoint:
https://login.microsoftonline.com/te/[my-tenant].onmicrosoft.com/[my-policy]/samlp/metadata?idptp=[my-technical-profile]
I have tried making the request from the b2clogin.com endpoint with the same result as above.
E.g. https://[my-tenant].b2clogin.com/te/[my-tenant].onmicrosoft.com/[my-policy]/samlp/metadata?idptp=[my-technical-profile]
I have also tried using my tenantId GUID in place of [my-tenant].onmicrosoft.com which resulted in the exact same result.
E.g. https://login.microsoftonline.com/te/[my-tenant-id]/[my-policy]/samlp/metadata?idptp=[my-technical-profile]
Re-visit the process by which you created the certificate, uploaded it to your 'Policy Keys' and referenced it in your custom policy files.
My scenario was similar, I had the same error and no output via Application Insights / Journey Recorder.
I had tried to avoid using 'makecert.exe' and instead used another SSC generation tool. This simply did not work, I think because the private key was not being incorporated in the certificate file.
This guide has been invaluable, see also this test facility