I ve been trying to get an OAuth 2.0 token from Google Analytics via a function app in Azure.
I am following this tutorial:
https://richardswinbank.net/adf/access_google_analytics_with_azure_data_factory#get_an_oauth_token_in_adf
There is a part were an URL is extracted from a variable:
var kvClient = new SecretClient(new Uri(Environment.GetEnvironmentVariable("KEY_VAULT_URL")), new ManagedIdentityCredential());
string keyJson = kvClient.GetSecret("KEY_VAULT_URL").Value.Value;
In Azure portal, under my function app -> Settings -> Configuration -> Application settings is the definition for KEY_VAULT_URL.
But the variable does not seem to resolve, because I get this error:
2021-10-14T14:02:24.371 [Error] Executed 'GetOAuthToken' (Failed, Id=bad02220-c792-4c53-af41-621c6a9d12345, Duration=32ms)The request URI contains an invalid name: KEY_VAULT_URLStatus: 400 (Bad Request)ErrorCode: BadParameterContent:{"error":{"code":"BadParameter","message":"The request URI contains an invalid name: KEY_VAULT_URL"}}Headers:Cache-Control: no-cachePragma: no-cachex-ms-keyvault-region: germanywestcentralx-ms-client-request-id: 0344c4b6-98d9-4ade-9c7f-cb058abd123x-ms-request-id: d35bd092-faf4-4567-99e4-4aba0123d7bx-ms-keyvault-service-version: 1.9.132.3x-ms-keyvault-network-info: conn_type=Ipv4;addr=51.216.128.119;act_addr_fam=InterNetwork;X-Powered-By: REDACTEDStrict-Transport-Security: REDACTEDX-Content-Type-Options: REDACTEDDate: Thu, 14 Oct 2021 14:02:23 GMTContent-Length: 101Content-Type: application/json; charset=utf-8Expires: -1
The value of the variable looks like this:
https://mykeyvault.vault.azure.net/
Maybe there is a mistake? I removed the trailing slash, but the output is always the same.
This line is wrong in the tutorial. It should not be a variable. It has to be the name of the secret.
kvClient.GetSecret("KEY_VAULT_URL")
Related
I am trying to upload base64 string as image file to Azure Blob Storage. Using https://learn.microsoft.com/en-us/rest/api/storageservices/put-blob documentation tried to create blob.
Request Syntax:
PUT https://myaccount.blob.core.windows.net/mycontainer/myblockblob HTTP/1.1
Request Headers:
x-ms-version: 2015-02-21
x-ms-date: <date>
Content-Type: text/plain; charset=UTF-8
x-ms-blob-content-disposition: attachment; filename="fname.ext"
x-ms-blob-type: BlockBlob
x-ms-meta-m1: v1
x-ms-meta-m2: v2
Authorization: SharedKey myaccount:YhuFJjN4fAR8/AmBrqBz7MG2uFinQ4rkh4dscbj598g=
Content-Length: 11
Request Body:
hello world
I am getting response as below,
<?xml
version="1.0" encoding="utf-8"?>
<Error>
<Code>AuthenticationFailed</Code>
<Message>Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
RequestId:a5d32623-f01e-0040-4275-c1880d000000
Time:2020-11-23T08:45:49.6994297Z</Message>
<AuthenticationErrorDetail>The MAC signature found in the HTTP request 'YhuFJjN4fAR8/AmBrqBz7MG2uFinQ4rkh4dscbj598g=' is not the same as any computed signature. Server used following string to sign: 'PUT
11
text/plain; charset=UTF-8
x-ms-blob-content-disposition:attachment; filename="demo.txt"
x-ms-blob-type:BlockBlob
x-ms-date:Mon, 23 Nov 2020 13:08:11 GMT
x-ms-encryption-key:YhuFJjN4fAR8/AmBrqBz7MG2uFinQ4rkh4dscbj598g=
x-ms-meta-m1:v1
x-ms-meta-m2:v2
x-ms-version:2015-02-21
/<myaccount>/<mycontainer>/<myblob>'.</AuthenticationErrorDetail>
</Error>
How to resolve this issue?
A simple way to upload a blob is to use the sas token.
Nav to azure portal -> your storage account -> Shared access signature, then select the following options in the screenshot -> then click the Generate SAS and connection string button. The screenshot is as below:
Then copy the SAS token, and append it to the url. Then the new url looks like this: https://myaccount.blob.core.windows.net/mycontainer/myblockblob?sv=2019-12-12&ss=b&srt=coxxxxx
Next, in the postman, paste the new url. And in the Headers, you can remove Authorization field.
The test result is as below:
#sathishKumar
If you look closely in this article Authorize with Shared Key
The syntax is as below :
Authorization="[SharedKey|SharedKeyLite] <AccountName>:<Signature>"
It is the signature that is passed along and not the Account key.
Signature is a Hash-based Message Authentication Code (HMAC) constructed from the request and computed by using the SHA256 algorithm, and then encoded by using Base64 encoding.
There are detailed steps how to construct the same mentioned on the above document.
Also, came across the post which talks about a PowerShell script which creates an Signature string through the Powershell that could be useful for you.
Sample Powershell Script
C# Implementation :
internal static AuthenticationHeaderValue GetAuthorizationHeader(
string storageAccountName, string storageAccountKey, DateTime now,
HttpRequestMessage httpRequestMessage, string ifMatch = "", string md5 = "")
{
// This is the raw representation of the message signature.
HttpMethod method = httpRequestMessage.Method;
String MessageSignature = String.Format("{0}\n\n\n{1}\n{5}\n\n\n\n{2}\n\n\n\n{3}{4}",
method.ToString(),
(method == HttpMethod.Get || method == HttpMethod.Head) ? String.Empty
: httpRequestMessage.Content.Headers.ContentLength.ToString(),
ifMatch,
GetCanonicalizedHeaders(httpRequestMessage),
GetCanonicalizedResource(httpRequestMessage.RequestUri, storageAccountName),
md5);
// Now turn it into a byte array.
byte[] SignatureBytes = Encoding.UTF8.GetBytes(MessageSignature);
// Create the HMACSHA256 version of the storage key.
HMACSHA256 SHA256 = new HMACSHA256(Convert.FromBase64String(storageAccountKey));
// Compute the hash of the SignatureBytes and convert it to a base64 string.
string signature = Convert.ToBase64String(SHA256.ComputeHash(SignatureBytes));
// This is the actual header that will be added to the list of request headers.
AuthenticationHeaderValue authHV = new AuthenticationHeaderValue("SharedKey",
storageAccountName + ":" + signature);
return authHV;
}
I'm trying to make a GET request to Azure Table REST API with Postman.
I can make a working request with a C# program I found, but when I try to copy the same information into the Postman request it return the followign error:
Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
With the C# program I generate my UTC time and my Authorization code.
The program will give me the following output:
x-ms-date: Fri, 01 Nov 2019 10:13:26 GMT
Authorization: SharedKeyLite username:e4IREMOVEDSOMELETTERST4Ag=
Request URI: https://username.table.core.windows.net/MainTable(PartitionKey='akey',RowKey='130')
The generated output works in the C# program, because when I use:
result = await Client.GetAsync(requestUri);
The result will give me the information of (akey, 130).
When I pass them into postman it will still give me an error.
I do update the date in postman whenever I make a new authorized string.
My postman setup is as follows:
I eventually want to make this request with the ESP32, so it might be a bit unrelated, but the ESP is giving me the same error. Any tips on setting the headers correct either for Postman or the ESP are appreciated.
to make this work first create two variables in your environment :
{{utcDate}}
{{authToken}}
Then create a new Get request and setup your headers like this :
x-ms-version 2015-12-11
x-ms-date {{utcDate}}
Authorization SharedKey resourceName:{{authToken}}
DataServiceVersion 3.0;NetFx
MaxDataServiceVersion 3.0;NetFx
Accept application/json;odata=nometadata
Finally, define a Pre-request Script :
var now = new Date().toUTCString();
pm.environment.set("utcDate", now);
var hcar = "/resourceName/TableName";
var verb = request.method;
var cntMd5 = "";
var cntType = "";
var mKey="<Your service key goes here>";
var text = verb + "\n" + (cntMd5 || "") + "\n" + (cntType || "") + "\n" + now + "\n" + hcar;
var key = CryptoJS.enc.Base64.parse(mKey);
var signature = CryptoJS.HmacSHA256(text, key);
var base64Bits = CryptoJS.enc.Base64.stringify(signature);
pm.environment.set("authToken", base64Bits);
The reason for the variables is, authToken because you need a place holder to store the calculated token, utcDate because the same date in your header must be used to calculate your token.
I found that the problem was within Postman itself.
There has been an ongoing issue with the automatic URL encoding.
When I went directly to the MainTable the code of Mauricio worked.
I am attempting to create an Azure Managed Cache using PowerShell and the Azure Management API, this two pronged approach is required because the Offical Azure PowerShell Cmdlets only have very limited support for Creation and Update of Azure Managed Cache. There is however an established pattern for calling the Azure Management API from PowerShell.
My attempts at finding the correct API to call have been somewhat hampered by limited documentation on the Azure Managed Cache API. However after working my way through the cmdlets using both the source code and the -Debug option in PowerShell I have been able to find what appear to be the correct API endpoints, as such I have developed some code to access these endpoints.
However, I have become stuck after the PUT request has been accepted to the Azure API as subsequent calls to the Management API /operations endpoint show that the result of this Operation was Internal Server Error.
I have been using Joseph Alabarhari's LinqPad to explore the API as it allows me to rapidly itterate on a solution using the minimum possible code, so to execute the following code snippets you will need both LinqPad and the following extension in your My Extensions script:
public static X509Certificate2 GetCertificate(this StoreLocation storeLocation, string thumbprint) {
var certificateStore = new X509Store(StoreName.My, storeLocation);
certificateStore.Open(OpenFlags.ReadOnly);
var certificates = certificateStore.Certificates.Find(X509FindType.FindByThumbprint, thumbprint, false);
return certificates[0];
}
The complete source code including the includes are available below:
My Extensions - you can replace an "My Extensions" by right clicking My Extensions in the bottom left hand pane and choosing "Open Script Location in Windows Explorer" then replacing the highlighted file with this one. Alternatively you may wish to merge my extensions into your own.
Azure Managed Cache Script - you should simply be able to download and double click this, once open and the above extensions and certificates are in place you will be able to execute the script.
The following settings are used throughout the script, the following variables will need to it for anyone who is following along using their own Azure Subscription ID and Management Certificate:
var cacheName = "amc551aee";
var subscriptionId = "{{YOUR_SUBSCRIPTION_ID}}";
var certThumbprint = "{{YOUR_MANAGEMENT_CERTIFICATE_THUMBPRINT}}";
var endpoint = "management.core.windows.net";
var putPayloadXml = #"{{PATH_TO_PUT_PAYLOAD}}\cloudService.xml"
First I have done some setup on the HttpClient:
var handler = new WebRequestHandler();
handler.ClientCertificateOptions = ClientCertificateOption.Manual;
handler.ClientCertificates.Add(StoreLocation.CurrentUser.GetCertificate(certThumbprint));
var client = new HttpClient(handler);
client.DefaultRequestHeaders.Add("x-ms-version", "2012-08-01");
This configures HttpClient to both use a Client Certificate and the x-ms-version header, the first call to the API fetches the existing CloudService that contains the Azure Managed Cache. Please note this is using an otherwise empty Azure Subscription.
var getResult = client.GetAsync("https://" + endpoint + "/" + subscriptionId + "/CloudServices");
getResult.Result.Dump("GET " + getResult.Result.RequestMessage.RequestUri);
This request is successful as it returns StatusCode: 200, ReasonPhrase: 'OK', I then parse some key information out of the request: the CloudService Name, the Cache Name and the Cache ETag:
var cacheDataReader = new XmlTextReader(getResult.Result.Content.ReadAsStreamAsync().Result);
var cacheData = XDocument.Load(cacheDataReader);
var ns = cacheData.Root.GetDefaultNamespace();
var nsManager = new XmlNamespaceManager(cacheDataReader.NameTable);
nsManager.AddNamespace("wa", "http://schemas.microsoft.com/windowsazure");
var cloudServices = cacheData.Root.Elements(ns + "CloudService");
var serviceName = String.Empty;
var ETag = String.Empty;
foreach (var cloudService in cloudServices) {
if (cloudService.XPathSelectElements("//wa:CloudService/wa:Resources/wa:Resource/wa:Name", nsManager).Select(x => x.Value).Contains(cacheName)) {
serviceName = cloudService.XPathSelectElement("//wa:CloudService/wa:Name", nsManager).Value;
ETag = cloudService.XPathSelectElement("//wa:CloudService/wa:Resources/wa:Resource/wa:ETag", nsManager).Value;
}
}
I have pre-created a XML file that contains the payload of the following PUT request:
<Resource xmlns="http://schemas.microsoft.com/windowsazure">
<IntrinsicSettings>
<CacheServiceInput xmlns="">
<SkuType>Standard</SkuType>
<Location>North Europe</Location>
<SkuCount>1</SkuCount>
<ServiceVersion>1.3.0</ServiceVersion>
<ObjectSizeInBytes>1024</ObjectSizeInBytes>
<NamedCaches>
<NamedCache>
<CacheName>default</CacheName>
<NotificationsEnabled>false</NotificationsEnabled>
<HighAvailabilityEnabled>false</HighAvailabilityEnabled>
<EvictionPolicy>LeastRecentlyUsed</EvictionPolicy>
</NamedCache>
<NamedCache>
<CacheName>richard</CacheName>
<NotificationsEnabled>true</NotificationsEnabled>
<HighAvailabilityEnabled>true</HighAvailabilityEnabled>
<EvictionPolicy>LeastRecentlyUsed</EvictionPolicy>
</NamedCache>
</NamedCaches>
</CacheServiceInput>
</IntrinsicSettings>
</Resource>
I construcuct a HttpRequestMessage with the above Payload and a URL comprised of the CloudService and Cache Names:
var resourceUrl = "https://" + endpoint + "/" + subscriptionId + "/cloudservices/" + serviceName + "/resources/cacheservice/Caching/" + cacheName;
var data = File.ReadAllText(putPayloadXml);
XDocument.Parse(data).Dump("Payload");
var message = new HttpRequestMessage(HttpMethod.Put, resourceUrl);
message.Headers.TryAddWithoutValidation("If-Match", ETag);
message.Content = new StringContent(data, Encoding.UTF8, "application/xml");
var putResult = client.SendAsync(message);
putResult.Result.Dump("PUT " + putResult.Result.RequestMessage.RequestUri);
putResult.Result.Content.ReadAsStringAsync().Result.Dump("Content " + putResult.Result.RequestMessage.RequestUri);
This request is nominally accepted by the Azure Service Management API as it returns a StatusCode: 202, ReasonPhrase: 'Accepted' response; this essentially means that the payload has been accepted and will be processed offline, the Operation ID can be parsed out of the HTTP Header to retreve further information:
var requestId = putResult.Result.Headers.GetValues("x-ms-request-id").FirstOrDefault();
This requestId can be used to request an update upon the status of the operation:
var operation = client.GetAsync("https://" + endpoint + "/" + subscriptionId + "/operations/" + requestId);
operation.Result.Dump(requestId);
XDocument.Load(operation.Result.Content.ReadAsStreamAsync().Result).Dump("Operation " + requestId);
The request to the /operations endpoint results in the following payload:
<Operation xmlns="http://schemas.microsoft.com/windowsazure" xmlns:i="http://www.w3.org/2001/XMLSchema-instance">
<ID>5364614d-4d82-0f14-be41-175b3b85b480</ID>
<Status>Failed</Status>
<HttpStatusCode>500</HttpStatusCode>
<Error>
<Code>InternalError</Code>
<Message>The server encountered an internal error. Please retry the request.</Message>
</Error>
</Operation>
And this is where I am stuck, the chances are I am subtly malforming the request in such a way that the underlying request is throwing a 500 Internal Server Error, however without a more detailed error message or API documentation I don't think there is anywhere I can go with this.
We worked with Richard offline and the following XML payload got him un-blocked.
Note - When adding/removing named cache to an existing cache, the object size is fixed.
Note 2- The Azure Managed Cache API is sensitive to whitespace between the element and the element.
Also please note, we are working on adding Named cache capability to our PowerShell itself, so folks don't have to use APIs to do so.
<Resource xmlns="http://schemas.microsoft.com/windowsazure" xmlns:i="http://www.w3.org/2001/XMLSchema-instance">
<IntrinsicSettings><CacheServiceInput xmlns="" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema">
<SkuType>Standard</SkuType>
<Location>North Europe</Location>
<SkuCount>1</SkuCount>
<ServiceVersion>1.3.0</ServiceVersion>
<ObjectSizeInBytes>1024</ObjectSizeInBytes>
<NamedCaches>
<NamedCache>
<CacheName>default</CacheName>
<NotificationsEnabled>false</NotificationsEnabled>
<HighAvailabilityEnabled>false</HighAvailabilityEnabled>
<EvictionPolicy>LeastRecentlyUsed</EvictionPolicy>
<ExpirationSettings>
<TimeToLiveInMinutes>10</TimeToLiveInMinutes>
<Type>Absolute</Type>
</ExpirationSettings>
</NamedCache>
<NamedCache>
<CacheName>richard</CacheName>
<NotificationsEnabled>false</NotificationsEnabled>
<HighAvailabilityEnabled>false</HighAvailabilityEnabled>
<EvictionPolicy>LeastRecentlyUsed</EvictionPolicy>
<ExpirationSettings>
<TimeToLiveInMinutes>10</TimeToLiveInMinutes>
<Type>Absolute</Type>
</ExpirationSettings>
</NamedCache>
</NamedCaches>
</CacheServiceInput>
</IntrinsicSettings>
</Resource>
I have an Azure Reporting Services instance I want to connect to via the Report Execution Web Service. I have referenced this article to connect. However, I am receiving an error...
The URL of the service is:
i593ehr-i.reporting.windows.net
I connected to:
i593ehr-i.reporting.windows.net/ReportServer/ReportExecution2005.asmx
and downloaded the WSDL file. It should be noted that the documentation used ReportExecution2010.asmx, but that didn't direct to a WSDL file... I used the command supplied in the file to generate a proxy class. I then used this code to connect:
var service = new ReportExecutionService();
service.CookieContainer = new CookieContainer();
service.Credentials = new NetworkCredential("report", "******", "i593ehr-i.reporting.windows.net");
service.LoadReport2(reportPath, null);
string extension;
string mimeType;
string encoding;
Warning[] warnings;
string[] streamIds;
var reportData = service.Render("PDF", null, out extension, out mimeType, out encoding, out warnings, out streamIds);
File.WriteAllBytes(outputFile, reportData);
and it's returning the message:
The Authentication Extension threw an unexpected exception or returned a value that is not valid: identity==null. (rsAuthenticationExtensionError)
What am I doing wrong?
It turns out that I needed to use the LogonUser method instead of NetworkCredentials, which the documentation specified but I must have overlooked... The code should be:
service.LogonUser("report", "******", "i593ehr-i.reporting.windows.net");
azure table query rest api is failing with AuthenticationFailed error.
<?xml version="1.0" encoding="utf-8" standalone="yes"?>
<error xmlns="http://schemas.microsoft.com/ado/2007/08/dataservices/metadata">
<code>AuthenticationFailed</code>
<message xml:lang="en-US">Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.</message>
</error>
The winjs app code snippet to form and make the rest call is:
var date = new Date().toGMTString().replace('UTC', 'GMT');
var xhrOption = {
type: 'GET',
url: url,
headers: {
'content-type': 'application/atom+xml;charset="utf-8"',
'content-length': 0,
dataserviceversion: '1.0;NetFx',
maxdataserviceversion: '2.0;NetFx',
'x-ms-version': '2011-08-18',
'x-ms-date': date,
accept: 'application/atom+xml,application/xml',
'Accept-Charset': 'UTF-8',
},
};
xhrOption.headers.Authorization = AuthorizationHeader().computeForTableService(options, xhrOption);
The code to compute the authorization header is little long. It is listed below:
_getSignatureStringForTableService: function getSignatureStringForTableService()
{
var headers = this.xhrOptions.headers;
var httpVerb = this.xhrOptions.type.toUpperCase();
var sigItems = [];
sigItems.push(httpVerb);
var contentMD5 = this._getHeaderOrDefault(headers, 'Content-MD5');
sigItems.push(contentMD5);
var contentType = this._getHeaderOrDefault(headers, 'content-type');
sigItems.push(contentType);
var date = this._getHeaderOrDefault(headers, 'x-ms-date');
if (!date)
date = this._getHeaderOrDefault(headers, 'Date');
sigItems.push(date);
var canonicalizedResource = this._getCanonicalizedResource();
sigItems.push(canonicalizedResource);
var result = sigItems.join('\n');
return result;
},
_getCanonicalizedResource: function getCanonicalizedResource()
{
var items = [];
var path;
if (config.storageAccount.isDevStorage)
path = "/" + config.storageAccount.name + '/' + config.storageAccount.name;
else
path = "/" + config.storageAccount.name;
path += "/" + this.options.resourcePath;
items.push(path);
var result = items.join('\n');
return result;
},
computeForTableService: function computeForTableService(options, xhrOptions)
{
this.options = options;
this.xhrOptions = xhrOptions;
var sig = this._computeSignatureForTableService();
var result = 'SharedKey ' + config.storageAccount.name + ':' + sig;
return result;
},
_computeSignatureForTableService: function computeSignatureForTableService()
{
var sigString = this._getSignatureStringForTableService();
// TODO: use crypto from windows api. currently uses, google cryptoJS lib
var key = CryptoJS.enc.Base64.parse(config.storageAccount.primaryKey);
var hmac = CryptoJS.algo.HMAC.create(CryptoJS.algo.SHA256, key);
hmac.update(sigString);
var hash = hmac.finalize();
var result = hash.toString(CryptoJS.enc.Base64);
return result;
},
Interestingly, I have the whole code working fine 2 days before. I have updated service code to use updated azure nodejs sdk. I wonder if the update caused some incompat in the publisher/consumer code?
Other observations
The service code that uses azure nodejs module, is able to query the table storage without error.
I debugged through the azure nodejs module, looked through the stringToSign and matched with what winjs code is producing. both are same afaik.
service was upgrade to use 0.10.x node and respective latest azure nodejs sdk.
Example: stringToSign
GET\n\napplication/atom+xml;charset="utf-8"\nWed, 5 Jun 2013 14:43:30 GMT\n/devstoreaccount1/devstoreaccount1/mytable()
Thanks for going through details.
Finally - the root cause of the bug is out. The issue is value of x-ms-date header.
Expected value - Thu, 06 Jun 2013 08:09:50 GMT
Value computed in the code above - Thu, 6 Jun 2013 08:20:34 GMT
The 0 missing before the date is the root cause of this bug. Because of that, stringToSign used in computing the authorization header is incorrect. Hence, Authorization Header is incorrect leading to AuthenticationFailed error. This also explains the reason why this code worked couple of days back (end of may - date had two digits).
If someone from MS is reading this, it will be so much useful to have right amount of details along with the error code. AuthenticationFailed error code alone does not give any clue to developer.
I had used azure storage blob rest api earlier. It returns better error for the same AuthenticationFailed error code. It sends across the expected stringToSign and found stringToSign along with the AuthenticationFailed error code. It is so much more helpful and bug gets resolved in couple of minutes.
Used Network monitor from Microsoft. Wrote c# code snippet to make the azure table query using azure .net sdk, and compared every header character by character to hit the issue.