I'm having intermittent 403 errors trying to access a blob storage, via Azure CDN, with the symmetric access key. It seems that sometimes there's a header added for "Range", in the format of "bytes=xxx". The full error message is below:
{'Date': 'Mon, 12 Dec 2022 13:07:40 GMT', 'Content-Type': 'application/xml', 'Content-Length': '697', 'Connection': 'keep-alive', 'x-ms-request-id': '3f89c2c1-e01e-0050-132a-0eeb42000000', 'x-ms-error-code': 'AuthenticationFailed', 'x-azure-ref': '20221212T130740Z-6rfkrgx8qt0shbtz3x46rwnhrn0000000630000000002ayd', 'X-Cache': 'TCP_MISS'}
<?xml version="1.0" encoding="utf-8"?><Error><Code>AuthenticationFailed</Code><Message>Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
RequestId:3f89c2c1-e01e-0050-132a-0eeb42000000
Time:2022-12-12T13:07:40.7638741Z</Message><AuthenticationErrorDetail>The MAC signature found in the HTTP request 'xxxxxx=' is not the same as any computed signature. Server used following string to sign: 'GET
bytes=0-8388607
x-ms-date:Mon, 12 Dec 2022 13:07:36 GMT
x-ms-version:2020-04-08
/deviceimage2zgjscikl7kny/images/data-prod-1.1.packer'.</AuthenticationErrorDetail></Error>
I was able to reproduce the error by generating the MAC signature in Python, but I saw it originally using the Go SDK and az CLI.
We added a rule at the CDN to Bypass caching, and it seems to have improved the situation (problem happens less frequently), but we are still seeing it on occasion.
Has anyone else experienced this? And is there a workaround?
Trying to access a blob storage with an access key, via Azure CDN
I tried in my environment and got below results:
Initially, I got a same when I tried to access blob storage with CDN using Postman.
Postman:
The above error states that signature and date is incorrect. So, we can't pass directly storage access key. You need to create a signature string that represents the given request, sign the string with the HMAC-SHA256 algorithm (using your storage key to sign), and encode the result in base 64.
For creating signature, I used below .NET code:
using System.Globalization;
using System.Net;
using System.Security.Cryptography;
class Program
{
static void Main(string[] args)
{
ListBlobs();
Console.WriteLine("done");
Console.ReadLine();
}
static void ListBlobs()
{
string Account = "venkat123";
string Key = "<Storage account key>";
string Container = "test";
string apiversion = "2021-06-08";
DateTime dt = DateTime.UtcNow;
string StringToSign = String.Format("GET\n"
+ "\n" // content encoding
+ "\n" // content language
+ "\n" // content length
+ "\n" // content md5
+ "\n" // content type
+ "\n" // date
+ "\n" // if modified since
+ "\n" // if match
+ "\n" // if none match
+ "\n" // if unmodified since
+ "\n" // range
+ "x-ms-date:" + dt.ToString("R") + "\nx-ms-version:" + apiversion + "\n" // headers
+ "/{0}/{1}\ncomp:list\nrestype:container", Account, Container);
string auth = SignThis(StringToSign, Key, Account);
Console.WriteLine($"the date is: {dt.ToString("R")}");
Console.WriteLine($"the auth token is: {auth}");
Console.WriteLine("*********");
string method = "GET";
string urlPath = string.Format("https://{0}.blob.core.windows.net/{1}?restype=container&comp=list", Account, Container);
Uri uri = new Uri(urlPath);
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(uri);
request.Method = method;
request.Headers.Add("x-ms-date", dt.ToString("R"));
request.Headers.Add("x-ms-version", apiversion);
request.Headers.Add("Authorization", auth);
Console.WriteLine("***list all the blobs in the specified container, in xml format***");
using (HttpWebResponse response = (HttpWebResponse)request.GetResponse())
{
using (StreamReader reader = new StreamReader(response.GetResponseStream()))
{
Console.WriteLine(reader.ReadToEnd());
}
}
}
private static String SignThis(String StringToSign, string Key, string Account)
{
String signature = string.Empty;
byte[] unicodeKey = Convert.FromBase64String(Key);
using (HMACSHA256 hmacSha256 = new HMACSHA256(unicodeKey))
{
Byte[] dataToHmac = System.Text.Encoding.UTF8.GetBytes(StringToSign);
signature = Convert.ToBase64String(hmacSha256.ComputeHash(dataToHmac));
}
String authorizationHeader = String.Format(
CultureInfo.InvariantCulture,
"{0} {1}:{2}",
"SharedKey",
Account,
signature);
return authorizationHeader;
}
}
Console:
Above executed code, date and signature which I copied and used in postman, and it worked successfully.
Postman:
Related
My application generates SAS tokens to access existing blobs within my container. However, my SAS token dose not look like it is expiring. I am able to view and get blob from container way past expiration time I am claiming.
Here is the code :
public string GenerateSasToken([NotNull] string containerName, [NotNull] string blobName)
{
var startTime = DateTimeOffset.UtcNow
var expiredTime = startTime.AddSeconds(20);
var blobClient = new BlobClient(_options.Value.ConnectionString, containerName, blobName);
var sasBuilder = new BlobSasBuilder(BlobContainerSasPermissions.Read, expiredTime)
{
BlobName = blobName,
BlobContainerName = containerName,
StartsOn = startTime,
ExpiresOn = expiredTime
};
var uri = blobClient.GenerateSasUri(sasBuilder);
return uri.ToString();
}
Token been generated is valid and I am able to use it, but it dose not expire after 20 seconds in fact it dose not expire even after 15 minutes.
Am I missing something within this API?
Thank you!
Edit:
I am attaching SAS token that was generated.
?sv=2020-08-04&st=2022-01-24T21%3A20%3A41Z&se=2022-01-24T21%3A21%3A01Z&sr=b&sp=r&sig=signature-here
Even though the SAS token is expired, because of the browser caching, you would still be able to access the blob storage using the same SAS token
In order to avoid this, you can override the cache-control header in the SAS token as suggested by #Gaurav Mantri
You need to set CacheControl value in your BlobSasBuilder function to override the cache-control header
Your BlobSasBuilder function can be as below:
var sasBuilder = new BlobSasBuilder(BlobContainerSasPermissions.Read, expiredTime)
{
BlobName = blobName,
BlobContainerName = containerName,
StartsOn = startTime,
ExpiresOn = expiredTime,
CacheControl = "max-age=" + expiredTime
};
I've being working with x509 certificates in order to make secure requests to some data services. They require Two way SSL auth, so I've converted my "Sandbox" certificate (.crt) w/ my Private Key to a Password protected .p12 file.
Here's the first question: Where should I place this .p12 file so that it's readable by my application after deploying to Azure (Using DevOps) but still stored securely? Can I use an my Azure Key Vault?
The second issue is that in my Dev environment I haven't been able to establish the SSL binding after making the request (With a .p12 absolute path):
Here's the code I'm using:
void GetATMs()
{
string requestURL = "https://sandbox.api.visa.com/globalatmlocator/v1/localatms/atmsinquiry";
string userId = "MyUserId";
string password = "MyPassword";
string p12certificatePath = "C:\\Code\\projects\\project\\\\Clients\\PaymentGateways\\Visa\\Certs\\TC_keyAndCertBundle.p12";
string p12certificatePassword = "CertPassword";
string postData = #"{""wsRequestHeaderV2"": { ""requestTs"": ""2018-11-06T03:16:18.000Z"", ""applicationId"": ""VATMLOC"", ""requestMessageId"": ""ICE01-001"", ""userId"": ""CDISIUserID"", ""userBid"": ""10000108"", ""correlationId"": ""909420141104053819418"" }, ""requestData"": { ""culture"": ""en-US"", ""distance"": ""20"", ""distanceUnit"": ""mi"", ""metaDataOptions"": 0, ""location"": { ""address"": null, ""placeName"": ""700 Arch St, Pittsburgh, PA 15212"", ""geocodes"": null }, ""options"": { ""range"": { ""start"": 10, ""count"": 20 }, ""sort"": { ""primary"": ""city"", ""direction"": ""asc"" }, ""operationName"": ""or"", ""findFilters"": [ { ""filterName"": ""OPER_HRS"", ""filterValue"": ""C"" } ], ""useFirstAmbiguous"": true } } }";
HttpWebRequest request = WebRequest.Create(requestURL) as HttpWebRequest;
request.Method = "POST";
// Add headers
string authString = userId + ":" + password;
var authStringBytes = System.Text.Encoding.UTF8.GetBytes(authString);
string authHeaderString = Convert.ToBase64String(authStringBytes);
request.Headers["Authorization"] = "Basic " + authHeaderString;
// Add certificate
var certificate = new X509Certificate2(p12certificatePath, p12certificatePassword);
request.ClientCertificates.Add(certificate);
request.Accept = "application/json";
var data = Encoding.ASCII.GetBytes(postData);
request.ContentLength = data.Length;
// Get the request stream.
Stream dataStream = request.GetRequestStream();
// Write the data to the request stream.
dataStream.Write(data, 0, data.Length);
// Close the Stream object.
dataStream.Close();
// Get the response.
WebResponse response = request.GetResponse();
// Display the status.
Console.WriteLine(((HttpWebResponse)response).StatusDescription);
// Get the stream containing content returned by the server.
dataStream = response.GetResponseStream();
// Open the stream using a StreamReader for easy access.
StreamReader reader = new StreamReader(dataStream);
// Read the content.
string responseFromServer = reader.ReadToEnd();
// Display the content.
Console.WriteLine(responseFromServer);
// Clean up the streams.
reader.Close();
dataStream.Close();
response.Close();
What am I missing here?
It fails the following way:
An unhandled exception occurred while processing the request.
Win32Exception: The credentials supplied to the package were not recognized
System.Net.SSPIWrapper.AcquireCredentialsHandle(SSPIInterface secModule, string package, CredentialUse intent, SCHANNEL_CRED scc)
HttpRequestException: The SSL connection could not be established, see inner exception.
System.Net.Http.ConnectHelper.EstablishSslConnectionAsyncCore(Stream stream, SslClientAuthenticationOptions sslOptions, CancellationToken cancellationToken)
WebException: The SSL connection could not be established, see inner exception. The credentials supplied to the package were not recognized
System.Net.HttpWebRequest.GetResponse()
We have a Wildcard SSL for our domain. Are they different? Can it be registered in the Visa dashboard and used for make secure request as it is signed by a trusted CA authority?
Well, yes. As per, #dagope Recommendation, I've uploaded my certificate to key-management on Azure and access it through the SDK. This is also a best practice for key/certificate management on Azure.
I have an azure stored procedure, and I need to hit it with a python script that I'm going to upload as a webjob to schedule it to run once per day.
I've been reading the docs on executing a stored procedure, the common request headers for Azure Cosmos DB rest calls, and the page on access control, but the access control page mentions that these keys are for read queries only (so I assume not for hitting stored procedures, which have rights to do any sort of query or else that seems like a huge vulnerability hole).
I need to know specifically how do I get a key from Azure in python to hit my stored procedure endpoint?
Update 1
I was able, finally, to construct the Authorization string and send it, along with some other headers, to the server. But I am still getting an unauthorized response.
The response:
{
"code": "Unauthorized",
"message": "The input authorization token can't serve the request. Please check that the expected payload is built as per the protocol, and check the key being used. Server used the following payload to sign: 'post\nsprocs\ndbs/metrics/colls/LoungeVisits/sprocs/calculateAverage\nfri, 05 oct 2018 19:06:17 gmt\n\n'\r\nActivityId: 41cd36af-ad0e-40c3-84c8-761ebd14bf6d, Microsoft.Azure.Documents.Common/2.1.0.0"
}
The request headers:
{
Authorization: [my-auth-string],
x-ms-version: "2017-02-22", //My DB was created after this, the latest version, so I assume it uses this version; can I verify this somehow?
x-ms-date: "Fri, 05 Oct 2018 19:06:17 GMT", // My js for returning the auth string also returns the date, so I copy both in
Content-Type: application/json
}
Code to generate auth string which is then copy/pasted into Postman:
var crypto = require("crypto");
var inputKey = "my-key-from-azure";
var today = new Date().toUTCString();
console.log(today);
console.log(getAuthorizationTokenUsingMasterKey("POST", "dbs", "dbs/ToDoList", today, inputKey))
function getAuthorizationTokenUsingMasterKey(verb, resourceType, resourceId, date, masterKey)
{
var key = new Buffer(masterKey, "base64");
var text = (verb || "").toLowerCase() + "\n" +
(resourceType || "").toLowerCase() + "\n" +
(resourceId || "") + "\n" +
date.toLowerCase() + "\n" +
"" + "\n";
var body = new Buffer(text, "utf8");
var signature = crypto.createHmac("sha256", key).update(body).digest("base64");
var MasterToken = "master";
var TokenVersion = "1.0";
return encodeURIComponent("type=" + MasterToken + "&ver=" + TokenVersion + "&sig=" + signature);
}
The page about authorization headers is for any Cosmos DB REST request: query, stored procedures, etc.
Azure cosmos DB has python SDK which is the recommended and supported way for such scenarios.
Also python SDK code is open-sourced. Here is the reference to auth header creation code enter link description here
i want to send message to Azure service bus by Azure scheduler using post
like demo in this page
http://www.prasadthinks.com/
but i don't know how to set 'authorization' property in Http Header.
As far as I know, the 'authorization' property must contains the service bus's access token.
You could use your shared access policies's key-name and key to generate the access token by using codes.
More details, you could refer to below codes.
string keyName = "keyname";
string key = "key";
var sasToken = createToken("http://yourservicebusname.servicebus.windows.net/queuename", keyName, key);
createToken function:
private static string createToken(string resourceUri, string keyName, string key)
{
TimeSpan sinceEpoch = DateTime.UtcNow - new DateTime(1970, 1, 1);
var expiry = Convert.ToString((int)sinceEpoch.TotalSeconds + 7200); //EXPIRES in 2h
string stringToSign = HttpUtility.UrlEncode(resourceUri) + "\n" + expiry;
HMACSHA256 hmac = new HMACSHA256(Encoding.UTF8.GetBytes(key));
var signature = Convert.ToBase64String(hmac.ComputeHash(Encoding.UTF8.GetBytes(stringToSign)));
//this is the auth token
var sasToken = String.Format(CultureInfo.InvariantCulture,
"SharedAccessSignature sr={0}&sig={1}&se={2}&skn={3}",
HttpUtility.UrlEncode(resourceUri), HttpUtility.UrlEncode(signature), expiry, keyName);
return sasToken;
}
The result is like below:
This is the 'authorization' property, you could copy it.But this token has two hours limit.
The azure scheduler job setting like below:
Besides, the azure scheduler job have already support send the message to the service bus, you don't need to create the sas token by yourself, you could just add the keyName and key in its authentication settings.
More details, you could refer to below images:
I am trying to call a Azure Notification Hub REST API , based on this documentation. As they said , I tried to create a Header of API and it giving me an error "The credentials contained in the authorization header are not in the WRAP format".
My Demo DefaultFullSharedAccessSignature is :
Endpoint=sb://shinetrialhub-ns.servicebus.windows.net/;SharedAccessKeyName=DefaultFullSharedAccessSignature;SharedAccessKey=BaGJbFDQZ+hkbi2MdUj7gU0tOM+aC/k+mez9J/y54Qc=
Here my API: https://shinetrialhub-ns.servicebus.windows.net/shinetrialhub/messages/?api-version=2015-01
by adding valid header (please see the MSDN document)
You need to generate Shared Access Signature Authentication with Service Bus. I've been using code below to achieve this:
resourceUri: https://shinetrialhub-ns.servicebus.windows.net/shinetrialhub/
keyName: RootManageSharedAccessKey
key: the value for RootManageSharedAccessKey
private string GetSasToken(string resourceUri, string keyName, string key)
{
var expiry = GetExpiry();
var stringToSign = HttpUtility.UrlEncode(resourceUri) + "\n" + expiry;
var hmac = new HMACSHA256(Encoding.UTF8.GetBytes(key));
var signature = Convert.ToBase64String(hmac.ComputeHash(Encoding.UTF8.GetBytes(stringToSign)));
var sasToken = string.Format(CultureInfo.InvariantCulture,
"SharedAccessSignature sr={0}&sig={1}&se={2}&skn={3}",
HttpUtility.UrlEncode(resourceUri), HttpUtility.UrlEncode(signature), expiry, keyName);
return sasToken;
}
private string GetExpiry()
{
var sinceEpoch = DateTime.UtcNow - new DateTime(1970, 1, 1);
return Convert.ToString((int) sinceEpoch.TotalSeconds + 102000); //token valid for that many seconds
}
Also make sure you have all the right headers, as shown in the documentation.