Im trying to upload file directly from web browser using the Azure storage Rest API and Ajax(actually im using Angular's $http)
I've followed every possible official and custom guides without success.
For starters my CORS is set like this:
var storageAccount = CloudStorageAccount.Parse(StorageConnectionString);
var blobClient = storageAccount.CreateCloudBlobClient();
var blobServiceProperties = blobClient.GetServiceProperties();
blobServiceProperties.Cors = new CorsProperties();
blobServiceProperties.Cors.CorsRules.Add(new CorsRule()
{
AllowedHeaders = new List<string>() { "*" },
AllowedMethods = CorsHttpMethods.Put | CorsHttpMethods.Get | CorsHttpMethods.Head | CorsHttpMethods.Post,
AllowedOrigins = new List<string>() { "*" },
ExposedHeaders = new List<string>() { "*" },
MaxAgeInSeconds = 1800 // 30 minutes
});
blobClient.SetServiceProperties(blobServiceProperties);
At UI level I call my own API that returns the SAS URL like this:
var storageAccount = CloudStorageAccount.Parse(StorageConnectionString);
var blobClient = storageAccount.CreateCloudBlobClient();
var container = blobClient.GetContainerReference("fotosrestaurantes");
var blobPermissions = container.GetPermissions();
blobPermissions.SharedAccessPolicies.Clear();
blobPermissions.SharedAccessPolicies.Add("enviarFotoRestaurante", new SharedAccessBlobPolicy()
{
Permissions = SharedAccessBlobPermissions.Write | SharedAccessBlobPermissions.Read | SharedAccessBlobPermissions.Create | SharedAccessBlobPermissions.Add
SharedAccessExpiryTime = DateTime.UtcNow.AddMinutes(30),
});
container.SetPermissions(blobPermissions);
var sasToken = container.GetSharedAccessSignature(null, "enviarFotoRestaurante");
return Ok(sasToken);
Then im using this Angular Module for uploading the blob: https://github.com/kinstephen/angular-azure-blob-upload
So far so good, but when I try to upload I get this from OPTIONS request (from Chrome's Network tab):
403 Server failed to authenticate the request. Make sure the value of
Authorization header is formed correctly including the signature.
And this from the console:
XMLHttpRequest cannot load 'myAzureHttpLink/fotosrestaurantes/google.jpg?sv=2015-04-05&sr=b&si=enviarFoto&sig=***&se=2016-01-11T14%3A16%3A22Z&sp=w&comp=block&blockid=YmxvY2stMDAwMDAw
Response to preflight request doesn't pass access control check: No
'Access-Control-Allow-Origin' header is present on the requested
resource. Origin 'http://localhost:34090' is therefore not allowed
access. The response had HTTP status code 403.
Now, as far as I could understand from the oficial documentation(most are very breif on thier matter and have lots of links that almost never cover what you really need on them) if I choose to use SAS URL I dont need the authorization header. As Gaurav says here: https://stackoverflow.com/a/33846704/3198372
I've tryed everything, from container SAS to blob SAS but nothing works. As if the CORS configuration and SAS URL just dont work(even if they are there).
Anyone knows where Im wrong?
Related
I have an Azure storage account where i store blobs in containers,
I am generating SAS URL in order to show the images in my react web app,
when pasting the URL to the browser everything works fine and the image is being downloaded,
but when I try to display it as an img tag in the browser I am receiving the following issue:
Failed to load resource: the server responded with a status of 403 (Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.)
It started to happen a day ago and before it worked fine.
A sample form of URL that I generate is:
https://{{storageName}}.blob.core.windows.net/{{ContainerName}}/5261a483-e131-40f9-90b2-91657b1daec7.png?sv=2020-10-02&st=2022-01-01T10%3A58%3A20Z&se=2022-01-01T11%3A01%3A27Z&sr=b&sp=r&sig=vN2k3%2BD04BDwnSIDx%2F%2FDyGfUt1UIIoivfOzdfh0kWG0%3D
And the code I am using to generate it is:
try {
//get extesnsion of promo image
var containerName = container;
const client = blobServiceClient.getContainerClient(containerName)
if (!containerName)
return '';
//get extesnsion of promo image
const blobName = imageName;
const blobClient = client.getBlobClient(blobName);
const blobSAS = generateBlobSASQueryParameters({
containerName,
blobName,
permissions: BlobSASPermissions.parse("r"),
startsOn: new Date(),
expiresOn: new Date(new Date().valueOf() + 186400)
},
cerds
).toString();
// await sleep(0);
const sasUrl = blobClient.url + "?" + blobSAS;
// console.log(sasUrl);
return sasUrl;
}
catch (err) {
console.log(err)
return '';
}
Why is this happening? how can it be that from the browser URL I always get a good response and can download the image and from the img tag i am getting a 403?
On work around try with solution
Solution 1) check if your storage account is firewall enabled .
Azure Portal -> Storage Account -> Networking -> Check Allow Access From (All Networks / Selected Networks)
If it is "Selected Networks" - It means the storage account is firewall enabled.
If the storage account is firewall enabled, check your app is whitelisted to access.
For more details refer this document: https://learn.microsoft.com/en-us/answers/questions/334786/azure-blob-storage-fails-to-authenticate-34make-su.html
Solution 2) Check with SAS( shared access signature) that expired. Try with updating the new one
I need to create Azure CDN Endpoint for Azure Container. I am using below code to do so.
Endpoint endpoint = new Endpoint() {
IsHttpAllowed = true,
IsHttpsAllowed = true,
Location = this.config.ResourceLocation,
Origins = new List<DeepCreatedOrigin> { new DeepCreatedOrigin(containerName, string.Format(STORAGE_URL, storageAccountName)) },
OriginPath = "/" + containerName,
};
await this.cdnManagementClient.Endpoints.CreateAsync(this.config.ResourceGroupName, storageAccountName, containerName, endpoint);
All the information I provide is correct and Endpoint is getting created successfully. But when I try to access any blob inside it. It is giving an InvalidUrl error.
However the weird thing is If I create the same endpoint using same values through portal, I am able to access and download blobs.
Anyone please let me know what am I doing wrong in my code? Do I need to pass any extra parameters?
As far as I know, if you want to create a storage CDN in code, you need set the OriginHostHeader value as your storage account URL.
More details, you could refer to below codes:
// Create CDN client
CdnManagementClient cdn = new CdnManagementClient(new TokenCredentials(token))
{ SubscriptionId = subscriptionId };
//ListProfilesAndEndpoints(cdn);
Endpoint e1 = new Endpoint()
{
// OptimizationType = "storage",
Origins = new List<DeepCreatedOrigin>() { new DeepCreatedOrigin("{yourstoragename}-blob-core-windows-net", "{yourstoragename}.blob.core.windows.net") },
OriginHostHeader = "{yourstoragename}.blob.core.windows.net",
IsHttpAllowed = true,
IsHttpsAllowed = true,
OriginPath=#"/foo2",
Location = "EastAsia"
};
cdn.Endpoints.Create(resourcegroup, profilename, enpointname, e1);
Besides, I suggest you could generate SAS token to directly access the blob file by URL.
I've a ton of SO post about this error, but I checked everything and still don't find why Azure Blob Storage keep failing authenticate my upload request.
I use Fine-Uploader to generate the request :
var uploaderInstance = $('#baz-fine-uploader').fineUploaderAzure({
template: 'qq-template-manual-trigger',
debug: true,
request: {
containerUrl: 'https://{MYACCOUNT}.blob.core.windows.net/client1',
endpoint: 'https://{MYACCOUNT}.blob.core.windows.net/client1'
},
// for Azure
signature: {
endpoint: "/api/upload/sas"
},
uploadSuccess: {
endpoint: "/api/upload/success"
},
cors: {
//all requests are expected to be cross-domain requests
expected:true
}
});
I generate the SAS URI with the Azure SDK, following what is recommended by Fine-uploader :
public string GetBlobSAS(string bloburi, string method)
{
try
{
var credentials = new StorageCredentials(STORAGE_ACCOUNT_NAME, STORAGE_ACCOUNT_KEY);
CloudBlockBlob blob = new CloudBlockBlob(new Uri(bloburi), credentials);
var sas = blob.GetSharedAccessSignature(new SharedAccessBlobPolicy()
{
Permissions = SharedAccessBlobPermissions.Read | SharedAccessBlobPermissions.Write | SharedAccessBlobPermissions.Create,
SharedAccessExpiryTime = DateTime.UtcNow.AddMinutes(15)
});
return string.Format(CultureInfo.InvariantCulture, "{0}{1}", bloburi, sas);
}
catch (Exception ex)
{
Debug.WriteLine(ex);
throw ex;
}
}
I created a CORS access policy through the Azure portal, and for test purpose, I set "everything" allowed, especially the "Allowed-Origin" field to "*", as I'm testing from my locahost :
Finally, Fine-uploader ask me for the SAS, fetch it and BOUM, the server answers :
OPTIONS https://{MYACCOUNT}.blob.core.windows.net/client1/5f89f3ae-2d10-4e4e-8f3f-25d…QAY40QjKGoJcDsHolt8KXjB86chaTWg0f4t4%3D&se=2016-12-20T14%3A34%3A58Z&sp=rcw 403 (Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.)
XMLHttpRequest cannot load https://{MYACCOUNT}.blob.core.windows.net/client1/5f89f3ae-2d10-4e4e-8f3f-25d…QAY40QjKGoJcDsHolt8KXjB86chaTWg0f4t4%3D&se=2016-12-20T14%3A34%3A58Z&sp=rcw. Response for preflight has invalid HTTP status code 403
The server response is
<Error>
<Code>AuthenticationFailed</Code>
<Message>
Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature. RequestId:71041c4f-0001-00cf-69cc-5aa7de000000 Time:2016-12-20T14:21:35.7541369Z
</Message>
<AuthenticationErrorDetail>
Signature did not match. String to sign used was rcw 2016-12-20T14:34:58Z /blob/{MYACCOUNT}/client1/5f89f3ae-2d10-4e4e-8f3f-25d7b4760965.PNG 2015-12-11
</AuthenticationErrorDetail>
</Error>
Really don't know what else I can do now..
According to whatever SO post, I also tried to add a "&comp=list&restype=container" at the end of my SAS URI, tried several combinations with that, none of them worked...
Any ideas??
I would like to only allow secure HTTP (HTTPS) connection / access to blob container through Shared Access Signature (SAS) URL on Azure Blob Storage (ABS).
Can that be achieved? How?
Please use this override of CloudBlobContainer.GetSharedAccessSignature for including the protocol restriction.
Here's sample code to do the same:
static void GetHttpsOnlySas()
{
var storageAccount = new CloudStorageAccount(new StorageCredentials(accountName, accountKey), true);
var blobClient = storageAccount.CreateCloudBlobClient();
var blobContainer = blobClient.GetContainerReference("container-name");
var sas = blobContainer.GetSharedAccessSignature(new SharedAccessBlobPolicy()
{
Permissions = SharedAccessBlobPermissions.List,
SharedAccessExpiryTime = DateTime.UtcNow.AddHours(1)
},
null,
SharedAccessProtocol.HttpsOnly,//This option will force SAS to work only on HTTPS
null);
}
When you create the sas token you can set a "protocol" parameter. If you set it to https only https will be allowed. More info can be found here: https://azure.microsoft.com/en-us/documentation/articles/storage-dotnet-shared-access-signature-part-1/
When using the sdk to generate the sas token you have to use this method: https://msdn.microsoft.com/en-us/library/azure/mt616571.aspx and set the SharedAccessProtocol parameter.
i have written this below code to get the blob url with cache expiry token, actually have set 2 hours to expire the blob url,
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(CloudConfigurationManager.GetSetting("StorageConnectionString"));
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = blobClient.GetContainerReference(containerName);
CloudBlockBlob blockBlob = container.GetBlockBlobReference("blobname");
//Create an ad-hoc Shared Access Policy with read permissions which will expire in 2 hours
SharedAccessBlobPolicy policy = new SharedAccessBlobPolicy()
{
Permissions = SharedAccessBlobPermissions.Read,
SharedAccessExpiryTime = DateTime.UtcNow.AddHours(2),
};
SharedAccessBlobHeaders headers = new SharedAccessBlobHeaders()
{
ContentDisposition = string.Format("attachment;filename=\"{0}\"", "blobname"),
};
var sasToken = blockBlob.GetSharedAccessSignature(policy, headers);
blobUrl = blockBlob.Uri.AbsoluteUri + sasToken;
using this above code i get the blob url with valid expiry token, now i want to check blob url is valid or not in one client application.
I tried web request and http client approach by passing the URL and get the response status code. if the response code is 404 then I assuming the URL is expired if not the URL is still valid,but this approach taking more time.
Please suggest me any other way.
I tried running code very similar to yours, and I am getting a 403 error, which is actually what is expected in this case. Based on your question, I am not sure whether the 403 is more helpful to you than the 404. Here is code running in a console application that returns a 403:
class Program
{
static void Main(string[] args)
{
string blobUrl = CreateSAS();
CheckSAS(blobUrl);
Console.ReadLine();
}
//This method returns a reference to the blob with the SAS, and attempts to read it.
static void CheckSAS(string blobUrl)
{
CloudBlockBlob blob = new CloudBlockBlob(new Uri(blobUrl));
//If the DownloadText() method is run within the two minute period that the SAS is valid, it succeeds.
//If it is run after the SAS has expired, it returns a 403 error.
//Sleep for 3 minutes to trigger the error.
System.Threading.Thread.Sleep(180000);
Console.WriteLine(blob.DownloadText());
}
//This method creates the SAS on the blob.
static string CreateSAS()
{
string containerName = "forum-test";
string blobName = "blobname";
string blobUrl = "";
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(CloudConfigurationManager.GetSetting("StorageConnectionString"));
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = blobClient.GetContainerReference(containerName);
container.CreateIfNotExists();
CloudBlockBlob blockBlob = container.GetBlockBlobReference(blobName + DateTime.Now.Ticks);
blockBlob.UploadText("Blob for forum test");
//Create an ad-hoc Shared Access Policy with read permissions which will expire in 2 hours
SharedAccessBlobPolicy policy = new SharedAccessBlobPolicy()
{
Permissions = SharedAccessBlobPermissions.Read,
SharedAccessExpiryTime = DateTime.UtcNow.AddMinutes(2),
};
SharedAccessBlobHeaders headers = new SharedAccessBlobHeaders()
{
ContentDisposition = string.Format("attachment;filename=\"{0}\"", blobName),
};
var sasToken = blockBlob.GetSharedAccessSignature(policy, headers);
blobUrl = blockBlob.Uri.AbsoluteUri + sasToken;
return blobUrl;
}
}
There are cases in which SAS failures do return a 404, which can create problems for troubleshooting operations using SAS. The Azure Storage team is aware of this issue and in future releases SAS failures may return a 403 instead. For help troubleshooting a 404 error, see http://azure.microsoft.com/en-us/documentation/articles/storage-monitoring-diagnosing-troubleshooting/#SAS-authorization-issue.
I also ran into the same issue a few days back. I was actually expecting storage service to return a 403 error code when the SAS token has expired but storage service returns 404 error.
Given that we don't have any other option, the way you're doing it is the only viable way but it is still not correct because you could get 404 error if the blob is not present in the storage account.
Maybe you can parse "se" argument from the generated SAS, which means expiry time, e.g. "se=2013-04-30T02%3A23%3A26Z". However, since the server time might not be the same as client time, the solution may be unstable.
http://azure.microsoft.com/en-us/documentation/articles/storage-dotnet-shared-access-signature-part-1/
You're using UTC time for SharedAccessExpiryTime (see "Expiry Time" in https://learn.microsoft.com/en-us/azure/storage/common/storage-dotnet-shared-access-signature-part-1#parameters-common-to-account-sas-and-service-sas-tokens).
The expiry time then is registered under the se claim in the token the value of which can be checked against current UTC time on the client side before actually using the token. This way you'd save yourself from making an extra call to the Blob storage only to find out whether the token is expired.