Simple UploadPage azure blob using pageBlobClient - azure

I am using Premium/Hot, LRS, StorageV2 Azure storage and trying to write a simple string but I keep getting an authentication error.
To generate the SAS URI of the container in the portal, I went to:
storage resource -> containers -> my container -> shared access token -> generate SAS token and URL
// SAS URI of blob container
var sasUriStr = "https://storageaccountname.blob.core.windows.net/containername?sp=r&st=2021-08-10T00:34:00Z&se=2021-08-15T08:34:00Z&spr=https&sv=2020-08-04&sr=c&sig=ABCDEFGH/YJKLMNOP=";
var uri = new Uri(sasUriStr);
var pageBlobClient = new PageBlobClient(uri);
pageBlobClient.UploadPages(new MemoryStream(Encoding.UTF8.GetBytes("hello world")), 0);
Unhandled exception. Azure.RequestFailedException: Server failed to
authenticate the request. Make sure the value of Authorization header
is formed correctly including the signature.
Time:2021-08-12T20:22:44.0905117Z Status: 403 (Server failed to
authenticate the request. Make sure the value of Authorization header
is formed correctly including the signature.) ErrorCode:
AuthenticationFailed
Additional Information: AuthenticationErrorDetail: Signature did not
match. String to sign used was r
I appreciate any help or hint. Thank you
UPDATE:
After adding /blob name to SAS URI I get this error:
Unhandled exception. Azure.RequestFailedException: The value for one of the HTTP headers is not in the correct format.
RequestId:7a741951-401c-00c9-3ce3-8f5076000000
Time:2021-08-13T01:38:13.4585107Z
Status: 400 (The value for one of the HTTP headers is not in the correct format.)
ErrorCode: InvalidHeaderValue
Additional Information:
HeaderName: x-ms-range
HeaderValue: bytes=0-10
Content:
<?xml version="1.0" encoding="utf-8"?>
<Error><Code>InvalidHeaderValue</Code><Message>The value for one of the HTTP headers is not in the correct format.
RequestId:7a741951-401c-00c9-3ce3-8f5076000000
Time:2021-08-13T01:38:13.4585107Z</Message><HeaderName>x-ms-range</HeaderName><HeaderValue>bytes=0-10</HeaderValue></Error>
Headers:
Server: Windows-Azure-Blob/1.0,Microsoft-HTTPAPI/2.0
x-ms-error-code: InvalidHeaderValue
x-ms-request-id: 7a741951-401c-00c9-3ce3-8f5076000000
x-ms-version: 2020-08-04
x-ms-client-request-id: 4366d771-7f70-4bbf-9677-6e9fcf3cb7a1
Date: Fri, 13 Aug 2021 01:38:12 GMT
Content-Length: 327
Content-Type: application/xml

Solution
Please change your code to something like:
var sasUriStr = "https://storageaccountname.blob.core.windows.net/containername?sp=r&st=2021-08-10T00:34:00Z&se=2021-08-15T08:34:00Z&spr=https&sv=2020-08-04&sr=c&sig=ABCDEFGH/YJKLMNOP=";
var uri = new Uri(sasUriStr);
BlobContainerClient containerClient = new BlobContainerClient(uri);
var pageBlobClient = containerClient.GetPageBlobClientCore("page-blob-name");
pageBlobClient.UploadPages(new MemoryStream(Encoding.UTF8.GetBytes("hello world")), 0);
Please ensure that the blob already exists before you use this code.
Problem
The reason you were running into the problem is because you were trying to create a PageBlobClient using a URI representing a blob container SAS. Because of this, Azure Storage service assumed that your blob name is containername and the container is $root. Since the SAS token was obtained for containername blob container and the service used $root blob container for validating the SAS token, you are getting authorization failed error.
By creating a BlobContainerClient using the SAS URL and then creating a PageBlobClient using BlobContainerClient.GetPageBlobClientCore(String) solves the problem.

Related

Azure Blob Storage - sp is mandatory. Cannot be empty

I am getting error while trying to upload a file to Azure Blob Storage using SAS link. Its an authentication error complaining about empty sp attribute. The wierd thing is Sp element is present in SAS Url.
It cannot be a permission issue as I am able to upload the file using the same SAS URL using ADF.
Url
BlobEndpoint=https://####.blob.core.windows.net/####?sp=racwdl&st=2021-12-08T01:14:01Z&se=2022-02-28T09:14:01Z&spr=https&sv=2020-08-04&sr=c&sig=####
Details of error
Server failed to authenticate the request. Make sure the value of
Authorization header is formed correctly including the signature.
RequestId:ed57ec28-f01e-00a9-79d2-ebcfc2000000
Time:2021-12-08T01:22:40.1147833Z Status: 403 (Server failed to
authenticate the request. Make sure the value of Authorization header
is formed correctly including the signature.) ErrorCode:
AuthenticationFailed
Additional Information: AuthenticationErrorDetail: sp is mandatory.
Cannot be empty
Content: AuthenticationFailedServer
failed to authenticate the request. Make sure the value of
Authorization header is formed correctly including the signature.
RequestId:ed57ec28-f01e-00a9-79d2-ebcfc2000000
Time:2021-12-08T01:22:40.1147833Zsp
is mandatory. Cannot be empty
Headers: x-ms-request-id: ed57ec28-f01e-00a9-79d2-ebcfc2000000
x-ms-error-code: AuthenticationFailed Content-Length: 407
Content-Type: application/xml Date: Wed, 08 Dec 2021 01:22:39 GMT
Server: Microsoft-HTTPAPI/2.0
Code
Stream file = new FileStream(fileToUpload, FileMode.Open);
var blobServiceClient1 = new BlobServiceClient(endpointString);
var containerRef = blobServiceClient1.GetBlobContainerClient("dropoff-commissionstatements");
var blob1 = containerRef.GetBlobClient("TDM_FINAL_102449_13092021_COMMSTMT_AR_TAL_D95337.csv");
string file_extension = Path.GetExtension(fileToUpload);
string filename_withExtension = Path.GetFileName(fileToUpload);
blob1.Upload(file);
Please try by changing your connection string to something like:
BlobEndpoint=https://####.blob.core.windows.net/####; SharedAccessSignature=sp=racwdl&st=2021-12-08T01:14:01Z&se=2022-02-28T09:14:01Z&spr=https&sv=2020-08-04&sr=c&sig=####
For more details, please see this link: https://learn.microsoft.com/en-us/azure/storage/common/storage-configure-connection-string#create-a-connection-string-using-a-shared-access-signature.

Azure 'azcopy sync' issue in syncing across containers using sas token for a multitenant app

I am trying to sync data across Azure containers in different accounts using a Multitetant App and azcopy tool.
The syncing happens via "azcopy sync" and using separate SAS tokens for both source storage account and destination storage account.
I am generating short lived sas tokens using the Java SDK following the user delegation key method.
Here is the scenario:
Account1 (destination) has App1 registered. i.e. Account1 is home tenant for App1.
Account1 has StorageAccount1 and Container1 configured
App1 is given "Storage Blob Data Contributor" role on StorageAccount1
Account2 (source) has StorageAccount2 and Container2 configured. It is the data source for us. Here, App1 is added as a ServicePrincipal via:
az ad sp create --id client-id-of-App1-in-Account1
In Account2, for this SP, we also gave the Storage Blob Data Reader Role as:
az role assignment create \
--assignee-object-id <object-id-for-this-sp> \
--role 2a2b9908-6ea1-4ae2-8e65-a410df84e7d1 \
--scope /subscriptions/<subsid-account2>/resourceGroups/<resgrpname>/providers/Microsoft.Storage/storageAccounts/<storagename>
This completes the setup.
Now using Java SDK, I generated a user delegation key for both source and destination.
The snippet looks something like below.
genSasToken(String storageAccountName, String containerName,
String tenantId,
String azureAppClientId,
String azureAppClientSecret,
boolean isDestinationAccount) {
BlobContainerSasPermission blobContainerSasPermission =
new BlobContainerSasPermission().setReadPermission(true).setListPermission(true);
if (isDestinationAccount) {
blobContainerSasPermission.setCreatePermission(true)
.setAddPermission(true)
.setWritePermission(true)
.setExecutePermission(true);
}
BlobServiceSasSignatureValues builder =
new BlobServiceSasSignatureValues(OffsetDateTime.now().plusHours(1), blobContainerSasPermission)
.setProtocol(SasProtocol.HTTPS_ONLY);
// Create a BlobServiceClient object which will be used to create a container client
String endpoint = String.format(Locale.ROOT, "https://%s.blob.core.windows.net",
storageAccountName);
ClientSecretCredential clientSecretCredential = new ClientSecretCredentialBuilder()
.clientId(azureAppClientId)
.clientSecret(azureAppClientSecret)
.tenantId(tenantId)
.build();
BlobServiceClient blobServiceClient =
new BlobServiceClientBuilder().endpoint(endpoint).credential(clientSecretCredential).buildClient();
BlobContainerClient blobContainerClient =
blobServiceClient.getBlobContainerClient(containerName);
// Get a user delegation key for the Blob service that's valid for one hour.
// You can use the key to generate any number of shared access signatures over the lifetime of the key.
OffsetDateTime keyStart = OffsetDateTime.now();
OffsetDateTime keyExpiry = OffsetDateTime.now().plusHours(1);
UserDelegationKey userDelegationKey = blobServiceClient.getUserDelegationKey(keyStart, keyExpiry);
String sas = blobContainerClient.generateUserDelegationSas(builder, userDelegationKey);
return sas;
}
Above method is called for both source and destination and gives us SAS tokens generated programmatically.
Interesting thing happening is this:
azcopy sync https://storageaccount2/container2/?sas-token-for2 https://storageaccount1/container1/?sas-token-for1
above sync errors out as
INFO: Authentication failed, it is either not correct, or expired, or does not have the correct permission -> github.com/Azure/azure-storage-blob-go/azblob.newStorageError, /Users/runner/go/pkg/mod/github.com/!azure/azure-storage-blob-go#v0.10.1-0.20201022074806-8d8fc11be726/azblob/zc_storage_error.go:42
===== RESPONSE ERROR (ServiceCode=AuthorizationFailure) =====
Description=This request is not authorized to perform this operation.
RequestId:xxx
Time:2021-01-27T10:26:34.9282634Z, Details:
Code: AuthorizationFailure
GET https://storageaccount1.blob.core.windows.net/container1/?comp=properties&restype=account&se=2021-01-27t11%3A10%3A12z&sig=-REDACTED-&ske=2021-01-27t11%3A10%3A12z&skoid=xxx&sks=b&skt=2021-01-27t10%3A10%3A12z&sktid=xxx&skv=2020-02-10&sp=racwle&spr=https&sr=c&sv=2020-02-10&timeout=901
User-Agent: [AzCopy/10.8.0 Azure-Storage/0.10 (go1.13; darwin)]
X-Ms-Client-Request-Id: [xxx]
X-Ms-Version: [2019-12-12]
--------------------------------------------------------------------------------
RESPONSE Status: 403 This request is not authorized to perform this operation.
Content-Length: [246]
Content-Type: [application/xml]
Date: [Wed, 27 Jan 2021 10:26:34 GMT]
Server: [Windows-Azure-Blob/1.0 Microsoft-HTTPAPI/2.0]
X-Ms-Client-Request-Id: [xxx]
X-Ms-Error-Code: [AuthorizationFailure]
X-Ms-Request-Id: [xxx]
X-Ms-Version: [2019-12-12]
But, when I try to copy from source to localhost using same sas token 2, it works.
azcopy sync https://storageaccount2/container2/sas-token-for2 /tmp
and
when I try to copy a localhost folder to destination using same sas token it also works.
azcopy sync /tmp https://storageaccount1/container1/sas-token-for1
So the tokens work individually like above.
But azcopy sync https://storageaccount2/container2/sas-token-for2 https://storageaccount1/container1/sas-token-for1
Fails.
Any pointers what might be the issue here?
For syncing you don't need execution permission (which is still in preview in any case). Just remove .setExecutePermission(true) you should be good. In fact syncing should work with only read, write and list permission on destination.

Azure Blob: 403 (Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.)

I have the sas token with write permission but when I am trying to write the blob getting the below error.
Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
RequestId:6c52482f-c01e-001c-5891-b2f06f000000
Time:2020-11-04T10:03:01.6761446Z
Status: 403 (Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.)
ErrorCode: AuthenticationFailed
Headers:
x-ms-request-id: 6c52482f-c01e-001c-5891-b2f06f000000
x-ms-error-code: AuthenticationFailed
Date: Wed, 04 Nov 2020 10:03:00 GMT
Server: Microsoft-HTTPAPI/2.0
Content-Length: 529
Content-Type: application/xml
Code:
var blobClient = new BlobClient(new Uri(command.AzureBlobContainerTargetUri.AbsoluteUri));
using (var ms = new MemoryStream())
{
LoadStreamWithJson(ms, JsonConvert.SerializeObject(userData));
await blobClient.UploadAsync(ms);
}
command.AzureBlobContainerTargetUri.AbsoluteUri --> sas token uri
userData- some object
The error is always related to your access key as SumanthMarigowda said in the comment. Please regenerate the key in the portal and try with the new one. And check your PC's time, in addition to time check your VPN as well.
I also faced the error when using DefaultAzureCredential(). This is the issue(see the UPDATE) with Python.

Generate user delegation SAS token running locally

I'm creating solution based on this [documentation][1]. I have it almost working as I want to but it works only when deployed to Azure. App Service has Managed Identity configured and it is assigned Storage Blob Data Contributor role. Is there any way to make it run on my local machine? Currently I need to publish code from VS to Azure and then use Remote debugging to verify how it works.
This is the problematic line:
userDelegationKey key = await blobClient.GetUserDelegationKeyAsync(DateTimeOffset.UtcNow,
DateTimeOffset.UtcNow.AddDays(7));
I get exception:
Status: 400 (The value for one of the XML nodes is not in the correct format.)
ErrorCode: InvalidXmlNodeValue
I use DefaultAzureCredentials and in debug I see it has 3 different sources. First of them is EnvironmentCredential (then ManagedIdentityCredential and SharedTokenCacheCredential). So I tried registering application in Azure AD and configured those 3 env variables but it didn't help. Maybe I need to add some specific permissions to this app?
"AZURE_CLIENT_ID": "",
"AZURE_CLIENT_SECRET": "",
"AZURE_TENANT_ID": ""
Or maybe this could somehow work with my account in Azure? If I'm also assigned Storage Blob Data Contributor role?
EDIT: I captured request and response with Fiddler
Request:
POST https://myaccount.blob.core.windows.net/?restype=service&comp=userdelegationkey HTTP/1.1
Host: myaccount.blob.core.windows.net
x-ms-version: 2019-07-07
x-ms-client-request-id: 23071825-dcf0-4803-a8a9-c44ec38695d5
x-ms-return-client-request-id: true
User-Agent: azsdk-net-Storage.Blobs/12.4.4 (.NET Core 3.1.2; Microsoft Windows 10.0.16299)
Authorization: Bearer Hidden
traceparent: 00-7e23f80250325742853c846211a83d02-6739d4cbc8ef0a46-00
Content-Type: application/xml
Content-Length: 91
<KeyInfo><Start>2020-07-08T06:02:35Z</Start><Expiry>2020-07-15T06:17:35Z</Expiry></KeyInfo>
Response:
HTTP/1.1 400 The value for one of the XML nodes is not in the correct format.
Content-Length: 348
Content-Type: application/xml
Server: Windows-Azure-Blob/1.0 Microsoft-HTTPAPI/2.0
x-ms-request-id: ba2fe641-f01e-0065-3eef-54acae000000
x-ms-client-request-id: 23071825-dcf0-4803-a8a9-c44ec38695d5
x-ms-version: 2019-07-07
x-ms-error-code: InvalidXmlNodeValue
Date: Wed, 08 Jul 2020 06:16:14 GMT
<?xml version="1.0" encoding="utf-8"?><Error><Code>InvalidXmlNodeValue</Code><Message>The value for one of the XML nodes is not in the correct format.
RequestId:ba2fe641-f01e-0065-3eef-54acae000000
Time:2020-07-08T06:16:15.7553428Z</Message><XmlNodeName>2020-07-15T06:17:35Z</XmlNodeName><XmlNodeValue>2020-07-15T06:17:35Z</XmlNodeValue></Error>
EDIT 2:
It works with help from #JimXu. In addition I was able to make it work with Azure account configured in Visual Studio so I could remove application registration that I created just for this purpose in Azure AD.
[1]: https://learn.microsoft.com/en-us/azure/storage/blobs/storage-blob-user-delegation-sas-create-dotnet#example-get-a-user-delegation-sas
If you want to get Azure storage account User Delegation Key, you need to assign Storage Blob Data Contributor, Storage Blob Data Owner or Storage Blob Delegator to the AD application or account. For more details, please refer to the document
Besides, please note that when we get Azure storage account User Delegation Key, we need to provide the expire time. Its value must be a valid date and time within 7 days of the current time. For more details, please refer to here.
For example
string accountName = "jimtestdiag417";
string blobEndpoint = $"https://{accountName}.blob.core.windows.net/";
// Create a new Blob service client with Azure AD credentials.
BlobServiceClient blobClient = new BlobServiceClient(new Uri(blobEndpoint),
new DefaultAzureCredential();
// Get a user delegation key for the Blob service that's valid for seven days.
// to avoid clock skew between the requesting pc and azure servers, please set expire time in future six days
UserDelegationKey key = await blobClient.GetUserDelegationKeyAsync(DateTimeOffset.UtcNow,
DateTimeOffset.UtcNow.AddDays(6));
// Read the key's properties.
Console.WriteLine("User delegation key properties:");
Console.WriteLine("Key signed start: {0}", key.SignedStartsOn);
Console.WriteLine("Key signed expiry: {0}", key.SignedExpiresOn);
Console.WriteLine("Key signed object ID: {0}", key.SignedObjectId);
Console.WriteLine("Key signed tenant ID: {0}", key.SignedTenantId);
Console.WriteLine("Key signed service: {0}", key.SignedService);
Console.WriteLine("Key signed version: {0}", key.SignedVersion);
Console.WriteLine();

Azure Blob Storage with SAS Token

I'm trying to upload a file to Azure Storage Account as Blob and I have a client provided by this "github.com/Azure/azure-storage-blob-go/azblob" package. As I saw in documentation there should be possibility to communicate with Storage using SAS Token creating anonymous credential with
credential := azblob.NewAnonymousCredential()
po := azblob.PipelineOptions{
Log: pipeline.LogOptions{
Log: func(s pipeline.LogLevel, m string) {
log.Tracef("pipeline message: %s", m)
},
ShouldLog: func(level pipeline.LogLevel) bool {
return level <= pipeline.LogError
},
},
}
pipeline := azblob.NewPipeline(credential, po)
However, I don't see an option to pass SAS Token which I receive from other service after I ask for access.
I also tried do it 'manually' using Azure Storage Account REST API, so my URL was like https://servicename.blob.core.windows.net/containerID/BlobID?sasToken... but all I get was 400, 411 and 501 HTTP codes depending on request header.
For example with
req.Header.Add("Accept", "*/*")
req.Header.Add("Accept-Language", "en-US,en;q=0.5 --compressed")
req.Header.Add("Accept-Encoding", "gzip, deflate, br")
req.Header.Add("content-type", "application/octet-stream")
req.Header.Add("x-ms-version", "2019-02-02")
req.Header.Add("x-ms-blob-type", "BlockBlob")
req.Header.Add("x-ms-client-request-id", "someID")
req.Header.Add("Connection", "keep-alive")
req.Header.Add("Content-Length", "512000")
req.Header.Add("Transfer-Encoding", "gzip, chunked, deflate")
I receive 400 code with
<?xml version="1.0" encoding="utf-8"?>
<Error>
<Code>MissingRequiredHeader</Code>
<Message>
An HTTP header that's mandatory for this request is not specified.
RequestId:someId
Time:2020-02-14T13:47:58.8383371Z
</Message>
<HeaderName>x-ms-original-content-length</HeaderName>
</Error>
Adding x-ms-original-content-length header changes nothing.
The funny fact is that it only happens when I try it in Go code. When I tried any REST Client it was working with these headers.
Summarizing, my need is to put a file into Azure Storage Account as a blob and the second solution which should simply work, does not work, and first one is not completed because I don't see the way to pass SAS Token. What am I missing?
So in the 1st case the problem was that SAS token is passed nowhere in this package. It should be added to URL later on during url creation like:
URL, err := url.Parse(blobURL + "/" + containerName + "/" + blobName + "?token as query"
And in the second case everything was about Content-Length which is not changeable from Header side. It's automatically set during http.NewRequest(...) but it must be one of following types *bytes.Buffer, *bytes.Reader or *strings.Reader. Otherwise it's 0. However http.NewRequest(...) accepts io.Reader as body, so it will compile with everything implementing io.Reader interface like *os.File but it will not set Content-Length which is required in case of Azure Storage Account. When I switched to one of three given types I listed above it started working.

Resources