My Application is in a Kubernetes cluster and I'm using Java v12 SDK to interact with the Blob Storage. To authorize against Blob Storage I'm using Managed Identities.
My application needs to copy blobs within one container. I haven't found any particular recommendations or examples of how SDK should be used to do the copy.
I figured that the following approach works when I'm working with the emulator
copyBlobClient.copyFromUrl(sourceBlobClient.getBlobUrl());
However, when this gets executed in the cluster I get the following error
<Error>
<Code>CannotVerifyCopySource</Code>
<Message>The specified resource does not exist. RequestId: __ Time: __ </Message>
</Error>
Message says "resource does not exist" but the blob is clearly there. My container has private access, though.
Now when I change the public access level to "Blob(anonymous read access for blobs only)" everything works as excepted. However, public access not acceptable to me.
Main question - what are the right ways to implement copy blob using Java v12 SDK.
What I could miss or misconfigured in my situation?
And the last is the error message itself. There is a part which says "CannotVerifyCopySource" which kind of helps you understand that there is something with access, but the message part is clearly misleading. Shouldn't it be more explicit about the error?
If you want to use Azure JAVA SDK to copy blob with Azure MSI, please refer to the following details
Copy blobs between storage accounts
If you copy blobs between storage accounts with Azure MSI. We should do the following actions
Assign Azure Storage Blob Data Reader to the MSI in the source container
Assign Azure Storage Blob Data Contributor to the MSI in the dest container. Besides when we copy blob, we need write permissions to write content to blob
Generate SAS token for the blob. If the souce blob is public, we can directly use source blob URL without sas token.
For example
try {
BlobServiceClient blobServiceClient = new BlobServiceClientBuilder()
.endpoint("https://<>.blob.core.windows.net/" )
.credential(new DefaultAzureCredentialBuilder().build())
.buildClient();
// get User Delegation Key
OffsetDateTime delegationKeyStartTime = OffsetDateTime.now();
OffsetDateTime delegationKeyExpiryTime = OffsetDateTime.now().plusDays(7);
UserDelegationKey key =blobServiceClient.getUserDelegationKey(delegationKeyStartTime,delegationKeyExpiryTime);
BlobContainerClient sourceContainerClient = blobServiceClient.getBlobContainerClient("test");
BlobClient sourceBlob = sourceContainerClient.getBlobClient("test.mp3");
// generate sas token
OffsetDateTime expiryTime = OffsetDateTime.now().plusDays(1);
BlobSasPermission permission = new BlobSasPermission().setReadPermission(true);
BlobServiceSasSignatureValues myValues = new BlobServiceSasSignatureValues(expiryTime, permission)
.setStartTime(OffsetDateTime.now());
String sas =sourceBlob.generateUserDelegationSas(myValues,key);
// copy
BlobServiceClient desServiceClient = new BlobServiceClientBuilder()
.endpoint("https://<>.blob.core.windows.net/" )
.credential(new DefaultAzureCredentialBuilder().build())
.buildClient();
BlobContainerClient desContainerClient = blobServiceClient.getBlobContainerClient("test");
String res =desContainerClient.getBlobClient("test.mp3")
.copyFromUrl(sourceBlob.getBlobUrl()+"?"+sas);
System.out.println(res);
} catch (Exception e) {
e.printStackTrace();
}
Copy in the same account
If you copy blobs in the same storage account with Azure MSI, I suggest you assign Storage Blob Data Contributor to the MSI in the storage account. Then we can do copy action with the method copyFromUrl.
For example
a. Assign Storage Blob Data Contributor to the MSI at the account level
b. code
try {
BlobServiceClient blobServiceClient = new BlobServiceClientBuilder()
.endpoint("https://<>.blob.core.windows.net/" )
.credential(new DefaultAzureCredentialBuilder().build())
.buildClient();
BlobContainerClient sourceContainerClient = blobServiceClient.getBlobContainerClient("test");
BlobClient sourceBlob = sourceContainerClient.getBlobClient("test.mp3");
BlobContainerClient desContainerClient = blobServiceClient.getBlobContainerClient("output");
String res =desContainerClient.getBlobClient("test.mp3")
.copyFromUrl(sourceBlob.getBlobUrl());
System.out.println(res);
} catch (Exception e) {
e.printStackTrace();
}
For more details, please refer to here and here
I had the same issue using the Java SDK for Azure I solved it by copying the blob using the URL + the SAS token. Actually the resource you're getting through the URL won't appear as available if you don't have the right access to it. Here is the code I used to solve the problem:
BlobClient sourceBlobClient = blobServiceClient
.getBlobContainerClient(currentBucketName)
.getBlobClient(sourceKey);
// initializing the copy blob client
BlobClient copyBlobClient = blobServiceClient
.getBlobContainerClient(newBucketName)
.getBlobClient(newKey);
// Creating the SAS Token to get the permission to copy the source blob
OffsetDateTime expiryTime = OffsetDateTime.now().plusDays(1);
BlobSasPermission permission = new BlobSasPermission().setReadPermission(true);
BlobServiceSasSignatureValues values = new BlobServiceSasSignatureValues(expiryTime, permission)
.setStartTime(OffsetDateTime.now());
String sasToken = sourceBlobClient.generateSas(values);
//Making the copy using the source blob URL + generating the copy
var res = copyBlobClient.copyFromUrl(sourceBlobClient.getBlobUrl() +"?"+ sasToken);
Perhaps another way is to use the streaming API to download and upload data. In our company, we are not allowed to generate SAS token on our storage account due to security and we use the following to copy from an append blob to a block blob (overwriting):
BlobAsyncClient src;
BlobAsyncClient dest;
//...
AppendBlobAsyncClient srcAppend = src.getAppendBlobAsyncClient();
Flux<ByteBuffer> streamData = srcAppend.downloadStream();
Mono<BlockBlobItem> uploaded = dest.upload(streamData, new ParallelTransferOptions(), true);
This returns Mono<BlockBlobItem> and you need to subscribe it to start the process. If used in a non-reactive context, perhaps the easiest way is to block().
Note that this will only copy the data and additional work is needed if you also need to copy the metadata and tags. For tags, there is BlobAsyncClientBase.getTags(). For meta data, there is BlobAsyncClientBase.getProperties(). You can get these tags and metadata from the source and apply the same to dest
Related
I want to connect to Azure Table Storage using RBAC. I have done the role assignment in the portal but I could not find a way to connect to Azure Table form .NET code. I could find a lot of documentation on how to connect to BlobClient
static void CreateBlobContainer(string accountName, string containerName)
{
// Construct the blob container endpoint from the arguments.
string containerEndpoint = string.Format("https://{0}.blob.core.windows.net/{1}",
accountName,
containerName);
// Get a token credential and create a service client object for the blob container.
BlobContainerClient containerClient = new BlobContainerClient(new Uri(containerEndpoint),
new DefaultAzureCredential());
// Create the container if it does not exist.
containerClient.CreateIfNotExists();
}
But could not find the similar documentation for Azure Table Acess.
Has anyone done this before?
The patterns for authorizing with Azure Active Directory and other token sources for the current generation of the Azure SDK are based on credentials from the Azure.Identity package.
The rough equivalent to the snippet you shared would look like the following for Tables:
// Construct a new TableClient using a TokenCredential.
var client = new TableClient(
new Uri(storageUri),
tableName,
new DefaultAzureCredential());
// Create the table if it doesn't already exist to verify we've successfully authenticated.
await client.CreateIfNotExistsAsync();
More information can be found in the Azure Tables authorization sample and the Azure.Identity library overvivew.
I am using libraries Microsoft.Azure.Storage.Blob 11.2.3.0 and Microsoft.Azure.Storage.Common 11.2.3.0 to connect to an Azure BlobStorage from a .NET Core 3.1 application.
When I started working on this, I had been given connection strings that gave me full access to the BlobStorage (or rather, the entire cloud storage account). Based upon those, I chose to write my connection code "defensively", making use of Exists() and CreateIfNotExists() from the CloudBlobContainer class to ensure the application would not fail when a container was not yet existing.
Now, I'm connecting a BlobStorage container using a SAS. While I can freely retrieve and upload blobs within the container like this, unfortunately, it seems that I am not allowed to do anything on the container level. Not only CreateIfNotExists, but even the mere querying of existence by Exists() throws a StorageException saying
This request is not authorized to perform this operation.
The documentation does not mention the exception.
Is there any way to check preemptively whether I am allowed to check the container's existence?
I have tried looking into the container permissions retrieved from GetPermissions, but that will throw an exception, as well.
The only other alternative I can see is to check for container existence within a try-catch-block and assume existence if an exception is thrown ...
There's a no definitive way to identify if an operation can be performed using a SAS token other than performing that operation and catching any exception that may be thrown by the operation. The exception that is of your interest is Unauthorized (403).
However you can try to predict if an operation can be performed by looking at the SAS token. If it is a Service SAS Token and not an Account SAS Token, that means all the account related operations are not not allowed. The way to distinguish between an Account SAS token and a Service SAS token is that the former will contain attributes like SignedServices (ss) and SignedResourceTypes (srt).
Next thing you would want to do is look for SignedPermissions (sp) attribute in your SAS token. This attribute will tell you what all operations are possible with the SAS token. For example, if your SAS token is a Service SAS token and if it includes Delete (d) permission, that would mean you can use this SAS token to delete a blob.
Please see these tables for the permissions/allowed operations combinations:
Service SAS Token: https://learn.microsoft.com/en-us/rest/api/storageservices/create-service-sas#permissions-for-a-directory-container-or-blob
Account SAS Token: https://learn.microsoft.com/en-us/rest/api/storageservices/create-service-sas#permissions-for-a-directory-container-or-blob
Please note that the operation might still fail for any number of reasons like SAS token has expired, account key has changed since the generation of SAS token, IP restrictions etc.
I tried in in my system to check whether the container exist or not able check it and if container not exists created container and able to upload file.
You need to give proper permission for your SAS Token
const string sasToken = “SAS Token”
const string accountName = "teststorage65";
const string blobContainerName = "example";
const string blobName = "test.txt";
const string myFileLocation = #"Local Path ";
var storageAccount = new CloudStorageAccount(storageCredentials, accountName, null, true);
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer blobContainer = blobClient.GetContainerReference(blobContainerName);
var result=blobContainer.Exists();
if (result == true)
{
Console.WriteLine("Container exists");
}
else
{
// blobContainer.CreateIfNotExists();
Console.WriteLine("Conatiner not exists");
Console.WriteLine("Creating Container "+ blobContainerName);
blobContainer.CreateIfNotExists();
}
// blobContainer.CreateIfNotExists();
//Console.WriteLine("Creating Container ");
CloudBlockBlob cloudBlob = blobContainer.GetBlockBlobReference(blobName);
cloudBlob.UploadFromFile(myFileLocation);
OUTPUT
I am building an Angular 6 application that will be able to make CRUD operation on Azure Blob Storage. I'm however using postman to test requests before implementing them inside the app and copy-pasting the token that I get from Angular for that resource.
When trying to read a file that I have inside the storage for test purposes, I'm getting: <Code>AuthorizationPermissionMismatch</Code>
<Message>This request is not authorized to perform this operation using this permission.
All in production environment (although developing)
Token acquired specifically for storage resource via Oauth
Postman has the token strategy as "bearer "
Application has "Azure Storage" delegated permissions granted.
Both the app and the account I'm acquiring the token are added as "owners" in azure access control IAM
My IP is added to CORS settings on the blob storage.
StorageV2 (general purpose v2) - Standard - Hot
x-ms-version header used is: 2018-03-28 because that's the latest I could find and I just created the storage account.
I found it's not enough for the app and account to be added as owners. I would go into your storage account > IAM > Add role assignment, and add the special permissions for this type of request:
Storage Blob Data Contributor
Storage Queue Data Contributor
Make sure to use Storage Blob Data Contributor and NOT Storage Account Contributor where the latter is only for managing the actual Storage Account and not the data in it.
I've just solved this by changing the resource requested in the GetAccessTokenAsync method from "https://storage.azure.com" to the url of my storage blob as in this snippet:
public async Task<StorageCredentials> CreateStorageCredentialsAsync()
{
var provider = new AzureServiceTokenProvider();
var token = await provider.GetAccessTokenAsync(AzureStorageContainerUrl);
var tokenCredential = new TokenCredential(token);
var storageCredentials = new StorageCredentials(tokenCredential);
return storageCredentials;
}
where AzureStorageContainerUrl is set to https://xxxxxxxxx.blob.core.windows.net/
Be aware that if you want to apply "STORAGE BLOB DATA XXXX" role at the subscription level it will not work if your subscription has Azure DataBricks namespaces:
If your subscription includes an Azure DataBricks namespace, roles assigned at the subscription scope will be blocked from granting access to blob and queue data.
Source: https://learn.microsoft.com/en-us/azure/storage/common/storage-auth-aad-rbac-portal#determine-resource-scope
Make sure you add the /Y at the end of the command.
Used the following to connect using Azure AD to blob storage:
This is code uses SDK V11 since V12 still has issues with multi AD accounts
See this issue
https://github.com/Azure/azure-sdk-for-net/issues/8658
For further reading on V12 and V11 SDK
https://learn.microsoft.com/en-us/azure/storage/blobs/storage-quickstart-blobs-dotnet-legacy
https://learn.microsoft.com/en-us/azure/storage/blobs/storage-quickstart-blobs-dotnet
using Microsoft.Azure.Services.AppAuthentication;
using Microsoft.Azure.Storage.Auth;
using Microsoft.Azure.Storage.Blob;
using Microsoft.Azure.Storage.Queue;
[Fact]
public async Task TestStreamToContainer()
{
try
{
var accountName = "YourStorageAccountName";
var containerName = "YourContainerName";
var blobName = "File1";
var provider = new AzureServiceTokenProvider();
var token = await provider.GetAccessTokenAsync($"https://{accountName}.blob.core.windows.net");
var tokenCredential = new TokenCredential(token);
var storageCredentials = new StorageCredentials(tokenCredential);
string containerEndpoint = $"https://{accountName}.blob.core.windows.net";
var blobClient = new CloudBlobClient(new Uri(containerEndpoint), storageCredentials);
var containerClient = blobClient.GetContainerReference(containerName);
var cloudBlob = containerClient.GetBlockBlobReference(blobName);
string blobContents = "This is a block blob contents.";
byte[] byteArray = Encoding.ASCII.GetBytes(blobContents);
using (MemoryStream stream = new MemoryStream(byteArray))
{
await cloudBlob.UploadFromStreamAsync(stream);
}
}
catch (Exception e)
{
Console.WriteLine(e.Message);
Console.ReadLine();
throw;
}
}
I'm using the following code to generate a SAS for a blob container using the Storage Client SDK version 8.0.0 to match what AzCopy is using.
CloudStorageAccount storageAccount = CloudStorageAccount.Parse("myconnectionstring");
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = blobClient.GetContainerReference("mycontainer");
SharedAccessBlobPolicy sasConstraints = new SharedAccessBlobPolicy();
sasConstraints.SharedAccessStartTime = DateTime.Now.AddHours(-24);
sasConstraints.SharedAccessExpiryTime = DateTime.Now.AddHours(24);
sasConstraints.Permissions = SharedAccessBlobPermissions.List | SharedAccessBlobPermissions.Read;
string sas = container.GetSharedAccessSignature(sasConstraints, null);
Console.WriteLine(sas);
Console.WriteLine(container.Uri);
StorageCredentials creds = new StorageCredentials(sas);
var sasContainer = new CloudBlobContainer(new Uri(container.Uri + sas));
var items = sasContainer.GetBlobReference("items.txt");
This all works fine in code, but when I copy the SAS token into AzCopy to download a blob as the "SourceSAS" parameter, I get a 403 saying the signature does not match (via Fiddler). The parameters used to create the signature look ok to me. I'm confused as to why this works in code but not via AzCopy.
If I create a SAS token using the Azure Storage Explorer tool, it works fine. This SAS token targets a different version of the service, but I can't force the SDK to change that parameter. Would anyone know why this is happening? I will post the Fiddler content in an update.
As you have done, I do a test to generate SAS token that associated with an access policy in a application with installing WindowsAzure.Storage v8.0.0, I could download blob from my container via SAS token that is generated in my app. Both AzCopy v5.0.0 and the latest version (5.2.0) could recognize this valid SAS token.
Please check if you could access https://{storageaccount}.blob.core.windows.net/{mycontainer}/items.txt?{sas token} in your browser after you generate a SAS token.
Besides, please try to upgrade SDK and increase ExpiryTime, or create a new container and do same test to check whether same issue will appear.
I am trying to access a blob stored in a private container in Windows Azure. The container has a Shared Access Signature but when I try
to access the blob I get a StorgeClientException "Server failed to authenticate the request. Make sure the Authorization header is formed
correctly including the signature".
The code that created the container and uploaded the blob looks like this:
// create the container, set a Shared Access Signature, and share it
// first this to do is to create the connnection to the storage account
// this should be in app.config but as this isa test it will just be implemented
// here:
// add a reference to Microsoft.WindowsAzure.StorageClient
// and Microsoft.WindowsAzure.StorageClient set up the objects
//storageAccount = CloudStorageAccount.DevelopmentStorageAccount;
storageAccount = CloudStorageAccount.Parse(ConfigurationManager.AppSettings["ConnectionString"]);
blobClient = storageAccount.CreateCloudBlobClient();
// get a reference tot he container for the shared access signature
container = blobClient.GetContainerReference("blobcontainer");
container.CreateIfNotExist();
// now create the permissions policy to use and a public access setting
var permissions = container.GetPermissions();
permissions.SharedAccessPolicies.Remove("accesspolicy");
permissions.SharedAccessPolicies.Add("accesspolicy", new SharedAccessPolicy
{
// this policy is live immediately
// if the policy should be delatyed then use:
//SharedAccessStartTime = DateTime.Now.Add(T); where T is some timespan
SharedAccessExpiryTime =
DateTime.UtcNow.AddYears(2),
Permissions =
SharedAccessPermissions.Read | SharedAccessPermissions.Write
});
// turn off public access
permissions.PublicAccess = BlobContainerPublicAccessType.Off;
// set the permission on the ocntianer
container.SetPermissions(permissions);
var sas = container.GetSharedAccessSignature(new SharedAccessPolicy(), "accesspolicy");
StorageCredentialsSharedAccessSignature credentials = new StorageCredentialsSharedAccessSignature(sas);
CloudBlobClient client = new CloudBlobClient(storageAccount.BlobEndpoint,
new StorageCredentialsSharedAccessSignature(sas));
CloudBlob sasblob = client.GetBlobReference("blobcontainer/someblob.txt");
sasblob.UploadText("I want to read this text via a rest call");
// write the SAS to file so I can use it later in other apps
using (var writer = new StreamWriter(#"C:\policy.txt"))
{
writer.WriteLine(container.GetSharedAccessSignature(new SharedAccessPolicy(), "securedblobpolicy"));
}
The code I have been trying to use to read the blob looks like this:
// the storace credentials shared access signature is copied directly from the text file "c:\policy.txt"
CloudBlobClient client = new CloudBlobClient("https://my.azurestorage.windows.net/", new StorageCredentialsSharedAccessSignature("?sr=c&si=accesspolicy&sig=0PMoXpht2TF1Jr0uYPfUQnLaPMiXrqegmjYzeg69%2FCI%3D"));
CloudBlob blob = client.GetBlobReference("blobcontainer/someblob.txt");
Console.WriteLine(blob.DownloadText());
Console.ReadLine();
I can make the above work by adding the account credentials but that is exactly what I'm trying to avoid. I do not want something
as sensitive as my account credentials just sitting out there and I have no idea on how to get the signature into the client app without having the account credentials.
Any help is greatly appreciated.
Why this?
writer.WriteLine(container.GetSharedAccessSignature(new SharedAccessPolicy(), "securedblobpolicy"));
and not writing the sas string you already created?
It's late and I could easily be missing something but it seems that you might not be saving the same access signature that you're using to write the file in the first place.
Also perhaps not relevant here but I believe there is a limit on the number of container-wide policies you can have. Are you uploading multiple files to the same container with this code and creating a new container sas each time?
In general I think it would be better to request a sas for an individual blob at the time you need it with a short expiry time.
Is "my.azurestorage.windows.net" just a typo? I would expect something there like "https://account.blob.core.windows.net".
Otherwise the code looks pretty similar to the code in http://blog.smarx.com/posts/shared-access-signatures-are-easy-these-days, which works.