I want to connect to Azure Table Storage using RBAC. I have done the role assignment in the portal but I could not find a way to connect to Azure Table form .NET code. I could find a lot of documentation on how to connect to BlobClient
static void CreateBlobContainer(string accountName, string containerName)
{
// Construct the blob container endpoint from the arguments.
string containerEndpoint = string.Format("https://{0}.blob.core.windows.net/{1}",
accountName,
containerName);
// Get a token credential and create a service client object for the blob container.
BlobContainerClient containerClient = new BlobContainerClient(new Uri(containerEndpoint),
new DefaultAzureCredential());
// Create the container if it does not exist.
containerClient.CreateIfNotExists();
}
But could not find the similar documentation for Azure Table Acess.
Has anyone done this before?
The patterns for authorizing with Azure Active Directory and other token sources for the current generation of the Azure SDK are based on credentials from the Azure.Identity package.
The rough equivalent to the snippet you shared would look like the following for Tables:
// Construct a new TableClient using a TokenCredential.
var client = new TableClient(
new Uri(storageUri),
tableName,
new DefaultAzureCredential());
// Create the table if it doesn't already exist to verify we've successfully authenticated.
await client.CreateIfNotExistsAsync();
More information can be found in the Azure Tables authorization sample and the Azure.Identity library overvivew.
Related
I've created and published (to Azure) a working and fairly simple Azure Function App (HTTP Trigger) which inserts data retrieved from an API call to an Azure SQL database after authenticating via User identity (if testing locally) or Managed Identity (when running on VM).
Everything is functional, however, I now need to encrypt one of the SQL table columns which I have done using SSMS. The next step as I understand it is authenticating a provider to access the CMK via Azure Key Vault (I'm following this Microsoft guide).
I'm wondering in the following code how to InitializeAzureKeyVaultProvider without using applicationId and clientKey from Azure App registration resource, but with a user or managed identity role. Or, if there's any other way to get/use applicationId, and clientKey without creating/using an Azure App registration resource.
Is there a newer / easier way to access Azure Key Vault for Always Encrypted Columns sql queries?
static void InitializeAzureKeyVaultProvider() {
_clientCredential = new ClientCredential(applicationId, clientKey);
SqlColumnEncryptionAzureKeyVaultProvider azureKeyVaultProvider = new SqlColumnEncryptionAzureKeyVaultProvider(GetToken);
Dictionary<string, SqlColumnEncryptionKeyStoreProvider> providers = new Dictionary<string, SqlColumnEncryptionKeyStoreProvider>();
providers.Add(SqlColumnEncryptionAzureKeyVaultProvider.ProviderName, azureKeyVaultProvider);
SqlConnection.RegisterColumnEncryptionKeyStoreProviders(providers);
}
}
Here is the other way I've been attempting, however, upon installing and using Microsoft.Data.SqlClient.AlwaysEncrypted.AzureKeyVaultProvider, I get the error following this code sample:
private static bool EncryptionProviderInitialized = false;
private static void InitializeAzureKeyVaultProvider()
{
if (!EncryptionProviderInitialized)
{
SqlColumnEncryptionAzureKeyVaultProvider akvProvider = null;
#if DEBUG
if (Debugger.IsAttached)
{
Console.WriteLine("Debugger attached - configuring KeyVaultProvider via VisualStudioCredential");
akvProvider = new SqlColumnEncryptionAzureKeyVaultProvider(new VisualStudioCredential());
}
if (akvProvider == null)
#endif
akvProvider = new SqlColumnEncryptionAzureKeyVaultProvider(new ManagedIdentityCredential());
SqlConnection.RegisterColumnEncryptionKeyStoreProviders(customProviders: new Dictionary<string, SqlColumnEncryptionKeyStoreProvider>(capacity: 1, comparer: StringComparer.OrdinalIgnoreCase)
{
{ SqlColumnEncryptionAzureKeyVaultProvider.ProviderName, akvProvider}
});
EncryptionProviderInitialized = true;
}
}
ERROR:
[2021-10-08T01:36:24.209Z] Executed 'EncryptedSQLMigration' (Failed, Id=1323dcbb-e671-4ed4-8a7c-6259447326c5, Duration=537ms)
[2021-10-08T01:36:24.209Z] System.Private.CoreLib: Exception while executing function: EncryptedSQLMigration. FreshFirstApproach: Method not found: 'Microsoft.Extensions.Primitives.StringValues Microsoft.AspNetCore.Http.IQueryCollection.get_Item(System.String)'.
Initially, I got that same error, however, with Microsoft.Extensions.Logging.Abstractions - upon removing the ILogger from my main function just for the sake of moving on as I wasn't able to solve that issue either, I now get this Microsoft.Extensions exception.
Any help with my goal of using Always Encrypted Columns with Azure Function App and Azure Key Vault is very much appreciated!
Thank you very much.
It is not possible to use applicationId and clientKey without creating or using an Azure App registration resource. There is an alternative way where you can pass clientId and clientSecret as shown below, but here also you will need application registration.
static void InitializeAzureKeyVaultProvider()
{
string clientId = ConfigurationManager.AppSettings["AuthClientId"];
string clientSecret = ConfigurationManager.AppSettings["AuthClientSecret"];
_clientCredential = new ClientCredential(clientId, clientSecret);
....
....
}
As for user or managed identity, if you check this document you won't find the Azure Key Vault in the list of services that support managed resources.
So managed service identity should be created for Azure Function instead of Azure Key Vault. Check this Retrieve Azure Key Vault Secrets using Azure Functions and Managed Service Identity document for more information.
My Application is in a Kubernetes cluster and I'm using Java v12 SDK to interact with the Blob Storage. To authorize against Blob Storage I'm using Managed Identities.
My application needs to copy blobs within one container. I haven't found any particular recommendations or examples of how SDK should be used to do the copy.
I figured that the following approach works when I'm working with the emulator
copyBlobClient.copyFromUrl(sourceBlobClient.getBlobUrl());
However, when this gets executed in the cluster I get the following error
<Error>
<Code>CannotVerifyCopySource</Code>
<Message>The specified resource does not exist. RequestId: __ Time: __ </Message>
</Error>
Message says "resource does not exist" but the blob is clearly there. My container has private access, though.
Now when I change the public access level to "Blob(anonymous read access for blobs only)" everything works as excepted. However, public access not acceptable to me.
Main question - what are the right ways to implement copy blob using Java v12 SDK.
What I could miss or misconfigured in my situation?
And the last is the error message itself. There is a part which says "CannotVerifyCopySource" which kind of helps you understand that there is something with access, but the message part is clearly misleading. Shouldn't it be more explicit about the error?
If you want to use Azure JAVA SDK to copy blob with Azure MSI, please refer to the following details
Copy blobs between storage accounts
If you copy blobs between storage accounts with Azure MSI. We should do the following actions
Assign Azure Storage Blob Data Reader to the MSI in the source container
Assign Azure Storage Blob Data Contributor to the MSI in the dest container. Besides when we copy blob, we need write permissions to write content to blob
Generate SAS token for the blob. If the souce blob is public, we can directly use source blob URL without sas token.
For example
try {
BlobServiceClient blobServiceClient = new BlobServiceClientBuilder()
.endpoint("https://<>.blob.core.windows.net/" )
.credential(new DefaultAzureCredentialBuilder().build())
.buildClient();
// get User Delegation Key
OffsetDateTime delegationKeyStartTime = OffsetDateTime.now();
OffsetDateTime delegationKeyExpiryTime = OffsetDateTime.now().plusDays(7);
UserDelegationKey key =blobServiceClient.getUserDelegationKey(delegationKeyStartTime,delegationKeyExpiryTime);
BlobContainerClient sourceContainerClient = blobServiceClient.getBlobContainerClient("test");
BlobClient sourceBlob = sourceContainerClient.getBlobClient("test.mp3");
// generate sas token
OffsetDateTime expiryTime = OffsetDateTime.now().plusDays(1);
BlobSasPermission permission = new BlobSasPermission().setReadPermission(true);
BlobServiceSasSignatureValues myValues = new BlobServiceSasSignatureValues(expiryTime, permission)
.setStartTime(OffsetDateTime.now());
String sas =sourceBlob.generateUserDelegationSas(myValues,key);
// copy
BlobServiceClient desServiceClient = new BlobServiceClientBuilder()
.endpoint("https://<>.blob.core.windows.net/" )
.credential(new DefaultAzureCredentialBuilder().build())
.buildClient();
BlobContainerClient desContainerClient = blobServiceClient.getBlobContainerClient("test");
String res =desContainerClient.getBlobClient("test.mp3")
.copyFromUrl(sourceBlob.getBlobUrl()+"?"+sas);
System.out.println(res);
} catch (Exception e) {
e.printStackTrace();
}
Copy in the same account
If you copy blobs in the same storage account with Azure MSI, I suggest you assign Storage Blob Data Contributor to the MSI in the storage account. Then we can do copy action with the method copyFromUrl.
For example
a. Assign Storage Blob Data Contributor to the MSI at the account level
b. code
try {
BlobServiceClient blobServiceClient = new BlobServiceClientBuilder()
.endpoint("https://<>.blob.core.windows.net/" )
.credential(new DefaultAzureCredentialBuilder().build())
.buildClient();
BlobContainerClient sourceContainerClient = blobServiceClient.getBlobContainerClient("test");
BlobClient sourceBlob = sourceContainerClient.getBlobClient("test.mp3");
BlobContainerClient desContainerClient = blobServiceClient.getBlobContainerClient("output");
String res =desContainerClient.getBlobClient("test.mp3")
.copyFromUrl(sourceBlob.getBlobUrl());
System.out.println(res);
} catch (Exception e) {
e.printStackTrace();
}
For more details, please refer to here and here
I had the same issue using the Java SDK for Azure I solved it by copying the blob using the URL + the SAS token. Actually the resource you're getting through the URL won't appear as available if you don't have the right access to it. Here is the code I used to solve the problem:
BlobClient sourceBlobClient = blobServiceClient
.getBlobContainerClient(currentBucketName)
.getBlobClient(sourceKey);
// initializing the copy blob client
BlobClient copyBlobClient = blobServiceClient
.getBlobContainerClient(newBucketName)
.getBlobClient(newKey);
// Creating the SAS Token to get the permission to copy the source blob
OffsetDateTime expiryTime = OffsetDateTime.now().plusDays(1);
BlobSasPermission permission = new BlobSasPermission().setReadPermission(true);
BlobServiceSasSignatureValues values = new BlobServiceSasSignatureValues(expiryTime, permission)
.setStartTime(OffsetDateTime.now());
String sasToken = sourceBlobClient.generateSas(values);
//Making the copy using the source blob URL + generating the copy
var res = copyBlobClient.copyFromUrl(sourceBlobClient.getBlobUrl() +"?"+ sasToken);
Perhaps another way is to use the streaming API to download and upload data. In our company, we are not allowed to generate SAS token on our storage account due to security and we use the following to copy from an append blob to a block blob (overwriting):
BlobAsyncClient src;
BlobAsyncClient dest;
//...
AppendBlobAsyncClient srcAppend = src.getAppendBlobAsyncClient();
Flux<ByteBuffer> streamData = srcAppend.downloadStream();
Mono<BlockBlobItem> uploaded = dest.upload(streamData, new ParallelTransferOptions(), true);
This returns Mono<BlockBlobItem> and you need to subscribe it to start the process. If used in a non-reactive context, perhaps the easiest way is to block().
Note that this will only copy the data and additional work is needed if you also need to copy the metadata and tags. For tags, there is BlobAsyncClientBase.getTags(). For meta data, there is BlobAsyncClientBase.getProperties(). You can get these tags and metadata from the source and apply the same to dest
I am building an Angular 6 application that will be able to make CRUD operation on Azure Blob Storage. I'm however using postman to test requests before implementing them inside the app and copy-pasting the token that I get from Angular for that resource.
When trying to read a file that I have inside the storage for test purposes, I'm getting: <Code>AuthorizationPermissionMismatch</Code>
<Message>This request is not authorized to perform this operation using this permission.
All in production environment (although developing)
Token acquired specifically for storage resource via Oauth
Postman has the token strategy as "bearer "
Application has "Azure Storage" delegated permissions granted.
Both the app and the account I'm acquiring the token are added as "owners" in azure access control IAM
My IP is added to CORS settings on the blob storage.
StorageV2 (general purpose v2) - Standard - Hot
x-ms-version header used is: 2018-03-28 because that's the latest I could find and I just created the storage account.
I found it's not enough for the app and account to be added as owners. I would go into your storage account > IAM > Add role assignment, and add the special permissions for this type of request:
Storage Blob Data Contributor
Storage Queue Data Contributor
Make sure to use Storage Blob Data Contributor and NOT Storage Account Contributor where the latter is only for managing the actual Storage Account and not the data in it.
I've just solved this by changing the resource requested in the GetAccessTokenAsync method from "https://storage.azure.com" to the url of my storage blob as in this snippet:
public async Task<StorageCredentials> CreateStorageCredentialsAsync()
{
var provider = new AzureServiceTokenProvider();
var token = await provider.GetAccessTokenAsync(AzureStorageContainerUrl);
var tokenCredential = new TokenCredential(token);
var storageCredentials = new StorageCredentials(tokenCredential);
return storageCredentials;
}
where AzureStorageContainerUrl is set to https://xxxxxxxxx.blob.core.windows.net/
Be aware that if you want to apply "STORAGE BLOB DATA XXXX" role at the subscription level it will not work if your subscription has Azure DataBricks namespaces:
If your subscription includes an Azure DataBricks namespace, roles assigned at the subscription scope will be blocked from granting access to blob and queue data.
Source: https://learn.microsoft.com/en-us/azure/storage/common/storage-auth-aad-rbac-portal#determine-resource-scope
Make sure you add the /Y at the end of the command.
Used the following to connect using Azure AD to blob storage:
This is code uses SDK V11 since V12 still has issues with multi AD accounts
See this issue
https://github.com/Azure/azure-sdk-for-net/issues/8658
For further reading on V12 and V11 SDK
https://learn.microsoft.com/en-us/azure/storage/blobs/storage-quickstart-blobs-dotnet-legacy
https://learn.microsoft.com/en-us/azure/storage/blobs/storage-quickstart-blobs-dotnet
using Microsoft.Azure.Services.AppAuthentication;
using Microsoft.Azure.Storage.Auth;
using Microsoft.Azure.Storage.Blob;
using Microsoft.Azure.Storage.Queue;
[Fact]
public async Task TestStreamToContainer()
{
try
{
var accountName = "YourStorageAccountName";
var containerName = "YourContainerName";
var blobName = "File1";
var provider = new AzureServiceTokenProvider();
var token = await provider.GetAccessTokenAsync($"https://{accountName}.blob.core.windows.net");
var tokenCredential = new TokenCredential(token);
var storageCredentials = new StorageCredentials(tokenCredential);
string containerEndpoint = $"https://{accountName}.blob.core.windows.net";
var blobClient = new CloudBlobClient(new Uri(containerEndpoint), storageCredentials);
var containerClient = blobClient.GetContainerReference(containerName);
var cloudBlob = containerClient.GetBlockBlobReference(blobName);
string blobContents = "This is a block blob contents.";
byte[] byteArray = Encoding.ASCII.GetBytes(blobContents);
using (MemoryStream stream = new MemoryStream(byteArray))
{
await cloudBlob.UploadFromStreamAsync(stream);
}
}
catch (Exception e)
{
Console.WriteLine(e.Message);
Console.ReadLine();
throw;
}
}
I created a storage account with Azure Fluent SDK. But after I created the storage account, I wanted to get the name and key to build the connection string that I can use to access the storage account. The problem is that the 'Key' property is a Guid, not the key shown in the Azure Portal.
This is how I create the storage account.
IStorageAccount storage = azure.StorageAccounts.Define(storageAccountName)
.WithRegion(Region.USEast)
.WithNewResourceGroup(rgName)
.Create();
How can I get the proper Key to build the connection string?
You should be able to do it via the below code, this documentation also shows the use of Fluent but only the auth methods:
// Get a storage account
var storage = azure.StorageAccounts.GetByResourceGroup("myResourceGroup", "myStorageAccount");
// Extract the keys
var storageKeys = storage.GetKeys();
// Build the connection string
string storageConnectionString = "DefaultEndpointsProtocol=https;"
+ "AccountName=" + storage.Name
+ ";AccountKey=" + storageKeys[0].Value
+ ";EndpointSuffix=core.windows.net";
// Connect
var account = CloudStorageAccount.Parse(storageConnectionString);
// Do things with the account here...
following Azure reference site for analysis service processing costume .NET activity
https://github.com/Azure/Azure-DataFactory/tree/master/Samples/AzureAnalysisServicesProcessSample
in this example we have Pipeline to Process Cube
` "TabularDatabaseName": "<DATABASE_NAME>",
"AzureADAuthority": "https://login.windows.net/<TENANT_ID>",
"AzureADResource": "https://<LOCATION>.asazure.windows.net",
"AzureADClientId": "<CLIENT_ID>",
"AzureADClientSecret": "<CLIENT_SECRET>"`
first we need to know how we get AzureADResource and AzureADAuthority information.
also this pipeline working fine when we pass hard-coded password instead of {0)
so we didn't understand where is problem. is ServicePrincipalAuth we provided here is correct or it is mandatory to provide password
Regards,
Manish
Please follow this article from MSDN which mentions how to create an application in the AD and create AD service principal.
https://learn.microsoft.com/en-us/azure/data-factory/data-factory-create-data-factories-programmatically
Step 7 would return the AD service principal.
Also, to be able to run a pipeline you would require a client when creating a connection with ADF, which would require client Id, client Key, subscription Id and tenant Id.
This is how I created my client to ADF.
private void CreateADFClient()
{
AuthenticationContext authenticationContext = new AuthenticationContext($"https://login.windows.net/{tenant_id}");
ClientCredential credential = new ClientCredential(clientId: this.client_id, clientSecret: this.client_key);
Task<AuthenticationResult> result = authenticationContext.AcquireTokenAsync(resource: "https://management.core.windows.net/", clientCredential: credential);
if (result == null)
{
throw new InvalidOperationException("Failed to obtain the JWT token");
}
string token = result.Result.AccessToken;
TokenCloudCredentials credentials = new TokenCloudCredentials(subscription_id, token);
this.inner_client = new DataFactoryManagementClient(credentials);
}
The application name that you provided while performing the steps from MSDN, will return the name of the application, take note of the application name & find it on the Azure portal by Azure Active Directory --> App Registrations --> . Corresponding to it would be your application id which is the client Id here.
To create the client key, click on the application name and go to 'Keys' section and add a key description and validation and save. Upon saving the value would be shown, this is your client key. Please take note of it, this would be visible the next time you come to this page, but of course, you can create another.
The tenant id can be downloaded by clicking on Help --> show diagnostics. A file will be downloaded where you can search your tenant Id from.
Hope this solves your questions around it!