Create azure cdn endpoint for azure container - azure

I need to create Azure CDN Endpoint for Azure Container. I am using below code to do so.
Endpoint endpoint = new Endpoint() {
IsHttpAllowed = true,
IsHttpsAllowed = true,
Location = this.config.ResourceLocation,
Origins = new List<DeepCreatedOrigin> { new DeepCreatedOrigin(containerName, string.Format(STORAGE_URL, storageAccountName)) },
OriginPath = "/" + containerName,
};
await this.cdnManagementClient.Endpoints.CreateAsync(this.config.ResourceGroupName, storageAccountName, containerName, endpoint);
All the information I provide is correct and Endpoint is getting created successfully. But when I try to access any blob inside it. It is giving an InvalidUrl error.
However the weird thing is If I create the same endpoint using same values through portal, I am able to access and download blobs.
Anyone please let me know what am I doing wrong in my code? Do I need to pass any extra parameters?

As far as I know, if you want to create a storage CDN in code, you need set the OriginHostHeader value as your storage account URL.
More details, you could refer to below codes:
// Create CDN client
CdnManagementClient cdn = new CdnManagementClient(new TokenCredentials(token))
{ SubscriptionId = subscriptionId };
//ListProfilesAndEndpoints(cdn);
Endpoint e1 = new Endpoint()
{
// OptimizationType = "storage",
Origins = new List<DeepCreatedOrigin>() { new DeepCreatedOrigin("{yourstoragename}-blob-core-windows-net", "{yourstoragename}.blob.core.windows.net") },
OriginHostHeader = "{yourstoragename}.blob.core.windows.net",
IsHttpAllowed = true,
IsHttpsAllowed = true,
OriginPath=#"/foo2",
Location = "EastAsia"
};
cdn.Endpoints.Create(resourcegroup, profilename, enpointname, e1);
Besides, I suggest you could generate SAS token to directly access the blob file by URL.

Related

Azure Function - Managed IDs to write to storage table - failing with 403 AuthorizationPermissionMismatch

I have an Azure function application (HTTP trigger) that writes to the storage queue and table. Both fail when I try to change to managed Id. This post / question is about just the storage table part.
Here's the code that does the actual writing to the table:
GetStorageAccountConnectionData();
try
{
WorkspaceProvisioningRecord provisioningRecord = new PBIWorkspaceProvisioningRecord();
provisioningRecord.status = requestType;
provisioningRecord.requestId = requestId;
provisioningRecord.workspace = request;
#if DEBUG
Console.WriteLine(Environment.GetEnvironmentVariable("AZURE_TENANT_ID"));
Console.WriteLine(Environment.GetEnvironmentVariable("AZURE_CLIENT_ID"));
DefaultAzureCredentialOptions options = new DefaultAzureCredentialOptions()
{
Diagnostics =
{
LoggedHeaderNames = { "x-ms-request-id" },
LoggedQueryParameters = { "api-version" },
IsLoggingContentEnabled = true
},
ExcludeVisualStudioCodeCredential = true,
ExcludeAzureCliCredential = true,
ExcludeManagedIdentityCredential = true,
ExcludeAzurePowerShellCredential = true,
ExcludeSharedTokenCacheCredential = true,
ExcludeInteractiveBrowserCredential = true,
ExcludeVisualStudioCredential = true
};
#endif
DefaultAzureCredential credential = new DefaultAzureCredential();
Console.WriteLine(connection.storageTableUri);
Console.WriteLine(credential);
var serviceClient = new TableServiceClient(new Uri(connection.storageTableUri), credential);
var tableClient = serviceClient.GetTableClient(connection.tableName);
await tableClient.CreateIfNotExistsAsync();
var entity = new TableEntity();
entity.PartitionKey = provisioningRecord.status;
entity.RowKey = provisioningRecord.requestId;
entity["requestId"] = provisioningRecord.requestId.ToString();
entity["status"] = provisioningRecord.status.ToString();
entity["workspace"] = JsonConvert.SerializeObject(provisioningRecord.workspace);
//this is where I get the 403
await tableClient.UpsertEntityAsync(entity);
//other stuff...
catch(AuthenticationFailedException e)
{
Console.WriteLine($"Authentication Failed. {e.Message}");
WorkspaceResponse response = new PBIWorkspaceResponse();
response.requestId = null;
response.status = "failure";
return response;
}
catch (Exception ex)
{
Console.WriteLine($"whoops! Failed to create storage record:{ex.Message}");
WorkspaceResponse response = new WorkspaceResponse();
response.requestId = null;
response.status = "failure";
return response;
}
I have the client id/ client secret for this security principal defined in my local.settings.json as AZURE_TENANT_ID/AZURE_CLIENT_ID/AZURE_CLIENT_SECRET.
The code dies trying to do the upsert. And it never hits the AuthenticationFailedException - just the general exception.
The security principal defined in the AZURE* variables was used to created this entire application including the storage account.
To manage data inside a storage account (like creating table etc.), you will need to assign different sets of permissions. Owner role is a control-plane role that enables you to manage storage accounts themselves and not the data inside them.
From this link:
Only roles explicitly defined for data access permit a security
principal to access blob data. Built-in roles such as Owner,
Contributor, and Storage Account Contributor permit a security
principal to manage a storage account, but do not provide access to
the blob data within that account via Azure AD.
Even though the text above is for Blobs, same thing applies for Tables as well.
Please assign Storage Table Data Contributor to your Managed Identity and then you should not get this error.

Node.js reading a blob with azure and creating a SAS token

So I am currently writing some code that gets a container and then selects a blob and makes a SAS token. which all currently work but I get a error when I try to open the link.
The error being given is this.
AuthenticationFailed
Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature. RequestId:somethingsomething
The specified signed resource is not allowed for the this resource level
const test = () => {
const keyCredit = new StorageSharedKeyCredential('storageaccount', 'key')
const sasOptions = {
containerName: 'compliance',
blobName: 'swo_compliance.csv',
};
sasOptions.expiresOn = new Date(new Date().valueOf() + 3600 * 1000);
sasOptions.permissions = BlobSASPermissions.parse("r");
const sasToken = generateBlobSASQueryParameters(sasOptions, keyCredit).toString();
console.log(`SAS token for blob container is: url/?${sasToken}`);
return `url/?${sasToken}`;
}
I tried to reproduce the scenario in my system able to download the blob using the sas token
While you returning the return url/?${sasToken}; in your code remove the “/” just give the
the return url?${sasToken};
Example URL : https://StorageName.blob.core.windows.net/test/test.txt?SASToken
I tried in my system able to download blob

How to read Azure Blob Url?

I am creating list of blog posts in react and express/ Azure SQL db. I am able to use the Azure blob storage to store the image associated to the post. I am also able to get the blob url and I am storing that in my SQL db. However when I want to read the url directly it threw an error resource not found. After searching docs and other stackoverflow answers I could infer that it has something to do with SAS token. Can anyone explain what would be the better way to approach this?
https://yourdomain.blob.core.windows.net/imagecontainer/yourimage.png
Below is the nodejs code.
router.post('/image', async function (req, res) {
try {
console.log(req.files.files.data);
const blobName = 'test' + uuidv1() + '.png';
const containerClient = blobServiceClient.getContainerClient(containerName);
const blockBlobClient = containerClient.getBlockBlobClient(blobName);
const uploadBlobResponse = await blockBlobClient.upload(req.files.files.data, req.files.files.data.length)
res.send({tempUrl:blockBlobClient.url});
} catch (e) {
console.log(e);
}
})
However when I want to read the url directly it threw an error
resource not found.
Most likely you're getting this error because the blob container containing the blob has a Private ACL and because of that anonymous access is disabled. To enable anonymous access, please change the blob container's ACL to Blob or Public and that will solve this problem.
If you can't (or don't want to) change the blob container's ACL, other option would be to create a Shared Access Signature (SAS) on a blob. A SAS essentially gives time and permission bound access to a blob. For your needs, you would need to create a short-lived SAS token with just Read permission.
To generate a SAS token, you will need to use generateBlobSASQueryParameters method. Once you create a SAS token, you will need to append it to your blob's URL to get a SAS URL.
Here's the sample code to do so. It makes use of #azure/storage-blob node package.
const permissions = new BlobSASPermissions();
permissions.read = true;//Set read permission only.
const currentDateTime = new Date();
const expiryDateTime = new Date(currentDateTime.setMinutes(currentDateTime.getMinutes()+5));//Expire the SAS token in 5 minutes.
var blobSasModel = {
containerName: 'your-blob-container-name',
blobName: 'your-blob-name',
permissions: permissions,
expiresOn: expiryDateTime
};
const sharedKeyCredential = new StorageSharedKeyCredential('your-storage-account-name', 'your-storage-account-key');
const sasToken = generateBlobSASQueryParameters(blobSasModel, sharedKeyCredential);
const sasUrl = blockBlobClient + "?" + sasToken;//return this SAS URL to the client.

Azure Storage: Enable blob versioning on storage account programmatically

I'm creating several storage accounts programmatically via StorageManagementClient and would like to enable blob versioning on account level at the time of account creation. How is this accomplished?
var storageManagementClient = new StorageManagementClient(azureCredentials)
{
SubscriptionId = subscriptionId
};
var storageAccountCreateParameters = new StorageAccountCreateParameters
{
// set properties
};
await storageManagementClient.StorageAccounts.CreateAsync(resourceGroupName, accountName, storageAccountCreateParameters);
I thought that this would be available as a create parameter in StorageAccountCreateParameters, but I don't see anything there.
Also see https://learn.microsoft.com/en-us/azure/storage/blobs/versioning-enable?tabs=portal
The blob versioning is not included in the StorageAccountCreateParameters. It belongs to BlobServiceProperties class.
So after you create the storage account with your code above, you can use the following code to set blob versioning:
var p1 = new BlobServiceProperties()
{
IsVersioningEnabled = true
};
storageManagementClient.BlobServices.SetServiceProperties("resource_group", "account_name", p1);

Create Shared Access Token with Microsoft.WindowsAzure.Storage returns 403

I have a fairly simple method that uses the NEW Storage API to create a SAS and copy a blob from one container to another.
I am trying to use this to Copy blob BETWEEN STORAGE ACCOUNTS. So I have TWo Storage accounts, with the exact same Containers, and I am trying to copy a blob from the Storage Account's Container to another Storage Account's Container.
I don't know if the SDK is built for that, but it seems like it would be a common scenario.
Some additional information:
I create the token on the Destination Container.
Does that token need to be created on both the source and destination? Does it take time to register the token? Do I need to create it for each request, or only once per token "lifetime"?
I should mention a 403 is an Unauthorized Result http error code.
private static string CreateSharedAccessToken(CloudBlobClient blobClient, string containerName)
{
var container = blobClient.GetContainerReference(containerName);
var blobPermissions = new BlobContainerPermissions();
// The shared access policy provides read/write access to the container for 10 hours:
blobPermissions.SharedAccessPolicies.Add("SolutionPolicy", new SharedAccessBlobPolicy()
{
// To ensure SAS is valid immediately we don’t set start time
// so we can avoid failures caused by small clock differences:
SharedAccessExpiryTime = DateTime.UtcNow.AddHours(1),
Permissions = SharedAccessBlobPermissions.Write |
SharedAccessBlobPermissions.Read
});
blobPermissions.PublicAccess = BlobContainerPublicAccessType.Blob;
container.SetPermissions(blobPermissions);
return container.GetSharedAccessSignature(new SharedAccessBlobPolicy(), "SolutionPolicy");
}
Down the line I use this token to call a copy operation, which returns a 403:
var uri = new Uri(srcBlob.Uri.AbsoluteUri + blobToken);
destBlob.StartCopyFromBlob(uri);
My version of Azure.Storage is 2.1.0.2.
Here is the full copy method in case that helps:
private static void CopyBlobs(
CloudBlobContainer srcContainer, string blobToken,
CloudBlobContainer destContainer)
{
var srcBlobList
= srcContainer.ListBlobs(string.Empty, true, BlobListingDetails.All); // set to none in prod (4perf)
//// get the SAS token to use for all blobs
//string token = srcContainer.GetSharedAccessSignature(
// new SharedAccessBlobPolicy(), "SolutionPolicy");
bool pendingCopy = true;
foreach (var src in srcBlobList)
{
var srcBlob = src as ICloudBlob;
// Determine BlobType:
ICloudBlob destBlob;
if (srcBlob.Properties.BlobType == BlobType.BlockBlob)
{
destBlob = destContainer.GetBlockBlobReference(srcBlob.Name);
}
else
{
destBlob = destContainer.GetPageBlobReference(srcBlob.Name);
}
// Determine Copy State:
if (destBlob.CopyState != null)
{
switch (destBlob.CopyState.Status)
{
case CopyStatus.Failed:
log.Info(destBlob.CopyState);
break;
case CopyStatus.Aborted:
log.Info(destBlob.CopyState);
pendingCopy = true;
destBlob.StartCopyFromBlob(destBlob.CopyState.Source);
return;
case CopyStatus.Pending:
log.Info(destBlob.CopyState);
pendingCopy = true;
break;
}
}
// copy using only Policy ID:
var uri = new Uri(srcBlob.Uri.AbsoluteUri + blobToken);
destBlob.StartCopyFromBlob(uri);
//// copy using src blob as SAS
//var source = new Uri(srcBlob.Uri.AbsoluteUri + token);
//destBlob.StartCopyFromBlob(source);
}
}
And finally the account and client (vetted) code:
var credentials = new StorageCredentials("BAR", "FOO");
var account = new CloudStorageAccount(credentials, true);
var blobClient = account.CreateCloudBlobClient();
var sasToken = CreateSharedAccessToken(blobClient, "content");
When I use a REST client this seems to work... any ideas?
Consider also this problem:
var uri = new Uri(srcBlob.Uri.AbsoluteUri + blobToken);
Probably you are calling the "ToString" method of Uri that produce a "Human redable" version of the url. If the blobToken contain special caracters like for example "+" this will cause a token malformed error on the storage server that will refuse to give you the access.
Use this instead:
String uri = srcBlob.Uri.AbsoluteUri + blobToken;
Shared Access Tokens are not required for this task. I ended up with two accounts and it works fine:
var accountSrc = new CloudStorageAccount(credsSrc, true);
var accountDest = new CloudStorageAccount(credsSrc, true);
var blobClientSrc = accountSrc.CreateCloudBlobClient();
var blobClientDest = accountDest.CreateCloudBlobClient();
// Set permissions on the container.
var permissions = new BlobContainerPermissions {PublicAccess = BlobContainerPublicAccessType.Blob};
srcContainer.SetPermissions(permissions);
destContainer.SetPermissions(permissions);
//grab the blob
var sourceBlob = srcContainer.GetBlockBlobReference("FOO");
var destinationBlob = destContainer.GetBlockBlobReference("BAR");
//create a new blob
destinationBlob.StartCopyFromBlob(sourceBlob);
Since both CloudStorageAccount objects point to the same account, copying without a SAS token would work just fine as you also mentioned.
On the other hand, you need either a publicly accessible blob or a SAS token to copy from another account. So what you tried was correct, but you established a container-level access policy, which can take up to 30 seconds to take effect as also documented in MSDN. During this interval, a SAS token that is associated with the stored access policy will fail with status code 403 (Forbidden), until the access policy becomes active.
One more thing that I would like to point is; when you call Get*BlobReference to create a new blob object, the CopyState property will not be populated until you do a GET/HEAD operation such as FetchAttributes.

Resources