How to create multiple stored access policy for the same Azure blob container? - azure

I’ve read through and played with the sample code in the https://azure.microsoft.com/en-us/documentation/articles/storage-dotnet-shared-access-signature-part-2/#part-1-create-a-console-application-to-generate-shared-access-signatures
I then applied it to my scenario.
I write a tool to upload data from partner to Azure blob storage and then it will be consumed by some internal teams:
YYYY-MM (container)
(DD-GUID) (prefix)
File1.zip
File2.zip
……
I created 2 policies per container:
1. Write only for the partner so that they can only write blobs and nothing else.
2. List and read for our internal teams so that they can list and read(download) all the blobs in the container.
My thought is that I can simply hand the correct policy to the correct recipients; however, my implementation doesn’t work as I expected.
I created 2 policies for each container using the below method, of course with correct permission per policy:
static void CreateSharedAccessPolicy(CloudBlobClient blobClient, CloudBlobContainer container, string policyName)
{
//Create a new stored access policy and define its constraints.
SharedAccessBlobPolicy sharedPolicy = new SharedAccessBlobPolicy()
{
SharedAccessExpiryTime = DateTime.UtcNow.AddHours(10),
Permissions = SharedAccessBlobPermissions.Read | SharedAccessBlobPermissions.Write | SharedAccessBlobPermissions.List
};
//Get the container's existing permissions.
BlobContainerPermissions permissions = new BlobContainerPermissions();
//Add the new policy to the container's permissions.
permissions.SharedAccessPolicies.Clear();
permissions.SharedAccessPolicies.Add(policyName, sharedPolicy);
container.SetPermissions(permissions);
}
I created the write only policy first and then the read and list policy. What I observed is that the first policy doesn’t seem to work, everything got back a 403 Forbidden and for the second policy, the only thing that works is the List blob but not Read (I tried to download the blob but got a 404 Not Found).
It seems like I’ve missed something very basic here. Can you please help me out to see what’s wrong with my approach?
The code I used to test the permissions of the container, I also notice that the Read permission on a container doesn't really work as mentioned somewhere in Azure documentation. Here I'm trying to find an easy way to simply give people a stored access policy so that they can list and download all the blobs in the container instead of providing them a signature per blob file:
static void UseContainerSAS(string sas)
{
//Try performing container operations with the SAS provided.
//Return a reference to the container using the SAS URI.
CloudBlobContainer container = new CloudBlobContainer(new Uri(sas));
//Create a list to store blob URIs returned by a listing operation on the container.
List<Uri> blobUris = new List<Uri>();
try
{
//Write operation: write a new blob to the container.
CloudBlockBlob blob = container.GetBlockBlobReference("blobCreatedViaSAS.txt");
string blobContent = "This blob was created with a shared access signature granting write permissions to the container. ";
MemoryStream msWrite = new MemoryStream(Encoding.UTF8.GetBytes(blobContent));
msWrite.Position = 0;
using (msWrite)
{
blob.UploadFromStream(msWrite);
}
Console.WriteLine("Write operation succeeded for SAS " + sas);
Console.WriteLine();
}
catch (StorageException e)
{
Console.WriteLine("Write operation failed for SAS " + sas);
Console.WriteLine("Additional error information: " + e.Message);
Console.WriteLine();
}
try
{
//List operation: List the blobs in the container, including the one just added.
foreach (ICloudBlob blobListing in container.ListBlobs())
{
blobUris.Add(blobListing.Uri);
}
Console.WriteLine("List operation succeeded for SAS " + sas);
Console.WriteLine();
}
catch (StorageException e)
{
Console.WriteLine("List operation failed for SAS " + sas);
Console.WriteLine("Additional error information: " + e.Message);
Console.WriteLine();
}
try
{
CloudBlockBlob blob = container.GetBlockBlobReference(blobUris[0].ToString());
MemoryStream msRead = new MemoryStream();
msRead.Position = 0;
using (msRead)
{
blob.DownloadToStream(msRead);
Console.WriteLine(msRead.Length);
}
Console.WriteLine("Read operation succeeded for SAS " + sas);
Console.WriteLine();
}
catch (StorageException e)
{
Console.WriteLine("Read operation failed for SAS " + sas);
Console.WriteLine("Additional error information: " + e.Message);
Console.WriteLine();
}
Console.WriteLine();
try
{
//Delete operation: Delete a blob in the container.
CloudBlockBlob blob = container.GetBlockBlobReference(blobUris[0].ToString());
blob.Delete();
Console.WriteLine("Delete operation succeeded for SAS " + sas);
Console.WriteLine();
}
catch (StorageException e)
{
Console.WriteLine("Delete operation failed for SAS " + sas);
Console.WriteLine("Additional error information: " + e.Message);
Console.WriteLine();
}
}

Actually, your latter operation erased what you did in the first operation. To avoid that, you ought to read the existing permissions of the container, add a new permission, then set the permissions back to container.
Following is the correct code sample:
static void CreateSharedAccessPolicy(CloudBlobClient blobClient, CloudBlobContainer container, string policyName)
{
//Create a new stored access policy and define its constraints.
SharedAccessBlobPolicy sharedPolicy = new SharedAccessBlobPolicy()
{
SharedAccessExpiryTime = DateTime.UtcNow.AddHours(10),
Permissions = SharedAccessBlobPermissions.Read | SharedAccessBlobPermissions.Write | SharedAccessBlobPermissions.List
};
//Get the container's existing permissions.
BlobContainerPermissions permissions = container.GetPermissions();
//Add the new policy to the container's permissions.
permissions.SharedAccessPolicies.Add(policyName, sharedPolicy);
container.SetPermissions(permissions);
}
For the reason why you were confronting 404 error for reading blobs, please share you code of creating SAS by policy and how you use the created SAS to read blobs so that I can help trouble-shoot.
Following is a code sample for creating SAS and using it to read blobs: (you can copy+paste the URLs in stdout to browser directly to have a try)
var permissions = container.GetPermissions();
var policy = new SharedAccessBlobPolicy
{
Permissions = SharedAccessBlobPermissions.Read,
SharedAccessExpiryTime = DateTime.UtcNow.AddYears(1),
};
string policyName = "read";
permissions.SharedAccessPolicies.Add(policyName, policy);
container.SetPermissions(permissions);
string sas = container.GetSharedAccessSignature(null, policyName);
var blobs = container.ListBlobs(null, true);
Console.WriteLine("SAS = {0}", sas);
Console.WriteLine("Blobs URLs with SAS:");
foreach (var blob in blobs)
{
Console.WriteLine(blob.Uri.ToString() + sas);
}

Related

Xamarin text blob Azure storage

I am using Azure storage blob to store image and I am trying to display it in my Xamrin.form application. I have find a simple tutorial and the code on github.
I have succeed to implement it by following the steps and and create an account on azure storage blob.
The problem is : I can see the name of the file but not the "image"
here is the error:
read started: <Thread Pool> #9
[0:] HTTP Request: Could not retrieve https://xxxxxx.blob.core.windows.net/yyyy/kakashi.jpg, status code NotFound
[0:] ImageLoaderSourceHandler: Could not retrieve image or image data was invalid: Uri: https://lxxxxxx.blob.core.windows.net/yyyy/kakashi.jpg
Thread finished: <Thread Pool> #4
Here is the tutorial:
click to see
Here is the Github:
click to see
Here is the output on screen:
and I have this error when I put the Urlof the image (https://lxxxxxx.blob.core.windows.net/yyyy/kakashi.jpg
) on my bronwser:
This XML file does not appear to have any style information associated with it. The document tree is shown below.
<Error>
<Code>ResourceNotFound</Code>
<Message>
The specified resource does not exist. RequestId:97933c69-a01e-014f-6669-f0502e000000 Time:2018-05-20T18:33:28.4774584Z
</Message>
</Error>
The error means you don't set Public access level to Blob.
See this requirement in your tutorial.
Code you use requires this setting, because it accesses the blob directly using blob Uri.
See PhotosBlobStorageService.cs
return blobList.Select(x => new PhotoModel { Title = x.Name, Uri = x.Uri }).ToList();
If you do want to keep Private level, you have to make some changes to the statement above. Here's the reference.
return blobList.Select(x => new PhotoModel { Title = x.Name,
Uri = new Uri(x.Uri+x.GetSharedAccessSignature(
new SharedAccessBlobPolicy {
Permissions = SharedAccessBlobPermissions.Read|SharedAccessBlobPermissions.Write,
// you can modify the expiration to meet your requirement
SharedAccessExpiryTime = DateTime.UtcNow.AddYears(1)
} ))
}).ToList();
This change allows you to visit private blob with a SAS.
1.Please check your subscription first.
2.Check the access policy of your container.
3.Here is the steps to Save and get blobs through the code.
1)Using NuGet we can install required Assembly packages.
Go to "Manage Package for Solution Menu" and search for WindowsAzure.Storage and WindowsAzure.ConfigurationManager and click on install.
2)Get access keys in the configuration.
3)Sample code to Create blob through the code:
public async Task<string> SaveImagesToAzureBlob(HttpPostedFileBase imageToUpload)
{
try
{
CloudStorageAccount cloudStorageAccount = CloudStorageAccount.Parse(CloudConfigurationManager.GetSetting("StorageConnectionString"));
CloudBlobClient cloudBlobClient = cloudStorageAccount.CreateCloudBlobClient();
CloudBlobContainer cloudBlobContainer = cloudBlobClient.GetContainerReference("sampleimage");
if (await cloudBlobContainer.CreateIfNotExistsAsync())
{
await cloudBlobContainer.SetPermissionsAsync(
new BlobContainerPermissions
{
PublicAccess = BlobContainerPublicAccessType.Blob
}
);
}
string imageFullPath = null;
string imageName = Guid.NewGuid().ToString() + "-" + Path.GetExtension(imageToUpload.FileName);
CloudBlockBlob cloudBlockBlob = cloudBlobContainer.GetBlockBlobReference(imageName);
cloudBlockBlob.Properties.ContentType = imageToUpload.ContentType;
await cloudBlockBlob.UploadFromStreamAsync(imageToUpload.InputStream);
imageFullPath = cloudBlockBlob.Uri.ToString();
return imageFullPath;
}
catch (Exception ex)
{
throw ex;
}
}
Now, check your storage account, you can see the container sample generated.
By default, the container will be private, no one can access from outside. To set the permissions we should use the SetPermission method as below.
CloudBlobContainer .SetPermissions( new BlobContainerPermissions { PublicAccess = BlobContainerPublicAccessType.Blob });
Please try different permissions in the list.
Please note the permission level settings.In your case it may cause the issue.
For more details :
Reference
https://learn.microsoft.com/en-us/azure/azure-resource-manager/resource-manager-deployment-model
https://learn.microsoft.com/en-us/azure/storage/blobs/storage-dotnet-how-to-use-blobs

Fine-uploader Azure upload Url is being changed

I had my project working in Core 1, but when I changed to Core 2 it,no longer uploads the images to Azure.I get a 404 response code and "Problem sending file to Azure" message. The correct URL is returned from the server, but then,finuploader calls a URL with current URL in Front
The URL in chrome Console returning the 404 is shown as
https://localhost:44348/House/%22https://Customstorage.blob.core.windows.net/images/b0975cc7-fae5-4130-bced-c26272d0a21c.jpeg?sv=2017-04-17&sr=b&sig=UFUEsHT1GWI%2FfMd9cuHmJsl2j05W1Acux52UZ5IsXso%3D&se=2017-09-16T04%3A06%3A36Z&sp=rcwd%22
This is being added to the URL somewhere
https://localhost:44348/House/%22
I create the SAS with
public async Task<string> SAS(string bloburi, string name)
{
CloudStorageAccount storageAccount = new CloudStorageAccount(
new Microsoft.WindowsAzure.Storage.Auth.StorageCredentials(
"<storage-account-name>",
"<access-key>"), true)
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = blobClient.GetContainerReference("images");
return await Task.FromResult<string>(GetBlobSasUri(container, bloburi));
}
private static string GetBlobSasUri(CloudBlobContainer container, string blobName, string policyName = null)
{
string sasBlobToken;
// Get a reference to a blob within the container.
// Note that the blob may not exist yet, but a SAS can still be created for it.
var uri = new Uri(blobName);
//get the name of the file from the path
string filename = System.IO.Path.GetFileName(uri.LocalPath);
CloudBlockBlob blob = container.GetBlockBlobReference(filename);
if (policyName == null)
{
// Create a new access policy and define its constraints.
// Note that the SharedAccessBlobPolicy class is used both to define the parameters of an ad-hoc SAS, and
// to construct a shared access policy that is saved to the container's shared access policies.
SharedAccessBlobPolicy adHocSAS = new SharedAccessBlobPolicy()
{
// When the start time for the SAS is omitted, the start time is assumed to be the time when the storage service receives the request.
// Omitting the start time for a SAS that is effective immediately helps to avoid clock skew.
SharedAccessExpiryTime = DateTime.UtcNow.AddHours(1),
Permissions = SharedAccessBlobPermissions.Read | SharedAccessBlobPermissions.Write | SharedAccessBlobPermissions.Create | SharedAccessBlobPermissions.Delete
};
// Generate the shared access signature on the blob, setting the constraints directly on the signature.
sasBlobToken = blob.GetSharedAccessSignature(adHocSAS);
}
else
{
// Generate the shared access signature on the blob. In this case, all of the constraints for the
// shared access signature are specified on the container's stored access policy.
sasBlobToken = blob.GetSharedAccessSignature(null, policyName);
}
// Return the URI string for the container, including the SAS token.
return blob.Uri + sasBlobToken;
}
Fine-uploader -
var uploader = new qq.azure.FineUploader({
element: document.getElementById('uploader'),
template: 'qq-template',
autoUpload: false,
request: {
endpoint:'https://customstorage.blob.core.windows.net/images'
},
signature: {
endpoint: '/House/SAS'
},
uploadSuccess: {
endpoint: '/House/UploadImage'
}
});

Delete "subpath" from Azure Storage

I know Azure doesn't have actual subpaths, but if I have for example container/projectID/iterationNumber/filename.jpg and I delete a project, how can I delete from ProjectID? Is it possible through coding?
I don't want to use the azure application as I am creating a web app.
Thanks in Advance
EDIT:
This is the code provided by Microsoft to target on specific item:
// Retrieve storage account from connection string.
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(
CloudConfigurationManager.GetSetting("StorageConnectionString"));
// Create the blob client.
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
// Retrieve reference to a previously created container.
CloudBlobContainer container = blobClient.GetContainerReference("mycontainer");
// Retrieve reference to a blob named "myblob.txt".
CloudBlockBlob blockBlob = container.GetBlockBlobReference("myblob.txt");
// Delete the blob.
blockBlob.Delete();
SystemDesignModel
public static SystemDesign returnImageURL(IListBlobItem item)
{
if (item is CloudBlockBlob)
{
var blob = (CloudBlockBlob)item;
return new SystemDesign
{
URL = blob.Uri.ToString(),
};
}
return null;
}
}
As you know, blob storage does not have the concept of subfolders. It has just 2 level hierarchy - container & blobs. So in essence, a subfolder is just a prefix that you attach to blob name. In your example, the actual file you uploaded is filename.jpg but its name from blob storage perspective is projectID/iterationNumber/filename.jpg.
Since there is no concept of subfolder, you just can't delete it like we do on our local computer. However there's a way. Blob storage provides a way to search for blobs starting with a certain blob prefix. So what you have to do is first list all blobs that start with certain prefix (projectID in your case) and then delete the blobs one at a time returned as a result of listing operations.
Take a look at sample code below:
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(
CloudConfigurationManager.GetSetting("StorageConnectionString"));
var container = storageAccount.CreateCloudBlobClient().GetContainerReference("container");
BlobContinuationToken token = null;
do
{
var listingResult = container.ListBlobsSegmented("blob-prefix (projectID in your case)", true, BlobListingDetails.None, 5000, token, null, null);
token = listingResult.ContinuationToken;
var blobs = listingResult.Results;
foreach (var blob in blobs)
{
(blob as ICloudBlob).DeleteIfExists();
Console.WriteLine(blob.Uri.AbsoluteUri + " deleted.");
}
}
while (token != null);

Create Shared Access Token with Microsoft.WindowsAzure.Storage returns 403

I have a fairly simple method that uses the NEW Storage API to create a SAS and copy a blob from one container to another.
I am trying to use this to Copy blob BETWEEN STORAGE ACCOUNTS. So I have TWo Storage accounts, with the exact same Containers, and I am trying to copy a blob from the Storage Account's Container to another Storage Account's Container.
I don't know if the SDK is built for that, but it seems like it would be a common scenario.
Some additional information:
I create the token on the Destination Container.
Does that token need to be created on both the source and destination? Does it take time to register the token? Do I need to create it for each request, or only once per token "lifetime"?
I should mention a 403 is an Unauthorized Result http error code.
private static string CreateSharedAccessToken(CloudBlobClient blobClient, string containerName)
{
var container = blobClient.GetContainerReference(containerName);
var blobPermissions = new BlobContainerPermissions();
// The shared access policy provides read/write access to the container for 10 hours:
blobPermissions.SharedAccessPolicies.Add("SolutionPolicy", new SharedAccessBlobPolicy()
{
// To ensure SAS is valid immediately we don’t set start time
// so we can avoid failures caused by small clock differences:
SharedAccessExpiryTime = DateTime.UtcNow.AddHours(1),
Permissions = SharedAccessBlobPermissions.Write |
SharedAccessBlobPermissions.Read
});
blobPermissions.PublicAccess = BlobContainerPublicAccessType.Blob;
container.SetPermissions(blobPermissions);
return container.GetSharedAccessSignature(new SharedAccessBlobPolicy(), "SolutionPolicy");
}
Down the line I use this token to call a copy operation, which returns a 403:
var uri = new Uri(srcBlob.Uri.AbsoluteUri + blobToken);
destBlob.StartCopyFromBlob(uri);
My version of Azure.Storage is 2.1.0.2.
Here is the full copy method in case that helps:
private static void CopyBlobs(
CloudBlobContainer srcContainer, string blobToken,
CloudBlobContainer destContainer)
{
var srcBlobList
= srcContainer.ListBlobs(string.Empty, true, BlobListingDetails.All); // set to none in prod (4perf)
//// get the SAS token to use for all blobs
//string token = srcContainer.GetSharedAccessSignature(
// new SharedAccessBlobPolicy(), "SolutionPolicy");
bool pendingCopy = true;
foreach (var src in srcBlobList)
{
var srcBlob = src as ICloudBlob;
// Determine BlobType:
ICloudBlob destBlob;
if (srcBlob.Properties.BlobType == BlobType.BlockBlob)
{
destBlob = destContainer.GetBlockBlobReference(srcBlob.Name);
}
else
{
destBlob = destContainer.GetPageBlobReference(srcBlob.Name);
}
// Determine Copy State:
if (destBlob.CopyState != null)
{
switch (destBlob.CopyState.Status)
{
case CopyStatus.Failed:
log.Info(destBlob.CopyState);
break;
case CopyStatus.Aborted:
log.Info(destBlob.CopyState);
pendingCopy = true;
destBlob.StartCopyFromBlob(destBlob.CopyState.Source);
return;
case CopyStatus.Pending:
log.Info(destBlob.CopyState);
pendingCopy = true;
break;
}
}
// copy using only Policy ID:
var uri = new Uri(srcBlob.Uri.AbsoluteUri + blobToken);
destBlob.StartCopyFromBlob(uri);
//// copy using src blob as SAS
//var source = new Uri(srcBlob.Uri.AbsoluteUri + token);
//destBlob.StartCopyFromBlob(source);
}
}
And finally the account and client (vetted) code:
var credentials = new StorageCredentials("BAR", "FOO");
var account = new CloudStorageAccount(credentials, true);
var blobClient = account.CreateCloudBlobClient();
var sasToken = CreateSharedAccessToken(blobClient, "content");
When I use a REST client this seems to work... any ideas?
Consider also this problem:
var uri = new Uri(srcBlob.Uri.AbsoluteUri + blobToken);
Probably you are calling the "ToString" method of Uri that produce a "Human redable" version of the url. If the blobToken contain special caracters like for example "+" this will cause a token malformed error on the storage server that will refuse to give you the access.
Use this instead:
String uri = srcBlob.Uri.AbsoluteUri + blobToken;
Shared Access Tokens are not required for this task. I ended up with two accounts and it works fine:
var accountSrc = new CloudStorageAccount(credsSrc, true);
var accountDest = new CloudStorageAccount(credsSrc, true);
var blobClientSrc = accountSrc.CreateCloudBlobClient();
var blobClientDest = accountDest.CreateCloudBlobClient();
// Set permissions on the container.
var permissions = new BlobContainerPermissions {PublicAccess = BlobContainerPublicAccessType.Blob};
srcContainer.SetPermissions(permissions);
destContainer.SetPermissions(permissions);
//grab the blob
var sourceBlob = srcContainer.GetBlockBlobReference("FOO");
var destinationBlob = destContainer.GetBlockBlobReference("BAR");
//create a new blob
destinationBlob.StartCopyFromBlob(sourceBlob);
Since both CloudStorageAccount objects point to the same account, copying without a SAS token would work just fine as you also mentioned.
On the other hand, you need either a publicly accessible blob or a SAS token to copy from another account. So what you tried was correct, but you established a container-level access policy, which can take up to 30 seconds to take effect as also documented in MSDN. During this interval, a SAS token that is associated with the stored access policy will fail with status code 403 (Forbidden), until the access policy becomes active.
One more thing that I would like to point is; when you call Get*BlobReference to create a new blob object, the CopyState property will not be populated until you do a GET/HEAD operation such as FetchAttributes.

Windows Azure Blob

I've been trying to create a Windows Azure Blob containing an image file. I followed these tutorials: http://www.nickharris.net/2012/11/how-to-upload-an-image-to-windows-azure-storage-using-mobile-services/ and http://www.windowsazure.com/en-us/develop/mobile/tutorials/upload-images-to-storage-dotnet/. Finally the following code represents a merging of them. On the last line, however, an exception is raised:
An exception of type 'System.TypeLoadException' occurred in
mscorlib.ni.dll but was not handled in user code
Additional information: A binding for the specified type name was not
found. (Exception from HRESULT: 0x80132005)
Even the container is created the table, but It doesn't work properly.
private async void SendPicture()
{
StorageFile media = await StorageFile.GetFileFromPathAsync("fanny.jpg");
if (media != null)
{
//add todo item to trigger insert operation which returns item.SAS
var todoItem = new Imagem()
{
ContainerName = "mypics",
ResourceName = "Fanny",
ImageUri = "uri"
};
await imagemTable.InsertAsync(todoItem);
//Upload image direct to blob storage using SAS and the Storage Client library for Windows CTP
//Get a stream of the image just taken
using (var fileStream = await media.OpenStreamForReadAsync())
{
//Our credential for the upload is our SAS token
StorageCredentials cred = new StorageCredentials(todoItem.SasQueryString);
var imageUri = new Uri(todoItem.SasQueryString);
// Instantiate a Blob store container based on the info in the returned item.
CloudBlobContainer container = new CloudBlobContainer(
new Uri(string.Format("https://{0}/{1}",
imageUri.Host, todoItem.ContainerName)), cred);
// Upload the new image as a BLOB from the stream.
CloudBlockBlob blobFromSASCredential =
container.GetBlockBlobReference(todoItem.ResourceName);
await blobFromSASCredential.UploadFromStreamAsync(fileStream.AsInputStream());
}
}
}
Please use Assembly Binding Log Viewer to see which load is failing. As also mentioned in the article, the common language runtime's failure to locate an assembly typically shows up as a TypeLoadException in your application.

Resources