Our software uses Azure blob & Azure table storage.
I would like developers to be able to look through our production data with the Microsoft Azure Storage Explorer, but not be allowed to accidentaly edit it's data.
I don't want to allow anonymous access to the data (read only) as suggested here.
What would be a good way to achieve this?
Make use of Shared Access Signature option to connect to Azure Blob Storage from the Storage Explorer.
Find more details about SAS here.
Find more details about SAS in Storage Explorer here.
Related
I have some e-mail attachments being saved to Azure Blob.
I am now trying to write a Azure Functions App that would connect to that blob storage, run some scripts and re-save the file.
However, when selecting a storage account for the function, I couldn't select my blob storage account.
I went on the website and it said this:
When creating a function app, you must create or link to a general-purpose Azure Storage account that supports Blob, Queue, and Table storage. Some storage accounts don't support queues and tables. These accounts include blob-only storage accounts and Azure Premium Storage.
I'm wondering, is there any workaround this? and if not, perhaps any other suggestions? I'm becoming a little lost in all the options, and which one to actually choose.
Thanks!
EDIT: Might I add I writing the function Python
I think you are overlooking the fact that you can have multiple storage accounts. In order for an Azure Function to work you need a storage account. That storage account is used to store runtime information of the Azure Function for internal purposes like state management. This storage account is subject to restrictions as you already found out. There is no workaround for that.
However, if the function you are writing needs to access another storage account it is free to do so. You just have to provide details to connect to that specific storage account. In that case you also have a clear seperation between the storage account that is used by the azure function for its internal operations and the storage account your application needs to connect and which you have total control about withouth having to worry that you break things by deleting internal used blobs/tables/queues.
You can have a blob triggered function that gets triggered when changes occur on your specific blob storage. That doesn't need to be the storage account that the azure function internally uses, which is created/selected when creating the azure function.
Here is a sample that shows how to add a blob triggered azure function in Python. MyStorageAccountAppSetting refers to an app setting that holds the connection string to the storage account that you use for storage.
The snippet from the website you are quoting is for storing the function app code itself and any related modules. It does not pertain to what your function can access when the code of your function executes.
When your function executes it will need to use the Azure Blob Storage SDK/modules to connect to your blob storage account and read the email attachments. Here's a quickstart guide for using Azure Storage with Python: Quickstart with Azure Storage Blobs SDK for Python
General-purpose v2 storage accounts support the latest Azure Storage features and incorporate all of the functionality of general-purpose v1 and Blob storage accounts here
There are more integration options with GPv2 accounts including Azure Function Triggers. See: Azure Blob storage bindings for Azure Functions
Further refer: Types of storage accounts
If Blob, based on your need, you can choose an access tier based on the frequency of access for the data (e-mail attachments)Access tiers for Azure Blob Storage - hot, cool, and archive. If General purpose storage account, its standard performance tier.
I have created a blob storage at Azure and I want to share this container with my colleagues who may or may not have azure account and can upload files in this blob storage without providing access to resource groups or storage account. Is it possible with blob storage or there is any other alternative to do that where they can drop there files time to time and I can access them from my azure pipeline. I have tried Secured Acess Signature but it is not working may be I have less knowledge to do that. I have creatred SAS URL by clicking right click on container and shared the link with colleagues.
I've developed an application which users can upload their files and share them with each other. The files are some private files for each user and public files like profile pictures. I'm storing the files in Azure File Storage.
Assume that I have a method to retrieve a file with its id: I've implemented the permissions in the file access methods in WebApi controllers.
Is Azure File Storage is proper storage type for this scenario?
What is the best way to retrieve the files from Azure Storage? Should I read the files server-side (using Azure .NET SDK) and stream them to the clients? Is there any way to avoid streaming the file in WebApi then clients can access the file directly from Azure File Storage (considering the permission)?
Thanks
Azure File Storage exists mainly to allow lift and shift of legacy applications to the cloud.
I would recommend using Blob Storage combined with SAS tokens for your problem. Using the SAS tokens you can control access permissions on a blob level. And this avoids the need to get your files on the web server first, before relaying them to the end users.
SAS and REST access are supported in both Azure Files and Azure Blob Storage. Azure Files support other key scenarios besides life & shift. Although a bit old, this article (https://blogs.msdn.microsoft.com/windowsazurestorage/2014/05/12/introducing-microsoft-azure-file-service/) explains the difference between Azure Blob vs Azure Files vs. Azure Disks. You should also factors such as size limit for your share, folder structure, ability to natively mount to a VM, maximum file/object size, throughput requirements, pricing, SMB/REST support and etc. If you still have questions, please send an email to azurefiles AT microsoft.com and we will be happy to review your scenario and recommend the option suitable to your usage scenario.
Aung
As I know, we can use Azure blob storage to store your files. Azure blob Storage containers provide three access level: Full public read access, Public read access for blobs only, no public read access. Refer to this article for more details. For your scenario, please save private files in “no public read access”, and save public file in “Public read access for blobs only”. So that the user cannot access your private files but can read your public files. If you want to share private files to others, please try SAS as LoekD mentioned. If you want to expired the SAS token in server side. Please try to use SAS policy to do it. Read this article for more details.
my question is about Microsoft Azure blob storage.
Can I manage blob snapshots from user console (azure portal)? Take snapshot,delete,recover etc.
Thanks
No, currently the portal does not offer this functionality. You may want to check out Azure Explorers available in the market. Most of them have support for managing blob snapshots.
How do I build a rich storage ACL policy system with Azure storage?
I want to have a blob container that has the following users:
public - read-only against some set of blobs
Uploader - read-write against some subset of blob names, these keys are shared out to semi-trusted build machines
shared admin - full capabilities against this blob subset
Ideally these users are accounts driven through Azure AD, so I can use the full directory service power with them... :)
My understanding of shared access keys is that they are (1) time-limited and (2) have to be created with hand-tooled code. My desire is that I can do something similar to AWS IAM policies on S3... :-)
Thing like AWS IAM Policies for S3 does not exist for Azure Blob Storage today. Azure recently started a Role Based Access Control (RBAC) and is available for Azure Storage but it is limited to performing management activities only like creating storage accounts etc. It is yet not available for perform data management activities like uploading blobs etc.
You may want to look at Azure Rights Management Service (Azure RMS) and see if it is a right solution for your needs. If you search for Azure RMS Blob you will find one of the search results link to a PDF file which talks about securing blob storage with this service (the link directly downloads the PDF file and hence I could not include it here).
If you're looking for a 3rd party service to do this, do take a look at the "Team Edition" of Cloud Portam (a service I am building currently). We recently released the Team Edition. In short, Cloud Portam is a browser-based Azure Explorer and it supports managing Azure Storage, Search Service and DocumentDB accounts. The Team Edition makes use of your Azure AD for user authentication and you can grant permissions (None, Read-Only, Read-Write and Read-Write-Delete) on the Azure resources you manage through this application.
Paul,
While Gaurav is correct in that Azure Storage does not have AD integration today, I wanted to point out a couple of things about shared access signatures from your post:
My understanding of shared access keys is that they are (1) time-limited and (2) have to be created with hand-tooled code
1) A sas token/uri does not need to have an expiry date on it (it's an optional field), so in that sense they are not time-limited and need not be regenerated unless you change the shared key with which you generated the token
2) You can use PS cmdlets to do this for e.g.: https://msdn.microsoft.com/en-us/library/dn806416.aspx. Some storage explorers also support creation of sas tokens/uris statically without you having to write code for it.