Is there a way to tell which user uploaded the file in an azure blob container? do you have to manually add it to the metadata?
The comment is basically correct, but the log is not in the Activity log , if you are using the storage account key to upload blob, you will not be able to know who uploaded the file. So in this way you could add it manually to the metadata as you mentioned.
If you upload blob via Azure AD auth e.g. use AAD auth flow to get the token, use the token to call REST API to upload blob(some other ways essentially use this), then you can use the Azure Storage analytics logging, follow this to configure it, select the Logging version with 2.0.
After configure, if you upload blob via AAD auth, you can find the log in the container named $log, in the log, there is a UserPrincipalName, it is the user.
Related
I want to authorize backend (Web API in Azure Active Directory) to distribute SAS for uploading into Blob Storage.
It would be perfect if SAS can be issued only for one blob upload. I assume that client will send a request with a file name to upload, then backend will return SAS that could be used only to upload one blob with previously set file name.
I found the Storage Blob Delegator role, but as far as I know roles can only be set to user account (person).
How in this case the backend should be authorized?
Shared key
Fake account
Long SAS
Or is there another way to solve this problem? I have checked a lot of articles in docs but all of them solves slightly different problems.
I would like to check my understanding of azure sas token.
we can access a BLOB storage using SAS token instead of azureAD certification. Does this mean that a person who does not have azure account can access a BLOB storage???
Or a person who has azure account can use SAS token and access a BLOB storage.
Yes, a person or script that has a SAS token can access BLOB storage, according to the permissions set in the token. That person or script does not need to have an azure account. Of course, that person would not be able to use the Azure portal to see the blob container, but he can access the storage account using programatically using the Azure API. He can also fetch blobs using HTTP GET requests.
As an example, I have a build script that pushes to storage and a deploy script to read from storage. These scripts contain the access token so they can run from any machine.
If I wanted to revoke the privileges of that access token I would need to replace the key that I used to generate the token with.
I'd have a PHP {codeigniter} application that i want to migrate to its storage service from AWS S3 to Blob Storage,The application uploads all media files to S3 bucket and S3 generates a link that is stored to the database in which the media file can be accessed from,I want to do the same with azure Blobs storage.I'm facing technical hindrance as i can't find the right resources {libraries/code samples} achieve this goal.Tried the Azure PHP SKD but it didn't work out.
Actually, there is a detailed sample for using Azure Storage PHP SDK. You may refer to: https://github.com/Azure/azure-storage-php/blob/master/samples/BlobSamples.php
To run that sample, you just need to replace the following place with your own value:
$connectionString = 'DefaultEndpointsProtocol=https;AccountName=<yourAccount>;AccountKey=<yourKey>';
Suggestion:
I see that you want to generate an access url and store it in database. I am not familiar with AWS S3, but with Azure Storage you may need to set public access level on container or blob.
Otherwise, you can not access the blob directly. You may need to created a SAS token.
Need to figure out how to log/retrieve information about who (which Azure AD user) has read/write on blobs in our azure blob storage.
I know you can turn on logging on the storage account level using this:
I can see in the logs the different api calls that have been performed on the blob but If I myself went via the azure portal to open some of the blobs, I could not see this activity recorded in the logs. Any ideas how to monitor this? I need it for auditing purposes.
When you enable Storage Analytics on Portal, you will have $logs folder on your Blob with storage logs.
When you are using Azure AD authentication you need to configure 2.0 logs and use UserPrincipalName column to identify the user and parse the column with JSON AuthorizationDetail.action to identify the action of the user on storage, i.e. Microsoft.Storage/storageAccounts/blobServices/containers/read for list the blobs in a container.
You will not capture OAuth/Azure AD authenticated requests with log format 1.0.
On Azure Storage Uservoice there is also the request for integration with LogAnalytics to simplify logs monitoring, the private preview should start this month.
We are building an app that allows users to upload their own files (ex. images) in Azure Storage.
We are using Azure Storage and plan to use containers to separate the content of each user.
I am a bit lost in the part about security. What would be the best way to secure the container of each user? For example, would each container have a different key?
And if I want to display the image, do I point it directly to the azure storage URL or do I need to have a middle API service that gets it from Azure Storage then returns it?
You have a key per storage account, not per container. You basically have two options - each requires a middleware:
1: Upload the files through you middleware. The client will send the files to your middleware which knows the storage account credentials to store the file in the desired container.
2: Direct upload to Azure Storage. The second option is to directly upload the files to the storage account. Since you don't want to expose storage account credentials to your clients, you will need some kind of middleware that gives your app a temporary SAS Token that allows it to upload the requested file (known as Valet Key pattern). Further information: (File upload in Cloud Applications: The Options)
If you want to display the image, you can point it directly to the Azure Storage URL (if you want them to be public readable) or you can again return the URLs with temporary SAS Token for each authorized user....
In your case, the best way I think is to Authorize the storage with Azure AD, assign the RBAC role for the different users at the container level. Then they will just be able to access their own container, see this link.
To let the user upload files from an app via Azure AD Auth, you could refer to this doc - Authorize access to blobs and queues with Azure Active Directory from a client application.
To display the image, just click the ... of your image(blob) in the portal -> Generate SAS -> Generate blob SAS token and URL -> copy the Blob SAS URL, you could access it directly in the browser.