Event Grid dead letter container security - azure

I currently have several Event Subscriptions configured on an Event Grid Domain Topic, some of which have dead lettering configured to various Storage Account Blob Containers. At the moment this Storage Account allows public access on the Acces Control List.
I am looking to improve our security posture on this Storage Account and was wondering if changing this to Enabled from selected virtual networks and IP addresses and selecting the allow Azure services on the trusted services list to access this storage account will allow Event Grid to write to the Storage Account Blob containers.
The Trusted access for resources registered in your subscription section of the documentation is not clear to me if publishing to Blob containers is a supported scenario. Anyone configured Storage Account network access lists to support this scenario?

TLDR
Securing the storage account by clicking the "Enabled from selected virtual networks and IP addresses" button on the networking tab of the Storage Account restricts all access from public networks. The important part of this process is to enable the "Allow Azure services on the trusted services list to access this storage account." network exception rule to allow Event Grid to communicate.
I will close this ticket and raise another story / tasks to action the change in the normal release cycle.
The long explanation
I create a new resources group containing a new Event Grid Domain, Topic, Subscription and a Function App endpoint as the subscription endpoint. I used Postman on my machine to publish events to this topic. A publicly accessible storage account was created with a blob container for the dead letter queue.
Round 1
All services publicly accessible I published several messages to the topic. From monitoring the function app log I could see these messages were reaching the target successfully. Checking the dead letter container after 5 minutes resulting in no entries. Checking the Event Grid subscription showed successful delivery.
Round 2
I stopped the Subscription endpoint function app and published 10 events, the storage account was still publicly accessible.
Event grid shows the matched subscriptions and failed deliveries. After 5 minutes event grid shown the dead lettered events. Checking the storage account I could see the dead letter blob container container date based folder structure and json files containing the failed messages.
Round 3
I enabled the network restrictions and Allow Azure services on the trusted services list to access this storage account exception on the storage account.
I published 10 events to the event grid topic which shows the matched subscriptions and failed deliveries. After 5 minutes event grid shows the dead lettered events.
Checking the storage account new dead letter folders and json files were present
Round 4
As these network changes can sometimes take several minutes to fully propagate I reran the same test over the next hour and after lunch.
This again successfully created new folders and json files for the failed delivered events.
The why?
Why did we need to do the test this way? Well, the documentation is unclear on what the Azure trusted service encompasses, specific the event grid select reads - Enable Blob Storage event publishing and allow Event Grid to publish to storage queues. Learn about blob storage events and publishing to queues.
This is unclear, to ne anyway, if publishing to dead letter blob containers is covered by this functionality.
I will raise a change with MS to update the documentation.

Related

Azure Storage Account Event subscription: Event not getting delivered to subscription

The requirement is when a file gets uploaded to a storage account container, a logic should be in force to encrypt the file and place it in another container and source file should not get affected. For this I setup an event subscription in storage account, which would publish event to an event grid system topic, in turn triggers an Azure function. Below is the screenshot of event subscription setup.
The issue is the event gets published to event grid system topic but not getting delivered to subscription. I set up diagnostic settings for the event grid system topic. Below is the error that I found in logs. Can anyone tell me what am I doing wrong?
outcome=Forbidden,deliveryResponse=Forbidden, errorCode=Forbidden, HttpRequestMessage: httpVersion=1.1, HttpResponseMessage: HttpVersion=1.1, StatusCode=Forbidden(Forbidden), StatusDescription=Ip Forbidden, ConnectionInfo=defaultConnectionLimit=1024, reusePortSupported=True, reusePort=True,
To the Azure function app, add an inbound traffic rule under Networking option to allow traffic from Azure EventGrid

Getting Azure Blob Store Eventgrid Notifications from another Azure Account

we work with an other company who just proposed us to privide us with an azure blob store SAS token. But we would like to use the events triggered by the Blob store and provided by the Azure Eventgrid system.
Is this possible ?
Why we have to this on our azure account is for billing purposes, we need the Events they don't we will have to pay for through our account
I hope someone can lead me in the right direction.
Azure Blob Storage as an Event Grid source works for Microsoft.Storage.BlobCreated and Microsoft.Storage.BlobDeleted and when an event is triggered, the Event Grid service sends data about that event to the subscribing endpoint. Those event grid subscriptions exist in the same Azure Subscription as the resources.
Webhook Event delivery is one of the many ways to receive events from Azure Event Grid and this is something you can host in your Azure Subscription (or even outside of Azure).
Given the Price per million operations is only $0.60 per the Event Grid Basic tier, the Webhook notification will allow you to pay for all but a negligible amount of the costs.

No Event Grid events triggering when uploading files to Azure Blob Storage -- why?

I set up a simple scenario in Azure using a Storage Account, a Function App, and an Event Grid System Trigger. Blob uploads into the Storage Account should cause the Event Grid System Trigger to send a BlobCreated event to trigger the Azure Function.
I can see that the Event Grid System Topic appears to be configured for the correct storage account according to the overview page in the Azure Portal:
I have a subscription created for the Event Grid System Topic, and it subscribes to all of the events the storage account can generate as I can see in the Azure Portal. This shows all 6 event types enabled, so I'm not filtering them out.
Despite this, when I upload blobs into a container I created in my storage account and watch for the events to show up in the metrics on my Event Grid System Topic, or see my Azure Function trigger, no events appear to ever be generated. Some interesting points about my storage account which may be worth mentioning are:
I am using a premium storage account
I am using a private vnet for my storage account
I suspected the network, but to rule that out I changed my storage account back to public and tried again but it didn't change the behavior. From everything I can tell from documentation, this should be working. Any ideas why it isn't?
I work at MS in the SDK team, and I reached to an EventGrid team member directly for opinion:
I looked into our service logs for last two weeks and I could not find
any events for this topic/event-subscription.
Can you please provide specific time and region when you are
uploading/deleting/editing the blobs to help investigating? Also, is
this specific to this storage account? Was this working before or this
scenario working for other storage accounts? Can you please open a
support ticket to handle this properly.
Thanks! In any doubt on the process, feel free to reply to me, we'll monitor this thread
[Edit: more info from Storage team]
We communicated with Azure Storage team and they confirmed that the behavior as described is by design and expected. Here are some additional details from Azure Storage Team:
The issue is that the customer is using a Premium_LRS StorageV2
account. These accounts only support premium page blobs and premium
disks.
If the customer wants to store block blobs in the premium tier, they
need to create a BlockBlobStorage account.
See subscript 5 in this table:
https://learn.microsoft.com/en-us/azure/storage/common/storage-account-overview

Send Azure Blob Storage event notifications to Event Hubs on another account

In Azure, I'm trying to send event notifications from a Storage Account in one Active Directory to an Event Hub in another Active Directory.
I'm having trouble figuring out how to share/link the resource.
In AWS, I was able to accomplish this by creating a role in the receiver account, adding the source account by ID, adding the SQS Writer resource permission, and adding the SQS Queue ARN as the bucket notification destination. I'm guessing something similar is possible in Azure..
At the moment, I am looking at Active Directory IAM, which appears to have the EventGrid EventSubscription Contributor property. In the destination account I have added the source account as a contributor, and I received a notification in the source account that I had permissions in the destination account, but when I try to create an event subscription in the source account, the Event Hubs in the destination account don't show as an option.
How can I write event notifications to Event Hubs in one account from a Storage Account in another?
Absolutely yes. I think there are many ways to do that across different subscriptions, such as the two below.
Solution 1 to use Azure Functions. You can use Azure Function with Blob Trigger to get the event notifications of blob changes, and then to request the other Azure Function with HttpTrigger via PUT/POST method to transfer the event message of blob information like blob url with SAS token for accessing in other subscriptions.
Solution 2 to use Azure Logic Apps. You can use the logic flow below to get the blob change events to send the notification message to EventHub in other subscriptions, because Azure Logic Apps allows to configure their connection information manually as below.
Fig 1. The logic flow to get events from Blob Storage and send to EventHub
Fig 2. Click the Manually enter connection information to configure for a service in other subscriptions.
Fig 2-A.
Fig 2-B.
Basically, there are supported two ways in the Azure Event Grid Pub/Sub model for delivery events across the multi-tenants environment, such as:
Tightly coupled delivery of the event messages to the subscriber resource based on the RBAC. At the subscriber (destination) resource, you can
add a built-in role assignment such as EventGrid EventSubscription Contributor for Azure AD user, etc.
or add co-administrator at the Azure subscription level
The following screen snippet shows an example of the case when I am a co-administrator two Azure subscriptions such as the Stage and Development.
Creating an Event Subscription for event driven blob storage topic in the AEG provider at the Stage azure account and delivery its notification events across the azure account boundary to the Subscriber such as an Event Hub located in the Development azure account is straightforward:
Loosely decoupled delivery of the event messages to the Subscribers across the multi-tenants boundary based on the WebHook event handler endpoint. For Pub/Sub integration across the tenant boundary can be used an EventGridTrigger function with an output binding to the Event Hub resource. The following screen snippet shows this example:
The above solution is very straightforward with capability to mediate (pre-processing) an event message to the Event Hub resource.
In the case for distributing the events to another subscribers, etc. in the Fan-Out pattern manner, the Azure Event Grids can be cascaded like is shown in the following screen snippet:
In the above solution, each tenant has own Azure Event Grid provider and there are cascaded via the "plumbing" WebHook event handler endpoint and custom topic endpoint.
More details about the AEG cascading implementation can be found here.

How to track user activity like who is creating what resources on azure of a specific subscription?

In my company we have one Azure subscription and there are two or three users which are added on the same subscription and have right to create any resource on Azure.
Now since three users are working on same subscription and they are independently creating resources, I want to keep track or see which user created what resource on the same subscription.
Please let me know is there any way to see this tracking/activity details corresponding to the user.
Currently all users have administrator role/permission.
You are looking for the Activity Log:
The Azure Activity Log is a log that provides insight into the
operations that were performed on resources in your subscription
The Activity Logs provides customers a Portal and REST API experience to see who performed what management operations (PUT/DELET/POST) through Azure Resource Manager (ARM) for the past 90 days.
For anything older than 90 days, you have the option to archive the data to storage account or stream the data to Event Hub if you would like to ingest this data into your own system.
The Activity Log data is also available through the Operations Management Suite.
http://www.deployazure.com/management/operations-management-suite/azure-activity-log-analytics-alerts-with-operations-management-suite/

Resources