Azure Event Grid: Deliver events if blob contains metadata - azure

I created an event subscription for BlobCreated event. I'm using Azure Functions with an EventGridTrigger to receive the events. Right now, events are firing every time a new blob is created. Is it possible to create an advanced filter in the event subscription so that the events are delivered only when a blob contains metadata?

No, this feature is not supported in the AEG Advanced filtering, see the event schemas for Azure Blob Storage as an Event Grid source here.
As you can see in this document, there is no blob's metadata in the event data object.

Related

How to save the data from GZIPPED file by converting to JSON into ADLS

I have an Event HUB which will receive GZIPPED Json Data as Messages. Any idea on how to catch these message and save the JSON to ADLS.
You can try the Event Hubs Capture feature.
Azure Event Hubs enables you to automatically capture the streaming data in Event Hubs in an Azure Blob storage or Azure Data Lake Storage Gen 1 or Gen 2 account of your choice.
You can follow the official MS docs to further Setting up Event Hubs Capture
To Capture data to Azure Data Lake Storage Gen 2 using Azure portal
(this is not available in basic namespace pricing, choose atleat standard plan)
1. Once you have the event hub namespace, create an event hub, while
creating select the enable capture
2. Later you can find and configure it at
Event hubs instance > features > Capture
Note:
If you enable the Capture feature for an existing event hub, the
feature captures events that arrive at the event hub after the feature
is turned on. It doesn't capture events that existed in the event hub
before the feature was turned on.

How to events from azure event hub to azure blob storage using logic app?

I want to move events from azure event hub to Azure blob storage using the logic app, can anyone suggest to me how to use with logic app connector, triggers, and action with the designer example.
Note:
Events are json events and need to be stored in blob storage.
You can start with a blank logic app, and use the search assistant to find what you're looking for.
Typing event hub gives:
where you can provide the connection by providing the name.
Save the content in a variable.
You can use SaveInitialJsonToBlobStorage to now store this json in a blob storage:

How to ingest blobs created by Azure Diagnostics into Azure Data Explorer by subscribing to Event Grid notifications

I want to send Azure Diagnostics to Kusto tables.
The idea is to get logs and metrics from various Azure resources by sending them to a storage account.
I'm following both Ingest blobs into Azure Data Explorer by subscribing to Event Grid notifications and Tutorial: Ingest and query monitoring data in Azure Data Explorer,
trying to use the best of all worlds - cheap intermediate storage for logs, and using EventHub only for notifications about the new blobs.
The problem is that only part of the data is being ingested.
I'm thinking that the problem is in the append blobs which monitoring creates. When Kusto receives "Created" notification, only a part of the blob is written, and the rest of events are never ingested as the blob is appended to.
My question is, how to make this scenario work? Is it possible at all, or I should stick with sending logs to EventHub without using the blobs with Event Grid?
Append blobs do not work nicely with Event Grid ADX ingestion, as they generate multiple BlobCreated events.
If you are able to cause blob rename on update completion, that would sole the problem.

Sending data from Table Storage to Event Hub

It happens so the systems we have to work with has immutable Batch layer as Table Storage. We would like to forward new records added to Table Storage to be sent to Event Hub so that we can process further.
Is there a way to continuously write new records written to Table Storage to be written to Event Hub?
Is it possible otherwise to use either Python or Java SDK of Azure Storage to read newly added records?

Azure Blob creation not firing Azure Data Factory event trigger

I've Event trigger in Azure Data Factory, it triggers when a new blob is created in Azure Blob storage. But my trigger is not firing on Blob creation.
Followed below link but stuck at below mentioned point:
Azure Data Factory: event not starting pipeline.
Environment details:
Event Grid is registered, ADF is v2 and passing parameters to pipeline.
My question is do I need to have Azure storage event subscription activated? If so what should be my event handler(which in my case is ADF pipeline)?
Please suggest:
Azure Storage Event subscription is mandatory to fire Blob creation triggers? if yes then Event Handler option.
If it is not mandatory(as per my research only Event Grid has to be registered) what is causing my Event trigger to not fire?
Must be using a V2 Storage Account
Trigger name must only contain letters, numbers and the '-' character (this restriction will soon be removed)
Trigger makes the following properties available #triggerBody().folderPath and #triggerBody().fileName. To use these in your pipeline your must map them to pipeline paramaters and use them as such: #pipeline().parameters.paramaetername.
Have you followed above guidelines? If yes, maybe you could consider create a support ticket.

Resources