How can i use Pub / Sub Notifications for Cloud Storage - node.js

I need to use Pub / Sub Notifications for Cloud Storage, on firebase i have storage files, i need to do any process those process i storage in another fields on firebase as well, sometimes the process failed so i need to use Pub / Sub Notifications for Cloud Storage to publish OBJECT_FINALIZE
to implement https://cloud.google.com/storage/docs/pubsub-notifications, i need to create the topic, the suscription of the action by user.
where can i create the topic?, i am so confused with the documentation

Once you run the command to create the GCS notification, if the topic-name does not exist it will automatically create it for you. So your first step is to create a GCS notification, passing a topic name and the event type on OBJECT_FINALIZE as parameters.
Example:
gsutil notification create -t <TOPIC_NAME> -f json -e OBJECT_FINALIZE gs://BUCKET_NAME
Note the parameters:
-t = Topic name
-e = Event type (Object finalize)
Once this part is done, your next step is to create a subscription to view the notifications sent to PubSub. The subscription is attached to the topic created above.
In this documentation, it explains how to create a subscription and also a Node.JS example with a push delivery: https://cloud.google.com/pubsub/docs/admin#pubsub_create_push_subscription-nodejs

Related

Azure az copy command complete event grid notification

Is there a way for the Azure event grid to trigger when an AZ copy command completes?
We have clients which use az copy to transfer hundreds of files and sub folders into our Azure storage. The number of files is variable. And the az copy command is of a single root folder on their local containing those files and sub folders.
We want to raise an event grid notification when the az copy is complete and successful.
An alternative would be to have a second az copy command in a batch file that transfers a single flag file once the initial command is fully executed successfully. We would then monitor for this single file as the flag to proceed with further processing.
Perhaps if az copy cannot raise the event, then it can add a verification file signaling the end of the transfer?
You can have event grid notifications on an individual blob (or directory, when using ADLS). azcopy is essentially creating individual blobs, so you'd get individual notifications. Azure storage doesn't provide a transactional batch of uploads, so you can't get a single notifiction.
If you wanted a single notification, you'd have to manage this yourself. You mentioned a "flag" file, but you can also create custom topics, use an Azure Function, service bus message, etc. How you ultimately implement this is up to you (and your clients that are uploading content), but tl;dr no, you can't get a single completion event for a batch of uploads.
I have tried to reproduce in my environment get notification successfully
For event grid notification the endpoint which will receive notification in function app or logic apps...I created endpoint in function app
In your function app -> Function -> create ->select azure event grid trigger ->create
once created, in storage account create subscription as below
please make changes as below once you select endpoint by default it shows function in right side as below and confirm selection and create subscription
once azcopy is complete and files uploaded in container you will get a notification like below

Azure Function Storage Container Blob Trigger

Azure Function Storage Account Blob Container Trigger
In one of our use case, i am looking for Azure function trigger for any activity in Storage account containers with following conditions
Container with a specific naming convention (name like xxxx-input)
It should automatically detect if a new container(with specific naming convention) is created
Currently, the following events are supported at the moment, per the documentation:
BlobCreated
BlobDeleted
BlobRenamed
DirectoryCreated(Data lake Gen2)
DirectoryRenamed(Data lake Gen2)
DirectoryDeleted(Data lake Gen2)
This means that it is not possible to create such event, but you can try to change the approach(if feasible for your use-case) from 'push' to 'pull'.
I suggest to write a time-triggered function that checks whether container with the given schemes were created. You can leverage the Blob Storage v12 SDK for this task, and get list of the containers.
Save the list to some database(for example CosmosDB), and every time the function gets triggered, you can compare the current state, with the last saved state from the db.
If there is a difference, you can push the message to the EventHub, that triggers another function that actually reacts on this 'new event-type'.
you should use the Azure Event Grid subscribing to the Resource group of your storage account and use for example, the advanced filtering for
"operationName":"Microsoft.Storage/storageAccounts/blobServices/containers/write",
"subject":"/subscriptions/<yourId>/resourcegroups/<yourRG>/providers/Microsoft.Storage/storageAccounts/<youraccount>/blobServices/default/containers/xxxx-input",
"eventType":"Microsoft.Resources.ResourceWriteSuccess",

Azure Storage Queue message to Azure Blob Storage

I have access to a Azure Storage Queue using a connection string which was provided to me (not my created queue). The messages are sent once every minute. I want to take all the messages and place them in Azure Blob Storage.
My issue is that I haven't been succesful in getting the message from the attached Storage Queue. What is the "easiest" way of doing this data storage?
I've tried accessing the external queue using Logic Apps and then tried to place it in my own queue before moving it to Blob Storage, however without luck.
If you want to access and external storage in the logic app, you will need the name of the storage account and the Key.
You have to choose the trigger for an azure queues and then click in the "Manually enter connection information".
And in the next step you will be able to choose the queue you want to listen for.
I recomend you to use and azure function, something like in this article:
https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-storage-blob-output?tabs=csharp
Firts you can try only reading the messages, and then add the output that create your blob:
[FunctionName("GetMessagesFromQueue")]
public IActionResult GetMessagesFromQueue(
[QueueTrigger("%ExternalStorage.QueueName%", Connection = "ExternalStorage.StorageConnection")ModelMessage modelmessage,
[Blob("%YourStorage.ContainerName%/{id}", FileAccess.Write, Connection = "YourStorage.StorageConnection")] Stream myBlob)
{
//put the modelmessage into the stream
}
You can bind to a lot of types not only Stream. In the link you have all the information.
I hope I've helped

Azure Event Hub - Can't understand Java flow

from Microsoft EventHub Java SDK examples (https://learn.microsoft.com/en-us/azure/event-hubs/event-hubs-java-get-started-send), these are the steps that needs to be taken to be able to consume messages from an Even-Hub via the java SDK :
1.Create a storage account
2.Create a new class called EventProcessorSample. Replace the placeholders with the values used when you created the event hub and storage account:
3.
String consumerGroupName = "$Default";
String namespaceName = "----NamespaceName----";
String eventHubName = "----EventHubName----";
String sasKeyName = "----SharedAccessSignatureKeyName----";
String sasKey = "----SharedAccessSignatureKey----";
String storageConnectionString = "----AzureStorageConnectionString----";
String storageContainerName = "----StorageContainerName----";
String hostNamePrefix = "----HostNamePrefix----";
ConnectionStringBuilder eventHubConnectionString = new ConnectionStringBuilder()
.setNamespaceName(namespaceName)
.setEventHubName(eventHubName)
.setSasKeyName(sasKeyName)
.setSasKey(sasKey);
There are several things i don't understand about this flow -
A. Why is a storage account required? Why does it needs to be created only when creating a consumer and not when creating the event hub itself?
B. What is 'hostNamePrefix' and why is it required?
C. More of a generalaztion of A, but i am failing to understand why is this flow so complicated and needs so much configuration. Event Hub is the default and only way of exporting metrics/monitoring data from Azure which is a pretty straightforward flow - Azure -> Event Hub -> Java Application. Am i missing a simpler way or a simpler client option?
All your questions are around consuming events from Event hub.
Why is a storage account required?
Read the event only once: Whenever your application will read event from event hub, you need to store the offset(identifier for the amount of event already read) value somewhere. The storing of this information is known as 'CheckPointing' and this information will be stored in Storage Account.
Read the events from starting everytime your app connects to it: In this case, your application will keep on reading the event from very beginning whenever it will start.
So, the storage account is required to store the offset value while consuming the events from event hub in case if you want to read event only once.
Why does it needs to be created only when creating a consumer and not
when creating the event hub itself?
As it depends upon the scenario, whether you want to read your events only once or every time your app starts and that's why storage account is not required while creating event hub.
What is 'hostNamePrefix' and why is it required?
As the name states 'hostNamePrefix', this is name for your host. The host means the application which is consuming the events. And it's a good practice to make use of GUID as a hostNamePrefix. HostNamePrefix is required by the event hub to manage the connection with the host. In case, if you have 32 partitions, and you have deployed 4 instances of your same application then 8 partition each will be assigned to your 4 different instances and that's where the host name helps the event hub to manage the information about the connection of the respective partitions to their host.
I will suggest you to read this article on event hub for clear picture of the event processor host.

Azure Functions - Event Hub not triggering Functions

I have an Azure infrastructure:
2 HTTP Functions -> Event Hub -> 2 Functions -> Table Storage
(so two http functions sending messages to event hub, and two functions triggered by messages in Event Hub, one of them saving message in table storage)
The infrastructure is daily automatically created by Azure ARM templates, with the use of Azure CLI. I haven't changed the logic in recent two months but since beginning of April I have noticed the new, weird behaviour.
At the end of setting up, E2E tests are executed automatically. They are sending some message and after some time they check, if message are in table storage.
And here is the problem: since beginning of April these tests almost always fail! And I did not change anything in logic of function or template.json's for infrastructure.
It looks that Functions that should be triggered by Event Hub are not executed at all! I have already found a workaround for it - if I go to Azure portal and run these functions manually ("Run" button above code editor), then the functions finally starts to work!
Does anybody else encounter this problem?
Is there some way to automatically, directly run non-HTTP triggered function by e.g. Azure CLI or REST interface?
It seems that problem is already quite well known:
https://github.com/Azure/Azure-Functions/issues/210
I'm using currently workaround from this issue, i.e. calling Azure CLI's method to synchronize function triggers after creating infrastructure and zip pushing of functions:
az resource invoke-action --resource-group <resourceGrouName> --action syncfunctiontriggers --name <functionAppName> --resource-type Microsoft.Web/sites

Resources