Azure EventHub and Durable Functions - azure

Literally trying out to make do of something I am not good at.
I have read upon the durable function overview here - https://learn.microsoft.com/en-us/azure/azure-functions/durable/durable-functions-overview.
There is a topic on using Bindings to use it on an Event Hub Trigger, but there is not really a working example, I followed what was there and came up with this binding in my function.json,
{
"bindings": [
{
"type": "eventHubTrigger",
"name": "myEventHubMessage",
"direction": "in",
"path": "testinhub",
"connection": "Endpoint=sb://dev-testingeventhubinns.servicebus.windows.net/;SharedAccessKeyName=RootManageSharedAccessKey;SharedAccessKey=lLassdff5Y/esH8/CaXDOWH0jF2JtZBQhQeFoCtfqYs=",
"consumerGroup": "$Default"
},
{
"type": "eventHub",
"name": "outputEventHubMessage",
"connection": "Endpoint=sb://dev-testingeventhuboutns.servicebus.windows.net/;SharedAccessKeyName=RootManageSharedAccessKey;SharedAccessKey=4yuasdff7Lzu+mQJFVlnlozUItqFY1L3WW/kJnpTjq8=",
"path": "testouthub",
"direction": "out"
}
],
"disabled": false,
"entryPoint": "EventTriggerFunction.EventHubTriggerClass.Run"
}
My code in entirety is as follows,
using System.Collections.Generic;
using System.Text;
using System.Threading.Tasks;
using Microsoft.Azure.WebJobs;
using Microsoft.Extensions.Logging;
using Microsoft.Azure.EventHubs;
using System;
namespace EventTriggerFunction
{
public static class EventHubTriggerClass
{
[FunctionName("EventHubTrigger")]
public static async Task<List<string>> Run([OrchestrationTrigger] DurableOrchestrationContext context)
{
await context.CallActivityAsync<string>("EventHubTrigger_Send", "Hello World");
return null;
}
[FunctionName("EventHubTrigger_Send")]
public static void SendMessages([EventHubTrigger("testinhub", Connection = "ConnectionValue")] EventData[] eventHubMessages, ILogger log)
{
var exceptions = new List<Exception>();
foreach (EventData message in eventHubMessages)
{
try
{
log.LogInformation($"C# Event Hub trigger function processed a message: {Encoding.UTF8.GetString(message.Body)}");
}
catch (Exception e)
{
// We need to keep processing the rest of the batch - capture this exception and continue.
// Also, consider capturing details of the message that failed processing so it can be processed again later.
exceptions.Add(e);
}
}
}
}
}
If I send a message using the function, I can not see it on my testouthub event hub. Not really sure how this Durable function and EventHub Trigger works hand in hand.

I think you are mixing it up a bit. When using the attributes like FunctionName and EventHubTrigger you do not need to supply the function.json. It's either function.json or with attributes.
If you are trying to receive messages from 1 eventhub and passing it on to the next, then something like this below will also do the trick.
There's no need to use DurableFunctions for this and the Azure Function runtime will scale by itself if there are many messages, see this
[FunctionName("EventHubTriggerCSharp")]
[return: EventHub("outputEventHubMessage", Connection = "EventHubConnectionAppSetting")]
public static void Run([EventHubTrigger("samples-workitems", Connection = "EventHubConnectionAppSetting")] string myEventHubMessage, ILogger log)
{
log.LogInformation($"C# Event Hub trigger function processed a message: {myEventHubMessage}");
return myEventHubMessage;
}
An additional tip: I would not paste your full connectionstring into StackOverflow. Maybe it's wise to immediately create new AccessKeys for your eventhubs.

Related

Azure Durable Functions - OrchestrationTrigger stuck

I am new to azure durable functions. I have created a sample azure durable function using vs 2019. I am running default generated azure durable function template code locally with azure storage enumerator and when I run the durable function, the OrchestrationTrigger stuck and not able to resume.
The hub name is samplehubname. There a pending records present in the samplehubnameInstances azure table but there is no records in the samplehubnameHistory azure table.
There is no exception and no errors in the code.
SampleFunction.cs
public static class SampleFunction
{
[FunctionName("SampleFunction")]
public static async Task<List<string>> RunOrchestrator(
[OrchestrationTrigger] IDurableOrchestrationContext context)
{
var outputs = new List<string>();
// Replace "hello" with the name of your Durable Activity Function.
outputs.Add(await context.CallActivityAsync<string>("SampleFunction_Hello", "Tokyo"));
// returns ["Hello Tokyo!", "Hello Seattle!", "Hello London!"]
return outputs;
}
[FunctionName("SampleFunction_Hello")]
public static string SayHello([ActivityTrigger] string name, ILogger log)
{
log.LogInformation($"Saying hello to {name}.");
return $"Hello {name}!";
}
[FunctionName("SampleFunction_HttpStart")]
public static async Task<HttpResponseMessage> HttpStart(
[HttpTrigger(AuthorizationLevel.Anonymous, "get", "post")] HttpRequestMessage req,
[DurableClient] IDurableOrchestrationClient starter,
ILogger log)
{
// Function input comes from the request content.
string instanceId = await starter.StartNewAsync("SampleFunction", null);
log.LogInformation($"Started orchestration with ID = '{instanceId}'.");
return starter.CreateCheckStatusResponse(req, instanceId);
}
}
local.settings.json
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "UseDevelopmentStorage=true",
"FUNCTIONS_WORKER_RUNTIME": "dotnet",
"AzureWebJobsSecretStorageType": "files", //files
"MyTaskHub": "samplehubname"
}
}
host.json
{
"version": "2.0",
"logging": {
"applicationInsights": {
"samplingSettings": {
"isEnabled": true,
"excludedTypes": "Request"
}
}
},
"extensions": {
"durableTask": {
"hubName": "%MyTaskHub%"
}
}
}
samplehubname-control-03 Message Queue
{"$type":"DurableTask.AzureStorage.MessageData","ActivityId":"72b75a34-e403-4772-aed0-fbb10039795a","TaskMessage":{"$type":"DurableTask.Core.TaskMessage","Event":{"$type":"DurableTask.Core.History.ExecutionStartedEvent","OrchestrationInstance":{"$type":"DurableTask.Core.OrchestrationInstance","InstanceId":"f8d0499a4297480c8bdf4a56954861d3","ExecutionId":"2e46b87e4cf74c2dab572d92e012bded"},"EventType":0,"ParentInstance":null,"Name":"Function1","Version":"","Input":"null","Tags":null,"EventId":-1,"IsPlayed":false,"Timestamp":"2021-09-21T15:41:35.0156514Z"},"SequenceNumber":0,"OrchestrationInstance":{"$type":"DurableTask.Core.OrchestrationInstance","InstanceId":"f8d0499a4297480c8bdf4a56954861d3","ExecutionId":"2e46b87e4cf74c2dab572d92e012bded"}},"CompressedBlobName":null,"SequenceNumber":1,"Sender":{"$type":"DurableTask.Core.OrchestrationInstance","InstanceId":"","ExecutionId":""}}
Any help will appreciated.
If your orchestration code has contain a loop too much, the guidance in our Eternal orchestration documentation.
In your code there is not much loop available. So, you need to use the durable functions eternal orchestrations TerminateAsync (.NET) method of the orchestration client binding to stop it.
Add your durable function into application insights to check the clear view of issue. It may help to fix the issue.
Check the similar issue here.
Try the steps and use fan in/ fan out in durable function durable functions patterns fan out and fan in

Azure Function: How to add row to cloud table on blob creating?

I'm developing an Azure Function which should add new line to an Azure table when new a new blob is added. The application has many containers in Blob Storage, and my Azure Function should process all blobs from all containers.
I tried to implement event getting with EventGrid, but it gives an error.
My Azure function:
#r "D:\home\site\wwwroot\BlobCreatedFunction\Microsoft.Azure.EventGrid.dll"
#r"D:\home\site\wwwroot\BlobCreatedFunction\Microsoft.WindowsAzure.Storage.dll"
using Microsoft.Azure.EventGrid.Models;
using Microsoft.WindowsAzure.Storage.Table;
using System;
public class TemporaryBlobEntity : TableEntity
{
public TemporaryBlobEntity(string partitionKey, string rowKey)
{
this.PartitionKey = partitionKey;
this.RowKey = rowKey;
}
public string BlobUrl { get; set; }
public DateTime BlobUploaded { get; set; }
}
public static TemporaryBlobEntity Run(EventGridEvent eventGridEvent, ILogger log)
{
if (eventGridEvent.Data is StorageBlobCreatedEventData eventData)
{
log.LogInformation(eventData.Url);
log.LogInformation(eventGridEvent.Data.ToString());
var temporaryBlob = new TemporaryBlobEntity("blobs", eventData.Url)
{
BlobUrl = eventData.Url,
BlobUploaded = DateTime.UtcNow
};
return temporaryBlob;
}
return null;
}
Here is my integration JSON:
{
"bindings": [
{
"type": "eventGridTrigger",
"name": "eventGridEvent",
"direction": "in"
},
{
"type": "table",
"name": "$return",
"tableName": "temporaryBlobs",
"connection": "AzureWebJobsStorage",
"direction": "out"
}
]
}
In my Azure Function settings, I added the value for AzureWebJobsStorage.
When I press Run in the test section, logs show:
2019-07-08T13:52:16.756 [Information] Executed 'Functions.BlobCreatedFunction' (Succeeded, Id=6012daf1-9b98-4892-9560-932d05857c3e)
Looks good, but there is no new item in cloud table. Why?
Then I tried to connect my function with EventGrid topic. I filled new subscription form, selected "Web Hook" as endpoint type, and set subscriber endpoint at: https://<azure-function-service>.azurewebsites.net/runtime/webhooks/EventGrid?functionName=<my-function-name>. Then I got the following error message:
Deployment has failed with the following error: {"code":"Url validation","message":"The attempt to validate the provided endpoint https://####.azurewebsites.net/runtime/webhooks/EventGrid failed. For more details, visit https://aka.ms/esvalidation."}
As far as I can understand, the application needs some kind of request validation. Do I really need to implement validation in each of my azure functions? Or shoudl I use another endpoint type?
When you enter a webhook into Event Grid it sends out a request to verify that you actually have permissions on that endpoint. The easiest way to connect a Function to Event Grid is to create the subscription from the Functions app instead of the Event Grid blade.
Opening up the Function in the portal you should find a link at the top to "Add Event Grid subscription". Even if the Functions app was created locally and published to Azure so the code isn't viewable the link will be available.
This will open up the screen for creating an Event Grid subscription. The difference is that instead of the Event Grid topic info being prefilled, the web hook info is prepopulated for you. Fill in the info about the Event Grid topic to finish creating the subscription.
If you decide you want to implement the validation response for whatever reason, it is possible to do this by checking the type of the message.
// Validate whether EventType is of "Microsoft.EventGrid.SubscriptionValidationEvent"
if (eventGridEvent.EventType == "Microsoft.EventGrid.SubscriptionValidationEvent")
{
var eventData = (SubscriptionValidationEventData)eventGridEvent.Data;
// Do any additional validation (as required) such as validating that the Azure resource ID of the topic matches
// the expected topic and then return back the below response
var responseData = new SubscriptionValidationResponse()
{
ValidationResponse = eventData.ValidationCode
};
if (responseData.ValidationResponse != null)
{
return Ok(responseData);
}
}
else
{
//Your code here
}
There is also an option to validate the link manually by getting the validation link out of the validation message and navigating to it in your browser. This method is primarily for 3rd party services where you can't add the validation code.
The following are changes in your EventGridTrigger function:
#r "Microsoft.WindowsAzure.Storage"
#r "Microsoft.Azure.EventGrid"
#r "Newtonsoft.Json"
using System;
using Newtonsoft.Json.Linq;
using Microsoft.Azure.EventGrid.Models;
using Microsoft.WindowsAzure.Storage.Table;
public static TemporaryBlobEntity Run(EventGridEvent eventGridEvent, ILogger log)
{
log.LogInformation(eventGridEvent.Data.ToString());
var eventData = (eventGridEvent.Data as JObject)?.ToObject<StorageBlobCreatedEventData>();
if(eventData?.Api == "PutBlob")
{
log.LogInformation(eventData.Url);
return new TemporaryBlobEntity("blobs", eventData.Sequencer)
{
BlobUrl = eventData.Url,
BlobUploaded = DateTime.UtcNow
};
}
return null;
}
public class TemporaryBlobEntity : TableEntity
{
public TemporaryBlobEntity(string partitionKey, string rowKey)
{
this.PartitionKey = partitionKey;
this.RowKey = rowKey;
}
public string BlobUrl { get; set; }
public DateTime BlobUploaded { get; set; }
}
Notes:
You don't need to validate an EventGridTrigger function for AEG subscription webhook endpoint. This validation is built-in the preprocessing of the EventGridTrigger function.
The eventGridEvent.Data property is a JObject and must be converted (deserialized) to the StorageBlobCreatedEventData object, see here.
For RowKey (and PartitionKey) see the restriction characters in here, so I changed it to the Sequencer value in this example.
The AEG subscription webhook endpoint for the EventGridTrigger function has the following format:
https://{azure-function-service}.azurewebsites.net/runtime/webhooks/EventGrid?functionName={my-function-name}&code={systemkey}

read SB message with azure functions service bus trigger

I just created a simple azure function using service bus trigger. I am using the default example provided. I can read the messageid in the code below
public static void Run(string mySbMsg, TraceWriter log)
{
log.Info($"C# ServiceBus topic trigger function processed message:
{mySbMsg}");
}
I`m struggling to find codes showing how to read the json message that was posted.
Thanks for you help
You can use the BrokeredMessage parameter to get the message body in Azure Function Service Bus Trigger.
This will return the message with the BrokeredMessage.GetBody() method.
Get more information here.
In Azure portal, add "project.json" in View Files. This the library which contains BrokeredMessage object.
The project.json should look like
{
"frameworks":
{
"net46":
{
"dependencies":
{
"WindowsAzure.ServiceBus": "4.1.2"
}
}
}
}
When you save, you can see the package gets restored.
Inside the Run method, add BrokeredMessage as parameter. The method should look like
public static void Run(BrokeredMessage message, TraceWriter log)
{
string messageBody = message.Properties["Message"].ToString();
string messageId = message.Properties["Id"].ToString();
log.Info($"message - " + messageBody + " Id " + messageId);
}
Don't forget to add Using Microsoft.ServiceBus.Messaging in "Run.csx" and change the name property to message in "Function.json"

Why Azure function is not writing to Service Bus Topic

My function is like
[FunctionName("MyFunctionName")]
[return: ServiceBus("mytopic", Connection = "ServiceBusConnectionString")]
public static async Task<string> MyFunctionAsync([QueueTrigger("my-input-queue")] string msgIn, TraceWriter log)
{
My local.settings.json has
{
"IsEncrypted": false,
"Values": {
"ServiceBusConnectionString": "[my connection string]"
}
}
where [my connection string] is copy-pasted from a Primary Connecting String under one of the Shared access policies with a Send claim.
This just silently fails: Messages get stuck in my-input-queue and no errors are written to log streaming. However I'm 100% sure the attribute is the issue because I've deployed 100 different combinations of this to try and make it work :).
Any ideas?
Based on my test,it should work with servicebus attribute. The following is my test code.
[return: ServiceBus("topicName",Connection = "ServiceBusConnectionString", EntityType = EntityType.Topic)]
public static async Task<string>Run([QueueTrigger("queueName")]string myQueueItem, TraceWriter log)
{
...
return myQueueItem; // write to the Topic.
}
local.settings.json
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "xxxxxx",
"AzureWebJobsDashboard": "xxxxxxxx",
"ServiceBusConnectionString": "xxxxxx"
}
}
You could get more information about Azure Service Bus output binding from this tutorial. You also could do that with follwoing way
[FunctionName("ServiceBusOutput")]
public static void Run([[QueueTrigger("queueName")]string myQueueItem,
TraceWriter log,
[ServiceBus("topicName",Connection = "ServiceBusConnectionString", EntityType = EntityType.Topic)]out string queueMessage)
{
log.Info("Azure Function Demo - Azure Service Bus Queue Topic");
queueMessage = myQueueItem;
}
You are missing the required settings for your QueueTrigger, so your function isn't triggering on new items in the queue. You should have values for AzureWebJobsStorage and AzureWebJobsDashboard, and your QueueTrigger should have a value for the Connection field.
For more information about how to wire up QueueTriggers and test locally, see this answer.

Azure Functions and DocumentDB triggers

Is it possible to specify the DocumentDB is to fire triggers when writing to DocumentDB?
I have an Azure function that pulls JSON messages off a Service Bus Queue and puts them into DocumentDB like so:
using System;
using System.Threading.Tasks;
public static string Run(string myQueueItem, TraceWriter log)
{
log.Info($"C# ServiceBus queue trigger function processed message: {myQueueItem}");
return myQueueItem;
}
This inserts new documents into the database as they are added to the service bus queue, however I need DocumentDB to process these as they are added and add attachments. This cannot be done in the present setup and I would like to tell DocumentDB to fire a trigger.
I have tried something like this:
using System;
using System.Threading.Tasks;
public static string Run(string myQueueItem, TraceWriter log)
{
log.Info($"C# ServiceBus queue trigger function processed message: {myQueueItem}");
return "x-ms-documentdb-post-trigger-include: addDocument\n" + myQueueItem;
}
It doesn't work and gives me errors like this:
Exception while executing function:
Functions.ServiceBusQueueTriggerCSharp1. Microsoft.Azure.WebJobs.Host:
Error while handling parameter _return after function returned:.
Newtonsoft.Json: Unexpected character encountered while parsing value:
x. Path '', line 0, position 0.
I like this setup because I can saturate the queue with requests to add records and they just buffer until the database can deal with it, which deals with spikes in demand, but it allows data offload from the client machine as fast as the network can carry it and then the queue/database combination gets caught up when demand drops again.
You could refer to the following code sample to create document with the trigger enabled in Azure Functions.
using System;
using System.Threading.Tasks;
using Microsoft.Azure.Documents;
using Microsoft.Azure.Documents.Client;
public static void Run(string myQueueItem, TraceWriter log)
{
string EndpointUri = "https://{documentdb account name}.documents.azure.com:443/";
string PrimaryKey = "{PrimaryKey}";
DocumentClient client = new DocumentClient(new Uri(EndpointUri), PrimaryKey);
client.CreateDocumentAsync(UriFactory.CreateDocumentCollectionUri("{databaseid}", "{collenctionid}"), new MyChunk { MyProperty = "hello" },
new RequestOptions
{
PreTriggerInclude = new List<string> { "YourTriggerName" },
}).Wait();
log.Info($"C# ServiceBus queue trigger function processed message: {myQueueItem}");
}
public class MyChunk
{
public string MyProperty { get; set; }
}
Note: for using Microsoft.Azure.DocumentDB NuGet package in a C# function, please upload a project.json file to the function's folder in the function app's file system.
project.json
{
"frameworks": {
"net46":{
"dependencies": {
"Microsoft.Azure.DocumentDB": "1.13.1"
}
}
}
}
Besides, please make sure you have created triggers in your DocumentDB, for details about creating triggers, please refer to this article.

Resources