I've tried to build a solution in which I have multiple Function Apps deployed (in multiple Azure Regions) which all share the same Task Hub (set via host.json) storage account. Now my idea was that since each of them should listen to new work for the Activity Functions, the load should get somewhat distributed. But thats not happening - at least with what I have tried so far. It looks like the Function App for the Activity Functions is already determined by which Orchestrator is picked. (I had to deploy the Orchestrator in together with the Activity Functions or else it wouldn't even kick off).
So my question is: Would such a scenario be possible to achieve using Durable Functions?
host.json
{
"version": "2.0",
"extensions": {
"durableTask": {
"hubName": "MyTaskHub",
"storageProvider": {
"connectionStringName": "DurableStorage",
"useAppLease": false
}
}
}
}
Thank you Anand Sowmithiran. Posting your suggestion as an answer so that it will be helpful for other community members who face similar kind of issues.
Now my idea was that since each of them should listen to new work for the Activity Functions, the load should get somewhat distributed. But thats not happening - at least with what I have tried so far. It looks like the Function App for the Activity Functions is already determined by which Orchestrator is picked.
As per the Microsoft Document the Task hub should be different for each durable function which means it should have their own durable function. below is the sample Task Hub code
{
"version": "2.0",
"extensions": {
"durableTask": {
"hubName": "%MyTaskHub%"
}
}
}
In addition to host.json, task hub names can also be configured in orchestration client binding metadata.
[FunctionName("HttpStart")]
public static async Task<HttpResponseMessage> Run(
[HttpTrigger(AuthorizationLevel.Function, methods: "post", Route = "orchestrators/{functionName}")] HttpRequestMessage req,
[DurableClient(TaskHub = "%MyTaskHub%")] IDurableOrchestrationClient starter,
string functionName,
ILogger log)
{
// Function input comes from the request content.
object eventData = await req.Content.ReadAsAsync<object>();
string instanceId = await starter.StartNewAsync(functionName, eventData);
log.LogInformation($"Started orchestration with ID = '{instanceId}'.");
return starter.CreateCheckStatusResponse(req, instanceId);
}
Related
I have a CRM system, when a contact is added, I want to add them to an accounting system.
I have setup a webhook in the CRM system that passes the contact to an Azure Function. The Azure function connects to the accounting system API and creates them there.
There is a little other processing I need to do before the user can be added to the accounting system.
I need about a 5 minute delay after receiving the webhook before I can add the user to the accounting system.
I would rather not add a pause or delay statement in the Azure Function as there is a timeout limit, and also It's a consumption plan so I want each function to action quickly.
I am using Powershell core.
Is a Service Bus Queue the best way to do this?
You could use a Timer in a Durable Function for this. Then you won't need an extra component like a queue. A Durable Function is all you need. For example (warning: not compiled this):
Note: Durable Functions do support powershell but I don't ;-) So the code below is to understand the concept.
[FunctionName("Orchestration_HttpStart")]
public static async Task<HttpResponseMessage> HttpStart(
[HttpTrigger(AuthorizationLevel.Anonymous, "get", "post")] HttpRequestMessage req,
[DurableClient] IDurableOrchestrationClient starter,
ILogger log)
{
// Function input comes from the request content.
string content = await req.Content.ReadAsStringAsync();
string instanceId = await starter.StartNewAsync("Orchestration", content);
log.LogInformation($"Started orchestration with ID = '{instanceId}'.");
return starter.CreateCheckStatusResponse(req, instanceId);
}
[FunctionName("Orchestration")]
public static async Task Run(
[OrchestrationTrigger] IDurableOrchestrationContext context)
{
var requestContent = context.GetInput<string>();
DateTime waitAWhile = context.CurrentUtcDateTime.Add(TimeSpan.FromMinutes(5));
await context.CreateTimer(waitAWhile, CancellationToken.None);
await context.CallActivityAsync("ProcessEvent", requestContent);
}
[FunctionName("ProcessEvent")]
public static string ProcessEvent([ActivityTrigger] string requestContent, ILogger log)
{
// Do something here with requestContent
return "Done!";
}
I would rather not add a pause or delay statement in the Azure Function as there is a timeout limit, and also It's a consumption plan so I want each function to action quickly.
The 5 minutes delay introduced by the timer won't count as active time so you won't run out of time on the consumption plan for those minutes.
Is a Service Bus Queue the best way to do this?
You can use it, but Azure Storage Queue is cheaper for your scenario.
What you can do is create a time triggered functions (* */5 * * * *) and will check for a message in a queue. If the time between the execution and the time the message was created is greater than minutes, then you process and complete the message, otherwise, don't complete the message and it will return to the queue for the next execution.
I have created a Azure Function App. The function app connects to a SQL DB and has the following features
Return all the records in a table
Returns the records based on the column name using the below code
public static async Task<IActionResult> Run(
[HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)] HttpRequest req,
ILogger log)
{
string loan_id = req.Query["loanid"];
string loan_amount = req.Query["loanamount"];
if (string.IsNullOrEmpty(loan_id)) {
//Do something when dont give loan id.
} else if (string.IsNullOrEmpty(loan_amount)) {
//DO something when dont give loan amount.
}
return new OkObjectResult("This is a test.");
}
I would like to document the function app using API Management/Swagger. Can you please let me know how this can be achieved?
Thanks in advance
You just need to create an API management service instance from the portal and add the function endpoint using the open api.
You can follow this documentation on how to do the same.
I'm developing an application where IOT devices are connected with the Azure IOT Hub. and its realtime data can be visible on the web view. However, I'm facing an error, I'm trying to bind the data Azure function with SignalR, but when I run the application I receive the following error message.
The listener for function 'SignalR' was unable to start. Microsoft.Azure.EventHubs.Processor: Encountered error while fetching the list of EventHub PartitionIds. System.Private.CoreLib: The link address '$management' did not match any of the expected formats.
Error Description Image
I've tried everything to fix it but failed every time. I'd really appreciate if someone would help me find the solution to this problem.
Here is the script I'm using from this link
Here is my SignalR.cs class
public static class SignalR
{
[FunctionName("SignalR")]
public static async Task Run(
[IoTHubTrigger("messages/events", Connection = "IoTHubTriggerConnection", ConsumerGroup = "$Default")]EventData message,
[SignalR(HubName = "broadcast")]IAsyncCollector<SignalRMessage> signalRMessages,
ILogger log)
{
var deviceData = JsonConvert.DeserializeObject<DeviceData>(Encoding.UTF8.GetString(message.Body.Array));
deviceData.DeviceId = Convert.ToString(message.SystemProperties["iothub-connection-device-id"]);
log.LogInformation($"C# IoT Hub trigger function processed a message: {JsonConvert.SerializeObject(deviceData)}");
await signalRMessages.AddAsync(new SignalRMessage()
{
Target = "notify",
Arguments = new[] { JsonConvert.SerializeObject(deviceData) }
});
}
}
Here is my SignalRConnection.cs class
public static class SignalRConnection
{
[FunctionName("SignalRConnection")]
public static SignalRConnectionInfo Run(
[HttpTrigger(AuthorizationLevel.Anonymous, "get", Route = null)] HttpRequest req,
[SignalRConnectionInfo(HubName = "broadcast")] SignalRConnectionInfo info,
ILogger log) => info;
}
Here is my local.settings.json file
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "UseDevelopmentStorage=true",
"FUNCTIONS_WORKER_RUNTIME": "dotnet",
"AzureSignalRConnectionString": "",
"MSDEPLOY_RENAME_LOCKED_FILES": 1,
"IoTHubTriggerConnection": ""
},
"Host": {
"LocalHttpPort": 7071,
"CORS": "*"
}
}
for IoTHubTriggerConnection, I'm using the connection string of iothubjohnsoncontrol (displayed in image below).
IOT Hub Keys Image
for AzureSignalRConnectionString, I'm using the connection string of signalrjohnsoncontrol (displayed in image below).
SignalR Keys Image
Could you please check if you have given EventHub Compatible name and EventHub compatible connections string from here
Please try replacing messages/events with EventHub-Compatible name and IoTHubTriggerConnection as EventHub compatible endpoint from portal.
Almost similar discussion here :
https://github.com/Azure/azure-event-hubs-dotnet/issues/103
I have a usecase like that to Push Iot data to Azure data explorer and this is what my Function looks like
Iot Hub Connection string which is EventHub compatibale
Hope this helps.
We are switching over our Azure WebJobs to Azure Functions (for multiple reasons besides this post). But we internally can't really agree on the architecture for those functions.
Currently we have one WebJob that does a single tasks from A-Z.
E.g. Status emails: (Gets triggered from the scheduler)
1. Looks up all recepients
2. Send email to all of those
3. Logs success/failure for each individual recepient
4. Logs success/failure for the whole run
And we have multiple web-jobs that do similar tasks.
Now we have 3 ways we can implement this in the future.
One-to-one conversion. Move the complete WebJob functionality into one Azure function
Split the above process into 4 different Azure functions. E.g. one that looks up the recipients and then calls another function that sends out the email and so on.
Combine all WebJobs into one Azure function
Personally, I would tend towards solution 3. But some team members tend towards 1. What do you think?
I would go with option 3 too. You can use Durable Functions and let it control the workflow, and create activities for each step (1-4).
[FunctionName("Chaining")]
public static async Task<object> Run(
[OrchestrationTrigger] IDurableOrchestrationContext context)
{
try
{
var recipients = await context.CallActivityAsync<object>("GetAllRecipients", null);
foreach(var recipient in recipients)
{
//maybe return a complex object with more info about the failure
var success = await context.CallActivityAsync<object>("SendEmail", recipient);
if (! success)
{
await context.CallActivityAsync<object>("LogError", recipient);
}
}
return await context.CallActivityAsync<object>("NotifyCompletion", null);
}
catch (Exception ex)
{
// Error handling or compensation goes here.
await context.CallActivityAsync<object>("LogError", ex);
}
}
more info: https://learn.microsoft.com/en-us/azure/azure-functions/durable/durable-functions-overview
I want to make an Azure Function that takes a JSON body passed to it and inserts this document in to an Azure COSMOSDB instance.
To do so, I created this function in the Azure Functions Portal:
And implement the function like so:
#r "Newtonsoft.Json"
using Newtonsoft.Json.Linq;
using System.Net;
public static async Task<HttpResponseMessage> Run(HttpRequestMessage req, TraceWriter log, object outputDocument)
{
var requestContent = await req.Content.ReadAsStringAsync();
log.Verbose($#"Received request:\n{requestContent}");
var newDoc = JObject.Parse(requestContent);
newDoc["Id"] = Guid.NewGuid().ToString();
newDoc["shardKey"] = newDoc.Value<string>(#"Id").Substring(8);
outputDocument = newDoc;
return req.CreateResponse(System.Net.HttpStatusCode.Created);
}
In the portal, I put in an easy sample doc:
{
"prop1": 2,
"prop2": "2017-02-20",
}
and click 'Run'
I'm immediately met with
as an overlay in the portal IDE along with
{
"id": "145ee924-f824-4064-8364-f96dc12ab138",
"requestId": "5a27c287-2c91-40f5-be52-6a79c7c86bc2",
"statusCode": 500,
"errorCode": 0,
"message": "'UploadDocumentToCosmos' can't be invoked from Azure WebJobs SDK. Is it missing Azure WebJobs SDK attributes?"
}
in the log area.
There seems to be nothing I can do to fix the issue, yet I sure feel like I'm doing everything right.
What do I need to do to simply take a JSON object as an HTTP Request to an Azure Function, and insert/upsert said object in to an Azure Cosmos DB instance??
For async functions you should use IAsyncCollector:
public static async Task<HttpResponseMessage> Run(
HttpRequestMessage req, TraceWriter log,
IAsyncCollector<object> outputDocuments)
{
...
await outputDocuments.AddAsync(newDoc);
}
Can you try adding out to dynamic outputDocument?
public static async Task<HttpResponseMessage> Run(HttpRequestMessage req, TraceWriter log, out dynamic outputDocument)