How to write azure function that can get triggered when an item appears in azure service bus queue using C#? - azure

The new azure function preview contains a few templates for C#. But there is no service bus queue template for C#. There is a trigger template for node with service bus. But on close inspection, it only supported Notification Hubs and not service bus queue. Is it even possible to write an azure function that can only be triggered when an item appears in azure service bus queue? If it is not possible now, will there be such a template in the near future?
Thanks.
Raghu/..

Update: The below steps and information still hold, however we now have a "ServiceBusQueueTrigger - C#" template live in the portal, so the workaround steps are no longer necessary :)
ServiceBus IS supported already for C#, we just need to add a template for it (we'll add very soon). In general, templates are just starting points - you can always modify templates by adding additional bindings, or start from the empty template and build up your own Function.
Until we get the template uploaded, you can get this working yourself by starting from the C# Empty template. For example, you can enter binding info like the following in the Advanced Editor on the Integrate tab:
{
"bindings": [
{
"type": "serviceBusTrigger",
"name": "message",
"direction": "in",
"queueName": "samples-input",
"connection": "myServiceBus"
}
]
}
Make sure your Function App has an AppSetting matching the name of the connection property, containing your ServiceBus connection string. It looks like we currently have some issues with the connection string picker for ServiceBus (which will also be fixed very soon), but you can use "Function app settings"/"Go to App Service Settings"/"Application Settings" to add this app setting. Then you can use the corresponding Function code:
using System;
using Microsoft.Azure.WebJobs.Host;
public static void Run(string message, TraceWriter log)
{
log.Verbose($"C# ServiceBus Queue function processed message: {message}");
}
This function will then be invoked whenever new messages are added to ServiceBus queue samples-input.

Per https://azure.microsoft.com/en-us/documentation/articles/functions-reference/, there is no binding with the SB.
The best way to do that instead of doing something that is (at least by some chance) being in work in product group is to submit your idea on the UserVoice - https://feedback.azure.com/forums/355860-azure-functions .

Related

Disabled Azure Function Still Pulls Messages off of Azure Storage Queue

I have a basic QueueTrigger Azure function. When I disable the function in the azure portal it's still pulling messages off the storage queue (because when I look at the queue in the Azure Queue Storage Explorer the queue is empty and if i add a message it is immediately pulled off).
Here is the code:
[FunctionName("ProcessMessage")]
public static void Run([QueueTrigger("queue-name", Connection = "queue-connection")] Models.Message message, TraceWriter log)
{
log.Info($"C# Queue trigger function processed: {message}");
}
I noticed that when I stop the whole functions app it stops processing messages off the queue, but I was hoping that I could disable queue processing temporarily without stopping the whole function app. How does one do that?
Thanks!
Disabling the V1 function created in Visual Studio does not work in the azure portal. You should use the attribute:
https://learn.microsoft.com/en-us/azure/azure-functions/disable-function#functions-1x---c-class-libraries
(see important section)

Azure Functions - how to set up IoTHubTrigger for my IoTHub messages?

How do I setup and configure an IoTHubTrigger correctly to trigger an Azure Function (C#) for my IoTHub messages? Where and how do I plug in my IoTHub's connection string?
Steps using Visual Studio 2017:
First make sure you have the latest version of the Azure Functions and Web Jobs Tools
Go to File->New->Project->Azure Functions and select "IoT Hub Trigger"
Select Functions V1 or V2 (learn about there differences here). And enter an arbitrary name that will serve as key for your connection string configuration.
Open local.settings.json and enter a key/value pair for your connection string:
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "UseDevelopmentStorage=true",
"AzureWebJobsDashboard": "UseDevelopmentStorage=true",
"ConnectionString": "<your connection string>"
}
}
IMPORTANT
If using Functions V1, use your IoTHub connection string obtained in the portal from here:
If using Functions V2, use your IoTHub's EventHub compatible endpoint obtained in the portal from here:
Now set a breakpoint in your function and hit F5. You will see your messages flowing from IoTHub to your Azure Function (assuming you have devices or simulators connected that are sending data)
Steps using the Azure Portal
Create a new Function App resource and select the EventHub Trigger template
Hit "New" for EventHub Connection and select IotHub and your desired hub
Edit and save your function code - you are now up and running!
Switch to "Monitor" see your events flowing in
More options to create IoTHub Trigger Azure Functions
a) Using VS Code with the Azure Functions Extension
b) From the command line using Azure Functions Core Tools
I also needed to install a NuGet package Microsoft.Azure.WebJobs.Extensions.EventHubs
I would like to add, if you want to publish the function on Azure, you must add the connectionstring for the portal side
Updating the thread for cloud deployment issue. If you don't find an option to pass the connection string through the publish pop up window in Visual Studio 2022, you can provide the configuration from the Azure portal UI.
Navigate to the Azure function resource on the Azure portal and click on the Configuration under thh Settings section. Click the New Application Setting button on top and add in the connection string parameter name (same as the value set to Connection in the Azure function declaration) and provide the event hub end point connection string as value.

Azure Functions - Queue trigger consumes message on fail

This issue only occurs when I use the Azure Portal Editor. If I upload from Visual Studio, this issue does not occur, but I cannot upload from Visual Studio due this unrelated bug: Azure Functions - only use connection string in Application Settings in cloud for queue trigger.
When using the Azure Portal Editor, if I throw an exception from C# or use context.done(error) from JavaScript, Application Insights shows an error occurred, but the message is simply consumed. The message is not retried, and it does not go to a poison queue.
The same code for C# correctly retries when uploaded from Visual Studio, so I believe this is a configuration issue. I have tried modifying the host.json file for the Azure Portal Editor version to:
{
"queues": {
"visibilityTimeout": "00:00:15",
"maxDequeueCount": 5
}
}
but the message was still getting consumed instead of retried. How do I fix this so that I can get messages to retry when coding with the Azure Portal Editor?
Notes:
In JavaScript, context.bindingData.dequeueCount returns 0.
Azure Function runtime version: 1.0.11913.0 (~1).
I'm using a Consumption App Plan.
I was using the manual trigger from the Azure Portal Editor, which has different behavior from creating a message in the queue. When I put a message in the queue, the Azure Function worked as expected.
For local development, if your function is async use Task for the return type.
public async Task Run
instead of void:
public async void Run

azure function to listen to any service bus topic/subscription

Using the template to create a azure function, one can create function only listening to particular pair of azure topic/subscription:
{
"bindings": [
{
"name": "mySbMsg",
"type": "serviceBusTrigger",
"direction": "in",
"topicName": "ftopic1",
"subscriptionName": "mysub",
"connection": "collosysazfuncsb_RootManageSharedAccessKey_SERVICEBUS",
"accessRights": "Manage"
}
],
"disabled": false
}
and then in run.csx you just receive the message
public static void Run(string message, TraceWriter log)
{
log.Info($"message: {message}");
}
Is there a way to listen to any topic/subscription using azure function and then receive topicName & subscriptionName as parameters in Run method.
Doing topic-name as * does not help, and also it does not provide topic-name in Run.
Azure Functions only allows to listen to a single queue or subscription. It doesn't allow to listen to multiple entities (queues or subscriptions) as the Azure Service Bus client doesn't support this. Instead, as it was pointed out, you could leverage Auto Forwarding feature of Azure Service Bus. The broker will forward any messages to the destination topic/queue and you'll have a single queue for Azure Function to feed of.
It's important to note that auto forwarded messages will not carry any information that would allow to identify from what queue/subscription they have originated. This is only possible with dead-lettered messages.
Since you're interested in topics, you could "workaround" this issue by having an action on the rules of your subscriptions that would stampt messages with a custom property. For example, having 3 topics with default subscription each and a default filter with rule action
set [x-source] = 'topic-N'
where N is a topic identifier, would cause all auto forwarded messages to contain x-source custom property with value corresponding to the topic they have originated from.

How to get all blob storage(new or updated one) using azure functions time trigger

I have requirement where i need to get all the blob storage which are updated or added after a particular time duration.
Example :- In container I have list of zip file as a blob, I need to get all the updated or newly added blob in a given interval like after every 1 hour I need to get all the newly added or updated blob.
So I have used azure function where created one time trigger function but could not able to get all the blob(updated or newly added).
Could anyone let me know how I can solve this problem.
function.json file
"bindings": [
{
"name": "myTimer",
"type": "timerTrigger",
"direction": "in",
"schedule": "0 */2 * * * *"
},
{
"type": "blob",
"name": "myBlob",
"path": "*****/******.zip",
"connection": "***************",
"direction": "in"
}
],
"disabled": false
}
index.js
module.exports = function(context, trigger, inputBlob) {
context.log(inputBlob);
//it's also available on context.bindings
context.log(context.bindings.inputBlob); // will log the same thing as above
context.done();
}
Thanks in Advance.
Ideally, Functions work best if you use them in reactive way, i.e. when Function is run on Blob change event directly (or via Event Grid).
If you have to stick to timer and then find all changed Blobs, Azure Function bindings won't help you. In this case, remove the input binding that you were trying to declare and search for changed blobs with Blob Storage API. I believe Azure Storage SDK for Node.js supports listing the blobs, but I haven't used it.
Your scenario is a good candidate for using an Azure Event Grid (now in the preview) solution with an event-driven blob storage publisher. More details here.
Basically, there is no limitation for number of containers, blob storages either the Azure subscriptions. If your blob storage has been subscribed for the event interest, such as the blob has been created or deleted, the custom filtered event message can be delivered to the subscriber, for instance, the EventGridTrigger Function.
The following screen snippet shows an example of the event-driven blob storages:
The following logs shows a received an event message by function, when the blob has been deleted:
Note, that the event message sent by blob storage publisher can be filtered in the subscription based on subject and/or eventType properties. In other words, each subscription can tell to Event Grid for its event source interest.
In the case of streaming events and their analyzing, the Event Grid can be subscribed for Event Hub subscriber, see the following screen snippet:
All events from the source interest will ingest to the Event Hub which it represents an entry point of the stream pipeline. The stream of the events, such as the event messages of the created/deleted blobs across the accounts and/or azure subscriptions is analyzing by ASA job based on the needs. The output of the ASA job will trigger an Azure Function to finish a business requirements.
More details about the Event Hub as a destination for Event Grid is here.

Resources