Azure Function not working with serviceBusTrigger NodeJs - node.js

I have been trying to put a message in azure service bus topic. My azure function test says that the message has been accepted(202). But there is no message on the subscription side, Could you please help me in this POC. Here is my code snippet. It's a sample that got generated from VS code, I am using serviceBusTrigger.
const { ServiceBusClient, ReceiveMode } = require("#azure/service-bus");
module.exports = async function(context, mySbMsg) {
context.log('JavaScript ServiceBus topic trigger function processed message', mySbMsg);
context.done();
};
Is there any way that I can check if the service bus topic is working as expected?

I do not see a connection string and a binding associated with the code, Inorder to save the message to the queue, you need to have the Queue connection strings in the settings file. Follow the docs,
{
"bindings": [
{
"schedule": "0/15 * * * * *",
"name": "myTimer",
"runsOnStartup": true,
"type": "timerTrigger",
"direction": "in"
},
{
"name": "outputSbQueue",
"type": "serviceBus",
"queueName": "testqueue",
"connection": "MyServiceBusConnection",
"direction": "out"
}
],
"disabled": false
}

Okay, So the thing was, I tried calling the function after pushing messages onto the topic. But the purpose of ServiceBusTriger is reverse. The function with this trigger processes the message when there is message on queue. There is no need to Call the function separately. Yeah, so basics, just create the function with your available topic and subscription, and then try putting message onto the topic. You can see the message in the log of the function.

Related

Azure Event Grid triggers Azure Function App multiple times instead of once

Here is the structure, a data-drift-detected event in ML Workspace sends the event into event grid which triggers a function in Azure Function App. I want it to run only once after the data drift detection. However, I got this:
image
It runs every ~20s for a few times ://
Here is my host.json:
{
"version": "2.0",
"logging": {
"applicationInsights": {
"samplingSettings": {
"isEnabled": true,
"excludedTypes": "Request"
}
}
},
"extensionBundle": {
"id": "Microsoft.Azure.Functions.ExtensionBundle",
"version": "[3.*, 4.0.0)"
}
}
and function.json:
{
"scriptFile": "__init__.py",
"bindings": [
{
"type": "eventGridTrigger",
"name": "event",
"direction": "in"
}
]
}
Tried changing default options in "singleton" field in host.json, but nothing changed.
Do you have any idea?
When you create an event grid trigger there you will have retry policies where you can change it 1.
Event grid trigger waits of response if it doesnt get it triggers again until it gets a response, so change it to 1. To only trigger once.
So if event grid doesnt get response it triggers again with some interval of time
If not you are sending responses, so its triggering try not to send more responses to your end point.
References taken from:
Azure Event Grid delivery and retry - Azure Event Grid | Microsoft Learn

Azure Function not picking up Eventhub events

I started playing around with Azure Functions and am running in the issue that my Function is not being triggered by events entering my eventhub.
This is the code for my Function:
host.json:
"version": "2.0",
"logging": {
"applicationInsights": {
"samplingSettings": {
"isEnabled": true,
"excludedTypes": "Request"
}
}
},
"extensionBundle": {
"id": "Microsoft.Azure.Functions.ExtensionBundle",
"version": "[2.*, 3.0.0)"
}
}
function.json:
"scriptFile": "__init__.py",
"bindings": [
{
"type": "eventHubTrigger",
"name": "events",
"direction": "in",
"eventHubName": "eventhub",
"connection": "eventhub_connection",
"cardinality": "many",
"consumerGroup": "$Default",
"dataType": "stream"
}
]
}
init.py:
import logging
import azure.functions as func
def main(events: List[func.EventHubEvent]):
for event in events:
logging.info('Python EventHub trigger processed an event: %s',
event.get_body().decode('utf-8'))
logging.info(f'Function triggered to process a message: {event.get_body().decode()}')
logging.info(f' EnqueuedTimeUtc = {event.enqueued_time}')
logging.info(f' SequenceNumber = {event.sequence_number}')
logging.info(f' Offset = {event.offset}')
# def main(event: func.EventHubEvent):
# logging.info(f'Function triggered to process a message: {event.get_body().decode()}')
# logging.info(f' EnqueuedTimeUtc = {event.enqueued_time}')
# logging.info(f' SequenceNumber = {event.sequence_number}')
# logging.info(f' Offset = {event.offset}')
# # Metadata
# for key in event.metadata:
# logging.info(f'Metadata: {key} = {event.metadata[key]}')
"IsEncrypted": false,
"Values": {
"FUNCTIONS_WORKER_RUNTIME": "python",
"AzureWebJobsStorage": "DefaultEndpointsProtocol=https;AccountName=storageaccount;AccountKey=storageacciuntaccesskey=;EndpointSuffix=core.windows.net",
"eventhub_connection": "Endpoint=sb://eventhub01.servicebus.windows.net/;SharedAccessKeyName=function;SharedAccessKey=0omitted;EntityPath=eventhub"
}
}
I started out with the basic eventhub python code provided by the Azure Function Core tools. And have been testing different pieces of code found in online examples from people's blogs and the Microsoft docs.
When switching to cardinality: one -> I switch to the code which is currently commented out. I don't know if that is supposed to go like that, it just feels right to me.
In any case, regardless of the cardinality setting, or the datatype being changed between binary, stream or string. My Function simply does not trigger.
I can query my eventhub and see/read the events. So I know my policy, and the sharedkey and such, work fine. I am also only using the $Default consumer group.
I also tried setting up a HTTP triggered function, and this function gets triggered from Azure Monitor. I can see in the logs each request entering the function.
Am I doing something wrong in the code for my eventhub function?
Am I missing some other configuration setting perhaps? I already checked the Access Rules on the function, but that realy doesn't matter does it? The function is pulling the event from the eventhub. It's not being sent data by an initiator.
Edit: Added the local.settings.json file configuration and updated the function.json
Edit 2: solution to my specific issue is in the comments of the answer.
Update:
__init__.py of the function:
from typing import List
import logging
import azure.functions as func
def main(events: List[func.EventHubEvent]):
for event in events:
logging.info('Python EventHub trigger processed an event: %s',
event.get_body().decode('utf-8'))
Send message to event hub:
import asyncio
from azure.eventhub.aio import EventHubProducerClient
from azure.eventhub import EventData
async def run():
# Create a producer client to send messages to the event hub.
# Specify a connection string to your event hubs namespace and
# the event hub name.
producer = EventHubProducerClient.from_connection_string(conn_str="Endpoint=sb://testbowman.servicebus.windows.net/;SharedAccessKeyName=RootManageSharedAccessKey;SharedAccessKey=xxxxxx;EntityPath=test", eventhub_name="test")
async with producer:
# Create a batch.
event_data_batch = await producer.create_batch()
# Add events to the batch.
event_data_batch.add(EventData('First event '))
event_data_batch.add(EventData('Second event'))
event_data_batch.add(EventData('Third event'))
# Send the batch of events to the event hub.
await producer.send_batch(event_data_batch)
loop = asyncio.get_event_loop()
loop.run_until_complete(run())
And please make sure you give the right event hub name:
It seems your function.json has a problem, the connection string should not directly put in the binding item.
It should be like below:
function.json
{
"scriptFile": "__init__.py",
"bindings": [
{
"type": "eventHubTrigger",
"name": "events",
"direction": "in",
"eventHubName": "test",
"connection": "testbowman_RootManageSharedAccessKey_EVENTHUB",
"cardinality": "many",
"consumerGroup": "$Default",
"dataType": "binary"
}
]
}
local.settings.json
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "DefaultEndpointsProtocol=https;AccountName=0730bowmanwindow;AccountKey=xxxxxx;EndpointSuffix=core.windows.net",
"FUNCTIONS_WORKER_RUNTIME": "python",
"testbowman_RootManageSharedAccessKey_EVENTHUB": "Endpoint=sb://testbowman.servicebus.windows.net/;SharedAccessKeyName=RootManageSharedAccessKey;SharedAccessKey=xxxxxx;EntityPath=test"
}
}
check for configuration of function app and EventHub. pre-warned instance of function app should be lesser/equal to partition count of EventHub. Worked for me ad was able to receive events properly after this configuration.

Setting Azure Function Service Bus Topic and Subscription Output Binding via C#

I have a simple HTTP trigger Azure Function with multiple Service Bus output bindings. All the bindings are pointing to the same Topic, but they have different Subscriptions. If I was to set this function app through function.json it is pretty straightforward:
{
"bindings": [
{
"authLevel": "function",
"name": "req",
"type": "httpTrigger",
"direction": "in",
"methods": [
"get",
"post"
]
},
{
"name": "$return",
"type": "http",
"direction": "out"
},
{
"type": "serviceBus",
"connection": "SERVICEBUS",
"name": "output",
"topicName": "outtopic",
"subscriptionName": "sub",
"direction": "out"
},
{
"type": "serviceBus",
"connection": "SERVICEBUS",
"name": "output",
"topicName": "outtopic",
"subscriptionName": "sub2",
"direction": "out"
}
],
"disabled": false
}
But I publish my functions via Visual Studio and therefore my Azure Functions are read only in the portal and function.json is automatically generated by VS upon publishing.
The problem is that I cannot figure out how do I setup multiple output bindings pointing to different subscriptions. Currently I have something like this:
[FunctionName("Function2")]
public static async Task<IActionResult> Run(
[HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)] HttpRequest req,
[ServiceBus("outtopic", entityType:EntityType.Topic)] IAsyncCollector<string> output,
[ServiceBus("outtopic", entityType: EntityType.Topic)] IAsyncCollector<string> output2,
ILogger log)
{
log.LogInformation("C# HTTP trigger function processed a request.");
string requestBody = await new StreamReader(req.Body).ReadToEndAsync();
await output.AddAsync(requestBody);
return new OkObjectResult("OK");
}
As you can see output and output2 are pointing to the same Topic, but there is no option to specify Subscription.
At this point I am pretty confident that this has not been implemented yet. But I hope there is a workaround maybe?
Try this, add Connection property in the definition, per this example - if by subscription you mean on the azure subscription:
public static void Run([BlobTrigger("inputvideo/{customername}/{date}/{filename}", Connection = "AzureWebJobsStorage")]Stream myBlob,
string customername,
string date,
string filename,
[ServiceBus("detectobjectsqueue",EntityType.Queue, Connection="ServiceBusConnectionString")] IAsyncCollector<string> output,
ILogger log)
Update
Per your comment, I understood that by subscription you mean on topic subscription. In that case, the idea of topic is that all subscriptions receive the message. So you have one publisher, and whoever subscribes to the topic it will receive the message. If you would like to make sure that a specific subscriber receives the message, either implement message filtering(per type for example) on the receiving endpoint or use dedicated queue per subscriber.
Also, conceptually, the publisher should not know who are the subscribers, and the subscriber should not know who is the publisher. If you know who is the subscriber, why not use the REST call for example to the receiving endpoint?
It is not possible to directly put messages into a Topic Subscription, rather every message has to come through a Topic.
To make sure only a particular subscription receives a message, you need to configure the Topic Subscription rule. You can read more about rules in the blog post here.

Azure IoT hub cloud to device message using Stream Analytics

I am sending data through the path as below.
Android -> IoT Hub -> Stream Analytics -> SQL
I am calling machine learning function at the query of Stream Analytics. Now, I want to return the result of machine learning to the Android device. For receiving cloud-to-device message at Android device, I have done and tested with Device Explorer. I am looking at official tutorials, 1, 2 but still no clue on how to send cloud to device message using the Stream Analytics. He said using the service bus and function app but did not give the details. I am new to Azure. Hoping someone will give me some guidance or any link so I can understand more on how to implement it. Thanks in advance.
You can use an Azure Function (Preview) to output an ASA job for sending a cloud-to-device message via the Azure IoT Hub service-facing endpoint.
The following is an example of this function.
run.csx:
#r "Newtonsoft.Json"
using System.Configuration;
using System.Text;
using System.Net;
using Microsoft.Azure.Devices;
using Newtonsoft.Json;
// create proxy
static Microsoft.Azure.Devices.ServiceClient client = ServiceClient.CreateFromConnectionString(ConfigurationManager.AppSettings["myIoTHub"]);
public static async Task<HttpResponseMessage> Run(string input, HttpRequestMessage req, TraceWriter log)
{
log.Info($"ASA Job: {input}");
var data = JsonConvert.DeserializeAnonymousType(input, new[] { new { xyz = "", IoTHub = new { ConnectionDeviceId = ""}}});
if(!string.IsNullOrEmpty(data[0]?.IoTHub?.ConnectionDeviceId))
{
string deviceId = data[0].IoTHub.ConnectionDeviceId;
log.Info($"Device: {deviceId}");
// cloud-to-device message
var msg = JsonConvert.SerializeObject(new { temp = 20.5 });
var c2dmsg = new Microsoft.Azure.Devices.Message(Encoding.ASCII.GetBytes(msg));
// send AMQP message
await client.SendAsync(deviceId, c2dmsg);
}
return req.CreateResponse(HttpStatusCode.NoContent);
}
function.json:
{
"bindings": [
{
"authLevel": "function",
"name": "input",
"type": "httpTrigger",
"direction": "in"
},
{
"name": "$return",
"type": "http",
"direction": "out"
}
],
"disabled": false
}
project.json:
{
"frameworks": {
"net46":{
"dependencies": {
"Microsoft.Azure.Devices": "1.3.2"
}
}
}
}
Appendix A
For the test purpose, make the following steps:
Add the setting myIoTHub in your Azure Function App for your
connection string to the Azure IoT Hub.
Create HttpTrigger function, let call it as HttpASA_1
Update run.csx, function.json and project.json files with the above contents
Use the following sample for Test Request Body:
[
{
"time": "2017-11-26T12:52:23.4292501Z",
"counter": 57,
"windSpeed": 8.7358,
"temperature": 16.63,
"humidity": 79.42,
"EventProcessedUtcTime": "2017-11-26T12:52:21.3568252Z",
"PartitionId": 2,
"EventEnqueuedUtcTime": "2017-11-26T12:52:22.435Z",
"IoTHub": {
"MessageId": null,
"CorrelationId": null,
"ConnectionDeviceId": "Device1",
"ConnectionDeviceGenerationId": "636189812948967054",
"EnqueuedTime": "2017-11-26T12:52:21.562Z",
"StreamId": null
}
}
]
Change the value Device1 for your actually deviceId.
Now, the AF is ready to test it. Press the button Run to run the sample and look at the Logs progress.
You should see a C2D Message sent by AF on your device, or you can download a small tester Azure IoT Hub Tester to simulate your MQTT Devices. The following screen snippet shows this tester:
Now, in this step we can go to the ASA job for invoking this HttpASA_1 function. Note, that the ASA job invoking only the HttpTrigger function. The following screen snippet shows adding an output for our Azure Function:
You should see this function in the combobox when you selected your subscription in the Import option combobox. Once you done, press the Save button and watch the notification message on the screen. The ASA will send a validation message to your AF and its response status should be 20x code.
Finally, you can go to the Query to generate an output for your AF. The following screen shows a simple output for all telemetry data to the AF:
Note, the inpsim is my iothub input.
The ASA outputs payload for HttpTrigger Function in the following format, see my example:
[
{
"time": "2017-11-26T12:52:23.4292501Z",
"counter": 57,
"windSpeed": 8.7358,
"temperature": 16.63,
"humidity": 79.42,
"EventProcessedUtcTime": "2017-11-26T12:52:21.3568252Z",
"PartitionId": 2,
"EventEnqueuedUtcTime": "2017-11-26T12:52:22.435Z",
"IoTHub": {
"MessageId": null,
"CorrelationId": null,
"ConnectionDeviceId": "Device1",
"ConnectionDeviceGenerationId": "636189812948967054",
"EnqueuedTime": "2017-11-26T12:52:21.562Z",
"StreamId": null
}
}
]
Note, that my telemetry data (*) are counter, temperature, humidity and timestamp time, so the other properties are created implicitly by ASA job. Based on the query, you can create any business properties for C2D message.

Azure Function App: Can't bind Queue to type 'Microsoft.WindowsAzure.Storage.Queue.CloudQueue' (IBinder)

My scenario for an Azure function:
HTTP trigger.
Based on HTTP parameters I want to read messages from an appropriate storage queue and return data back.
Here is the code of the function (F#):
let Run(request: string, customerId: int, userName: string, binder: IBinder) =
let subscriberKey = sprintf "%i-%s" customerId userName
let attribute = new QueueAttribute(subscriberKey)
let queue = binder.Bind<CloudQueue>(attribute)
() //TODO: read messages from the queue
The compilation succeeds (with proper NuGet references and opening packages), but I get the runtime exception:
Microsoft.Azure.WebJobs.Host:
Can't bind Queue to type 'Microsoft.WindowsAzure.Storage.Queue.CloudQueue'.
My code is based on an example from this article.
What am I doing wrong?
Update: now I realize I haven't specified Connection Name anywhere. Do I need a binding for the IBinder-based queue access?
Update 2: my function.json file:
{
"bindings": [
{
"type": "httpTrigger",
"name": "request",
"route": "retrieve/{customerId}/{userName}",
"authLevel": "function",
"methods": [
"get"
],
"direction": "in"
}
],
"disabled": false
}
I suspect that you're having versioning issues because you're bringing in a conflicting version of the Storage SDK. Instead, use the built in one (w/o bringing in any nuget packages). This code works with no project.json:
#r "Microsoft.WindowsAzure.Storage"
open Microsoft.Azure.WebJobs;
open Microsoft.WindowsAzure.Storage.Queue;
let Run(request: string, customerId: int, userName: string, binder: IBinder) =
async {
let subscriberKey = sprintf "%i-%s" customerId userName
let attribute = new QueueAttribute(subscriberKey)
let! queue = binder.BindAsync<CloudQueue>(attribute) |> Async.AwaitTask
() //TODO: read messages from the queue
} |> Async.RunSynchronously
This will bind to the default storage account (the one we created for you when your Function App was created). If you want to point to a different storage account, you'll need to create an array of attributes, and include a StorageAccountAttribute that points to your desired storage account (e.g. new StorageAccountAttribute("your_storage")). You then pass this array of attributes (with the queue attribute first in the array) into the BindAsync overload that takes an attribute array. See here for more details.
However, if you don't need to do any sophisticated parsing/formatting to form the queue name, I don't think you even need to use Binder for this. You can bind to the queue completely declaratively. Here's the function.json and code:
{
"bindings": [
{
"type": "httpTrigger",
"name": "request",
"route": "retrieve/{customerId}/{userName}",
"authLevel": "function",
"methods": [
"get"
],
"direction": "in"
},
{
"type": "queue",
"name": "queue",
"queueName": "{customerId}-{userName}",
"connection": "<your_storage>",
"direction": "in"
}
]
}
And the function code:
#r "Microsoft.WindowsAzure.Storage"
open Microsoft.Azure.WebJobs;
open Microsoft.WindowsAzure.Storage.Queue;
let Run(request: string, queue: CloudQueue) =
async {
() //TODO: read messages from the queue
} |> Async.RunSynchronously

Resources