I started playing around with Azure Functions and am running in the issue that my Function is not being triggered by events entering my eventhub.
This is the code for my Function:
host.json:
"version": "2.0",
"logging": {
"applicationInsights": {
"samplingSettings": {
"isEnabled": true,
"excludedTypes": "Request"
}
}
},
"extensionBundle": {
"id": "Microsoft.Azure.Functions.ExtensionBundle",
"version": "[2.*, 3.0.0)"
}
}
function.json:
"scriptFile": "__init__.py",
"bindings": [
{
"type": "eventHubTrigger",
"name": "events",
"direction": "in",
"eventHubName": "eventhub",
"connection": "eventhub_connection",
"cardinality": "many",
"consumerGroup": "$Default",
"dataType": "stream"
}
]
}
init.py:
import logging
import azure.functions as func
def main(events: List[func.EventHubEvent]):
for event in events:
logging.info('Python EventHub trigger processed an event: %s',
event.get_body().decode('utf-8'))
logging.info(f'Function triggered to process a message: {event.get_body().decode()}')
logging.info(f' EnqueuedTimeUtc = {event.enqueued_time}')
logging.info(f' SequenceNumber = {event.sequence_number}')
logging.info(f' Offset = {event.offset}')
# def main(event: func.EventHubEvent):
# logging.info(f'Function triggered to process a message: {event.get_body().decode()}')
# logging.info(f' EnqueuedTimeUtc = {event.enqueued_time}')
# logging.info(f' SequenceNumber = {event.sequence_number}')
# logging.info(f' Offset = {event.offset}')
# # Metadata
# for key in event.metadata:
# logging.info(f'Metadata: {key} = {event.metadata[key]}')
"IsEncrypted": false,
"Values": {
"FUNCTIONS_WORKER_RUNTIME": "python",
"AzureWebJobsStorage": "DefaultEndpointsProtocol=https;AccountName=storageaccount;AccountKey=storageacciuntaccesskey=;EndpointSuffix=core.windows.net",
"eventhub_connection": "Endpoint=sb://eventhub01.servicebus.windows.net/;SharedAccessKeyName=function;SharedAccessKey=0omitted;EntityPath=eventhub"
}
}
I started out with the basic eventhub python code provided by the Azure Function Core tools. And have been testing different pieces of code found in online examples from people's blogs and the Microsoft docs.
When switching to cardinality: one -> I switch to the code which is currently commented out. I don't know if that is supposed to go like that, it just feels right to me.
In any case, regardless of the cardinality setting, or the datatype being changed between binary, stream or string. My Function simply does not trigger.
I can query my eventhub and see/read the events. So I know my policy, and the sharedkey and such, work fine. I am also only using the $Default consumer group.
I also tried setting up a HTTP triggered function, and this function gets triggered from Azure Monitor. I can see in the logs each request entering the function.
Am I doing something wrong in the code for my eventhub function?
Am I missing some other configuration setting perhaps? I already checked the Access Rules on the function, but that realy doesn't matter does it? The function is pulling the event from the eventhub. It's not being sent data by an initiator.
Edit: Added the local.settings.json file configuration and updated the function.json
Edit 2: solution to my specific issue is in the comments of the answer.
Update:
__init__.py of the function:
from typing import List
import logging
import azure.functions as func
def main(events: List[func.EventHubEvent]):
for event in events:
logging.info('Python EventHub trigger processed an event: %s',
event.get_body().decode('utf-8'))
Send message to event hub:
import asyncio
from azure.eventhub.aio import EventHubProducerClient
from azure.eventhub import EventData
async def run():
# Create a producer client to send messages to the event hub.
# Specify a connection string to your event hubs namespace and
# the event hub name.
producer = EventHubProducerClient.from_connection_string(conn_str="Endpoint=sb://testbowman.servicebus.windows.net/;SharedAccessKeyName=RootManageSharedAccessKey;SharedAccessKey=xxxxxx;EntityPath=test", eventhub_name="test")
async with producer:
# Create a batch.
event_data_batch = await producer.create_batch()
# Add events to the batch.
event_data_batch.add(EventData('First event '))
event_data_batch.add(EventData('Second event'))
event_data_batch.add(EventData('Third event'))
# Send the batch of events to the event hub.
await producer.send_batch(event_data_batch)
loop = asyncio.get_event_loop()
loop.run_until_complete(run())
And please make sure you give the right event hub name:
It seems your function.json has a problem, the connection string should not directly put in the binding item.
It should be like below:
function.json
{
"scriptFile": "__init__.py",
"bindings": [
{
"type": "eventHubTrigger",
"name": "events",
"direction": "in",
"eventHubName": "test",
"connection": "testbowman_RootManageSharedAccessKey_EVENTHUB",
"cardinality": "many",
"consumerGroup": "$Default",
"dataType": "binary"
}
]
}
local.settings.json
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "DefaultEndpointsProtocol=https;AccountName=0730bowmanwindow;AccountKey=xxxxxx;EndpointSuffix=core.windows.net",
"FUNCTIONS_WORKER_RUNTIME": "python",
"testbowman_RootManageSharedAccessKey_EVENTHUB": "Endpoint=sb://testbowman.servicebus.windows.net/;SharedAccessKeyName=RootManageSharedAccessKey;SharedAccessKey=xxxxxx;EntityPath=test"
}
}
check for configuration of function app and EventHub. pre-warned instance of function app should be lesser/equal to partition count of EventHub. Worked for me ad was able to receive events properly after this configuration.
Related
Here is the structure, a data-drift-detected event in ML Workspace sends the event into event grid which triggers a function in Azure Function App. I want it to run only once after the data drift detection. However, I got this:
image
It runs every ~20s for a few times ://
Here is my host.json:
{
"version": "2.0",
"logging": {
"applicationInsights": {
"samplingSettings": {
"isEnabled": true,
"excludedTypes": "Request"
}
}
},
"extensionBundle": {
"id": "Microsoft.Azure.Functions.ExtensionBundle",
"version": "[3.*, 4.0.0)"
}
}
and function.json:
{
"scriptFile": "__init__.py",
"bindings": [
{
"type": "eventGridTrigger",
"name": "event",
"direction": "in"
}
]
}
Tried changing default options in "singleton" field in host.json, but nothing changed.
Do you have any idea?
When you create an event grid trigger there you will have retry policies where you can change it 1.
Event grid trigger waits of response if it doesnt get it triggers again until it gets a response, so change it to 1. To only trigger once.
So if event grid doesnt get response it triggers again with some interval of time
If not you are sending responses, so its triggering try not to send more responses to your end point.
References taken from:
Azure Event Grid delivery and retry - Azure Event Grid | Microsoft Learn
I want to create a Azure function using Python which will read data from the Azure Event Hub.
Fortunately, Visual Studio Code provides a way to create to create Azure functions skeleton. That can be edited according to the requirement.
I am able to create a demo HTTP trigger Azure Function with the help of a Microsoft Documentation but I don't know what change I should made in the below function so that it can read the data from the event hub and write the same to Azure Blob Storage.
Also, if someone can refer suggest any blog to get more details on azure function and standard practice.
UPDATE:
I tried to update my code based on suggestion of #Stanley but possibly it need to update in code.
I have written following code in my Azure function.
local.settings.json
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "Storage account connection string",
"FUNCTIONS_WORKER_RUNTIME": "python",
"EventHub_ReceiverConnectionString": "Endpoint Connection String of the EventHubNamespace",
"Blob_StorageConnectionString": "Storage account connection string"
}
}
function.json
{
"scriptFile": "__init__.py",
"bindings": [
{
"authLevel": "function",
"type": "eventHubTrigger",
"direction": "in",
"name": "event",
"eventHubName": "pwo-events",
"connection": "EventHub_ReceiverConnectionString",
"cardinality": "many",
"consumerGroup": "$Default",
"dataType": "binary"
}
]
}
init.py
import logging
import azure.functions as func
from azure.storage.blob import BlobClient
storage_connection_string='Storage account connection string'
container_name = ''
def main(event: func.EventHubEvent):
logging.info(f'Function triggered to process a message: {event.get_body().decode()}')
logging.info(f' SequenceNumber = {event.sequence_number}')
logging.info(f' Offset = {event.offset}')
blob_client = BlobClient.from_connection_string(storage_connection_string,container_name,str(event.sequence_number) + ".txt")
blob_client.upload_blob(event.get_body().decode())
Following is the screenshot of my blob container:
After executing he above code something got written to blob containers.
but instead of txt file it got saved in some other format. also, if I trigger azure function multiple time then files are getting overwritten.
I want to perform append operation instead of overwrite.
Also, I want to save my file in user defined location. Example: container/Year=/month=/date=
Thanks !!
If you want to read data from the Azure Event Hub, using the event hub trigger will be much easier, this is my test code (read data and write into storage):
import logging
import azure.functions as func
from azure.storage.blob import BlobClient
import datetime
storage_connection_string=''
container_name = ''
today = datetime.datetime.today()
def main(event: func.EventHubEvent):
logging.info(f'Function triggered to process a message: {event.get_body().decode()}')
logging.info(f' SequenceNumber = {event.sequence_number}')
logging.info(f' Offset = {event.offset}')
blob_client = BlobClient.from_connection_string(
storage_connection_string,container_name,
str(today.year) +"/" + str(today.month) + "/" + str(today.day) + ".txt")
blob_client.upload_blob(event.get_body().decode(),blob_type="AppendBlob")
I use the code below to send events to the event hub:
import asyncio
from azure.eventhub.aio import EventHubProducerClient
from azure.eventhub import EventData
async def run():
# Create a producer client to send messages to the event hub.
# Specify a connection string to your event hubs namespace and
# the event hub name.
producer = EventHubProducerClient.from_connection_string(conn_str="<conn string>", eventhub_name="<hub name>")
async with producer:
# Create a batch.
event_data_batch = await producer.create_batch()
# Add events to the batch.
event_data_batch.add(EventData('First event '))
event_data_batch.add(EventData('Second event'))
event_data_batch.add(EventData('Third event'))
# Send the batch of events to the event hub.
await producer.send_batch(event_data_batch)
loop = asyncio.get_event_loop()
loop.run_until_complete(run())
My local.settings.json:
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "<storage account conn str>",
"FUNCTIONS_WORKER_RUNTIME": "python",
"testhubname0123_test_EVENTHUB": "<event hub conn str>"
}
}
My function.json just as this doc indicated:
{
"scriptFile": "__init__.py",
"bindings": [{
"type": "eventHubTrigger",
"name": "event",
"direction": "in",
"eventHubName": "test01(this is my hubname, pls palce yours here)",
"connection": "testhubname0123_test_EVENTHUB"
}]
}
Result
Run the function and send data to the event hub using the code above:
Data has been saved into storage successfully:
Download .txt and check its content we can see that 3 event content has been written:
Building on an earlier question. The following code is an httptrigger that listed to a gis layer edits and updates. It logs into the queue the url payload. I do not want the payload loaded but a specific repetitive message so that it is overwritten everytime for I do not want to dequeue every now and then. How can I go about this?
import logging
import azure.functions as func
def main(req: func.HttpRequest,msg: func.Out[str]) -> func.HttpResponse:
logging.info('Python HTTP trigger function processed a request.')
input_msg = req.params.get('message')
logging.info(input_msg)
msg.set(req.get_body())
return func.HttpResponse(
"This is a test.",
status_code=200
)
**function.json**
{
"scriptFile": "__init__.py",
"bindings": [
{
"authLevel": "anonymous",
"type": "httpTrigger",
"direction": "in",
"name": "req",
"methods": [
"get",
"post"
]
},
{
"type": "http",
"direction": "out",
"name": "$return"
},
{
"type": "queue",
"direction": "out",
"name": "msg",
"queueName": "outqueue1",
"connection": "AzureStorageQueuesConnectionString"
}
]
}
I do not want the payload loaded but a specific repetitive message so
that it is overwritten everytime for I do not want to dequeue every
now and then.
No, when you put in the same message, It will not overwritten. It just queued in the queue storage.
If you want to process the message in queue, just use queueclient or use queuetrigger of azure function.(queuetrigger of function is based on queueclient, they are basically same.)
This is the API reference of queue:
https://learn.microsoft.com/en-us/python/api/azure-storage-queue/azure.storage.queue?view=azure-python
You can use it to process message in the queue with python code.
And this is the queuetrigger of azure function:(This is already integrated and can be used directly)
https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-storage-queue-trigger?tabs=python
I have been trying to put a message in azure service bus topic. My azure function test says that the message has been accepted(202). But there is no message on the subscription side, Could you please help me in this POC. Here is my code snippet. It's a sample that got generated from VS code, I am using serviceBusTrigger.
const { ServiceBusClient, ReceiveMode } = require("#azure/service-bus");
module.exports = async function(context, mySbMsg) {
context.log('JavaScript ServiceBus topic trigger function processed message', mySbMsg);
context.done();
};
Is there any way that I can check if the service bus topic is working as expected?
I do not see a connection string and a binding associated with the code, Inorder to save the message to the queue, you need to have the Queue connection strings in the settings file. Follow the docs,
{
"bindings": [
{
"schedule": "0/15 * * * * *",
"name": "myTimer",
"runsOnStartup": true,
"type": "timerTrigger",
"direction": "in"
},
{
"name": "outputSbQueue",
"type": "serviceBus",
"queueName": "testqueue",
"connection": "MyServiceBusConnection",
"direction": "out"
}
],
"disabled": false
}
Okay, So the thing was, I tried calling the function after pushing messages onto the topic. But the purpose of ServiceBusTriger is reverse. The function with this trigger processes the message when there is message on queue. There is no need to Call the function separately. Yeah, so basics, just create the function with your available topic and subscription, and then try putting message onto the topic. You can see the message in the log of the function.
I have a azure python function : It gets triggered by HTTP , responds with HTTP response and puts the message in the Azure Service Bus Queue.
Function.json: for outbound Azure Service bus
{
"type": "serviceBus",
"connection": "ServiceBus",
"name": "outputMessage",
"queueName": "testqueue",
"accessRights": "send",
"direction": "out"
}
I have function as
def main(req: func.HttpRequest, outputMessage: func.Out[func.ServiceBusMessage]) -> str:
I get below error:
Result: Failure
Exception: FunctionLoadError: cannot load the HttpTrg function: type of outputMessage binding in function.json "serviceBus" does not match its Python annotation "ServiceBusMessage"
Question:
1. What should be the python annotation for Azure Service Bus outbound ?
def main( , outputMessage: func.Out[func.ServiceBusMessage])
Can I keep -> str for Azure Service Bus ?
func.Out[func.ServiceBusMessage]) -> str
Can i use the set method to send message like :
outputMessage.set(text)
"To produce multiple outputs, use the set() method provided by the azure.functions.Out interface to assign a value to the binding" -> will this work?
Thanks
Sandeep
Another example of being able to do output binding to service bus with python:
// function.json
{
"scriptFile": "__init__.py",
"bindings": [
{
"authLevel": "function",
"type": "httpTrigger",
"direction": "in",
"name": "req",
"methods": [
"get",
"post"
]
},
{
"type": "http",
"direction": "out",
"name": "$return"
},
{
"type": "serviceBus",
"direction": "out",
"connection": "AzureWebJobsServiceBusConnectionString",
"name": "msg",
"queueName": "outqueue"
}
]
}
.
# __init__.py
import azure.functions as func
def main(req: func.HttpRequest, msg: func.Out[str]) -> func.HttpResponse:
msg.set(req.get_body())
return 'OK'
When you say Azure Service Queue you probably mean Azure Storage Queue.
The binding serviceBus is specifically for Service Bus, not Storage Queues.
You need to use queueName.
Here's an example of output binding. Firstly, we have the function.json file:
{
"scriptFile": "__init__.py",
"bindings": [
{
"authLevel": "function",
"type": "httpTrigger",
"direction": "in",
"name": "req",
"methods": [
"get",
"post"
]
},
{
"type": "http",
"direction": "out",
"name": "$return"
},
{
"type": "queue",
"direction": "out",
"name": "msg",
"queueName": "outqueue",
"connection": "AzureWebJobsStorage"
}
]
}
Now we can use it like so:
import logging
import azure.functions as func
def main(req: func.HttpRequest, msg: func.Out[func.QueueMessage]) -> str:
name = req.params.get('name')
if not name:
try:
req_body = req.get_json()
except ValueError:
pass
else:
name = req_body.get('name')
if name:
msg.set(name)
return func.HttpResponse(f"Hello {name}!")
else:
return func.HttpResponse(
"Please pass a name on the query string or in the request body",
status_code=400
)
Output Binding with Python to Service Bus
Here's your function.json code:
{
"name": "outputSbQueue",
"type": "serviceBus",
"queueName": "testqueue",
"connection": "MyServiceBusConnection",
"direction": "out"
}
Also, the following references may be helpful for you:
Azure Function - Python - ServiceBus Output Binding - Setting Custom Properties
https://github.com/yokawasa/azure-functions-python-samples/blob/master/docs/quickstart-v2-python-functions.md
https://learn.microsoft.com/en-us/azure/azure-functions/functions-reference-python
https://azure.microsoft.com/en-us/blog/taking-a-closer-look-at-python-support-for-azure-functions/
https://unofficialism.info/posts/quick-start-with-azure-function-v2-python-preview/
Sending messages to service bus queue without an explicit binding
from azure.servicebus import QueueClient, Message
# Create the QueueClient
queue_client = QueueClient.from_connection_string(
"<CONNECTION STRING>", "<QUEUE NAME>")
# Send a test message to the queue
msg = Message(b'Test Message')
queue_client.send(msg)
Using the Azure ServiceBusMessage class as an outbound parameter is not supported:
Attributes are not supported by Python.
Use the Azure Service Bus SDK rather than the built-in output binding.
Azure Service Bus output binding for Azure Functions
ServiceBusMessage can only be used for Trigger binding:
import azure.functions as func
def main(msg: func.ServiceBusMessage):
The queue message is available to the function via a parameter typed as func.ServiceBusMessage. The Service Bus message is passed into the function as either a string or JSON object.
Azure Service Bus trigger for Azure Functions