I was listening to this talk, where it is told that Event Hub is compatible with the Kafka protocol and that if an app is writing or reading from Kafka topics, its possible to use an Event Hub broker in place of a Kafka broker.
But does that also mean that we can use Kafka connectors for Event Hub? For example, if I want to bring in data from a Postgres Database into a Kafka topic using a Postgres Kafka connector, can I simply change the broker address to that of an Event Hub broker to bring it into an Event Hub topic instead?
Yes, it is possible to use Kafka connectors with Azure Event Hubs endpoints. I am not knowledgeable about PostgreSQL connector configuration however I can point you this Kafka CLI sample - https://github.com/Azure/azure-event-hubs-for-kafka/tree/master/tutorials/connect. Probably, PostgreSQL connector is also configured similar to CLI connector.
Related
We've native Kafka running in a Linux server in Azure (not confluent or event hubs). We are trying to create the Kafka trigger of this native instance on Azure function app.
The Microsoft documentation is mentioning about two types of event providers i.e. Event hubs and Confluent and not talking about the native Kafka client.
Microsoft documentation : https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-kafka-trigger?tabs=in-process%2Cevent-hubs&pivots=programming-language-csharp
So, is there any way we can connect the native Kafka client (kafka trigger) to Azure function app.
Looking forward for the support and please let me know in case need additional details.
Tried: checked for the samples to connect the Kafka native client which is running in a VM inside Azure to Azure function app and there is no luck.
Documentation is only mentioning about the Kafka hosted in Azure Event Hubs or confluent cloud and no information about the native Kafka clients.
Expectation: Should be able to pass the Kafka trigger from the native client to Azure function app
Upon reading on AZURE Event Hub,
I note that we can send data via
http(s)
AMQP
KAFKA
As I am not an integration (messaging) expert, the following then:
Can I use both AMQP and http(s) to write to the same Event Hub Topic
and subsequently can a single AZURE Function read from that same single Event Hub Topic regardless of how written to?
For KAFKA, this will need to be always a separate Event Hub (Topic) is my understanding.
The AZURE EVENT HUB KAFKA look-like API means that, if you, say, all send a JSON format using all 3 protocols, they can be mapped to the same Event Hub (= Topic), and one can read the Event Hub in KAFKA mode, say.
This is a good read https://learn.microsoft.com/en-us/azure/event-hubs/event-hubs-exchange-events-different-protocols but I checked with a more experienced person to confirm.
Could you please help me by providing some suggestions on consuming the Azure Service Bus streaming message using Python.
As I found there is no spark structured streaming source for Azure Service Bus then in this case can I read the Azure Service Bus message using provided Python client and from Python client I read the each message and write it into Kafka topic and on this Kafka topic I will apply the spark structured streaming programing.
My use case is to consume the Azure Service Bus streaming message and write each message by transforming it into a timestream database InfluxDb or Pramethoues and show the real time dashboard on business metrics in Grafana.
I am thinking of reading the Azure Service Bus streaming message using python kafka producer like program and write this data into Kafka topic and then consume this data into Spark structures streaming with Kafka topic.
Please suggest am I going in the right direction?Any suggestion will be appreciated....
Looks like there is no readily available connector since Service Bus is not designed with this in mind, unlike Event Hubs (which provides the Kafka Protocol). But it should be possible to write your own receiver (like this one).
Another alternative would be to immediately forward messages from Service Bus to a compatible source like Event Hubs (or Kafka) using something simple like Azure Functions.
Azure Functions along with bindings for both Serice Bus and Event Hubs / Kafka, you could implement this forwarding service with almost no code. But if you prefer, using the Python SDK for both in your own client will do the trick as well which itself could be an Azure Function as well.
-- From my original answer on Microsoft Q&A
im new to mqtt and currently trying to setup a mqtt protocol to send data from a gateway devices to azure iot hub. The problem i facing was I couldn't figure out which way that I can received and store data on IoT Hub when i published my data on mqtt broker. The textbook way is to subscribe the mqtt broker using Azure IOT Hub but how should I do it?
Assuming I am doing testing using a laptop
Read data stored in json file -> published to topic "data/device1" -> Data stored in Azure IoT Hub
I tried reading the Azure IoT HUB MQTT Connections but it doesnt work out for me. PLease Help
By default Azure IoT Hub makes incoming telemetry messages available on its Event Hub-compatible endpoint: https://learn.microsoft.com/en-us/azure/iot-hub/iot-hub-devguide-messages-read-builtin It does not matter over which protocol (MQTT, AMQP or HTTPS) you sent in the messages to IoT Hub - they all will land in that endpoint.
From there you can read the information using HTTPS or AMQP. I would recommend to use the Event Hub SDK or use a stream processing service like Azure Stream Analytics or Spark Streaming, which supports Event Hub directly.
We would like to connect our producers and consumers to our Kafka-enabled Event Hub using the Confluent Kafka .NET API. This works fine if we keep and pass the SAS keys and secrets, but it would be much better if we could use the Managed Service Identity capabilities of Event Hub.
Using MSI when connecting to Event Hub seems pretty easy when using the Event Hub client, but how can we do it when using Kafka clients?