The requirement is that we have an Azure Service Bus topic which we want to set up as Push Subscriber in Google PubSub Topic. This way any messages published to Google PubSub Topic will be pushed to Azure SB Topic without any intermediate layer involved.
On Paper this should work because messages can be published to Azure SB Topic using API framework and Google PubSub can also configure API as Push Subscribers.
I have gone through following articles but couldn't make this linking workout.
Azure SB as API: https://learn.microsoft.com/en-us/rest/api/servicebus/send-message-to-queue
Google PubSub Push Subscriptions: https://cloud.google.com/pubsub/docs/push
Has anyone done this kind of linking before?
Thanks in Advance
It would be nice to create a pub/sub push subscription, which pushes to Azure Event Hub. I ran into the same problem when configuring this setup.
Pub/sub push subscriptions currently do not support custom Authorization headers, which are needed as per the Event Hub documentation.
POST https://your-namespace.servicebus.windows.net/your-event-hub/messages?timeout=60&api-version=2014-01 HTTP/1.1
Authorization: SharedAccessSignature sr=your-namespace.servicebus.windows.net&sig=your-sas-key&se=1403736877&skn=RootManageSharedAccessKey
Content-Type: application/atom+xml;type=entry;charset=utf-8
Host: your-namespace.servicebus.windows.net
{ "DeviceId":"dev-01", "Temperature":"37.0" }
So the only two options I see here:
Push setup: Create a cloud function or dataflow job in Google Cloud. Push pub/sub events to this endpoint and then pass on the event to Azure Event hub with the appropriate headers.
Pull setup: Poll a pub/sub pull subscription from the Azure side with an Azure function or WebJob.
Both options require extra compute resources, so definitely not the preferred way of doing this. I would always try a push setup first, since then you don't have t o continuously have a polling job running in the background.
I have hopes that pub/sub push subscriptions will support custom headers anywhere in the future. Lately some other useful features have been added like: Dead Lettering, Message Ordering and Partitioning (Lite Topics). Hopefully they will add custom headers as well.
I have a workaround on this problem.
1- you can create cloud function on google cloud that can push data for you on azure service bus.
2- you can develop web job on azure that will run continuously and check google pub/sub topic with the help of provided connection string and google pub/sub supporting libraries.
From above mentioned solution you can get data from google cloud and push to your azure service bus.
Related
I have 3 Azure service bus topics, 3 azure app services subscribe to these topic messages and perform some actions and return results.
I want to push a message to another topic with the collective results (failure/success) from the above 3 app services.
How can this be achieved?
The sample flow chart below:
One of the options to solve this is to leverage the Azure Service Bus Message Sessions feature. It allows handling messages and keeping the state using a single service. You'd have 3 message types (events) published by the upstream services (Payment, Email, and Notification), and a fourth message type, a timeout. This is also known as a saga pattern. I've described the implementation with Azure Service Bus in detail in this blog post.
I have a GKE application that currently is driven by Notifications from a Google Cloud Storage bucket. I want to convert this node.js application to be triggered instead by PubSub notifications. I've been crawling through Google documentation pages most of the day, and do not have a clear answer. I see some python code that might do it, but it's not helping much.
The code as it is currently written is working - an image landing in my GCS bucket triggers a notification to my GKE pod(s), and my function runs. Trying to understand what I need to do inside my function to subscribe to a Pub/Sub topic to trigger the processing. Any and all suggestions welcome.
Firstly thanks, I didn't know the notification capability of GCS!!
The principle is close but you use PubSub as intermediary. Instead of notify directly your application with a watchbucket command, you notif a PubSub topic.
From there, the notifications arrive in PubSub topic, now you have to create a subscription. 2 types are possible:
Push: you specify an HTTP URL that is called with a POST request, and the body contain the notification message.
Pull: your application need to create a connection with the PubSub subscription and to read the messages.
Pro and cons
Push requires an authentication from the PubSub push subscription to your application. And if you use internal IP, you can't use this solution (URL endpoint must be publicly accessible). The main advantage is the scalability and the simplicity of the model.
Pull require an authentication of the subscriber (here your application) and thus, even if your application is privately deployed, you can use Pull subscription. Pull is recommended for high throughput but require higher skill in processing, concurrency/multi-threading programming. You don't scale on request rate (as with Push model) but according to the number of message that you read. And you need to acknowledge manually the messages.
Data model is mentioned here. Your pubsub message is like that
{
"data": string,
"attributes": {
string: string,
...
},
"messageId": string,
"publishTime": string,
"orderingKey": string
}
The attributes are discribed in the documentation and the payload (base64 encoded, be carefull) has this format. Very similar of what you get today.
So, why the attributes? Because you can use the filter feature on PubSub to create subscription with only a subset of messages.
You can also shiht gears and use Cloud Event (base on Knative events) if you use Cloud Run for Anthos in your GKE cluster. Here, the main advantage is the portability of the solution, because the messages are compliant with Cloud Event format and not specific to GCP.
Am working on Azure resources like Azure Service Bus, Azure Functions, IOT Hub. Here am trying to send queue messages from Azure Service Bus to IOT Hub using Azure functions and then display that messages in my local device (Cloud-To-Device). Am able to read my messages in Azure function using Service Bus Queue Trigger and tried to send them to IOT Hub as output of function. Once, when I run the Azure function "Its can sending the messages to IOT Hub as output",but it unable to send them to client device. Can you please suggest me to "How to solve this situation"
As far as I know there is currently no way to select a Cloud to Device Message(C2D) as a Azure Functions Output.
You also cannot use the Event Hub Output as it does not support C2D messages as described here.
I can think of 2 methods to accomplish C2D messaging in Azure functions:
Use the Azure IoT SDK as described in this answer and shown in this channel9 video from 2017 (might be out of date).
Use the Azure IoT Hub REST API. You can find general configuration options here and the API endpoint to use would be senddevicecommand.
Unfortunately there is current no output binding for IoT Hub from Functions (you could write a new custom binding, though ;) )
To talk from a Function to your devices, you need the Azure Device Service SDK of the IoT Hub. Then you can either use Cloud-to-Device messages (asynchronous) or Direct Methods (synchronous). You can find a example of the latter in my GitHub repo here: https://github.com/sebader/iotedge-end2end/blob/master/CloudFunctions/DirectMethodCaller.cs
The important pieces being:
ServiceClient _iothubServiceClient = ServiceClient.CreateFromConnectionString(config["iothubowner_cs"]);
var methodRequest = new CloudToDeviceMethod("YourDirectMethodName", TimeSpan.FromSeconds(10), TimeSpan.FromSeconds(10));
var result = await _iothubServiceClient.InvokeDeviceMethodAsync(device, module, methodRequest).ConfigureAwait(false);
An implementation for C2D messages will look pretty much the same.
In my python application, if any bad/good event happens, I want to send the event details as notification message to user's email addresses or phone #s that have been subscribed to this application. So I am looking for publisher-subscriber model azure cloud
Looks like multiple Azure services achieving similar goal but having a thin line of differences. Event hubs and notification hubs seems promising. So my question is as follows:
Can email ID/phone # be subscribed to Azure event hub and receive the message being sent/produced to Azure event hub?
If not event hub, what is the correct option? Can I achieve it with Service bus or Notification hub?
In AWS, there is a service called SNS (Simple Notification Service) where one can subscribe email/phone number and opt for receiving event messages about that application. I am looking for equivalent to that in Azure.
You can use the Azure Logic Apps / Azure Functions with Event Hubs to achieve this easily.
Using logic apps you can do like simple as below image.
Logic Apps has many in-build connectors for most all Azure Services, you can use Event-hubs,Service bus,SQL etc.,
You can find all the list of available connectors here
Update 1
Once you connected the Event-Hubs to send an Email connector, you will automatically get all the available source data from event-hubs to email task. See below
You can achieve this by using Azure Application Insights. With this, you will be able to monitor your application and receive alerts during application unavailabillity, failures or even during performance issue.
Check this https://learn.microsoft.com/en-us/azure/application-insights/app-insights-tutorial-alert
I'm simply trying to work out how best to retrieve messages as quickly as possible from an Azure Service Bus Queue.
I was shocked that there wasn't some way to properly subscribe to the queue for notifications and that I'm going to have to poll. (unless I'm wrong in which case the documentation is terrible).
I got long polling working, but checking a single message every 60 seconds looks like it'll cost around £900 per month (again, unless I've misunderstood that). And if I add a redundant/second service to poll it'll double.
So I'm wondering what the best/most cost efficient way of doing it is.
Essentially I just want to take a message from the queue, perform an API lookup on some internally held data (perhaps using hybrid services?) and then perhaps post a message back to a different queue with some additional information .
I looked at worker roles(?) -- is that something that could do it?
I should mention that I've been looking at doing this with node.js.
Check out these videos from Scott Hanselman and Mark Simms on Azure Queues.
It's C# but you get the idea.
https://channel9.msdn.com/Search?term=azure%20queues%20simms#ch9Search
Touches on:
Storage Queues vs. Service Bus Queues
Grabbing messages in bulk vs. one by one (chunky vs. chatty)
Dealing with poison messages (bad actors)
Misc implementation details
Much more stuff i can't remember now
As for your compute, you can either do a VM, a Worker Role (Cloud Services), App Service Webjobs, or Azure Functions.
The Webjobs SDK and Azure Functions bot have a way to subscribe to Queue events (notify on message).
(Listed from IaaS to PaaS to FaaS - Azure Functions - if such a thing exists).
Azure Functions already has sample code provided as templates to do all that with Node. Just make a new Function and follow the wizard.
If you need to touch data on-prem you either need to look at integrating with a VNET that has site-to-site connectivity back to your prem, or Hybrid Connections (App Service only!). Azure Functions can't do that yet, but every other compute is a go.
https://azure.microsoft.com/en-us/documentation/articles/web-sites-hybrid-connection-get-started/
(That tutorial is Windows only but you can pull data from any OS. The Hybrid Connection Manager has to live on a Windows box, but then it acts as a reverse proxy to any host on your network).
To deal with Azure ServiceBus Queue easily, the best option seems to be Azure Webjob.
There is a ServiceBusTrigger that allows you to get messages from an Azure ServiceBus queue.
For node.js integration, you should have a look at Azure Function. It is built on top of the webjob SDK and have node.js integration :
Azure Functions NodeJS developer reference
Azure Functions Service Bus triggers and bindings for queues and topics
In the second article, there is an example on how get messages from a queue using Azure Function and nodejs :
module.exports = function(context, myQueueItem) {
context.log('Node.js ServiceBus queue trigger function processed message', myQueueItem);
context.done();
};