Worker Role with Service Bus Queue - multiple queues - azure

I need to create a worker role to process messages from multiple queues. I realized that there's a cloud project Worker Role with Service Bus Queue. Can I create multiple Queue Clients using this one or should I separate in N Worker Roles?

There is no restriction on how many Queues you can process from a single Worker role, you can create several QueueClients and then kick off Receive calls in parallel or register with OnMessage.
From an application perspective you need to think about isolation and scaling in that if you have queues with different types of workloads or different priorities then having different processing backends can provide more flexibility.

Related

Multiple kubernetes pods with Azure Eventhubs subscription redundancy

I am working in a microservice architecture and deploying them in k8s. For communication among different application, we are using Azure eventhub to publish and subscribe events. My question is if multiple instances(POD) of an application are running, then the subscribed event callback will be triggered in single POD or in each POD?
Do i need to segregate the PODs in different consumer group?
At any given time only one processor must read events from a consumer group. Do not share a consumer group between different receivers! Important: They need to actively read, there is no "callback".
So if you have multiple consumers that you want to receive each event, then yes, you need to have multiple consumer groups, one for each pod in your case.
But since you are talking about publish-subscribe, maybe something like Azure Service Bus Topics might actually better suited for your scenario?!

Running WebJobs within an Azure Worker Role

I do have a AzureWorker that receives SMTP messages from TCP ports and pushes them to queues. Other threads pick up these messages from the queues and process them. Currently, process threads have their queue polling logic. Simply they check the queues and increase wait interval if the queues are empty.
I want to simplify the queue logic and make use of other Webjobs functionalities in this AzureWorker.
Is it possible to start a WebJobs thread in this AzureWorker and let that thread handle the details? Are there any limitations that I need to know?
Azure Worker Roles are a feature of Azure Cloud Services. Azure Web Jobs are a feature of Azure App Service. They are both built to provide similar ability to run background process tasks within the context of your application. Although, since they are features of different Azure services they can't be run together like you are asking in a nested fashion.
Is it possible to start a WebJobs thread in this AzureWorker and let that thread handle the details?
I agree with Chris Pietschmann, it does not enable us to start WebJobs thread directly in Azure Worker Role.
Other threads pick up these messages from the queues and process them. Currently, process threads have their queue polling logic. Simply they check the queues and increase wait interval if the queues are empty.
I want to simplify the queue logic and make use of other Webjobs functionalities in this AzureWorker.
If you’d like to complete this task by using WebJobs, you could write a program and run as a WebJobs in your Azure App Service. And WebJobs API provides a way to dynamically start/stop WebJobs via REST API, you could use it to manage your WebJobs in your Worker Role.

Azure worker role + number of message queue consumer

I am trying to understand the best practice when hosting a message queue consumer in a azure worker role. I have many different type of message consumers that subscribe to different azure service bus subscriptions (or queue if you like to call it). I am wondering if I should instantiate multiple threads for each consumer in one Worker Role or should I deploy to multiple Worker Role for each consumer.
This is really dependent on your app and workload. If you have tasks that are I/O-blocked, then you should be running several threads; otherwise, you'll have a virtual machine instance which isn't being used efficiently. If it's primarily CPU-based, you may find that you can run efficiently with a lower number of threads.
You should only scale out your worker instances if you can't handle the capacity in a single instance (or if you need high-availability, in which you'd need at least two instances). Just remember that a worker role instance is a full VM, so adding one VM per queue consumer scales in cost, and you still might not see great throughput in an I/O-bound app (or one that blocks on other things such as a Web Service call).
You'll need to do a bit of experimenting to see how many threads to work with on the worker side.

Windows Azure Inter-Role communication

I want to create an Azure application which does the following:
User is presented with a MVC 4 website (web role) which shows a list of commands.
When the user selects a command, it is broadcast to all worker roles.
Worker roles process the task, store the results and notify web role
Web role displays the combined results of the worker roles
From what I've been reading there seem to be two ways of doing this: the Windows Azure Service Bus or using Queues. Each worker role also stores the results in the database.
The Service Bus seems more appropriate with its publish/subscribe model, so all worker roles would get the same command and roughly the same time. Queues seem easier to use though.
Can the service bus be used locally with the emulator when developing? I am using a free trial and cannot keep the application constantly whilst still developing. Also, when using queues how can you notify the web role that processing is complete?
I agree. ServiceBus is a better choice for this messaging requirement. You could, with some effort, do the same with queues. But, you'll be writing a lot of code to implement things that the ServiceBus already gives you.
There is not a local emulator for ServiceBus like there is for the Azure Strorage service (queues/tables/blobs). However, you could still use the ServiceBus for messaging between roles while they are running locally in your development environment.
As for your last question about notifying the web role that processing is complete, there are a several ways to go here. Just a few thoughts (not exhaustive list)...
Table storage where the web role can periodically check the status of the unit of work.
Another ServiceBus Queue/topic for completed work.
Internal endpoints. You'll have to have logic to know if it's just an update from worker role N or if it is indicating a completed unit of work for all worker roles.
I agree with Rick's answer, but would also add the following things to think about:
If you choose the Service Bus Topic approach then as each worker role comes online it would need to generate a subscription to the topic. You'll need to think about subscription maintenance of when one of the workers has a failure and is recycled, or any number of reasons why a subscription may be out there.
Telling the web role that all the workers are complete is interesting. The options Rick provides are good ones, but you'll need to think about some things here. It means that the web role needs to know just how many workers are out there or some other mechanism to decide when all have reported done. You could have the situation of five worker roles receieving a message and start working, then one of them starts to repeatedly fail processing. The other four report their completion but now the web role is waiting on the fifth. How long do you wait for a reply? Can you continue? What if you just told the system to scale down and while the web role thinks there are 5 there is now only 4. These are things you'll need to to think about and they all depend on your requirements.
Based on your question, you could use either queue service and get good results. But each of them are going to have different challenges to overcome as well as advantages.
Some advantages of service bus queues is that it provides blocking receipt with a persistent connection (up to 100 connections), it can monitor messages for completion, and it can send larger messages (256KB).
Some advantages of storage queues over the service bus solution is that it's slightly faster (if 15 ms matters to you), you can use a single storage system (since you'll probably be using Storage for blob and table services anyways), and simple auto-scaling. If you need to auto-scale your worker roles based on the load, passing the the requests through a storage queue makes auto-scaling trivial -- you just setup auto-scaling in the Azure Cloud Service UI under the scale tab.
A more in-depth comparison of the two azure queue services can be found here: http://msdn.microsoft.com/en-us/library/hh767287.aspx
Also, when using queues how can you notify the web role that processing is complete?
For the Azure Storage Queues solution, I've written a library that can help: https://github.com/brentrossen/AzureDistributedService.
It provides a proxy layer that facilitates RPC style communication from web roles to worker roles and back through Storage Queues.

Multiple Threads in an Azure Worker Role and Storage Queue Access

I am planning to implement an azure worker role which can start multiple threads. Each thread may want to read from or write to a storage queue. For different worker role instances, race situations are not the case. But, is it safe to simultaneously access the same queue from different threads running inside the same worker role ?
Azure Queue is thread safe and is accessed using a REST API. MSDN has some reference data.

Resources