notification in saas application over azure - azure

We are working on a SaaS based application (built over azure). In this application Web server and App server is shared among all tenants but their database are separate (Sql Azure).
Now there is a need to implement notification service which can generate notifications based on events subscriptions. System can generate different kind of event (like account locked and many other) and user can configure notification rule on these events. Notification can be in the form of email and sms.
We are planning to implement a queue for events. Event notifier will push an even on this queue. Notification engine will subscribe to this queue. Whenever it receive a new event, it will check if there is a notification rule configured on this type of event or not. If yes, it will create a notification, which will result into emails/sms. These emails/sms can be stored in database or pushed to another queue. A different background process (worker role) can process these emails.
Here are my queries.
Should we keep one single queue (for events) for all tenants or create separate queue for different tenants. If we keep a single queue, we can a shared subscriber service which can subscribe to this queue. We can easily scale in-out this machine.
Since we have different databases for each tenant, we can store their emails to their respective databases and using some service, we can pool database and send email after defined interval. But I am not sure how will we share the subscriber code in this case.
We can store mails in a nosql database (like table storage in azure). A subscriber (window service/worker role) can pool this table and send mails after defined interval. Again, scaling can a challenge here too.
We can store emails in queue (RabbitMQ for instance). A worker role can subscribe to this queue. Scaling of worker role should not be any issue in case we keep a single queue for all tenant.
Please provide your inputs on these points.
Thanks In Advance

I would separate queues not by tenants but by function. So that queue handlers are specific for the type of a message that they are processing.
IE: order processing queue, an account setup queue, and etc.
Creating queues by tenant is a /headache/ to manage when you want to scale based on them and you want to presumably sync/add/remove them as customers come and leave. So, I would avoid this scenario
Ultimately, scaling based on multiple queues will be harder without auto-scaling services such as CloudMonix (a commercial product I help built)
HTH

Related

Azure Service Bus Topic Subscriber receiving messages not in order

I am using azure service bus topic and I have enable session for it's subscription.
With in the my logic app i am inserting data using sql transaction which are coming from topic
I am using topic Subscription(peek-lock) and
in subscriber level concurrency is set to default as follows
According to my Understanding, my logic app(subscriber) should read ALL the messages and have to processed in FIFO
my logic app is like
which means it should insert data to the table in ordered manner
however when i checked in the trigger log it shows correct order but in database level you can see the order is not happening
Message ordering is a delicate business. You can have either message ordering or concurrent processing, but not both. The moment you make message ordering a must, you lose the ability to have concurrent processing. This is correct for both, Azure Service Bus Sessions and Logic Apps concurrency control. You could process multiple sessions, but each session would be still restricted to a single processor. Here's a post about it.

Azure Service Bus Queues vs Topics for one to many(unique)

I have an online service hosted on Azure, that asynchronously sends data to on-premise clients.
Each client is identified by an unique code.
Actually there is a single topic, with a subscription for each client which has a filter on the unique code, that is sent as a parameter in the message. No message will ever be broadcasted to all the clients.
I feel that using topic this way is wrong.
The alternative that comes to my mind is to use a dedicated queue for each client, that is created on first contact
Could this be a better approach?
Thanks
In my opinion using Topics and Subscriptions is the right way to go. Here's the reason why:
Currently the routing logic (which message needs to go to which subscription) is handled by Azure Service Bus based on the rules you have configured. If you go with queues, the routing logic will need to come to your hosted service. You'll need to ensure that the queue exists before sending each message. I think it will increase the complexity at your service level somehow.
Furthermore, topics and subscriptions would enable you to do build an audit trail kind of functionality (not sure if you're looking for this kind of functionality). You can create a separate subscription that has a rule to deliver all messages (True SQL Rule) to that subscription along with client specific subscription.
Creating a separate Queue for each client is not advisable. This is the problem solved by Topics.
If you have separate Queue for each client, then you need to send messages to multiple Queues from Server. This will become tedious when the number of clients increases.
Having a single Topic and multiple Subscriptions is easy to manage as the message will be sent only to a single Topic from Server.

Azure Service Bus synchronize all masterdata

Let's say I've got an azure service bus in a microservice scenario.
One microservice pushes master data changes to the other services with a subscription.
Now let's say a new service is introduced and subscribes to the master data service. How can I make sure that the new service receives all neccessary data?
Do I have to resend all master data on the master data service or does the azure service bus (or alternatives) provide some features for that?
As far as I know there is no way to achieve what you want within the capabilities of Azure Service Bus. Also, I don't think this what Service Bus is there for.
Of course there is a configurable "time to live" value for messages within queues and topics, which could probably be set to some really high value, but this would still not make your master data be infinitely available for future services. And - but this is just my opinion and I'm far from being an expert - I wouldn't want to load up my service bus with potentially thousands or even millions of messages (depending on what you're doing) without them being processed quickly.
For your specific concern I'd rather implement something like a "master data import service" without any service bus integration. Details of this, however, depend on your environment and specific requirements.
Couple of points:
1) This is not possible with Azure Service bus. Even If you set TTL at Topic level, the messages will only be delivered to available subscriptions at that point of time. you cant read messages directly from Topic.
2) you can consider Eventhub option where you can create new consumer group with offset from when you want to start reading messages but Eventhub has maximum retention period as 7 days. If you need message retention beyond 7 days, enabling Event Hubs Capture on your event hub pulls the data from your event hub to the Storage account. But in this case you would require additional logic to read from this storage account to replay the messages.

Microservices for job/cron tasks

For example I want to have a microservice to send notifications(email, sms, push notification). Everything is ok for few users. After some time our application have a lot of users, so, this microservice doe not manage, and email a sent after 1 hour.
So, how to handle this situation? Deploy another instance of microservice, but how to handle that only one microservice process one email and user don't receive multiple emails?
Need to setup messaging for that.
It’s common to use a persistent queue such as RabbitMQ. The microservice responsible for sending emails then consumes the messages from the queue and handles them appropriately.
If you run into a problem of your single instance of email microservice not being enough you can simply fork another instance and deploy it instantly. This is because when a message from the message queue is consumed it’s gone unless you tell it to return (to be requeued). I.e. any successfully sent email will consume the the message hence the request to send an email is no longer within the system.
1) You can create coordinating service that will schedule tasks for senders using persistent storage like database table. This service will add send job records into table and sender services will scan table in a loop get job, mark it as processing so other instances will not get the same job.
2) You can use queue like Azure ServiceBus to send jobs from coordinating service.
Also if you are using micro services I will suggest to separate sending services by transport so you can scale them separately.
I can see next structure:
NotificationSenderService - send coordinator you usually need only one instance of this. The responsibility of this service is to receive send notification request and create job using queue or database
EmailNotificationService, SMSNotificationService, PuthNotificationService - actual senders. You can run as many instances of each as you need. They need to have access to database or queue of NotificationSenderService.

Azure Service Bus Queue grouped messages

I have a web api application which performs different type of actions an a Domain entity, called Application. Actions like "Copy", "Update", "Recycle", "Restore" etc.
This actions needs to be executed, per Application, in First In First Out order, not randomly or simultaneous. However, it can process simultaneously two Actions as long as they are for two separate Applications.
Is some kind of a queue, but not a big queue for all the requests, but a queue of actions for each Application in database.
Knowing this, i think that azure service bus queue is a good solution for this scenario.
However, the solution i can think of right now is to programmatically create a queue for each Application i have in database, and start listening to that queue.
Is possible to get messages from the queue based on a filter? (using FIFO principle) So i have to subscribe only to one queue? (instead of having a queue for each Application - which is very hard to maintain)
What you want is Azure Service Bus Topics/Subscriptions.
Subscriptions allow you to filter messages that are published to a topic using a SqlFilter on the message headers.
The article linked above should provide enough examples to meet your needs.
I think u can solve this by using Sessions.
I just came across this very clear article: https://dev.to/azure/ordered-queue-processing-in-azure-functions-4h6c which explains in to detail how Azure Service Bus Queue sessions work.
In short: by defining a SessionId on the messages you can force the ordering of the processing within a session, the downside is that there will be no parallelization for all messages in a session between multiple consumers of the queue.

Resources