I have a service bus topic that ingest telemetry messages outgoing from one of my servers ('telemetry-topic') at a rate of approximately 1 message every 10 seconds.
I have a web (site) front that subscribes to the 'telemetry-subscription' that consumes the 'telemetry-topic'.
What I witness is that the more front I run, the least message frequency it receives. I sounds like my topic acts as a queue…
What I should see is that any front -no matter the number of them— should sustain de approx 1 message very 10 seconds.
Note: My stack is node.
The receiver:
async function serviceBusTelemetrySubscribe(props) {
const module = '🚌 Service bus telemetry'
const {enter, leave, log, warn, error} = SDK.logFunctions(module, {
verboseLevel: 1,
verboseEnterLeave: true,
})
try {
enter()
log('Subscribing with given processMessage', props.processMessage, 'processError', props.processError)
const {processMessage, processError} = props
// connection string to your Service Bus namespace
const connectionString = 'XXXX'
// == Topic & subscription names
const topicName = 'telemetry-topic'
const subscriptionName = 'telemetry'
// == Create a Service Bus client using the connection string to the Service Bus namespace
const sbClient = new ServiceBusClient(connectionString)
log('Service bus client', sbClient)
// == Create a receiver for "all telemetry" subscription.
const receiver = sbClient.createReceiver(topicName, subscriptionName)
log('Service bus receiver for subscription name "', subscriptionName, '" is:', sbClient)
// == Function to handle messages, which provide a default one
const myMessageHandler = processMessage || (async (messageReceived) => {
log(`received message: ${messageReceived.body}`)
})
// == Function to handle any errors, which provide a default one
const myErrorHandler = processError || (async (error) => {
error(error)
})
log('Subscribing with actual processMessage', myMessageHandler, 'processError',myErrorHandler)
// == Subscribe and specify the message and error handlers
const sbSubscription = receiver.subscribe({
processMessage: myMessageHandler,
processError: myErrorHandler
})
log('Service bus subscription', sbSubscription)
// == Return cleanup method
return async () => {
log('Closing Service bus…')
// Waiting long enough before closing the sender to send messages
await delay(5000)
log('Closing Service bus subscription…')
await sbSubscription.close()
log('Closing Service bus receiver…')
await receiver.close()
log('Closing Service bus client…')
await sbClient.close()
}
}
catch(err) {
error(err)
}
finally {
leave()
}
}
Related
Azure storage queue is not listening to the message automatically when we push to the queue, we have to write a custom lister in-order to fetch the message from the queue.
import {
QueueServiceClient,
} from "#azure/storage-queue";
// Create a QueueServiceClient object
const queueServiceClient = QueueServiceClient.fromConnectionString("AccountName=devstoreaccount1;AccountKey=Eby8vdM02xNOcqFlqUwJPLlmEtlCDXJ1OUzFT50uSRZ6IFsuFq2UVErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw==;DefaultEndpointsProtocol=http;BlobEndpoint=http://127.0.0.1:10000/devstoreaccount1;QueueEndpoint=http://127.0.0.1:10001/devstoreaccount1;TableEndpoint=http://127.0.0.1:10002/devstoreaccount1;");
// // Create a new queue
const queueName = "test-local-queue";
const queueClient = queueServiceClient.getQueueClient(queueName);
queueClient.create().then(() => {
console.log(`Queue "${queueName}" was created successfully`);
});
// Add a message to the queue
const message = "Hello, world!";
queueClient.sendMessage(message).then(() => {
console.log(`Message "${message}" was added to the queue`);
});
// Set the number of messages to retrieve (up to 32)
const maxMessages = 10;
// Set the visibility timeout (in seconds) for the messages
const visibilityTimeout = 60;
// Receive messages from the queue
queueClient
.receiveMessages({ maxMessages, visibilityTimeout })
.then(response => {
const messages = response.receivedMessageItems;
console.log(`Received ${messages.length} messages from the queue`);
// Process the messages
messages.forEach(message => {
console.log(message);
// Do something with the message...
});
})
.catch(error => {
console.error(`Failed to receive messages from the queue: ${error}`);
});
Does Azure service bus solves the above stated problem?
Well instead of recievemessage function try peekmessage .
In this function by default it will peek at a single message but you can configure it the way you want. Here I have configured it to peek 3 messages.
const peekedMessages = await queueClient.peekMessages({ numberOfMessages: 3 });
Now the peekMessages const contains an array called peekedMessageItems which will contain the peeked messages
here I have just iterated in that array and read the messages using messageText property
Complete Code:
const { QueueClient, QueueServiceClient } = require("#azure/storage-queue");
const queueServiceClient = QueueServiceClient.fromConnectionString(conn);
const queueClient = queueServiceClient.getQueueClient(qname);
const peekedMessages = await queueClient.peekMessages({ numberOfMessages: 3 });
for (i = 0; i < peekedMessages.peekedMessageItems.length; i++) {
// Display the peeked message
console.log("Peeked message: ", peekedMessages.peekedMessageItems[i].messageText);
}
Also as the name suggest this will peek message not dequeue them.
I've a GCP pub/sub messaging event and it's configured to my nodejs application. In other words. there's topic and there's a subscription assigned to it and the subscription delivery type set to Push. This is how the subscription setup in GCP.
and exmaple node app api as follows:
group.post(
'/test-data',
testFunction
);
export const testFunction = (
req: Request,
res: Response,
next: NextFunction
) => {
const service = (req as any).service as ServiceContainer;
const available = service.UserDao.find(req.body.id);
if (result.length > 0) {
resutun res.json({found:1});
}
return service.UserDao
.saveIds(req.body.ids)
.then((response) => {
const deviceMap: { [id: string]: Device } = {};
(response || []).forEach((v) => {
deviceMap[v._id] = v;
});
res.json(deviceMap);
})
.catch((err) => {
next(err);
});
};
Now, before one message is processed, another message is sent to the URL from the subscription. So what I want is, to process the request sequentially, one topic should completely processed before the next gets processed.
Accoding to Google:
And this option only available for pull type:
I am seeking to use groups within Azure PubSub, but it appears that either my publisher and subscriber are not both joining the same group somehow, or my serverless functions are not handling the broadcast after the message gets published. The service works if I publish without the groups implemented, but once I attempted to add groups I can see messages hitting the live trace tool on azure, but no messages being sent out after, so I suspect I may be missing something in my azure functions, but am not sure what that would be.
Publisher code:
const hub = "simplechat";
let service = new WebPubSubServiceClient("Endpoint=endpointURL", hub);
// by default it uses `application/json`, specify contentType as `text/plain` if you want plain-text
const group = service.group("myGroup");
group.sendToAll('Hello World', { contentType: "text/plain" });
Subscriber code:
const WebSocket = require('ws');
const { WebPubSubServiceClient } = require('#azure/web-pubsub');
var printer = require("printer/lib");
var util = require('util');
async function main() {
const hub = "simplechat";
let service = new WebPubSubServiceClient("EndpointEndpointURL", hub);
const group = service.group("myGroup");
let token = await service.getClientAccessToken();
let ws = new WebSocket(token.url, 'json.webpubsub.azure.v1');
ws.on('open', () => console.log('connected'));
ws.on('message', data => {
console.log('Message received: %s', data);
});
}
main();
I think you missed the part of joining your subscriber to the group.
The simplest way is to give the connection a user name and call addUser to add the connection to the group when the connection is connected:
async function main() {
const hub = "simplechat";
let service = new WebPubSubServiceClient("EndpointEndpointURL", hub);
const group = service.group("myGroup");
let token = await service.getClientAccessToken({ userId: "user1"});
// with this approach, the WebSocket actually does not need to be 'json.webpubsub.azure.v1' subprotocol, a simple WebSocket connection also works
let ws = new WebSocket(token.url, 'json.webpubsub.azure.v1');
ws.on('open', () => {
console.log('connected');
group.addUser("user1");
}
);
ws.on('message', data => {
console.log('Message received: %s', data);
});
}
Or you can wait until received the Connected Response to get the connectionId of the connection and call addConnection to add the subscriber to the group.
Another way, since you are already using json.webpubsub.azure.v1 protocol, would be that your subscriber to send the joinGroup request:
async function main() {
const hub = "simplechat";
let service = new WebPubSubServiceClient("EndpointEndpointURL", hub);
// make sure you set the joinLeaveGroup role for the group
let token = await service.getClientAccessToken({
roles: ['webpubsub.joinLeaveGroup.myGroup']
});
let ws = new WebSocket(token.url, 'json.webpubsub.azure.v1');
let ackId = 0;
ws.on('open', () => {
console.log('connected');
ws.send(JSON.stringify({
type: 'joinGroup',
group: 'myGroup',
ackId: ++ackId,
}));
});
ws.on('message', data => {
console.log('Message received: %s', data);
});
}
When your subscriber receives the AckMessage for this joinGroup action, your subscriber successfully joins the group.
I am using pun/sub to pull messages when someone buys a subscription from google play.
const { PubSub } = require('#google-cloud/pubsub');
const grpc = require('grpc');
// Instantiates a client
const pubSubClient = new PubSub({ grpc });
const pubsub = () => {
const projectId = process.env.GOOGLE_PUB_SUB_PROJECT_ID; // Your Google Cloud Platform project ID
const subscriptionName = process.env.GOOGLE_PUB_SUB_SUBSCRIBER_NAME; // Name of our subscription
const timeout = 60;
const maxInProgress = 10;
const subscriberOptions = {
flowControl: {
maxMessages: maxInProgress,
},
};
// Get our created subscription
const subscriptionPub = pubSubClient.subscription(subscriptionName, subscriberOptions);
console.log(`subscription ${subscriptionPub.name} found.`);
// Create an event handler to handle messages
let messageCount = 0;
// Create an event handler to handle messages
const messageHandler = message => {
console.log(`Received message: ${message.id}`);
console.log(`Data: ${message.data}`);
console.log(`Attributes: ${JSON.stringify(message.attributes)}`);
//todo: you can update your backend here using the purchase token
messageCount += 1;
// "Ack" (acknowledge receipt of) the message
message.ack();
};
// Create an event handler to handle errors
const errorHandler = function (error) {
console.log(`GOOGLE PUB SUB ERROR: ${error}`);
throw error;
};
// Listen for new messages/errors until timeout is hit
subscriptionPub.on('message', messageHandler);
subscriptionPub.on('error', errorHandler);
setTimeout(() => {
subscriptionPub.removeListener('message', messageHandler);
subscriptionPub.removeListener('error', errorHandler);
console.log(`${messageCount} message(s) received.`);
}, timeout * 1000);
};
module.exports = pubsub;
And above file is called in the main.js file and every 1 min I am receiving log subscription ${subscriptionPub.name} found.
also, I have commented setTimeout code as of now but I want to understand why removeListener is important to remove the listener every one minute.
I have created a topic and subscription in azure. when i try to push my message in topic and retrieve it with subscription i cannot get the messages. Are messages stored in the queue. Are my messages not getting published.
Push in the topic code
const topicName = 'xxxxxxxxxxxxx';
async function main(){
const sbClient = ServiceBusClient.createFromConnectionString(connectionString);
const topicClient = sbClient.createTopicClient(topicName);
const sender = topicClient.createSender();
try {
const message= {
body: req.body.message,
label: `test`,
};
console.log(`Sending message: ${message.body}`);
await sender.send(message);
await topicClient.close();
res.send(message.body)
} finally {
await sbClient.close();
}
}
main()
.catch((err) => {
console.log("Error occurred: ", err);
});
Getting Message via subscription code
const topicName = 'xxxxxxxxx';
const subscriptionName = "subsTest1";
async function main(){
const sbClient = ServiceBusClient.createFromConnectionString(connectionString);
const subscriptionClient = sbClient.createSubscriptionClient(topicName, subscriptionName);
const receiver = subscriptionClient.createReceiver(ReceiveMode.receiveAndDelete);
try {
const messages = await receiver.receiveMessages(10);
res.send(messages)
console.log("Received messages:");
console.log(messages.map(message => message.body));
await subscriptionClient.close();
} finally {
await sbClient.close();
}
}
main().catch((err) => {
console.log("Error occurred: ", err);
});
I test your code, in my test I delete the request and response part, I could send and receive message. Cause in your test you don't know whether you succeed send the message, you could use ServiceBusExplorer to view the message. And remember when receive message from subscription it's slow.
And below is my test result. Check my log, you could find the interval it won't receive the message immediately.
Run your code to push the message to the topic. Then you can look in the subscription in the Azure portal to see if the message is there. That will at least confirm if your code is sending the message properly.