I have created a topic and subscription in azure. when i try to push my message in topic and retrieve it with subscription i cannot get the messages. Are messages stored in the queue. Are my messages not getting published.
Push in the topic code
const topicName = 'xxxxxxxxxxxxx';
async function main(){
const sbClient = ServiceBusClient.createFromConnectionString(connectionString);
const topicClient = sbClient.createTopicClient(topicName);
const sender = topicClient.createSender();
try {
const message= {
body: req.body.message,
label: `test`,
};
console.log(`Sending message: ${message.body}`);
await sender.send(message);
await topicClient.close();
res.send(message.body)
} finally {
await sbClient.close();
}
}
main()
.catch((err) => {
console.log("Error occurred: ", err);
});
Getting Message via subscription code
const topicName = 'xxxxxxxxx';
const subscriptionName = "subsTest1";
async function main(){
const sbClient = ServiceBusClient.createFromConnectionString(connectionString);
const subscriptionClient = sbClient.createSubscriptionClient(topicName, subscriptionName);
const receiver = subscriptionClient.createReceiver(ReceiveMode.receiveAndDelete);
try {
const messages = await receiver.receiveMessages(10);
res.send(messages)
console.log("Received messages:");
console.log(messages.map(message => message.body));
await subscriptionClient.close();
} finally {
await sbClient.close();
}
}
main().catch((err) => {
console.log("Error occurred: ", err);
});
I test your code, in my test I delete the request and response part, I could send and receive message. Cause in your test you don't know whether you succeed send the message, you could use ServiceBusExplorer to view the message. And remember when receive message from subscription it's slow.
And below is my test result. Check my log, you could find the interval it won't receive the message immediately.
Run your code to push the message to the topic. Then you can look in the subscription in the Azure portal to see if the message is there. That will at least confirm if your code is sending the message properly.
Related
Azure storage queue is not listening to the message automatically when we push to the queue, we have to write a custom lister in-order to fetch the message from the queue.
import {
QueueServiceClient,
} from "#azure/storage-queue";
// Create a QueueServiceClient object
const queueServiceClient = QueueServiceClient.fromConnectionString("AccountName=devstoreaccount1;AccountKey=Eby8vdM02xNOcqFlqUwJPLlmEtlCDXJ1OUzFT50uSRZ6IFsuFq2UVErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw==;DefaultEndpointsProtocol=http;BlobEndpoint=http://127.0.0.1:10000/devstoreaccount1;QueueEndpoint=http://127.0.0.1:10001/devstoreaccount1;TableEndpoint=http://127.0.0.1:10002/devstoreaccount1;");
// // Create a new queue
const queueName = "test-local-queue";
const queueClient = queueServiceClient.getQueueClient(queueName);
queueClient.create().then(() => {
console.log(`Queue "${queueName}" was created successfully`);
});
// Add a message to the queue
const message = "Hello, world!";
queueClient.sendMessage(message).then(() => {
console.log(`Message "${message}" was added to the queue`);
});
// Set the number of messages to retrieve (up to 32)
const maxMessages = 10;
// Set the visibility timeout (in seconds) for the messages
const visibilityTimeout = 60;
// Receive messages from the queue
queueClient
.receiveMessages({ maxMessages, visibilityTimeout })
.then(response => {
const messages = response.receivedMessageItems;
console.log(`Received ${messages.length} messages from the queue`);
// Process the messages
messages.forEach(message => {
console.log(message);
// Do something with the message...
});
})
.catch(error => {
console.error(`Failed to receive messages from the queue: ${error}`);
});
Does Azure service bus solves the above stated problem?
Well instead of recievemessage function try peekmessage .
In this function by default it will peek at a single message but you can configure it the way you want. Here I have configured it to peek 3 messages.
const peekedMessages = await queueClient.peekMessages({ numberOfMessages: 3 });
Now the peekMessages const contains an array called peekedMessageItems which will contain the peeked messages
here I have just iterated in that array and read the messages using messageText property
Complete Code:
const { QueueClient, QueueServiceClient } = require("#azure/storage-queue");
const queueServiceClient = QueueServiceClient.fromConnectionString(conn);
const queueClient = queueServiceClient.getQueueClient(qname);
const peekedMessages = await queueClient.peekMessages({ numberOfMessages: 3 });
for (i = 0; i < peekedMessages.peekedMessageItems.length; i++) {
// Display the peeked message
console.log("Peeked message: ", peekedMessages.peekedMessageItems[i].messageText);
}
Also as the name suggest this will peek message not dequeue them.
I am working on a job bidding app.
Each user has a field "User job notifications preferences".
The array field stores the data to which type of job they would like to receive notifications for.
for example:
Person A has the setting to receive a notification when a job of type 'plumming' is created.
Person B has the setting to receive a notification when a job of type 'electrical' is created.
Person C creates a plumming job,
Peron A should receive a notification to let them know a new job of type 'plumming' has been created.
here is the code snip
// when a job is updated from new to open
// send notifications to the users that signed up for that jobtype notification
exports.onJobUpdateFromNewToOpen= functions.firestore
.document('job/{docId}')
.onUpdate(async (change, eventContext) => {
const beforeSnapData = change.before.data();
const afterSnapData = change.after.data();
const jobType = afterSnapData['Job type'];
const afterJobState = afterSnapData["Job state"];
const beforeJobState = beforeSnapData["Job state"];
console.log('job updated');
// only consider jobs switching from new to open
if (beforeJobState=="New" && afterJobState == "Open") {
console.log('job updated from new to open');
console.log('jobType: '+jobType);
console.log('job id: '+change.after.id )
// get users that contain the matching job type
const usersWithJobTypePreferenceList = await admin.firestore().collection("user").where("User job notifications preferences", "array-contains-any", jobType).get();
// get their userIds
const userIdsList = [];
usersWithJobTypePreferenceList.forEach((doc) => {
const userId = doc.data()["User id"];
userIdsList.push(userId);
})
// get their user tokens
const userTokenList = [];
for (var user in userIdsList) {
const userId = userIdsList[user];
const userToken = await (await admin.firestore().collection("user token").doc(userId).get()).data()["token"];
userTokenList.push(userToken);
};
// send message
const messageTitle = "new " + jobType + " has been created";
for (var token in userTokenList) {
var userToken = userTokenList[token];
const payload = {
notification: {
title: messageTitle,
body: messageTitle,
sound: "default",
},
data: {
click_action: "FLUTTER_NOTIFICATION_CLICK",
message: "Sample Push Message",
},
};
return await admin.messaging().sendToDevice(receiverToken, payload);
}
}
});
I think the issue is at the following line because I am getting the error 'Error: 3 INVALID_ARGUMENT: 'ARRAY_CONTAINS_ANY' requires an ArrayValue' (see image)
const usersWithJobTypePreferenceList = await admin.firestore().collection("user").where("User job notifications preferences", "array-contains-any", jobType).get();
below is the full error:
Error: 3 INVALID_ARGUMENT: 'ARRAY_CONTAINS_ANY' requires an ArrayValue.
at Object.callErrorFromStatus (/workspace/node_modules/#grpc/grpc-js/build/src/call.js:31:19)
at Object.onReceiveStatus (/workspace/node_modules/#grpc/grpc-js/build/src/client.js:352:49)
at Object.onReceiveStatus (/workspace/node_modules/#grpc/grpc-js/build/src/client-interceptors.js:328:181)
at /workspace/node_modules/#grpc/grpc-js/build/src/call-stream.js:188:78
at processTicksAndRejections (node:internal/process/task_queues:78:11)
I interpret the error as the following: there is no value being passed to 'jobType'.but that cant be right because I am printing the value ( see screenshot )
I found the following related questions but I dont think I am having the same issue:
Getting firestore data from a Google Cloud Function with array-contains-any
Firestore: Multiple 'array-contains'
So I am not sure what the issue is here, any ideas?
here is how the data looks in firebase:
I looked at similar questions and I printed the values being passed to the function that was creating the error
I updated the line that was giving me an issue now everything works :) ::
'''
const usersWithJobTypePreferenceList = await admin.firestore().collection("user").where("User job notifications preferences", "array-contains", jobType).get();
'''
Im new in the rabbitmq, trying to figure out how to delete queue after message has been received. Any help appreciated. Here is consumer script:
const amqp = require("amqplib");
let result = connect();
async function connect() {
try {
const amqpServer = "amqp://localhost"
const connection = await amqp.connect(amqpServer)
const channel = await connection.createChannel();
await channel.assertQueue("jobs");
channel.consume("jobs", message => {
const input = JSON.parse(message.content.toString());
console.log(`Recieved job with input ${input}`);
})
console.log("Waiting for messages...");
} catch (ex) {
console.error(ex)
}
}
According to the assertQueue queue docs, you can pass an autoDelete option when creating which will clean up after the number of consumers drops to 0.
const amqp = require("amqplib");
let result = connect();
async function connect() {
try {
const amqpServer = "amqp://localhost"
const connection = await amqp.connect(amqpServer)
const channel = await connection.createChannel();
await channel.assertQueue("jobs", {autoDelete: true});
channel.consume("jobs", message => {
const input = JSON.parse(message.content.toString());
console.log(`Recieved job with input ${input}`);
})
console.log("Waiting for messages...");
} catch (ex) {
console.error(ex)
}
}
You could then call cancel on the channel to stop consuming the messages.
channel.cancel("jobs");
Lastly, you could forcefully delete the queue with deleteQueue, although this might have some strange side effects if doing it in a callback.
I am trying to publish messages via google pubsub batch publishing feature. The batch publishing code looks like below.
const gRPC = require("grpc");
const { PubSub } = require("#google-cloud/pubsub");
const createPublishEventsInBatch = (topic) => {
const pubSub = new PubSub({ gRPC });
const batchPublisher = pubSub.topic(topic, {
batching: {
maxMessages: 100,
maxMilliseconds: 1000,
},
});
return async (logTrace, eventData) => {
console.log("Publishing batch events for", eventData);
try {
await batchPublisher.publish(Buffer.from(JSON.stringify(eventData)));
} catch (err) {
console.error("Error in publishing", err);
}
};
};
And this batch publisher is getting called from a service like this.
const publishEventsInBatch1 = publishEventFactory.createPublishEventsInBatch(
"topicName1"
);
const publishEventsInBatch2 = publishEventFactory.createPublishEventsInBatch(
"topicName2"
);
events.forEach((event) => {
publishEventsInBatch1(logTrace, event);
publishEventsInBatch2(logTrace, event);
});
I am using push subscription to receive the messages with the below settings.
Acknowledgement deadline: 600 Seconds
Retry policy: Retry immediately
The issue I am facing is, if the total number of events/messages is 250k, the push subscription is supposed to get less than or equal to 250k messages based on the message execution. But in my case, I am getting 3-4 M records on subscription and it is getting varied.
My fastify and pubsub configuration is
fastify: 3.10.1
#google-cloud/pubsub: 2.12.0
Adding the subscription code
fastify.post("/subscription", async (req, reply) => {
const message = req.body.message;
let event;
let data;
let entityType;
try {
let payload = Buffer.from(message.data, "base64").toString();
event = JSON.parse(payload);
data = event.data;
entityType = event.entityType;
if (entityType === "EVENT") {
if (event.version === "1.0") {
console.log("Processing subscription");
await processMessage(fastify, data);
} else {
console.error("Unknown version of stock event, being ignored");
}
} else {
console.error("Ignore event");
}
reply.code(200).send();
} catch (err) {
if (err.status === 409) {
console.error("Ignoring stock update due to 409: Conflict");
reply.code(200).send();
} else {
console.error("Error while processing event from subscription");
reply.code(500).send();
}
}
});
Can any one guide me where I am doing the mistakes. It's a simple fastify application. Do I am making any mistake in coding or any configuration.
Before, I was using azure-sb package to handle service bus message in NodeJS with below sample code:
let message = {
body: JSON.stringify(body),
customProperties: {
userId: userId
}
};
However, after changed to use package #azure/service-bus, I needed to change a little bit to get body in C# code as below:
let signMessage = {
body: body,
customProperties: { // tried to use userProperties but not okay
userId: userId
}
};
However, I still cannot get userProperties successfully in C# or ServiceBus Explorer.
Simple code:
const { ServiceBusClient } = require("#azure/service-bus");
const connectionString = "Endpoint=sb://bowman1012.servicebus.windows.net/;SharedAccessKeyName=RootManageSharedAccessKey;SharedAccessKey=xxxxxx"
const topicName = "test";
const messages = [
{ body: "Albert Einstein",
applicationProperties: {
userId: 'userId'
}
}
];
async function main() {
// create a Service Bus client using the connection string to the Service Bus namespace
const sbClient = new ServiceBusClient(connectionString);
// createSender() can also be used to create a sender for a queue.
const sender = sbClient.createSender(topicName);
try {
// Tries to send all messages in a single batch.
// Will fail if the messages cannot fit in a batch.
// await sender.sendMessages(messages);
// create a batch object
let batch = await sender.createMessageBatch();
for (let i = 0; i < messages.length; i++) {
// for each message in the arry
// try to add the message to the batch
if (!batch.tryAddMessage(messages[i])) {
// if it fails to add the message to the current batch
// send the current batch as it is full
await sender.sendMessages(batch);
// then, create a new batch
batch = await sender.createBatch();
// now, add the message failed to be added to the previous batch to this batch
if (!batch.tryAddMessage(messages[i])) {
// if it still can't be added to the batch, the message is probably too big to fit in a batch
throw new Error("Message too big to fit in a batch");
}
}
}
// Send the last created batch of messages to the topic
await sender.sendMessages(batch);
console.log(`Sent a batch of messages to the topic: ${topicName}`);
// Close the sender
await sender.close();
} finally {
await sbClient.close();
}
}
// call the main function
main().catch((err) => {
console.log("Error occurred: ", err);
process.exit(1);
});
It works fine on my side.
This is the API reference:
https://learn.microsoft.com/en-us/javascript/api/#azure/service-bus/servicebusmessage?view=azure-node-latest