Rabbitmq and nodejs delete queue after message has been received - node.js

Im new in the rabbitmq, trying to figure out how to delete queue after message has been received. Any help appreciated. Here is consumer script:
const amqp = require("amqplib");
let result = connect();
async function connect() {
try {
const amqpServer = "amqp://localhost"
const connection = await amqp.connect(amqpServer)
const channel = await connection.createChannel();
await channel.assertQueue("jobs");
channel.consume("jobs", message => {
const input = JSON.parse(message.content.toString());
console.log(`Recieved job with input ${input}`);
})
console.log("Waiting for messages...");
} catch (ex) {
console.error(ex)
}
}

According to the assertQueue queue docs, you can pass an autoDelete option when creating which will clean up after the number of consumers drops to 0.
const amqp = require("amqplib");
let result = connect();
async function connect() {
try {
const amqpServer = "amqp://localhost"
const connection = await amqp.connect(amqpServer)
const channel = await connection.createChannel();
await channel.assertQueue("jobs", {autoDelete: true});
channel.consume("jobs", message => {
const input = JSON.parse(message.content.toString());
console.log(`Recieved job with input ${input}`);
})
console.log("Waiting for messages...");
} catch (ex) {
console.error(ex)
}
}
You could then call cancel on the channel to stop consuming the messages.
channel.cancel("jobs");
Lastly, you could forcefully delete the queue with deleteQueue, although this might have some strange side effects if doing it in a callback.

Related

using Node-Redis throws out error saying Error: Socket already opened'

I am using redis for the notification service, the redis library is Node-Redis with latest version 4.5.1
const Redis = require('redis');
const redisClient = Redis.createClient({ url:'redis://127.0.0.1'});
class NotificationService {
async getNotificationCount(userId){
let llen = 0;
try {
await redisClient.connect();
const notificationKey = `user:notification:${userId}`;
llen = await redisClient.lLen(notificationKey);
redisClient.quit();
} catch(err) {
console.log("Err:", err);
}
return llen;
}
async getNotifications(userId, pageNum, pageSize){
let offset = (pageNum - 1) * pageSize;
let lpopList1 = [];
try {
await redisClient.connect();
const notificationKey = `user:notification:${userId}`;
lpopList1 = await redisClient.lRange(notificationKey, 0, -1);
redisClient.quit();
} catch(err) {
console.log("Err:", err);
}
return lpopList1;
}
...
}
As noted, there is only one client and is being reused for every connection. In each new request, the new connection is built by 'await redisClient.connect()' and closed by 'redisClient.quit()'
However, sometimes the app throws out the error saying 'Err: Error: Socket already opened'.
why is it happening , a better way to deal with this?
Update:
the error happened in circumstance that getNotifications() and getNotificationCount() are being called at the same time.
however, when I only called one function at each time, the error is not happening.
So it seems one redis client in this library can not deal with concurrency, any best practice?
Connecting to Redis can be nontrivial, especially when dealing with multi-node topologies, and you should try to reuse your connections as much as possible: instead of connecting and disconnecting at every run, I would suggest to just leave the client connected: node-redis handles reconnections automatically in the event of a network failure, so you don't have to worry about that.
Of course, whether you choose to bind the lifespan of the connection to the entire process or to, perhaps, your NotificationService class is up to you and your business needs.
const Redis = require('redis');
const redisClient = Redis.createClient({ url:'redis://127.0.0.1'});
// ...
await redisClient.connect();
// ...
class NotificationService {
async getNotificationCount(userId){
let llen = 0;
try {
const notificationKey = `user:notification:${userId}`;
llen = await redisClient.lLen(notificationKey);
} catch(err) {
console.log("Err:", err);
}
return llen;
}
async getNotifications(userId, pageNum, pageSize){
let offset = (pageNum - 1) * pageSize;
let lpopList1 = [];
try {
const notificationKey = `user:notification:${userId}`;
lpopList1 = await redisClient.lRange(notificationKey, 0, -1);
} catch(err) {
console.log("Err:", err);
}
return lpopList1;
}
// ...
}

Socket connection congests whole nodejs application

I have a socket connection using zmq.js client:
// routerSocket.ts
const zmqRouter = zmq.socket("router");
zmqRouter.bind(`tcp://*:${PORT}`);
zmqRouter.on("message", async (...frames) => {
try {
const { measurementData, measurementHeader } =
await decodeL2Measurement(frames[frames.length - 1]);
addHeaderInfo(measurementHeader);
// Add cell id to the list
process.send(
{ measurementData, measurementHeader, headerInfoArrays },
(e: any) => {
return;
},
);
} catch (e: any) {
return;
}
});
I run this socket connection within a forked process in index.ts:
// index.ts
const zmqProcess = fork("./src/routerSocket");
zmqProcess.on("message", async (data: ZmqMessage) => {
if (data !== undefined) {
const { measurementData, measurementHeader, headerInfoArrays } = data;
headerInfo = headerInfoArrays;
emitHeaderInfo(headerInfoArrays);
// Emit the message to subscribers of the rnti
const a = performance.now();
io.emit(
measurementHeader.nrCellId,
JSON.stringify({ measurementData, measurementHeader }),
);
// Emit the message to the all channel
io.emit("all", JSON.stringify({ measurementData, measurementHeader }));
const b = performance.now();
console.log("time to emit: ", a - b);
}
});
There is data coming in rapidly, about one message per ms, to the zmqRouter object, which it then processes and sends onto the main process where I use socket.io to distribute the data to clients. But as soon as the stream begins, node can't do anything else. Even a setInterval log stops working when the stream begins.
Thank you for your help!

Google Pub/Sub pull method restart express server after every 1 min

I am using pun/sub to pull messages when someone buys a subscription from google play.
const { PubSub } = require('#google-cloud/pubsub');
const grpc = require('grpc');
// Instantiates a client
const pubSubClient = new PubSub({ grpc });
const pubsub = () => {
const projectId = process.env.GOOGLE_PUB_SUB_PROJECT_ID; // Your Google Cloud Platform project ID
const subscriptionName = process.env.GOOGLE_PUB_SUB_SUBSCRIBER_NAME; // Name of our subscription
const timeout = 60;
const maxInProgress = 10;
const subscriberOptions = {
flowControl: {
maxMessages: maxInProgress,
},
};
// Get our created subscription
const subscriptionPub = pubSubClient.subscription(subscriptionName, subscriberOptions);
console.log(`subscription ${subscriptionPub.name} found.`);
// Create an event handler to handle messages
let messageCount = 0;
// Create an event handler to handle messages
const messageHandler = message => {
console.log(`Received message: ${message.id}`);
console.log(`Data: ${message.data}`);
console.log(`Attributes: ${JSON.stringify(message.attributes)}`);
//todo: you can update your backend here using the purchase token
messageCount += 1;
// "Ack" (acknowledge receipt of) the message
message.ack();
};
// Create an event handler to handle errors
const errorHandler = function (error) {
console.log(`GOOGLE PUB SUB ERROR: ${error}`);
throw error;
};
// Listen for new messages/errors until timeout is hit
subscriptionPub.on('message', messageHandler);
subscriptionPub.on('error', errorHandler);
setTimeout(() => {
subscriptionPub.removeListener('message', messageHandler);
subscriptionPub.removeListener('error', errorHandler);
console.log(`${messageCount} message(s) received.`);
}, timeout * 1000);
};
module.exports = pubsub;
And above file is called in the main.js file and every 1 min I am receiving log subscription ${subscriptionPub.name} found.
also, I have commented setTimeout code as of now but I want to understand why removeListener is important to remove the listener every one minute.

Batch commit been called before batch finishes

I'm getting "Cannot modify a WriteBatch that has been committed." in this snippet of code. Although, I'm sure why batch.commit() is not waiting for the forEach finishes.
const db = admin.firestore();
const batch = db.batch();
const channelIds = [];
const messages = data
.map((item) => {
if (!item || !item.phone_number)
return null;
const msg = pupa(message, item);
if (!channelIds.includes(item.channel.id))
channelIds.push(item.channel.id);
return {
...item,
message: msg
};
})
.filter((msg) => msg);
logger.info(`Creating messages/${messageId}/sms entries. [Count = ${messages.length}]`);
// From all channels included in the messages array, it fetchs its remaining sms credits.
channelIds.forEach(async (channelId) => {
const subscriptionDetails = (await admin.firestore()
.collection('channels')
.doc(channelId)
.collection('subscription')
.doc('details')
.get()).data();
const creditsRemaining = subscriptionDetails.limits.snapshot.sms_notifications - subscriptionDetails.limits.used.sms_notifications;
// Sends messages according to its respective channel ID and channel remaining credits.
messages
.filter((item) => item.channel.id === channelId)
.slice(0, creditsRemaining)
.forEach((msg) => {
batch.set(db.collection('messages')
.doc(messageId)
.collection('sms')
.doc(), {
phone_number: msg.phone_number,
message: msg.message
});
});
});
await batch.commit();
EDIT: I fixed this issue by wrapping the forEach in a Promise. Thanks!
It seems you're missing an await before batch.set(db.collection('messages')..., which means that your await batch.commit() gets run before all batch.set() calls have completed.

Writing a unit test case in node.js using mocha to mock a azure service bus queue to recive messages

I have written a unit test case,but it is giving error.
Please find the code below
index.js
const { ServiceBusClient, ReceiveMode } = require("#azure/service-bus");
module.exports = async function (context, myTimer) {
// Define connection string and related Service Bus entity names here
const connectionString = process.env['serviceBusConnectionString'];
const queueName = process.env['serviceBusQueueName'];
const sbClient = ServiceBusClient.createFromConnectionString(connectionString);
const queueClient = sbClient.createQueueClient(queueName);
//const receiver = queueClient.createReceiver(ReceiveMode.receiveAndDelete);
const receiver = queueClient.createReceiver(ReceiveMode.peekLock);
const messages = await receiver.receiveMessages(1);
try {
let payloads = [];
messages.forEach((msg) => {
payloads.push(msg.body);
})
await queueClient.close();
} catch (err) {
context.log('Queue message status settle: abandon');
await messages[0].abandon();
console.log('Error ', err);
} finally {
await sbClient.close();
context.done();
}
};
This is the unit test file and I am getting error.Please let me know why I am getting this errorenter image description here
indexTest.js:
beforeEach(() => {
const sbClientStub = {
createQueueClient: sinon.stub().returnsThis(),
createReceiver: sinon.stub().returnsThis(),
receiveMessages:sinon.stub(),
close: sinon.stub(),
};
sinon.stub(ServiceBusClient, 'createFromConnectionString').callsFake(() => sbClientStub);
const ctx = {};
// const actual = await pushToQueue(message, ctx);
// sinon.assert.match(actual, 2);
sinon.assert.calledWithExactly(ServiceBusClient.createFromConnectionString, undefined);
sinon.assert.calledWithExactly(sbClientStub.createQueueClient, undefined);
sinon.assert.calledOnce(sbClientStub.createReceiver, undefined );
//sinon.assert.calledWithExactly(sbClientStub.send.firstCall, { body: 'a' });
//sinon.assert.calledWithExactly(sbClientStub.send.secondCall, { body: 'b' });
sinon.assert.calledTwice(sbClientStub.close);
});
You should replace every sinon.stub() with sinon.spy(). The stub will prevent calling the original implementation of methods, but spies will do. They basically have the same APIs.
In order to call the original methods of #azure/service-bus, make sure the resources of #azure/service-bus are ready such as environment variables, service account, queue and so on.
If you do this, the unit tests are no longer isolated. In fact, they are no longer unit tests, but integration tests, or e2e tests.

Resources