I am trying to create kafka consumer health check using event emmitter. I have tried to implement the following code to check heartbeat and consumer.crash event:
const isHealthy = async function() {
const { HEARTBEAT } = consumer.events;
let lastHeartbeat;
let crashVal;
console.log(consumer);
consumer.on(HEARTBEAT, ({timestamp}) => {
console.log("Inside consumer on HeartbeatVal: "+timestamp);
});
console.log("consumer after binding hearbeat"+JSON.stringify(consumer));
consumer.on('consumer.crash', event => {
const error = event?.payload?.error;
crashVal=error;
console.log("Hello error: "+JSON.stringify(error));
})
console.log("diff"+Date.now() - lastHeartbeat);
if (Date.now() - lastHeartbeat < SESSION_TIMEOUT) {
return true;
}
// Consumer has not heartbeat, but maybe it's because the group is currently rebalancing
try {
console.log("Inside Describe group");
const flag=await consumer.describeGroup()
const { state } = await consumer.describeGroup()
console.log("state: "+state);
if(state==='Stable'){
return true;
}
return ['CompletingRebalance', 'PreparingRebalance'].includes(state)
} catch (ex) {
return false
}
}
Could you help to provide any solution for the above.
Related
I have a socket connection using zmq.js client:
// routerSocket.ts
const zmqRouter = zmq.socket("router");
zmqRouter.bind(`tcp://*:${PORT}`);
zmqRouter.on("message", async (...frames) => {
try {
const { measurementData, measurementHeader } =
await decodeL2Measurement(frames[frames.length - 1]);
addHeaderInfo(measurementHeader);
// Add cell id to the list
process.send(
{ measurementData, measurementHeader, headerInfoArrays },
(e: any) => {
return;
},
);
} catch (e: any) {
return;
}
});
I run this socket connection within a forked process in index.ts:
// index.ts
const zmqProcess = fork("./src/routerSocket");
zmqProcess.on("message", async (data: ZmqMessage) => {
if (data !== undefined) {
const { measurementData, measurementHeader, headerInfoArrays } = data;
headerInfo = headerInfoArrays;
emitHeaderInfo(headerInfoArrays);
// Emit the message to subscribers of the rnti
const a = performance.now();
io.emit(
measurementHeader.nrCellId,
JSON.stringify({ measurementData, measurementHeader }),
);
// Emit the message to the all channel
io.emit("all", JSON.stringify({ measurementData, measurementHeader }));
const b = performance.now();
console.log("time to emit: ", a - b);
}
});
There is data coming in rapidly, about one message per ms, to the zmqRouter object, which it then processes and sends onto the main process where I use socket.io to distribute the data to clients. But as soon as the stream begins, node can't do anything else. Even a setInterval log stops working when the stream begins.
Thank you for your help!
Trying out my first NodeJS cloud function so far unsuccessfully despite working fine VS code. Getting following error
Function cannot be initialized. Error: function terminated.
Looking through the logs I see some potential issues
Detailed stack trace: ReferenceError: supabase_public_url is not defined
Provided module can't be loaded (doesn't specify)
Thoughts: Am I doing it wrong with the secret manager and using the pub/sub incorrect?
My Code index.js
import { createClient } from '#supabase/supabase-js'
import sgMail from "#sendgrid/mail"
import { SecretManagerServiceClient } from '#google-cloud/secret-manager'
//activate cloud secret manager
const client = new SecretManagerServiceClient()
const supabaseUrl = client.accessSecretVersion(supabase_public_url)
const supabaseKey = client.accessSecretVersion(supabase_service_key)
const sendgridKey = client.accessSecretVersion(sendgrid_service_key)
sgMail.setApiKey(sendgridKey)
const supabase = createClient(supabaseUrl, supabaseKey)
// get data for supabase where notifications coins are true
const supabaseNotifications = async() => {
let { data, error } = await supabase
.from('xxx')
.select('*, xxx!inner(coin, xx, combo_change, combo_signal, combo_prev_signal), xxx!inner(email)')
.eq('crypto_signals.combo_change', true)
if(error) {
console.error(error)
return
}
return data
}
//create an array of user emails from supabase data
const userEmail = (data) => {
try {
const emailList = []
for (let i of data) {
if (emailList.includes(i.profiles.email) != true) {
emailList.push(i.profiles.email)
} else {}
}
return emailList
}
catch(e) {
console.log(e)
}
}
// function to take email list and supabase data to generate emails to users
const sendEmail = (e, data ) => {
try {
for (let i of e) {
const signalList = []
for (let x of data) {
if(i == x.profiles.email) {
signalList.push(x)
} else {}
}
// create msg and send from my email to the user
const msg = {
to: i,
from:"xxxx",
subject: "Coin notification alert from CryptoOwl",
text: "One or more of you coins have a new signal",
html: signalList.toString()
}
sgMail.send(msg)
console.log(i)
}
}
catch(e) {
console.log(e)
}
}
// main function combines all 3 functions (supabase is await)
async function main(){
let supabaseData = await supabaseNotifications();
let supabaseEmails = userEmail(supabaseData);
let sendgridEmails = sendEmail(supabaseEmails, supabaseData);
}
exports.sendgridNotifications = (event, context) => {
main()
};
my package.json with type module to use import above
{
"type":"module",
"dependencies":{
"#sendgrid/mail":"^7.6.1",
"#supabase/supabase-js":"1.30.0",
"#google-cloud/secret-manager": "^3.11.0"
}
}
I'm not at all versed in Google Secret Manager but a rapid look at the Node.js library documentation shows (if I'm not mistaking) that accessSecretVersion() is an asynchronous method.
As a matter of facts, we find in the doc examples like the following one:
async function accessSecretVersion() {
const [version] = await client.accessSecretVersion({
name: name,
});
// Extract the payload as a string.
const payload = version.payload.data.toString();
// WARNING: Do not print the secret in a production environment - this
// snippet is showing how to access the secret material.
console.info(`Payload: ${payload}`);
}
See https://cloud.google.com/secret-manager/docs/samples/secretmanager-access-secret-version#secretmanager_access_secret_version-nodejs
I am trying to publish messages via google pubsub batch publishing feature. The batch publishing code looks like below.
const gRPC = require("grpc");
const { PubSub } = require("#google-cloud/pubsub");
const createPublishEventsInBatch = (topic) => {
const pubSub = new PubSub({ gRPC });
const batchPublisher = pubSub.topic(topic, {
batching: {
maxMessages: 100,
maxMilliseconds: 1000,
},
});
return async (logTrace, eventData) => {
console.log("Publishing batch events for", eventData);
try {
await batchPublisher.publish(Buffer.from(JSON.stringify(eventData)));
} catch (err) {
console.error("Error in publishing", err);
}
};
};
And this batch publisher is getting called from a service like this.
const publishEventsInBatch1 = publishEventFactory.createPublishEventsInBatch(
"topicName1"
);
const publishEventsInBatch2 = publishEventFactory.createPublishEventsInBatch(
"topicName2"
);
events.forEach((event) => {
publishEventsInBatch1(logTrace, event);
publishEventsInBatch2(logTrace, event);
});
I am using push subscription to receive the messages with the below settings.
Acknowledgement deadline: 600 Seconds
Retry policy: Retry immediately
The issue I am facing is, if the total number of events/messages is 250k, the push subscription is supposed to get less than or equal to 250k messages based on the message execution. But in my case, I am getting 3-4 M records on subscription and it is getting varied.
My fastify and pubsub configuration is
fastify: 3.10.1
#google-cloud/pubsub: 2.12.0
Adding the subscription code
fastify.post("/subscription", async (req, reply) => {
const message = req.body.message;
let event;
let data;
let entityType;
try {
let payload = Buffer.from(message.data, "base64").toString();
event = JSON.parse(payload);
data = event.data;
entityType = event.entityType;
if (entityType === "EVENT") {
if (event.version === "1.0") {
console.log("Processing subscription");
await processMessage(fastify, data);
} else {
console.error("Unknown version of stock event, being ignored");
}
} else {
console.error("Ignore event");
}
reply.code(200).send();
} catch (err) {
if (err.status === 409) {
console.error("Ignoring stock update due to 409: Conflict");
reply.code(200).send();
} else {
console.error("Error while processing event from subscription");
reply.code(500).send();
}
}
});
Can any one guide me where I am doing the mistakes. It's a simple fastify application. Do I am making any mistake in coding or any configuration.
I'm using MQTTjs module in a Node app to subscribe to an MQTT broker.
I want, upon receiving new messages, to store them in MongoDB with async functions.
My code is something as:
client.on('message', (topic, payload, packet) => {
(async () => {
await msgMQTT.handleMQTT_messages(topic, payload, process.env.STORAGE,
MongoDBClient)
})
})
But I can't understand why it does not work, i.e. it executes the async function but any MongoDB query returns without being executed. Apparently no error is issued.
What am I missing?
I modified the code in:
client.on('message', (topic, payload, packet) => {
try {
msgMQTT.handleMQTT_messages(topic, payload, process.env.STORAGE,
MongoDBClient, db)
} catch (error) {
console.error(error)
}
})
Where:
exports.handleMQTT_messages = (topic, payload, storageType, mongoClient, db) => {
const dateFormat = 'YYYY-MMM-dddd HH:mm:ss'
// topic is in the form
//
const topics = topic.split('/')
// locations info are at second position after splitting by /
const coord = topics[2].split(",")
// build up station object containing GeoJSON + station name
//`Station_long${coord[0]}_lat${coord[1]}`
const stationObj = getStationLocs(coord.toString())
const msg = JSON.parse(payload)
// what follows report/portici/
const current_topic = topics.slice(2).join()
let data_parsed = null
// parse only messages having a 'd' property
if (msg.hasOwnProperty('d')) {
console.log(`${moment().format(dateFormat)} - ${stationObj.name} (topic:${current_topic})\n `)
data_parsed = parseMessages(msg)
// date rounded down to the nearest hour
// https://stackoverflow.com/questions/17691202/round-up-round-down-a-momentjs-moment-to-nearest-minute
dateISO_String = moment(data_parsed.t).startOf('hour').toISOString();
// remove AQ from station name using regex
let station_number = stationObj.name.match(/[^AQ]/).join('')
let data_to_save = {
id: set_custom_id(stationObj.name, dateISO_String),
//`${station_number}${moment(dateISO_String).format('YMDH')}`,
date: dateISO_String,
station: stationObj,
samples: [data_parsed]
}
switch (storageType) {
case 'lowdb':
update_insertData(db, data_to_save, coll_name)
break;
case 'mongodb': // MongoDB Replicaset
(async () => {
updateIoTBucket(data_to_save, mongoClient, db_name, coll_name)
})()
break;
default: //ndjson format
(async () => {
await fsp.appendFile(process.env.PATH_FILE_NDJSON,
JSON.stringify(data_to_save) + '\n')
})()
//saveToFile(JSON.stringify(data_to_save), process.env.PATH_FILE_NDJSON)
break;
}
// show raw messages (not parsed)
const show_raw = true
const enable_console_log = true
if (msg && enable_console_log) {
if (show_raw) {
console.log('----------RAW data--------------')
console.log(JSON.stringify(msg, null, 2))
console.log('--------------------------------')
}
if (show_raw && data_parsed) {
console.log('----------PARSED data-----------')
console.log(JSON.stringify(data_parsed, null, 2))
console.log('--------------------------------')
}
}
}
}
Only updateIoTBucket(data_to_save, mongoClient, db_name, coll_name) is executed asynchrounsly using mgongodb driver.
I am writing one node js service which receives messages using rabbitmq. But I am facing one issue when I am trying to send concurrent requests to my node js service.
Here is amqp subscriber I have written,
const amqp = require('amqplib/callback_api')
let AmqpConnection = {
// some other methods to make connection
// ....
//....
subscribe: function(){
this.withChannel((channel) => {
let defaultQueueName = "my_queue";
channel.assertQueue(defaultQueueName, { durable: true }, function(err, _ok) {
if (err) throw err;
channel.consume(defaultQueueName, AmqpConnection.processMessage);
Logger.info("Waiting for requests..");
});
})
},
processMessage: function(payload){
debugger
try {
Logger.info("received"+(payload.content.toString()))
}
catch(error){
Logger.error("ERROR: "+ error.message)
//Channel.ack(payload)
}
}
}
And now I am trying to publish messages to this using publisher,
const amqp = require('amqplib/callback_api')
let Publisher = {
// some other methods to make connection
// ....
// ....
sendMessage: function(message){
this.withChannel((channel) => {
let exchangeName = 'exchange';
let exchangeType = 'fanout';
let defaultQueueName = 'my_queue';
channel.assertExchange(exchangeName, exchangeType)
channel.publish(exchangeName, defaultQueueName, new Buffer(message));
})
}
}
let invalidMsg = JSON.stringify({ "content": ""})
let correctMsg = JSON.stringify({ "content": "Test message"})
setTimeout(function () {
for(let i=0; i<2; i++){
Publisher.sendMessage(correctMsg)
Publisher.sendMessage(invalidMsg)
}
}, 3000)
But when I execute both publisher and subscriber, I get following output on subscriber side
2017-02-18T11:27:55.368Z - info: received{"content":""}
2017-02-18T11:27:55.378Z - info: received{"content":""}
2017-02-18T11:27:55.379Z - info: received{"content":""}
2017-02-18T11:27:55.380Z - info: received{"content":""}
It seems like concurrent requests are overriding message received. Can someone help here?