How can I discover that the connection with Mongo was reestablished? - node.js

I need emit an event when the connection was connected and disconnected. My problem is to detect when the connection was reestablished.
The mongo drive emit um event when the connection was disconnected (on error), but it doesn't emit a event when the connection is reestablished.
MongoClient.connect(mongoUrl, function (err, db) {
if (err) {
console.log('Error on connecting to mongo!');
console.error(err);
reconnect();
return;
}
console.log('Mongo connected!');
emit('connected', db); // This event is emitted only on the first connection.
db.on('error', function (err) {
console.log('Mongo connection broken!');
console.error(err);
emit('disconnected');
});
}
Analyzing the code I discovered the serverConfig object.
db.serverConfig.on('reconnect', function() {
console.log('DB reconnected');
});
Is it a good practice to use this internal object?
Thanks.

According to Mongo driver developers, it's a good practice to use db.serverConfig to subscribe to the 'reconnect' event.
More details in:
Database object emits a 'reconnect' event

Related

How to avoid a broken connection with ORACLEDB? Nodejs

I have this database connection. Inside the function where the comment is located, there is a data update cycle for rest api. The data is updated, but when the data in the Oracle database is updated, the connection may fail and after that all subsequent updated data will get undefined. How can you properly connect to the database so that there are no failures?
oracledb.getConnection(
{
user: db.user,
password: db.password,
connectString: db.connectString
},
connExecute
);
function connExecute(err, connection) {
if (err) {
console.error(err.message);
return;
}
sql = `SELECT * FROM db.test`;
connection.execute(sql, {}, { outFormat: oracledb.OBJECT },
function (err, db) {
if (err) {
console.error(err.message);
connRelease(connection);
return;
}
// data update loop
connRelease(connection);
});
}
function connRelease(connection) {
connection.close(
function (err) {
if (err) {
console.error(err.message);
}
});
}
You should be using a connection pool. Connection pools have built-in logic to detect connections with issues and create new connections transparently. See this series on creating a REST API for more details: https://jsao.io/2018/03/creating-a-rest-api-with-node-js-and-oracle-database/
Keep in mind that issues can still happen, so you have to handle errors as needed for your application.
Mostly you add listener on connection object and on dissociation or failure again create connection. With minor changes you can adopt this approach and use listeners to check if connection is available if not connect again. There could be several reason that results in connection closing better handle exceptions, check if still connected and reconnect in case of error.
Or you can try this NPM this will do reconnection for you
https://www.npmjs.com/package/oracledb-autoreconnect
Ping me if you need calcification.
var dbConfig = {
host: '----',
user: '----',
password: '----',
database: '----',
port: ----
};
var connection;
function handleDisconnect() {
connection = <obj>.getConnection(dbConfig);
// Recreate the connection, since the old one cannot be reused.
connection.connect( function onConnect(err) {
// The server is either down
if (err) {
// or restarting (takes a while sometimes).
console.log('error when connecting to db:', err);
setTimeout(handleDisconnect, 10000);
// We introduce a delay before attempting to reconnect,
}
// to avoid a hot loop, and to allow our node script to
});
// process asynchronous requests in the meantime.
// If you're also serving http, display a 503 error.
connection.on('error', function onError(err) {
console.log('db error', err);
if (err.code == 'PROTOCOL_CONNECTION_LOST') {
handleDisconnect();
// lost due to either server restart, or a
} else {
// connnection idle timeout (the wait_timeout
throw err;
// server variable configures this)
}
});
}
handleDisconnect();

Mongoose connection printing out both console.log when mongo server not running

I have a mongoose connection to a mongodb server. When the server is running and it tries to connect it works fine and only prints out the single statement to the console. But when I haven't turned the mongo server on yet it prints both statements in the order they are in the code. I know this is not a huge error but would like to not have the health check show up as 'up' when the server is actually down.
Mongoose connection code:
mongoose.connect(config.db, {autoReconnect: true}, () => console.log('MongoDB has connected successfully.'));
mongoose.connection.on('error', function() {
console.error('MongoDB Connection Error. Make sure MongoDB is running.');
});
The connect callback receives an error parameter you can check:
mongoose.connect(config.db, {autoReconnect: true}, (err) => {
if (!err) console.log('MongoDB has connected successfully.');
});
You can also separately handle the 'connect' event in the same way you're handling the 'error' event:
mongoose.connection.on('connect', function() {
console.error('MongoDB has connected successfully');
});

Client is trying to get data with Socket.io but doesn't seem to connect

I have set up sockets on my client and server, but I can't seem to get my data to come into my client. It seems they are connecting properly, I just can't get any data to come through. There are no error messages either.
Here is my server code:
io.on('connection', function (socket) {
console.log('a user connected');
socket.on('custom-message', function () {
console.log("Hitting messages socket");
Message.find(function(err, messages){
if(err){
socket.emit('custom-message', err)
} else {
socket.emit('custom-message', messages);
}
})
});
});
Here is the function in the client that connects to the socket:
loadMessagesFromServer: function(){
console.log("About to load messages")
socket.on('custom-message', function(msg){
console.log("connected in client", msg)
});
},
Like I said it is a pretty simple example, I just can't seem to get the data in loadMessagesFromServer .. And there are no erros, the only way I have been debugging is trying different things..
You are listening on the event messages. So you need to emit the same event not socket.emit('messages: err', err). Try with socket.emit("messages", error). Moreover, in your server-side code, you need first to receive a message event and only then your socket will emit the messages. Remove the socket.on(custom-messages). Why do you need it?
Server-side code
io.on('connection', function (socket) {
/* Here a client connection is established. The on("connection")
* event will be triggered as many times as`io.connect("the-uri")`
* is triggered (and succeeded) in the client`
*/
// Listening for the post event
socket.on("post", function(messages){
console.log("client posted data:", messages)
// Find messages and emit result
Message.find(function(err, messages){
if(err){
socket.emit('error', err)
} else {
socket.emit("message", messages);
}
});
});
});
Client-side code
registerOnMessageFromServerListener: function(){
socket.on("message", function(msg){
console.log("received message:", msg);
});
registerOnErrorFromServerListener: function(){
socket.on("error", function(error){
console.log("an error occured:", error);
});
registerOnMessageFromServerListener();
registerOnErrorFromServerListener();
socket.emit("post", "a-message");
Also make sure that you call the loadMessagesFromServer before you establish the socket connection

Improving performance of inserting into Mongo from ActiveMQ

The basic idea of the following code is I read messages off an ActiveMQ Artemis installation and insert them into a MongoDB instance.
It works well for up to a hundred or so messages per second but crashes if I throw a few thousand at it. My first guess would be the constant opening and closing of database connections. Should I also think about using an in-memory store and doing bulk database inserts?
The code is all running in node using the mqtt and mongodb npm packages. The code below, the database and the queue are all running in docker containers if it makes any difference.
var mqtt = require('mqtt'),
client = mqtt.connect('mqtt://mq:1883', {
username: "*************",
password: "*************"
}),
MongoClient = require('mongodb').MongoClient,
ObjectId = require('mongodb').ObjectID,
assert = require('assert'),
url = 'mongodb://db:27017/uo-readings';
client.on('connect', function () {
client.subscribe('readings');
});
client.on('error', function(error){
console.log(error)
});
client.on('message', function (topic, message) {
console.log(message.toString());
MongoClient.connect(url, function(err, db) {
assert.equal(null, err);
console.log("Connected correctly to server.");
db.collection('readings').insertOne(JSON.parse(message.toString()), function(err, result) {
assert.equal(err, null);
console.log("Inserted a document into the readings collection.");
});
client.end(function(){
console.log("Closing Connection.");
db.close();
});
});
});
See #Jonathan Muller's comment above

Maintaining Postgre connection while waiting for new tweet from Twitter Streaming API

I'm writing an app which catches tweets with a specific hashtag using the Twitter Streaming API. Each tweet caught must be inserted into my db and I need to perform another query to my db.
This is a not trending hashtag so let's assume there's around 400 tweets caught per hour. I currently open a new connection to my db each time I catch a tweet, process my queries then close the connection.
t.on('tweet', function(tweet) {
client.connect(function(error) {
if (error) {
console.error('Error while connecting to PostgreSQL DB: ', error);
}
else {
client.query('<MyQuery>',
<MyParameters>,
function (err, result) {
if (err) {
return console.error('error running query', err);
}
else {
client.query('<MyQuery>', function (err, result) {
client.end();
if (err) {
return console.error('error running count query', err);
}
else {
io.emit('infos', {infos: {}});
}
return console.log('Tweet added');
}
});
}
});
}
});
});
I post a tweet after running the server, the process is ok, but when posting another one something like 2 minutes later, I get the following output:
Tweet added
events.js:85
throw er; // Unhandled 'error' event
^
Error: write EPIPE
at exports._errnoException (util.js:746:11)
at WriteWrap.afterWrite (net.js:775:14)
Can you tell me what's the best way to have my app always listening to a new tweet and insert it into the DB? This way by creating a new connection each time (but I don't understand why it's crashing) or by opening a single connection at launch and maintain it opened during several weeks (I guess it's not the good option)?
I finally ended up by instantiating a whole new client each time rather than calling connect() and end() methods.
t.on('tweet', function(tweet) {
client = new pg.Client(conString);

Resources