I need an example how to use promise in node.js. I have a variable connection that must be closed after a call to function finished. Here is the flow how my program should run
var connection = {
/*create connection */
}
/* call to a function */
/* close connection after function finish */
connection.close();
This is an example using bluebird (http://bluebirdjs.com/docs/getting-started.html)
let con = undefined
Promise.try(() => createConnection()) // create connection
.then(_con => {
con = _con // assign connection
// TO DO STUFF
})
.then(() => {
con.close() // close connection
})
.catch(e => {
// handle exception here
})
Related
I need my script to end/exit after it is finished and after several tests I thought the problem was my mongodb-class, which connection i never closed. (when I commented out the class btw it's usage, the script ran through and exited like I want it)
But after I have implemented a closing-method, my script still is alive and I don't know why?
this is my mongo-class:
const MongoClient = require('mongodb').MongoClient
class MongodbClient {
constructor(cfg) {
// CONNECT TO MONGO-ENGINE
this.client = new MongoClient(cfg.mongoUrl, { useUnifiedTopology: true });
this.client.connect();
// CONNECT TO DB
this.db = this.client.db(cfg.mongoDbName);
}
// Close connection
async end() {
this.client.close()
return new Promise((resolve, reject) => {
resolve(true)
})
}
// .. some mehtods
}
module.exports = {
MongodbClient: MongodbClient
}
in my main-script I call a function dosomething() at which end the script needs to exit:
parser.dosomething().then(async() => {
await mongo.end()
})
but the sctipt still lives? why is that?
Promise-Ception 😯😯
Your end method returns another Promise within a Promise
async end() {
/* ... */
return true
}
👆 This async function returns a Promise by itself. For async functions it's important to return something at some point.
In your dosomething method you do the correct thing and use await to resolve the Promise.
await mongo.end();
However it doesn't stop there. The first Promise (async end) returns another Promise
// ...
return new Promise((resolve, reject) => {
return resolve(true)
});
// ...
To completely resolve everything, your dosomething method should eventually do this :
const anotherPromise = await mongo.end();
await anotherPromise();
By this time you will realize that client.close() as well returns a Promise and should be resolved. The whole thing is a bit messy IMHO.
Simplify things
Try this
async end() {
try {
await this.client.close();
return true;
} catch (ex) {
throw ex;
}
}
parser.dosomething().then(async () => {
try {
const closed = await mongo.end();
console.log("connection closed");
} catch (ex) {
console.log(ex.message);
}
});
Remember to use try ... catch blocks when using async/await . If the result is still the same 👆 then the problem lies somewhere else probably.
Simplify some more
end() { return this.client.close() }
Now your end method just returns the unresolved Promise from client.close. Please Note, I removed the async prefix from the end method as it is not needed.
await mongo.end();
I am using a tcp server to recieve and process packets in Node.js. It should recieve 2 packets:
"create" for creating an object in a database. It first checks if the object already exists and then creates it. (-> takes some time process)
"update" for updating the newly created object in the database
For the sake of simplicity, we'll just assume the first step always takes longer than the second. (which is always true in my original code)
This is a MWE:
const net = require("net");
const server = net.createServer((conn) => {
conn.on('data', async (data) => {
console.log(`Instruction ${data} recieved`);
await sleep(1000);
console.log(`Instruction ${data} done`);
});
});
server.listen(1234);
const client = net.createConnection(1234, 'localhost', async () => {
client.write("create");
await sleep(10); // just a cheap workaround to "force" sending 2 packets instead of one
client.write("update");
});
// Just to make it easier to read
function sleep(ms) {
return new Promise((resolve) => {
setTimeout(resolve, ms);
});
}
If i run this code i get:
Instruction create recieved
Instruction update recieved
Instruction create done
Instruction update done
But i want the "create" instruction to block the conn.on('data', func) until the last callback returns asynchronously. The current code tries to update an entry before it is created in the database which is not ideal.
Is there an (elegant) way to achieve this? I suspect some kind of buffer which stores the data and a worker loop of some kind which processes the data? But how do i avoid running an infinite loop which blocks the event loop? (Event loop is the correct term, is it?)
Note: I have a lot more logic to handle fragmentation, etc. But this explains the issue i'm having.
I managed to get it to work with the package async-fifo-queue.
It's not the cleanest solution but it should do what i want and as efficient as possible (using async/await instead of just looping infinitely).
Code:
const net = require("net");
const afq = require("async-fifo-queue");
const q = new afq.Queue();
const server = net.createServer((conn) => {
conn.on('data', q.put.bind(q));
});
server.listen(1234);
const client = net.createConnection(1234, 'localhost', async () => {
client.write("create");
await sleep(10);
client.write("update");
});
(async () => {
while(server.listening) {
const data = await q.get();
console.log(`Instruction ${data} recieved`);
await sleep(1000);
console.log(`Instruction ${data} done`);
}
})();
function sleep(ms) {
return new Promise((resolve) => {
setTimeout(resolve, ms);
});
}
You can pause the socket when you get the "create" event. After it finishes, you can resume the socket. Example:
const server = net.createServer((conn) => {
conn.on('data', async (data) => {
if (data === 'create') {
conn.pause()
}
console.log(`Instruction ${data} recieved`);
await sleep(1000);
console.log(`Instruction ${data} done`);
if (data === 'create') {
conn.resume()
}
});
});
server.listen(1234);
const client = net.createConnection(1234, 'localhost', async () => {
client.write("create");
await sleep(10); // just a cheap workaround to "force" sending 2 packets instead of one
client.write("update");
});
// Just to make it easier to read
function sleep(ms) {
return new Promise((resolve) => {
setTimeout(resolve, ms);
});
}
In node.js I had code like following:
mongoose.connect(dbURI, dbOptions)
.then(() => {
console.log("ok");
},
err => {
console.log('error: '+ err)
}
);
Now i want to do it with async/await syntax. So i could start with var mcResult = await mongoose.connect(dbURI, dbOptions);, afaik it will wait for operation, until it ends with any result (much like calling C function read() or fread() in syncronous mode).
But what should I write then? What does that return to the mcResult variable and how to check for an error or success? Basically I want a similar snippet, but written with proper async/await syntax.
Also I wonder because I have auto reconnect, among dbOptions:
dbOptions: {
autoReconnect: true,
reconnectTries: 999999999,
reconnectInterval: 3000
}
Would it "stuck" on await forever, in case if database connection is unavailble? I hope you can give me a clue on what would happen and how that would work.
Basically I want a similar snippet, but written with proper async/await syntax.
(async () => {
try {
await mongoose.connect(dbURI, dbOptions)
} catch (err) {
console.log('error: ' + err)
}
})()
Please try this, Below code has basics of db connectivity and a query :
const mongoose = require('mongoose');
const Schema = mongoose.Schema;
let url = 'mongodb://localhost:27017/test';
const usersSchema = new Schema({
any: {}
}, {
strict: false
});
const Users = mongoose.model('users', usersSchema, 'users');
/** We've created schema as in mongoose you need schemas for your collections to do operations on them */
const dbConnect = async () => {
let db = null;
try {
/** In real-time you'll split DB connection(into another file) away from DB calls */
await mongoose.connect(url, { useNewUrlParser: true }); // await on a step makes process to wait until it's done/ err'd out.
db = mongoose.connection;
let dbResp = await Users.find({}).lean(); /** Gets all documents out of users collection.
Using .lean() to convert MongoDB documents to raw Js objects for accessing further. */
db.close(); // Needs to close connection, In general you don't close & re-create often. But needed for test scripts - You might use connection pooling in real-time.
return dbResp;
} catch (err) {
(db) && db.close(); /** Needs to close connection -
Only if mongoose.connect() is success & fails after it, as db connection is established by then. */
console.log('Error at dbConnect ::', err)
throw err;
}
}
dbConnect().then(res => console.log('Printing at callee ::', res)).catch(err => console.log('Err at Call ::', err));
As we're talking about async/await then few things I wanted to mention - await definitely needs it's function to be declared as async - otherwise it would throw an error. And it's recommended to wrap async/await code inside try/catch block.
const connectDb = async () => {
await mongoose.connect(dbUri, dbOptions).then(
() => {
console.info(`Connected to database`)
},
error => {
console.error(`Connection error: ${error.stack}`)
process.exit(1)
}
)
}
connectDb().catch(error => console.error(error))
Lets assume the use of then() is prohibited, you can result to this...
const connectDb = async () => {
try {
await mongoose.connect(dbConfig.url, dbConfigOptions)
console.info(`Connected to database on Worker process: ${process.pid}`)
} catch (error) {
console.error(`Connection error: ${error.stack} on Worker process: ${process.pid}`)
process.exit(1)
}
}
I would like to add event listeners to a MongoDB connection to run something when the connection drops, each reconnection attempt and at a successful reconnection attempt.
I read all the official docs and the API, but I can't find a solution.
Currently, I have this, but only the timeout event works.
// If we didn't already initialize a 'MongoClient', initialize one and save it.
if(!this.client) this.client = new MongoClient();
this.connection = await this.client.connect(connectionString, this.settings);
this.client.server.on('connect', event => {
console.log(event);
});
this.client.server.on('error', event => {
console.log(event);
});
this.client.server.on('reconnect', event => {
console.log(event);
});
this.client.server.on('connections', event => {
console.log(event);
});
this.client.server.on('timeout', event => {
console.log(event);
});
this.client.server.on('all', event => {
console.log(event);
});
I tried the events listed here, and they work, but there is no "reconnect" event:
http://mongodb.github.io/node-mongodb-native/2.2/reference/management/sdam-monitoring/
Sure you can. Basically though you need to tap into the EventEmitter at a lower level than basically off the MongoClient itself.
You can clearly see that such things exist since they are visible in "logging", which can be turned on in the driver via the setting:
{ "loggerLevel": "info" }
From then it's really just a matter of tapping into the actual source emitter. I've done these in the following listing, as well as including a little trick for getting the enumerated events from a given emitted, which was admittedly used by myself in tracking this down:
const MongoClient = require('mongodb').MongoClient;
function patchEmitter(emitter) {
var oldEmit = emitter.emit;
emitter.emit = function() {
var emitArgs = arguments;
console.log(emitArgs);
oldEmit.apply(emitter, arguments);
}
}
(async function() {
let db;
try {
const client = new MongoClient();
client.on('serverOpening', () => console.log('connected') );
db = await client.connect('mongodb://localhost/test', {
//loggerLevel: 'info'
});
//patchEmitter(db.s.topology);
db.s.topology.on('close', () => console.log('Connection closed') );
db.s.topology.on('reconnect', () => console.log('Reconnected') );
} catch(e) {
console.error(e)
}
})()
So those two listeners defined:
db.s.topology.on('close', () => console.log('Connection closed') );
db.s.topology.on('reconnect', () => console.log('Reconnected') );
Are going to fire when the connection drops, and when an reconnect is achieved. There are also other things like reconnect attempts which are also in the event emitter just like you would see with the loggerLevel setting turned on.
I have A Mongoose Connection and at some point in my program I need to close it.
after logging the mongoose object several times, I have found that the following workd
mongoose.connection.base.connections[1].close();
Is There A Cleaner way to do this?
To close all connections in the Mongoose connection pool:
mongoose.disconnect();
Docs here.
I used a singleton in my code.
Exporting a promise
In one file I export a promise...
connection.js
const { createConnection } = require('mongoose');
function getConnection() {
return mongoose.createConnection(...).asPromise();
}
const connPromise = getConnection();
module.exports = connPromise;
other files...
const connPromise = require('./connection.js');
async function handler() {
// Connection starts on first module import
// This gets faster when the connection is done being created
const connection = await connPromise;
// Do stuff with the connection
return {
data: '',
and: '',
stuff: '',
};
}
When everything is shutting down, like in a lambda, also in connection.js:
process.on('SIGTERM', async () => {
console.info('[runtime] SIGTERM received');
const conn = await connPromise;
await conn.close();
console.info('[runtime] closed connection');
process.exit(0);
});
So by storing the connection in a promise, I can get it everywhere. I don't export the connection itself because getting it is async.
Exporting a getter function
I could have a function like
let connection = null;
function getConnection() {
if (!connection) {
connection = await createConnection(...).asPromise();
}
return connection;
}
module.exports = getConnection;
other files...
const getConnection = require('./connection.js');
async function handler() {
// Gets faster after the first call.
// connection created on first call.
const connection = await getConnection();
// Do stuff with the connection
return {
data: '',
and: '',
stuff: '',
};
}
Then your shutdown would become
process.on('SIGTERM', async () => {
console.info('[runtime] SIGTERM received');
if (connection) {
await connection.close();
console.info('[runtime] closed connection');
}
process.exit(0);
});
Summary
Overall, I like exporting the promise, since in every lambda function, I'm always using the database.
Since the exported promise means that the connection starts as soon as the module is imported, it's ready to go sooner than if I waited till the invocation of the getConnection() in an event handler.
For having only one connection, this is a good way to get ahold of it.