Knex event on transaction success - node.js

I have a function that takes a transaction object as an argument. Can this function subscribe to an event that fires when the transaction is commited?
function createUser (data, trx) {
trx.on('success', .. )
return User.create(data, { transacting: trx })
}
I don't see anything like that in the source, if not inner/outer transaction can be used somehow.
https://github.com/tgriesser/knex/blob/master/src/transaction.js

I never found better solution. But transaction is event emmiter so you can override default knex functions to emit your custom event.
Override commit to fire event.
knex = require('knex')({...});
const _transaction = knex.transaction;
knex.transaction = (cb) => {
return _transaction(trx => {
const _commit = trx.commit;
trx.commit =async (conn, value) => {
const out = await _commit(conn, value);
trx.emit('commit');
return out;
}
return cb(trx);
})
};
Listen to commit event anywhere in code
knex.transaction(async trx => {
trx.on('commit', async () => {
// fired after commit is done
});
await trx.select().from('users').update({...});
})

Related

Nodejs TCP server process packets in order

I am using a tcp server to recieve and process packets in Node.js. It should recieve 2 packets:
"create" for creating an object in a database. It first checks if the object already exists and then creates it. (-> takes some time process)
"update" for updating the newly created object in the database
For the sake of simplicity, we'll just assume the first step always takes longer than the second. (which is always true in my original code)
This is a MWE:
const net = require("net");
const server = net.createServer((conn) => {
conn.on('data', async (data) => {
console.log(`Instruction ${data} recieved`);
await sleep(1000);
console.log(`Instruction ${data} done`);
});
});
server.listen(1234);
const client = net.createConnection(1234, 'localhost', async () => {
client.write("create");
await sleep(10); // just a cheap workaround to "force" sending 2 packets instead of one
client.write("update");
});
// Just to make it easier to read
function sleep(ms) {
return new Promise((resolve) => {
setTimeout(resolve, ms);
});
}
If i run this code i get:
Instruction create recieved
Instruction update recieved
Instruction create done
Instruction update done
But i want the "create" instruction to block the conn.on('data', func) until the last callback returns asynchronously. The current code tries to update an entry before it is created in the database which is not ideal.
Is there an (elegant) way to achieve this? I suspect some kind of buffer which stores the data and a worker loop of some kind which processes the data? But how do i avoid running an infinite loop which blocks the event loop? (Event loop is the correct term, is it?)
Note: I have a lot more logic to handle fragmentation, etc. But this explains the issue i'm having.
I managed to get it to work with the package async-fifo-queue.
It's not the cleanest solution but it should do what i want and as efficient as possible (using async/await instead of just looping infinitely).
Code:
const net = require("net");
const afq = require("async-fifo-queue");
const q = new afq.Queue();
const server = net.createServer((conn) => {
conn.on('data', q.put.bind(q));
});
server.listen(1234);
const client = net.createConnection(1234, 'localhost', async () => {
client.write("create");
await sleep(10);
client.write("update");
});
(async () => {
while(server.listening) {
const data = await q.get();
console.log(`Instruction ${data} recieved`);
await sleep(1000);
console.log(`Instruction ${data} done`);
}
})();
function sleep(ms) {
return new Promise((resolve) => {
setTimeout(resolve, ms);
});
}
You can pause the socket when you get the "create" event. After it finishes, you can resume the socket. Example:
const server = net.createServer((conn) => {
conn.on('data', async (data) => {
if (data === 'create') {
conn.pause()
}
console.log(`Instruction ${data} recieved`);
await sleep(1000);
console.log(`Instruction ${data} done`);
if (data === 'create') {
conn.resume()
}
});
});
server.listen(1234);
const client = net.createConnection(1234, 'localhost', async () => {
client.write("create");
await sleep(10); // just a cheap workaround to "force" sending 2 packets instead of one
client.write("update");
});
// Just to make it easier to read
function sleep(ms) {
return new Promise((resolve) => {
setTimeout(resolve, ms);
});
}

Why does Async firebase fetching is not working? (NODE JS)

Building a NodeJS REST API.
Trying to send load data from FireBase collection, then sending it to the user (as API response).
Looks like the problem is that it's not waits for the firebase fetch to resolve, but send back a response without the collection data. (tried to use ASYNC-AWAIT but its not working)
exports.getChatMessages = async (req, res, next) => {
const chatId = req.params.chatId
const getChatData = () => {
db
.collection('chats')
.doc(chatId)
.collection('messages')
.orderBy('timeStamp', 'asc')
.onSnapshot((snapshot) => {
snapshot.docs.forEach(msg => {
console.log(msg.data().messageContent)
return {
authorID: msg.data().authorID,
messageContent: msg.data().messageContent,
timeStamp: msg.data().timeStamp,
}
})
})
}
try {
const chatData = await getChatData()
console.log(chatData)
res.status(200).json({
message: 'Chat Has Found',
chatData: chatData
})
} catch (err) {
if (!err.statusCode) {
err.statusCode(500)
}
next(err)
}
}
As you can see, I've used 2 console.logs to realize what the problem, Terminal logs looks like:
[] (from console.logs(chatData))
All messages (from console.log(msg.data().messageContent))
Is there any way to block the code unti the firebase data realy fetched?
If I correctly understand, you want to send back an array of all the documents present in the messages subcollection. The following should do the trick.
exports.getChatMessages = async (req, res, next) => {
const chatId = req.params.chatId;
const collectionRef = db
.collection('chats')
.doc(chatId)
.collection('messages')
.orderBy('timeStamp', 'asc');
try {
const chatsQuerySnapshot = await collectionRef.get();
const chatData = [];
chatsQuerySnapshot.forEach((msg) => {
console.log(msg.data().messageContent);
chatData.push({
authorID: msg.data().authorID,
messageContent: msg.data().messageContent,
timeStamp: msg.data().timeStamp,
});
});
console.log(chatData);
res.status(200).json({
message: 'Chat Has Found',
chatData: chatData,
});
} catch (err) {
if (!err.statusCode) {
err.statusCode(500);
}
next(err);
}
};
The asynchronous get() method returns a QuerySnapshot on which you can call forEach() for enumerating all of the documents in the QuerySnapshot.
You can only await a Promise. Currently, getChatData() does not return a Promise, so awaiting it is pointless. You are trying to await a fixed value, so it resolves immediately and jumps to the next line. console.log(chatData) happens. Then, later, your (snapshot) => callback happens, but too late.
const getChatData = () => new Promise(resolve => { // Return a Promise, so it can be awaited
db.collection('chats')
.doc(chatId)
.collection('messages')
.orderBy('timeStamp', 'asc')
.onSnapshot(resolve) // Equivalent to .onSnapshot((snapshot) => resolve(snapshot))
})
const snapshot = await getChatData();
console.log(snapshot)
// Put your transform logic out of the function that calls the DB. A function should only do one thing if possible : call or transform, not both.
const chatData = snapshot.map(msg => ({
authorID: msg.data().authorID,
messageContent: msg.data().messageContent,
timeStamp: msg.data().timeStamp,
}));
res.status(200).json({
message: 'Chat Has Found',
chatData
})
Right now, getChatData is this (short version):
const getChatData = () => {
db
.collection('chats')
.doc(chatId)
.collection('messages')
.orderBy('timeStamp', 'asc')
.onSnapshot((snapshot) => {}) // some things inside
}
What that means is that the getChatData function calls some db query, and then returns void (nothing). I bet you'd want to return the db call (hopefully it's a Promise), so that your await does some work for you. Something along the lines of:
const getChatData = async () =>
db
.collection('chats')
// ...
Which is the same as const getChatData = async() => { return db... }
Update: Now that I've reviewed the docs once again, I see that you use onSnapshot, which is meant for updates and can fire multiple times. The first call actually makes a request, but then continues to listen on those updates. Since that seems like a regular request-response, and you want it to happen only once - use .get() docs instead of .onSnapshot(). Otherwise those listeners would stay there and cause troubles. .get() returns a Promise, so the sample fix that I've mentioned above would work perfectly and you don't need to change other pieces of the code.

How to promise / await callback function to be called, when function is passed to another function

I have a typescript method that is trying to call a method, on(), that takes in a callback method. I'd like to have myConnect() wait until the callback method is executed. I assume this is a promise, but I'm not sure how to write this method so it waits until the callback is called.
myConnect(): void {
this.innerProducer.connect();
this.innerProducer.on("ready", () => {
Logger.info("producer is ready to produce for topic {}", this.topic);
this.isReadyToProduce = true;
});
}
I attempted to use util.promisify, but it's not quite right:
async connect2() {
this.innerProducer.connect();
const util = require("util");
const readyCallbackFunc = util.promisify(this.innerProducer.on);
await readyCallbackFunc("ready", () => {
Logger.info("producer is ready to produce for topic {}", this.topic);
this.isReadyToProduce = true;
});
}
error: (node:23144) UnhandledPromiseRejectionWarning: TypeError: Cannot read property '_events' of undefined
API is defined here:
https://github.com/Blizzard/node-rdkafka/blob/129cb733f5b3271523fb27cd38c08de0f20e0515/index.d.ts#L196
on<E extends Events>(event: E, listener: EventListener<E>): this;
You need to wrap on function into a Promise.
async myConnect(): Promise<void> {
this.innerProducer.connect();
await new Promise<void>((resolve, reject) => {
// TODO: Add reject call on some error.
this.innerProducer.on("ready", () => {
Logger.info("producer is ready to produce for topic {}", this.topic);
this.isReadyToProduce = true;
resolve();
});
});
}
Then when you need to use use use await myConnect() or myConnect().then(() => { some code here! });

How to abort async function when 'stop' event has been emitted

I trying to make a puppeteer.js bot to be able to pause and resume its work.
In general, i have a class with a dozen of async methods, event emitter and a property called 'state' with setter to change it. When I have event 'stop', I want some async functions to be aborted. How can I achieve this?
I thought i need to observe when this.state becomes 'stop', and run return; but hadn't found any solution.
Then I decided to try to set a handler on an event which changes state to 'stop', but I cannot abort async functions from the handler on the stop event.
constructor() {
this.state = 'none';
this.emiter = new events.EventEmitter();
this.setHandler('stop', () => this.stop());
this.setHandler('resume', () => this.resume());
this.setHandler('onLoginPage', () => this.passAuth());
// ...
// And dozen of other states with its handlers
}
stop= () => this.setState('stoped', true);
resume = () => this.setState(this.getPreviousState());
getPreviousState = () => ...
// Just an example of a state handler. It has async calls as well
// I want to abort this function when event 'stop' is emitted
#errorCatcher()
async passAuth() {
const { credentials } = Setup.Instance;
await this.page.waitForSelector(LOGIN);
await typeToInput(this.page, EMAIL_INPUT, credentials.login);
await typeToInput(this.page, PWD_INPUT, credentials.pass);
await Promise.all([
await this.page.click(LOGIN),
this.page.waitForNavigation({ timeout: 600000 }),
]);
await this.page.waitFor(500);
await DomMutations.setDomMutationObserver(this.page, this.socketEmitter);
// ...
// And dozen of handlers on corresponding state
setState(nextState, resume) {
// Avoiding to change state if we on pause.
// But resume() can force setstate with argument resume = true;
if (this.state === 'stoped' && !resume) return false;
console.log(`\nEmmited FSM#${nextState}`);
this.emiter.emit(`FSM#${nextState}`);
}
setHandler(state, handler) {
this.emiter.on(`FSM#${state}`, async () => {
this.state = state;
console.log(`State has been changed: ${this.getPreviousState()} ==> ${this.state}. Runnig handler.\n`);
//
// On the next line, we run a corresponding handler func,
// like passAuth() for state 'onLoginPage'. It has has to be aborted
// if emiter gets 'FSM#stoped' event.
//
await handler();
});
}
}```
I expect the async functions to be aborted when event emitter emits 'stop';
It is impossible to do it natively.
Alternatively, there are two other way to do it.
check your state after any call of await, for example:
class Stated {
async run() {
await foo()
if(this.stopped) return
await bar()
if(this.stopped) return
await done()
}
}
const s = new Stated()
s.run()
use generator with custom wrapper rather than async/await.
// the wrapper
function co(gen, isStopped = () => false) {
return new Promise((resolve, reject) => {
if (!gen || typeof gen.next !== 'function') return resolve(gen)
onFulfilled()
function onFulfilled(res) {
let ret
try {
ret = gen.next(res)
} catch (e) {
return reject(e)
}
next(ret)
}
function onRejected(err) {
let ret
try {
ret = gen.throw(err)
} catch (e) {
return reject(e)
}
next(ret)
}
function next(ret) {
if (ret.done || isStopped()) return resolve(ret.value)
Promise.resolve(ret.value).then(onFulfilled, onRejected)
}
});
}
// the following is your code:
class Stated {
* run() {
yield foo()
yield bar()
yield done()
}
}
const s = new Stated()
co(s.run(), () => s.stopped)

kafka-node asynchronous consumer handler

That's how my consumer is initialised:
const client = new kafka.Client(config.ZK_HOST)
const consumer = new kafka.Consumer(client, [{ topic: config.KAFKA_TOPIC, offset: 0}],
{
autoCommit: false
})
Now the consumer consumer.on('message', message => applyMessage(message))
The thing is applyMessage talks to the database using knex, the code looks something like:
async function applyMessage(message: kafka.Message) {
const usersCount = await db('users').count()
// just assume we ABSOLUTELY need to calculate a number of users,
// so we need previous state
await db('users').insert(inferUserFromMessage(message))
}
The code above makes applyMessage to execute in parallel for all the messages in kafka, so in the code above given that there are no users in the database yet, usersCount will ALWAYS be 0 even for the second message from kafka where it should be 1 already since first call to applyMessage inserts a user.
How do I "synchronise" the code in a way that all the applyMessage functions run sequentially?
You'll need to implement some sort of Mutex. Basically a class which queues up things to execute synchronously. Example
var Mutex = function() {
this.queue = [];
this.locked = false;
};
Mutex.prototype.enqueue = function(task) {
this.queue.push(task);
if (!this.locked) {
this.dequeue();
}
};
Mutex.prototype.dequeue = function() {
this.locked = true;
const task = this.queue.shift();
if (task) {
this.execute(task);
} else {
this.locked = false;
}
};
Mutex.prototype.execute = async function(task) {
try { await task(); } catch (err) { }
this.dequeue();
}
In order for this to work, your applyMessage function (whichever handles Kafka messages) needs to return a Promise - notice also the async has moved from the parent function to the returned Promise function:
function applyMessage(message: kafka.Message) {
return new Promise(async function(resolve,reject) {
try {
const usersCount = await db('users').count()
// just assume we ABSOLUTELY need to calculate a number of users,
// so we need previous state
await db('users').insert(inferUserFromMessage(message))
resolve();
} catch (err) {
reject(err);
}
});
}
Finally, each invocation of applyMessage needs to be added to the Mutex queue instead of called directly:
var mutex = new Mutex();
consumer.on('message', message => mutex.enqueue(function() { return applyMessage(message); }))

Resources