I've got the following code in my nodejs app as part of an expressjs service:
let db = sqlite.open(conf.database, { Promise }).then( (dba) => {
P.all([dba.exec(accesslog_ddl),dba.exec(tokens_ddl)]).then(()=>{
console.log("Database initialized ...");
});
return dba;
});
let upsertToken = function (token, expire, customer) {
return db.then(d=>{
console.log(`${token}, ${expire}, ${customer}`);
return d.exec("insert into tokens (tokenid, name, email, cell, expire) values (?,?,'none','none','never')",
[token,expire]);
});
};
// ... expressjs setup ...
router.post('/tokens',(req,res) => {
tokens.addToken(req.body.token, req.body.expires, req.body.name)
.then(() => {res.status(201).send()})
.catch((e) => {res.status(500).send(e)});
});
The output in the logs is as follows:
bluhbleh, never, fred flintstone
{ [Error: SQLITE_CONSTRAINT: NOT NULL constraint failed: tokens.name] errno: 19, code: 'SQLITE_CONSTRAINT' }
POST /services/tokens 500 20.287 ms - 2
This proves that token, expire, and customer are all present prior to the exec() call, yet the call fails due to a NOT NULL constraint failure.
A few odd things about this code:
I'm using the promisified node-sqlite package.
I'm using a promisified expressjs router.
I really am new to promises (and nodejs generally), and am probably not doing something right.
My full code is here:
https://bitbucket.org/highaltitudearchery/locker/src/master/
You're using exec wrong. The promisified version of Database#exec only takes a single argument, the SQL to execute.
Related
I set up my nodejs app with qldb to implement a wallet service. Set up some tests with some success tests and some expected error tests and once in a while this error 'BadRequestException: no open transaction' would happen and cause my tests to fail. if I run the test again, they will pass. Again once in a while, this error will happen unexpectedly cause the tests to fail. I noted when commented out my expected error tests and the error didn't happen or did not happen as often. and this error happens not only to the expected error test but to the successful tests.
this is how my tests look like
describe('createWallet()', () => {
it('should return an object with wallet Id', async () => {
let result6 = await controller.createWallet({ body: mocks.walletInfo6.info });
documentId6 = result6.walletId;
expect(result6).to.have.property('walletId').that.is.a.uuid;
});
it('One player should have only one active wallet for each currency', async () => {
try {
let res = await controller.createWallet({ body: mocks.walletInfo1.info });
assert.fail('expected error was not thrown')
} catch (e) {
expect(e.message).to.equal('Player already owns an active wallet in this currency.');
}
});
});
describe('suspendWallet()', () => {
it('should change wallet status to suspend', async () => {
let res = await controller.suspendWallet({ documentId: documentId3 });
await controller.suspendWallet({ documentId: documentId5 });
expect(res).to.be.a.string;
expect(res).to.equal(documentId3);
});
it('should not change wallet status if wallet Id is invalid', async () => {
try {
let res = await controller.suspendWallet({ documentId: mocks.invalidWalletId });
assert.fail('expected error was not thrown')
} catch (e) {
expect(e.message).to.equal('Did not find any record with this document Id.');
}
});
});
It's hard to be certain about how your application is running into this error without looking at how the driver is being used to execute transactions.
The driver APIs (for example - execute ) returns a promise. One way the application could be seeing the "No transaction open error" is that the promise is not resolved before sending further commands.
Cookbook - Refer to the QLDB JS driver cookbook which lists code sample for CRUD operations. Note how the samples use await inside the transactions to wait for the promises to resolve. Not waiting for the promises returned by execute can cause the driver to commit the transaction before the execute call is processed and hence a "No open transaction error".
Sample code for executing transactions -
var qldb = require('amazon-qldb-driver-nodejs');
const qldbDriver = new qldb.QldbDriver("vehicle-registration");
(async function() {
await qldbDriver.executeLambda(async (txn) => {
await txn.execute("CREATE TABLE Person");
});
})();
In case you still face issues, please share the code snippet where you use the driver to execute transactions.
Update on this issue. I use nodejs driver version 2.1.0.
My team and I found out that the problem was because there are rollbacks that happen after the error tests and we don't know when the rollbacks are done. When the rollback of the previous test is still running, the transaction for that test is still open so if the next test tries to open a new transaction it would conflict and would not able to open a new transaction for the next test. To fix this, we just not throw errors inside the transaction to prevent rollbacks to happen. This way works for our code, but a better solution would be a way to detect when the rollback is done from the driver and wait for the transaction to close before opening a new transaction.
When working with a big application that has several tables and several DB operations it's very difficult to keep track of what transactions are occurring. To workaround this we started by passing around a trx object.
This has proven to be very messy.
For example:
async getOrderById(id: string, trx?: Knex.Transaction) { ... }
Depending on the function calling getOrderById it will either pass a trx object or not. The above function will use trx if it is not null.
This seems simple at first, but it leads to mistakes where if you're in the middle of a transaction in one function and call another function that does NOT use a transaction, knex will hang with famous Knex: Timeout acquiring a connection. The pool is probably full.
async getAllPurchasesForUser(userId: string) {
..
const trx = await knex.transaction();
try {
..
getPurchaseForUserId(userId); // Forgot to make this consume trx, hence Knex timesout acquiring connection.
..
}
Based on that, I'm assuming this is not a best practice, but I would love if someone from Knex developer team could comment.
To improve this we're considering to instead use knex.transactionProvider() that is accessed throughout the app wherever we perform DB operations.
The example on the website seems incomplete:
// Does not start a transaction yet
const trxProvider = knex.transactionProvider();
const books = [
{title: 'Canterbury Tales'},
{title: 'Moby Dick'},
{title: 'Hamlet'}
];
// Starts a transaction
const trx = await trxProvider();
const ids = await trx('catalogues')
.insert({name: 'Old Books'}, 'id')
books.forEach((book) => book.catalogue_id = ids[0]);
await trx('books').insert(books);
// Reuses same transaction
const sameTrx = await trxProvider();
const ids2 = await sameTrx('catalogues')
.insert({name: 'New Books'}, 'id')
books.forEach((book) => book.catalogue_id = ids2[0]);
await sameTrx('books').insert(books);
In practice here's how I'm thinking about using this:
SingletonDBClass.ts:
const trxProvider = knex.transactionProvider();
export default trxProvider;
Orders.ts
import trx from '../SingletonDBClass';
..
async getOrderById(id: string) {
const trxInst = await trx;
try {
const order = await trxInst<Order>('orders').where({id});
trxInst.commit();
return order;
} catch (e) {
trxInst.rollback();
throw new Error(`Failed to fetch order, error: ${e}`);
}
}
..
Am I understanding this correctly?
Another example function where a transaction is actually needed:
async cancelOrder(id: string) {
const trxInst = await trx;
try {
trxInst('orders').update({ status: 'CANCELED' }).where({ id });
trxInst('active_orders').delete().where({ orderId: id });
trxInst.commit();
} catch (e) {
trxInst.rollback();
throw new Error(`Failed to cancel order, error: ${e}`);
}
}
Can someone confirm if I'm understanding this correctly? And more importantly if this is a good way to do this. Or is there a best practice I'm missing?
Appreciate your help knex team!
No. You cannot have global singleton class returning the transaction for your all of your internal functions. Otherwise you are trying always to use the same transaction for all the concurrent users trying to do different things in the application.
Also when you once commit / rollback the transaction returned by provider, it will not work anymore for other queries. Transaction provider can give you only single transaction.
Transaction provider is useful in a case, where you have for example middleware, which provides transaction for request handlers, but it should not be started, since it might not be needed so you don't want yet allocate a connection for it from pool.
Good way to do your stuff is to pass transcation or some request context or user session around, so that each concurrent user can have their own separate transactions.
for example:
async cancelOrder(trxInst, id: string) {
try {
trxInst('orders').update({ status: 'CANCELED' }).where({ id });
trxInst('active_orders').delete().where({ orderId: id });
trxInst.commit();
} catch (e) {
trxInst.rollback();
throw new Error(`Failed to cancel order, error: ${e}`);
}
}
Depending on the function calling getOrderById it will either pass a trx object or not. The above function will use trx if it is not null.
This seems simple at first, but it leads to mistakes where if you're in the middle of a transaction in one function and call another function that does NOT use a transaction, knex will hang with famous Knex: Timeout acquiring a connection. The pool is probably full.
We usually do it in a way that if trx is null, query throws an error, so that you need to explicitly pass either knex / trx to be able to execute the method and in some methods trx is actually required to be passed.
Anyhow if you really want to force everything to go through single transaction in a session by default you could create API modules in a way that for each user session you create an API instance which is initialized with transaction:
const dbForSession = new DbService(trxProvider);
const users = await dbForSession.allUsers();
and .allUsers() does something like return this.trx('users');
I've got a function setClaims which is simply setting claims for current transaction. I need to create a helper function which will check if uuid was provided, set claims and return the transaction object. Something like this:
export const useAuthTransaction = async (
kx: Knex,
setClaims: (tx: Knex.Transaction, uuid?: string) => Promise<any>,
uuid?: string,
): Promise<Knex.Transaction> => {
if (!uuid) {
throw { errorCodes: [RemoveQuestionErrorCode.Unauthorized] };
}
return kx.transaction(async tx => {
await setClaims(tx);
return tx;
});
};
But when I try to use it in my resolver:
useAuthTransaction(kx, setClaims, claims?.uid).then(async tx =>
dataSources.questionAPI.update(tx, input));
it says:
Transaction query already complete
How do I resolve transaction context without closing it?
Returning promise from transaction handler automatically commits / rejects it.
So you are using async tx => {...} type of handler which implicitly always returns a promise.
What you want to do is to use the knex.transactionProvider() feature (check examples from docuementation) and then you can pass that transaction around and tell explicitly when it should be committed / rolled back.
I am using Dialogflow to build an Action for Google Assistant. Everything works, except the Fulfillment of my intent.
I am using the Inline Editor (Powered by Cloud Functions for Firebase) to fulfill the intent. My function in itself runs - since I can send text to the assistant from the function.
But for some reason, code execution never enters the the function that fetches data from my Collection on Firebase Firestore - although it does execute commands before and after.
Here is the code in my index.js.
'use strict';
const admin = require('firebase-admin');
const functions = require('firebase-functions');
admin.initializeApp(functions.config().firebase);
let db = admin.firestore();
const {dialogflow} = require('actions-on-google');
const app = dialogflow({debug: true});
app.intent('INTENT', (conv, {ENTITY}) => {
conv.add("Hello."); //THIS IS DISPLAYED
db.collection("COLLECTION").orderBy("FIELD", "desc").get().then(snapshot => {
conv.add("Hey!"); //THIS IS NOT DISPLAYED
snapshot.forEach(doc => {
conv.add("Hi?"); //NOR IS THIS
});
conv.add("Hmm..."); //NEITHER THIS
}).catch(error => {
conv.add('Error!'); //NOT EVEN THIS
});
conv.add("Bye."); //THIS IS DISPLAYED TOO
});
exports.dialogflowFirebaseFulfillment = functions.https.onRequest(app);
Clearly, this shows that execution never really entered the db... block, and hence the function didn't even throw any error.
Here are the logs from Firebase.
Function execution started
Billing account not configured...
Warning, estimating Firebase Config based on GCLOUD_PROJECT. Initializing firebase-admin may fail
Request {...}
Headers {...}
Conversation {...}
Response {...}
Function execution took 1681 ms, finished with status code: 200
I know that the firestore function fetches data asynchronously, but there
seems no way I could execute anything even inside its .then(...) block.
I also tried returning a Promise from the .then(...) block, and using a second .then(...) with it - which again didn't work.
var fetch = db.collection("COLLECTION").orderBy("FIELD", "desc").get().then(snapshot => {
conv.add("Hey!"); //NOT DISPLAYED
var responseArr = [];
snapshot.forEach(doc => {
conv.add("Hi?"); //NOT DISPLAYED
responseArr.push(doc);
});
conv.add("Hmm..."); //NOT DISPLAYED
return Promise.resolve(responseArr);
}).then(fetch => {
conv.add("Here?"); //NOT DISPLAYED
}).catch(error => {
conv.add('Error!'); //NOT DISPLAYED
});
Finally, I also tried putting the firestore function in a separate function, like this.
function getData(){
return db.collection("COLLECTION").orderBy("FIELD", "desc").get().then(snapshot => {
snapshot.forEach(doc => {
...
});
return data; //Manipulated from above. 'data' can be a string.
}).catch(error => {
return error;
});
}
app.intent('INTENT', (conv, {ENTITY}) => {
conv.add("Hello."); //THIS IS DISPLAYED
conv.add(getData()); //THIS IS NOT DISPLAYED
conv.add("Bye."); //THIS IS DISPLAYED
});
The problem is that you're doing an asynchronous operation (the call to get()), but you're not returning a Promise from the Intent Handler itself. The library requires you to return a Promise so it knows that there is an async operation taking place.
Returning a Promise from inside the then() portion isn't enough - that doesn't return a value from the handler, it just returns a value that is passed to the next then() function or (if it was the last one) as the return value of the entire Promise chain.
In your original code, this can be done just by returning the get().then().catch() chain. Something like this as your first line:
return db.collection("COLLECTION").orderBy("FIELD", "desc").get() // etc etc
In your second example, the fetch in your then() block is not the fetch that you think it is, and only confuses matters. Structured that way, you need to return the fetch from the let assignment.
Your third example is more complicated. The line
conv.add(getData());
doesn't even seem like it would work, on the surface, because it is returning a Promise, but you can't add a promise to the conv object. You would need to rewrite that part as
return getData()
.then( data => conv.add( data ) );
But that doesn't address how the "Bye" line would work. If you actually wanted "Bye" after the data, you would have to include it as part of the then() block.
In short, when dealing with async data, you need to
Make sure you understand how Promises work and make sure all async work is done using Promises.
Add all your data inside the then() portion of a Promise
Return a Promise correctly
had following code which worked:
let sdk = new SDK({ name: "somevalue"});
module.exports= ()=> {
sdk.dothing();
}
I then needed to change the parameter to use data from an async function:
let asyncfunc = require('asyncfunc');
let sdk = new SDK({ name: (()=>{
return asyncfunc()
.then((data) => {
return data.value
})
});
module.exports= ()=> {
sdk.dothing();
}
Following the change, the call to new SDK is failing because the parameter passed is {} as the asyncFunc promise has not yet resolved.
I'm getting back into node after a year and new to promises. what is the proper way to do this?
As you've found, you can't pass in a promise to something that's expecting a string. You need to wait for the asynchronous operation to complete.
This means that your SDK won't be ready right away, so you have two options:
Change your module so it returns a promise for the needed value. Anyone who needs to use your module would need to use the returned promise.
Example:
let pSdk = asyncFunc()
.then(data => new SDK({ name: data.value }));
module.exports = () => pSdk.then(sdk => sdk.dothing());
Store an sdk value that's not populated immediately. Users of your module can obtain the SDK instance directly, but it might not be ready when they need it.
Example:
let sdk;
asyncFunc()
.then(data => sdk = new SDK({ name: data.value }));
module.exports = () => {
if(!sdk) { throw new Error("The SDK is not ready yet!"); }
return sdk.dothing();
};
if any bit of code, in node, is asynchronous then immediately next bit of code will be executed. it doesn't matter if the asynchronous code is wrapped in promise or not.( For codes wrapped in the promise the compiler will return a pending promise to be resoled or rejected and proceed to the next bit of code.) When you are creating an object using new SDK({ }) the name is having reference to a pending promise which is yet to be settled that's why your code is failing to fulfill your requirement. You can do it this way to resolve your problem.
asyncfunc()
.then((data) => {
return new SDK({ name: data.value });
}).then(function(sdk){
//do your work here using sdk
})
One important point to be noted here is you can't return from .then() to assign the value to any variable as you are doing. The value returned from .then() will be accessible from the next chained .then() not by outside global variable.Since you are exporting sdk.dothing() so you need to export it inside the last .then()