UPDATE: SOLUTION FOUND. ARANGODB CLUSTER DOES NOT SUPPORT TRANSACTIONS. IT IS ONLY SUPPORTED ON SINGLE INSTANCES.
I am trying to use the transactions function using arangoJS library. The function that I will present is just a dummy function that inserts two records, and then tries to get a document that doesn't exist. Getting the nonexistent document generates an error, and the transaction must rollback. Indeed, the error gets generated after trying to get the document that doesn't exist. However, the database does not rollback, and the two inserted documents remain inserted in the database. Does anyone know how to solve it?
"updateCustomer" : function (options, cb) {
const action = String(function (params) {
// This code will be executed inside ArangoDB!
const db = require('#arangodb').db;
const aql = require('#arangodb').aql;
const customer = db._collection('customer');
try{
//insert two documents
db._query(aql`INSERT ${params.user} INTO ${customer} Return NEW`);
db._query(aql`INSERT ${params.customer} INTO ${customer} Return NEW`);
//Get a document that doesn't exist
customer.document('does-not-exist');
}catch(e){
throw new Error("Everything is bad");
}
});
let opts = {
collections : {
read : ["customer"],
write : ["customer"]
},
action : action,
params : {user: options, customer: options},
lockTimeout : 5
};
Arango.transaction(opts,(err, result) => {
console.log("err: " + err);
console.log("result: " + JSON.stringify(result));
return cb(err, result);
});
}
"transaction" : function (options, cb) {
utils.dbConnect().transaction(options.collections, options.action, options.params, options.lockTimeout, cb);
}
UPDATE: I tried this transaction on a single instance ArangoDB and it worked. However, it did not work on a cluster. Is there no support for transactions on ArangoDB clusters?
Single document operations are atomic in arangodb clusters. Multi-document are not as of now. We are currently working on ACID for multi-document operations.
Related
I am using Mongoose to access to my database. I need to use transactions to make an atomic insert-update.
95% of the time my transaction works fine, but 5% of the time an error is showing :
"Given transaction number 1 does not match any in-progress transactions"
It's very difficult to reproduce this error, so I really want to understand where it is coming from to get rid of it.
I could not find a very clear explanation about this type of behaviour.
I have tried to use async/await key words on various functions. I don't know if an operation is not done in time or too soon.
Here the code I am using:
export const createMany = async function (req, res, next) {
if (!isIterable(req.body)) {
res.status(400).send('Wrong format of body')
return
}
if (req.body.length === 0) {
res.status(400).send('The body is well formed (an array) but empty')
return
}
const session = await mongoose.startSession()
session.startTransaction()
try {
const packageBundle = await Package.create(req.body, { session })
const options = []
for (const key in packageBundle) {
if (Object.prototype.hasOwnProperty.call(packageBundle, key)) {
options.push({
updateOne: {
filter: { _id: packageBundle[key].id },
update: {
$set: {
custom_id_string: 'CAB' + packageBundle[key].custom_id.toLocaleString('en-US', {
minimumIntegerDigits: 14,
useGrouping: false
})
},
upsert: true
}
}
})
}
}
await Package.bulkWrite(
options,
{ session }
)
for (const key in packageBundle) {
if (Object.prototype.hasOwnProperty.call(packageBundle, key)) {
packageBundle[key].custom_id_string = 'CAB' + packageBundle[key].custom_id.toLocaleString('en-US', {
minimumIntegerDigits: 14,
useGrouping: false
})
}
}
res.status(201).json(packageBundle)
await session.commitTransaction()
} catch (error) {
res.status(500).end()
await session.abortTransaction()
throw error
} finally {
session.endSession()
}
}
I expect my code to add in the database and to update the entry packages in atomic way, that there is no instable database status.
This is working perfectly for the main part, but I need to be sure that this bug is not showing anymore.
You should use the session.withTransaction() helper function to perform the transaction, as pointed in mongoose documentation. This will take care of starting, committing and retrying the transaction in case it fails.
const session = await mongoose.startSession();
await session.withTransaction(async () => {
// Your transaction methods
});
Explanation:
The multi-document transactions in MongoDB are relatively new and might be a bit unstable in some cases, such as described here. And certainly, it has also been reported in Mongoose here. Your error most probably is a TransientTransactionError due to a write-conflict happening when the transaction is committed.
However, this is a known and expected issue from MongoDB and these comments explain their reasoning behind why they decided it to be like this. Moreover, they claim that the user should be handling the cases of write conflicts and retrying the transaction if that happens.
Therefore, looking at your code, the Package.create(...) method seems to be the reason why the error gets triggered, since this method is executing a save() for every document in the array (from mongoose docs).
A quick solution might be using Package.insertMany(...) instead of create(), since the Model.insertMany() "only sends one operation to the server, rather than one for each document" (from mongoose docs).
However, MongoDB provides a helper function session.withTransaction() that will take care of starting and committing the transaction and retry it in case of any error, since release v3.2.1. Hence, this should be your preferred way to work with transactions in a safer way; which is, of course, available in Mongoose through the Node.js API.
The accepted answer is great. In my case, I was running multiple transactions serially within a session. I was still facing this issue every now and then. I wrote a small helper to resolve this.
File 1:
// do some work here
await session.withTransaction(() => {});
// ensure the earlier transaction is completed
await ensureTransactionCompletion(session);
// do some more work here
await session.withTransaction(() => {});
Utils File:
async ensureTransactionCompletion(session: ClientSession, maxRetryCount: number = 50) {
// When we are trying to split our operations into multiple transactions
// Sometimes we are getting an error that the earlier transaction is still in progress
// To avoid that, we ensure the earlier transaction has finished
let count = 0;
while (session.inTransaction()) {
if (count >= maxRetryCount) {
break;
}
// Adding a delay so that the transaction get be committed
await new Promise(r => setTimeout(r, 100));
count++;
}
}
MongoDB version 3.6.7, mongoDB node.js driver version 3.1.10.
Have a function which should add some data to the db in a bulk unordered operation. When bulk.execute() is called the data is inserted into the DB as have manually checked this is the case, however the bulkWriteResult object which should be generated as a result of the execution isn't being returned.
We need that for use in other functions to determine the number of insertions etc. So when the function in question is chained in other functions it returns undefined, when adding bulk.execute().then(console.log) also nothing is logged to the terminal.
The function takes 3 parameters, the MongoClient, the name of the collection and the documents to be inserted into the DB which is an array of documents.
Have also tried adding err and result callbacks too with no luck, none of the console.logs are reached.
batch.execute((err, result) => {
console.log('RESULT INSERTED:', result.nInserted)
console.log('RESULT ERRORS:', result.getWriteErrorCount())
console.log('RESULT INSIGHTS:', result.getWriteErrors())
console.log('ERROR:', err)
})
Any ideas why the bulkWriteResult would not be returned yet the bulk insertion is successful?, function was working and returning the expected object prior to upgrading the mongoDB node driver to 3.1.10.
const insertManyMissingEntries = (database, collectionName, documents) => {
const db = database.db('data')
const collection = db.collection(collectionName)
const batch = collection.initializeUnorderedBulkOp()
documents.forEach(doc => {
batch
.find({ year: doc.year, month: doc.month, code: doc.code })
.upsert()
.updateOne({ '$setOnInsert': doc })
})
return batch.execute()
}
Here is your function
const insertManyMissingEntries = (database, collectionName, documents) => {
if (documents.length === 0)
return Promise.reject('documents length is zero')
const operations = documents.map((doc) => {
const filter = {year: doc.year, month: doc.month, code: doc.code}
const update = {$setOnInsert: doc}
return {updateOne: {'filter': filter, 'update': update, 'upsert': true}}
})
const db = database.db('data')
const collection = db.collection(collectionName)
const bulkOptions = {ordered: false}
return collection.bulkWrite(operations, bulkOptions)
}
If documents parameters is empty i.e. length = 0 the collection bulk write function will always fail. Watch it.
Also you need to make sure you have index on fields year, month, code or your bulk will be executing slow because it will every time it will perform a collection scan to find a document.
Remember,
driver docs is your best friend
http://mongodb.github.io/node-mongodb-native/3.1/api/Collection.html
There wasn't a way to resolve this issue though we believe it was caused by a docker mongo image version not being the official one or out of date. Even the folks at mongoDB weren't able to replicate this or explain any circumstances where a promise should fail to either resolve or reject within the timescales the functions were working to so like many things it remains a mystery that we had to workaround by basically using a ternary to handle the undefined that resulted from the promise error without ever actually solving the underlying issue - not ideal or best practice but in absence of other options it's what we did.
Thanks for everyone's help and comments to try to resolve it, much appreciated.
I have a basic NodeJS Couchbase script straight from their documentation. It just inserts a document and immediately N1QL queries the inserted document.
var couchbase = require('couchbase')
var cluster = new couchbase.Cluster('couchbase://localhost/');
cluster.authenticate('admin', 'admini');
var bucket = cluster.openBucket('application');
var N1qlQuery = couchbase.N1qlQuery;
bucket.manager().createPrimaryIndex(function() {
bucket.upsert('user:king_arthur', {
'email': 'kingarthur#couchbase.com', 'interests': ['Holy Grail',
'African Swallows']
},
function (err, result) {
bucket.get('user:king_arthur', function (err, result) {
console.log('Got result: %j', result.value);
bucket.query(
N1qlQuery.fromString('SELECT * FROM application WHERE $1 in
interests LIMIT 1'),
['African Swallows'],
function (err, rows) {
console.log("Got rows: %j", rows);
});
});
});
});
This is returning back
bash-3.2$ node nodejsTest.js Got result:
{"email":"kingarthur#couchbase.com","interests":["Holy
Grail","African Swallows"]}
Got rows: []
I was expecting the inserted document in the "rows" array.
Any idea why this very basic nodeJS starter script is not working?
Key/value read writes are always consistent (that is, if you write a document, then retrieve it by ID, you will always get back what you just wrote). However, updating an index for N1QL queries takes time and can affect performance.
As of version 5.0, you can control your consistency requirements to balance the trade-off between performance and consistency. By default, Couchbase uses the Not_bounded mode. In this mode, queries are executed immediately, without waiting for indexing to catch up. This causes the issue you're seeing. Your query is executing before the index has been updated with the mutation you made.
You can read about this further here: https://developer.couchbase.com/documentation/server/current/indexes/performance-consistency.html
I am working on a nodeJS application that inserts documents into a mongoDB collection. The document has a createdAt field that records the time of creation. I also have a function that logs the most recent match of a query to the console.
Code:
function getResult(player){
let timeList = [];
MongoClient.connect(url, (err, db) => {
if(err){
console.log(err);
}
else{
let count = db.collection('PokeBook').count({ 'player1': player }); //count function returns a promise
count.then((val) => {
db.collection('PokeBook').find({ 'player1': player }).forEach((doc) => {
timeList.push(doc.createdAt);
console.log(doc.createdAt);
if(val===timeList.length){
myEmitter.emit('full', timeList);
}
});
});
}
});
}
//code for the emitter:
myEmitter.on('full', (arr) => {
console.log('full emitted');
console.log(moment.max(arr));
});
the code returns an error saying moments[i].isValid is not a function. Commenting the moment.max line of code results in successfully logging "full emitted" to the console.
Any advice on why this happens and how to fix this will be much appreciated. :)
moment.max() expects an array of Moment objects, but doc.createdAt is (probably) a regular Date object. You can try to replace:
timeList.push(doc.createdAt);
with:
timeList.push(moment(doc.createdAt));
so arr would be an array of Moment objects.
Alternatively, you can implement max yourself w/o using moment, assuming doc.createdAt is either Date or Number:
console.log(arr.sort()[arr.length-1]);
I have a mongodb Relationships collection that stores the user_id and the followee_id(person the user is following). If I query for against the user_id I can find all the the individuals the user is following. Next I need to query the Users collection against all of the returned followee ids to get their personal information. This is where I confused. How would I accomplish this?
NOTE: I know I can embed the followees in the individual user's document and use and $in operator but I do not want to go this route. I want to maintain the most flexibility I can.
You can use an $in query without denormalizing the followees on the user. You just need to do a little bit of data manipulation:
Relationship.find({user_id: user_id}, function(error, relationships) {
var followee_ids = relationships.map(function(relationship) {
return relationship.followee_id;
});
User.find({_id: { $in: followee_ids}}, function(error, users) {
// voila
});
};
if i got your problem right(i think so).
you need to query each of the "individuals the user is following".
that means to query the database multiple queries about each one and get the data.
because the queries in node.js (i assume you using mongoose) are asynchronies you need to get your code more asynchronies for this task.
if you not familier with the async module in node.js it's about time to know it.
see npm async for docs.
i made you a sample code for your query and how it needs to be.
/*array of followee_id from the last query*/
function query(followee_id_arr, callback) {
var async = require('async')
var allResults = [];
async.eachSerias(followee_id_arr, function (f_id, callback){
db.userCollection.findOne({_id : f_id},{_id : 1, personalData : 1},function(err, data){
if(err) {/*handel error*/}
else {
allResults.push(data);
callback()
}
}, function(){
callback(null, allResults);
})
})
}
you can even make all the queries in parallel (for better preformance) by using async.map