Sequelize Query Execution In Loop - node.js

Below iam calling addUpdateDailyLeads with an array like
[{
"yyyymmdd": "20191124",
"admin_login":"rasheed.s",
"category":"PO",
"amount":10,
"office_id":10000,
"new_leads_attempted":10
},
{
"yyyymmdd": "20191124",
"admin_login":"rasheed.s",
"category":"PO",
"amount":10,
"office_id":10000,
"new_leads_attempted":10
},
{
"yyyymmdd": "20191125",
"admin_login":"prajeed.av",
"category":"FHL",
"amount":10,
"office_id":10000,
"new_leads_attempted":10
}
]
So,key 0 should insert,
key 1 should update because duplicate key constratint,
key 2 will insert,
but im getting duplicate key constraint error on key 1,because array map not waiting for the query to be executed .
const addUpdateDailyLeads = async (req, res) => {
let admin_login,category,office_id,new_leads_attempted,yyyymmdd,where,values;
let data = req.body;
req.body.map(async function(item,i){
admin_login = item.admin_login,
category = item.category,
office_id = item.office_id,
new_leads_attempted = item.new_leads_attempted,
yyyymmdd = item.yyyymmdd;
where = {yyyymmdd:yyyymmdd, admin_login:admin_login, category:category};
values = {yyyymmdd:yyyymmdd, admin_login:admin_login, category:category,office_id:office_id,new_leads_attempted:new_leads_attempted,update_date:moment().format('YYYYMMDDHHmmss')};
console.log("calling ",i);
let chck = await addUpdateDailyLeadsCollection({where:where,values:values})
console.log("")
console.log("called")
})
res.json({ code: '200', message: `Advisor Daily Leads Updated ${admin_login}` });
}
const addUpdateDailyLeadsCollection = async data => {
let transaction;
let where = data.where
let values = data.values
var Sequelize = require("sequelize");
console.log("startef 1");
await AdvisorLeads.findOne({ where: where }, { useMaster: true }).then( async(data)=>{
console.log("waited");
if(data){
await data.update({new_leads_attempted: Sequelize.literal('new_leads_attempted + '+values.new_leads_attempted)}).then(data=>{
console.log("updated")
return Promise.resolve(1);
})
}else{
AdvisorLeads.create(values).then(data=>{
console.log("inserted")
return Promise.resolve(1);
})
}
})
};
final output on console
calling 0
startef 1
waiting 1
calling 1
startef 1
waiting 1
calling 2
startef 1
waiting 1
waited
waited
waited
called
called
called
inserted
inserted
My expected output like
calling 0
startef 1
waiting 1
waited
inserted
called
calling 1
startef 1
waiting 1
waited
updated
called
calling 2
startef 1
waiting 1
waited
inserted
called
Finally whait i need is to wait for each item ,execute all queries and then process next item

I think you can solve by using await on the update and create statements....
But also take a look at the UPSERT method, which could simplify your code quite a bit. From
The Sequelize API Reference: "Insert or update a single row. An update will be executed if a row which matches the supplied values on either the primary key or a unique key is found."
Addendum: for synchronizing async/await, there are many ways to do this, as detailed in this post. Here's some code I set up following the ES7 method:
let params = [{id : 1, sal : 10}, {id : 44, sal: 30}, {id : 1, sal : 20}];
async function doUpsertArrayInSequence(myParams) {
let results = [];
for (let i = 0; i < myParams.length; i++) {
let x = await User.findByPk(myParams[i].id).then(async (u) => {
if (u != null) {
await u.update({ sal : u.sal + myParams[i].sal});
} else {
await User.create({id: myParams[i].id, sal: myParams[i].sal});
}
});
results.push(x);
}
return results;
}
await doUpsertArrayInSequence(params).then(function(result) {
User.findAll().then(proj => {
res.send(proj);
next();
});
})
From the log, I can see
a) a SELECT, followed by an UPDATE or INSERT for each row (in sequence).
b) the 2nd occurrence of id=1 reflects the update of the 1st occurrence.
c) the final findAll reflects all inserts and updates.
HTH

Related

Node-sqlite3: Efficiently select rows by various ids, in a single query

I'd like to write a wrapper function function select(db: any, ids: number[]): Cat[] that returns an array of Cat rows fetched from the DB by ID. The function should return the entire array of rows.
Below is one approach I've written. Instead of calling db.each on every ID in a for-loop as I do below, is it possible to pass my entire ids: number[] array as a parameter to a db.all / db.each query function?
// dbmethods.ts
async function select(db: any, ids: number[]): Promise<Cat[]> {
let query = "SELECT * FROM cats_table WHERE id = ?;";
let cats_back: Cat[] = [];
for (let i = 0; i < ids.length; i++) {
let cat: Promise<Cat> = new Promise(async function (resolve, reject) {
await db.get(query, ids[i], (err: Error, row: any) => {
if (err) {
reject(err);
} else {
let cat: Cat = {
index: row.id,
cat_type: row.cat_type,
health: row.health,
num_paws: row.num_paws
};
resolve(cat);
}
});
});
cats_back.push(await cat);
}
return cats_back;
}
and
// index.ts
let ids = create_many_ids(10_000); // returns an array of unique ordered ints between 0 and 10K
let res = await select(db, ids);
console.log(res); // successfully prints my cats
Benchmarks on my select function above suggest that it takes 300ms to select 10_000 rows by ID. It seems to me that's a little long; 10K rows shouldn't take that long for sqlite's select by id functionality... How can I be more efficient?
SELECT * FROM cats_table WHERE id IN (SELECT value FROM json_each(?));
The query parameter is a string representing a JSON array of ids, e.g., '[1, 2, 4]'
See this tutorial for further details

MongoDB transaction doesn't work as expected - Incorrect total price on “order” document when its "order items" are being updated concurrently

I have 2 collections: order and orderitems where an order consists of order item(s).
The order has field price which should be the sum of the price of the order items only if the status is “new”.
For example, order ABC has 3 order items :
Price: 5, status: “new”
Price: 10, status: “paid”
Price: 15, status: “new”
This order should have the price = 20.
What happens is sometimes the order price does not calculate correctly when the order items’ status is updated concurrently from “new” to “paid” or vice versa.
For example, when order item 1 and 3 are updated to “paid”, the order price is still 20 instead of 0.
My update order item API (Node JS) looks like this:
const orderitem = await services
.query("orderitem")
.findOne({ _id: orderitemId });
const currStatus = orderitem.status;
const newStatus = request.status ?? currStatus;
const currPrice = orderitem.price;
const newPrice = request.price ?? currPrice;
const statusNew = "new";
let inc = 0.0;
if (newStatus === statusNew && currStatus === statusNew) {
inc = newPrice - currPrice;
} else if (newStatus === statusNew && currStatus !== statusNew) {
inc = newPrice;
} else if (newStatus !== statusNew && currStatus === statusNew) {
inc = currPrice * -1;
}
const session = services.common.startDbSession();
try {
await session.withTransaction(async () => {
if (inc !== 0.0) {
await services.order.updateOutstandingBalance(
orderitem.order._id,
inc,
session
);
}
await services
.query("orderitem")
.model.updateOne(params, request)
.session(session);
});
} finally {
await session.endSession();
}
And here is the updateOutstandingBalance function:
updateOutstandingBalance: async (_id, inc = 0.0, session = null) => {
await services
.query("order")
.model.updateOne(
{ _id: _id },
{
$inc: { price: inc },
updatedAt: new Date(),
myLock: { appName: "myApp", pseudoRandom: new ObjectID() },
}
)
.session(session);
}
It seems like the update sometimes just doesn’t $inc correctly.
p.s: The transaction is using write concern "majority" and read concern “local”.
What I have tried so far:
Calculate the total of the order item price using $sum (the read is also inside the transaction) then update the order
Write-lock all the order's orderitems but this caused a lot of spikes on the database operations because of the write retry
Use write concern "majority" and read concern "majority" or “snapshot”
And, this issue still happens.
Any help is really appreciated, thank you.
What happens is sometimes the order price does not calculate correctly when the order items’ status is updated concurrently from “new” to “paid” or vice versa.
For example, when order item 1 and 3 are updated to “paid”, the order price is still 20 instead of 0.
So, you have process A which changes item status and process B which calculates order price.
These operations are linearizable with B executing first and A executing second.
Trigger price update (another one) after status change.

Append Collections together

I have two collections, both have a structure like this:
id trips_in
1 5
2 10
id trips_out
1 6
2 8
My question is how can I combine them into a single collection like such:
id trips_in trips_out
1 5 6
2 10 8
I found out about mapReduce, but its functionality looks like more than what I need. I wrote the following query:
tripsInMap = function() {
var values = {
trips_in: this.trips_in
};
emit(this._id, values);
};
tripsOutMap = function() {
var values = {
trips_out: this.trips_out
};
emit(this._id, values);
};
reduce = function(key, values) {
var result = {
"trips_out" : "",
"trips_in" : ""
};
values.forEach(function(value) {
if(value.trips_out !== null) {result.trips_out = value.trips_out;}
if(value.trips_in !== null) {result.trips_in = value.trips_in;}
});
return result;
}
db.tripsIn.mapReduce(tripsInMap, reduce, {"out": {"reduce": "joined"}});
db.tripsOut.mapReduce(tripsOutMap, reduce, {"out": {"reduce": "joined"}});
However I end up with "trips_in": undefined. I wonder if there is a better method.
While this may not be the fastest way, you could try something like this:
// Create the new collection with data from the tripsIn collection
db.tripsIn.find().forEach( function(trip) {
db.tripsJoined.insert({ _id: trip._id, trips_in: trip.trips_in, trips_out: 0 });
})
// Update the trips_out field in the tripsJoined collection
// using upsert:true to insert records that are not found
db.tripsOut.find().forEach( function(trip) {
db.tripsJoined.update(
{ _id: trip._id},
{$inc: {trips_in: 0, trips_out: trip.trips_out}},
{upsert: true});
})
The first line will iterate through each document in the tripsIn collection and insert a corresponding document in the tripsJoined collection with the trips_out field set.
The second line will iterate over the tripsOut collection, and for each document it will update the corresponding tripsJoined document with the trips_out value.
Note that I added {$inc: {trips_in: 0... and upsert:true. This was done so that if any documents of trips exist in the tripsOut collection that do not have a corresponding _id value in the tripsIn collection, the document is inserted and the trips_in field is initialized to 0.

Copying data from one DB to another with node-sqlite - formatting the 'insert' statement

I'm writing a small utility to copy data from one sqlite database file to another. Both files have the same table structure - this is entirely about moving rows from one db to another.
My code right now:
let tables: Array<string> = [
"OneTable", "AnotherTable", "DataStoredHere", "Video"
]
tables.forEach((table) => {
console.log(`Copying ${table} table`);
sourceDB.each(`select * from ${table}`, (error, row) => {
console.log(row);
destDB.run(`insert into ${table} values (?)`, ...row) // this is the problem
})
})
row here is a js object, with all the keyed data from each table. I'm certain that there's a simple way to do this that doesn't involve escaping stringified data.
If your database driver has not blocked ATTACH, you can simply tell the database to copy everything:
ATTACH '/some/where/source.db' AS src;
INSERT INTO main.MyTable SELECT * FROM src.MyTable;
You could iterate over the row and setup the query with dynamically generated parameters and references.
let tables: Array<string> = [
"OneTable", "AnotherTable", "DataStoredHere", "Video"
]
tables.forEach((table) => {
console.log(`Copying ${table} table`);
sourceDB.each(`select * from ${table}`, (error, row) => {
console.log(row);
const keys = Object.keys(row); // ['column1', 'column2']
const columns = keys.toString(); // 'column1,column2'
let parameters = {};
let values = '';
// Generate values and named parameters
Object.keys(row).forEach((r) => {
var key = '$' + r;
// Generates '$column1,$column2'
values = values.concat(',', key);
// Generates { $column1: 'foo', $column2: 'bar' }
parameters[key] = row[r];
});
// SQL: insert into OneTable (column1,column2) values ($column1,$column2)
// Parameters: { $column1: 'foo', $column2: 'bar' }
destDB.run(`insert into ${table} (${columns}) values (${values})`, parameters);
})
})
Tried editing the answer by #Cl., but was rejected. So, adding on to the answer, here's the JS code to achieve the same:
let sqlite3 = require('sqlite3-promise').verbose();
let sourceDBPath = '/source/db/path/logic.db';
let tables = ["OneTable", "AnotherTable", "DataStoredHere", "Video"];
let destDB = new sqlite3.Database('/your/dest/logic.db');
await destDB.runAsync(`ATTACH '${sourceDBPath}' AS sourceDB`);
await Promise.all(tables.map(table => {
return new Promise(async (res, rej) => {
await destDB.runAsync(`
CREATE TABLE ${table} AS
SELECT * FROM sourceDB.${table}`
).catch(e=>{
console.error(e);
rej(e);
});
res('');
})
}));

Nested transactions with pg-promise

I am using NodeJS, PostgreSQL and the amazing pg-promise library. In my case, I want to execute three main queries:
Insert one tweet in the table 'tweets'.
In case there is hashtags in the tweet, insert them into another table 'hashtags'
Them link both tweet and hashtag in a third table 'hashtagmap' (many to many relational table)
Here is a sample of the request's body (JSON):
{
"id":"12344444",
"created_at":"1999-01-08 04:05:06 -8:00",
"userid":"#postman",
"tweet":"This is the first test from postman!",
"coordinates":"",
"favorite_count":"0",
"retweet_count":"2",
"hashtags":{
"0":{
"name":"test",
"relevancetraffic":"f",
"relevancedisaster":"f"
},
"1":{
"name":"postman",
"relevancetraffic":"f",
"relevancedisaster":"f"
},
"2":{
"name":"bestApp",
"relevancetraffic":"f",
"relevancedisaster":"f"
}
}
All the fields above should be included in the table "tweets" besides hashtags, that in turn should be included in the table "hashtags".
Here is the code I am using based on Nested transactions from pg-promise docs inside a NodeJS module. I guess I need nested transactions because I need to know both tweet_id and hashtag_id in order to link them in the hashtagmap table.
// Columns
var tweetCols = ['id','created_at','userid','tweet','coordinates','favorite_count','retweet_count'];
var hashtagCols = ['name','relevancetraffic','relevancedisaster'];
//pgp Column Sets
var cs_tweets = new pgp.helpers.ColumnSet(tweetCols, {table: 'tweets'});
var cs_hashtags = new pgp.helpers.ColumnSet(hashtagCols, {table:'hashtags'});
return{
// Transactions
add: body =>
rep.tx(t => {
return t.one(pgp.helpers.insert(body,cs_tweets)+" ON CONFLICT(id) DO UPDATE SET coordinates = "+body.coordinates+" RETURNING id")
.then(tweet => {
var queries = [];
for(var i = 0; i < body.hashtags.length; i++){
queries.push(
t.tx(t1 => {
return t1.one(pgp.helpers.insert(body.hashtags[i],cs_hashtags) + "ON CONFLICT(name) DO UPDATE SET fool ='f' RETURNING id")
.then(hash =>{
t1.tx(t2 =>{
return t2.none("INSERT INTO hashtagmap(tweetid,hashtagid) VALUES("+tweet.id+","+hash.id+") ON CONFLICT DO NOTHING");
});
});
}));
}
return t.batch(queries);
});
})
}
The problem is with this code I am being able to successfully insert the tweet but nothing happens then. I cannot insert the hashtags nor link the hashtag to the tweets.
Sorry but I am new to coding so I guess I didn't understood how to properly return from the transaction and how to perform this simple task. Hope you can help me.
Thank you in advance.
Jean
Improving on Jean Phelippe's own answer:
// Columns
var tweetCols = ['id', 'created_at', 'userid', 'tweet', 'coordinates', 'favorite_count', 'retweet_count'];
var hashtagCols = ['name', 'relevancetraffic', 'relevancedisaster'];
//pgp Column Sets
var cs_tweets = new pgp.helpers.ColumnSet(tweetCols, {table: 'tweets'});
var cs_hashtags = new pgp.helpers.ColumnSet(hashtagCols, {table: 'hashtags'});
return {
/* Tweets */
// Add a new tweet and update the corresponding hash tags
add: body =>
db.tx(t => {
return t.one(pgp.helpers.insert(body, cs_tweets) + ' ON CONFLICT(id) DO UPDATE SET coordinates = ' + body.coordinates + ' RETURNING id')
.then(tweet => {
var queries = Object.keys(body.hashtags).map((_, idx) => {
return t.one(pgp.helpers.insert(body.hashtags[i], cs_hashtags) + 'ON CONFLICT(name) DO UPDATE SET fool = $1 RETURNING id', 'f')
.then(hash => {
return t.none('INSERT INTO hashtagmap(tweetid, hashtagid) VALUES($1, $2) ON CONFLICT DO NOTHING', [+tweet.id, +hash.id]);
});
});
return t.batch(queries);
});
})
.then(data => {
// transaction was committed;
// data = [null, null,...] as per t.none('INSERT INTO hashtagmap...
})
.catch(error => {
// transaction rolled back
})
},
NOTES:
As per my notes earlier, you must chain all queries, or else you will end up with loose promises
Stay away from nested transactions, unless you understand exactly how they work in PostgreSQL (read this, and specifically the Limitations section).
Avoid manual query formatting, it is not safe, always rely on the library's query formatting.
Unless you are passing the result of transaction somewhere else, you should at least provide the .catch handler.
P.S. For the syntax like +tweet.id, it is the same as parseInt(tweet.id), just shorter, in case those are strings ;)
For those who will face similar problem, I will post the answer.
Firstly, my mistakes:
In the for loop : body.hashtag.length doesn't exist because I am dealing with an object (very basic mistake here). Changed to Object.keys(body.hashtags).length
Why using so many transactions? Following the answer by vitaly-t in: Interdependent Transactions with pg-promise I removed the extra transactions. It's not yet clear for me how you can open one transaction and use the result of one query into another in the same transaction.
Here is the final code:
// Columns
var tweetCols = ['id','created_at','userid','tweet','coordinates','favorite_count','retweet_count'];
var hashtagCols = ['name','relevancetraffic','relevancedisaster'];
//pgp Column Sets
var cs_tweets = new pgp.helpers.ColumnSet(tweetCols, {table: 'tweets'});
var cs_hashtags = new pgp.helpers.ColumnSet(hashtagCols, {table:'hashtags'});
return {
/* Tweets */
// Add a new tweet and update the corresponding hashtags
add: body =>
rep.tx(t => {
return t.one(pgp.helpers.insert(body,cs_tweets)+" ON CONFLICT(id) DO UPDATE SET coordinates = "+body.coordinates+" RETURNING id")
.then(tweet => {
var queries = [];
for(var i = 0; i < Object.keys(body.hashtags).length; i++){
queries.push(
t.one(pgp.helpers.insert(body.hashtags[i],cs_hashtags) + "ON CONFLICT(name) DO UPDATE SET fool ='f' RETURNING id")
.then(hash =>{
t.none("INSERT INTO hashtagmap(tweetid,hashtagid) VALUES("+tweet.id+","+hash.id+") ON CONFLICT DO NOTHING");
})
);
}
return t.batch(queries);
});
}),

Resources