Knex Inserting Same Record Twice when Query Executed Asynchronously - node.js

In my /api/v1/chats POST route I make an async call to user_chat_queries.addUserChat for each user_id passed in the request body. The idea is that if lots of user_id's come in I don't want to have each insertion await, instead I'd like to to dispatch all the insertions asynchronously so I do:
Asynchronous Route handler (https://github.com/caseysiebel/lang-exchange/blob/master/src/server/routes/chats.js#L57):
await Promise.all(user_ids.map((user_id) => {
return user_chat_queries.addUserChat(user_id, chat.id)
}));
As opposed to,
Synchronously:
for (let user_id of user_ids) {
await user_chat_queries.addUserChat(user_id, chat.id)
}
And in the user_chat_queries (https://github.com/caseysiebel/lang-exchange/blob/master/src/server/db/queries/user_chat.js#L5):
addUserChat: ( async (user_id, chat_id) => {
const user_chat = await userChats
.insert({ user_id, chat_id })
.returning('*')
const data = await db('user_chat').select('*')
return user_chat;
}),
Now the route is accessed from my test file: (https://github.com/caseysiebel/lang-exchange/blob/master/test/routes.chats.test.js#L83)
it('should add 2 user_chats', (done) => {
chai.request(server)
.post('/api/v1/chats')
.send({
created_at: Date.now(),
user_ids: [ 2, 4 ]
})
.end((err, res) => {
should.not.exist(err);
res.status.should.equal(201);
res.type.should.equal('application/json');
res.body.status.should.eql('success');
const chat = res.body.data;
chat.should.include.keys('id', 'created_at');
knex('user_chat')
.select('*')
.then((data) => console.log('data', data))
done();
});
});
The log shows that the { user_id: 4, chat_id: 3 } in inserted in the user_chat table twice.
The expected result (and the result when executed synchronously) is one { user_id: 2, chat_id: 3 } record and one { user_id: 4, chat_id: 3 } record are inserted into the user_chat table.
I can't track down what is causing this. It seems to me that addUserChat should insert a record made from the inputs it is passed each time, regardless of when it resolves.
Full code base: https://github.com/caseysiebel/lang-exchange
Debugging console output: https://gist.github.com/caseysiebel/262997efdd6467c72304ee783dadd9af#file-console-L5

you need to use mapSeries type mechanism (Which is not available on default Promise API). So, you may need to use other 3rd party library like bluebird.
more details bluebird mapSeries

You don't show what userChats is in your code examples, but it looks like you may have a global instance of knex that you reuse across different queries, such as:
const userChats = knex('user_chats')
This is not safe because each query building operation mutates the query builder. You may get away with it if you're running in series but when you run in parallel, both your addUserChat calls are using the same userChats query builder at the same time and weird stuff happens.
Rule of thumb: call knex() once for every query you build.

Related

Delay when removing row in postgres with knex

I have a local postgres database running on my machine. I use node.js to access it. I have a table called 'friends' where every row is a user and a friend. I also have a table called 'users' where every row has all basic info about a user(e.g name and such).
When I want to remove a friendship between two users I have to remove two rows from the 'friends' table. I do this with this function:
const removeFriend = async (clientId, friendId) => {
// initiate transaction
return db
.transaction((trx) => {
// remove friendship from client
trx('friends')
.where({ user_id: clientId, friend_id: friendId })
.del()
.then(() => {
// remove friendship from friend
return trx('friends').where({ user_id: friendId, friend_id: clientId }).del();
})
// if all good then commit
.then(trx.commit)
// if bad then rollback
.catch(trx.rollback);
})
.catch(() => 'error');
};
I call the removeFriend function this way removeFriend(clientId, friendId)
Then when i want to get a list of all friends with their names from the database i use this function:
const getUserFriends = async (clientId) => {
// get friends
return db('friends')
.where({ user_id: clientId })
.join('users', 'users.id', 'friends.friend_id')
.select('friends.friend_id', 'users.name')
.then((friends) => friends)
.catch(() => 'error');
};
I call the getUserFriends function this way await getUserFriends(clientId)
The problem is that when I use removeFriend function and then directly use the getUserFriends function i get a list where the users are still friends. However, If i look in the database the rows have been deleted so naturaly i should get a list where the users are not friends. Do I use the await wrong or something?

How can I lookup which objects in a MongoDB collection contain a field with specific information in it?

I have a web application which allows user to bring up a page, called "Cruise Ship". On that page, I would like users to be able to see other cruise ships from the same cruise line. My MongoDB model for the ships has cruiseLine as a field. This is how I have my view controller structured:
exports.getShip = catchAsync(async (req, res, next) => {
const ship = await Ship.findOne({ slug: req.params.slug }).populate({
path: 'reviews',
fields: 'review rating user displayDate ratingsQuantity',
});
const reviewCount = await Review.count();
const allShips = await Ship.find();
if (!ship) {
return next(new AppError('Page does not exist. Please try again.', 404));
}
res.status(200).render('ship', {
title: `${ship.shipName} Reviews`,
ship,
allShips,
reviewCount,
});
});
I tried including something like this, but it returns undefined:
const cruiseLineInfo = await Ship.find({ cruiseLine: ship.cruiseLine })
In my attempt, I was hoping that ship.cruiseLine would be interpreted as a the cruise line for the specific ship page (example, "Carnival Cruise Line") and then cruiseLineInfo would contain all of the ship objects that matched the find query. But alas, it has not worked. Any suggestions would be appreciated.
I was able to accomplish this in Pug using inline Javascript, a loop and an "if" statement.
- var thisCruiseLine = ship.cruiseLine
- var thisShipName = ship.shipName
each ship in allShips
if ship.cruiseLine == thisCruiseLine && ship.shipName != thisShipName
p #{ship.shipName}
In Mongoose, .find() only builds a query, but that doesn’t run the query immediately. You could build a query first and then run it some time later.
To execute your query, you can either add a callback as a parameter to .find() or chain an .exec() right behind it:
const cruiseLineInfo = await Ship.find({ cruiseLine: ship.cruiseLine }, (error, response) => {
// do something with the response
})
In your case, the .exec() seems more appropriate:
const cruiseLineInfo = await Ship.find({ cruiseLine: ship.cruiseLine }).exec()

How to insert bulk data to postgresql db from CSV file?

I have to insert more than 100 records which are present in CSV file to PostgreSQL db. So I have tried the below mentioned code, it is reading the data from the file but unable to insert them to PostgreSQL table so is there any other way to perform this? Like csvtojson etc.?
const csv = require('csv');
var csvParser = require('csv-parse');
Controller.uploadCsv = async(data) => {
fs.createReadStream(data.path)
.pipe(csvParser({
delimiter: '\t',
endLine: '\n',
escapeChar: '"',
enclosedChar: '"'
}))
.on('data', function(data) {
console.log(data)// returning in console mentioned below
console.log(data.name) // is undefined
const add = {
name: data.name,
address: data.address,
phoneNo: data.phoneNumber,
email: data.email,
created_at: new Date(),
updated_at: new Date()
};
const result = await models.table.create(add);
})
.on('end', function(data) {
console.log('reading finished')
})
}
router.js
router.post('/file', upload.single('file'),(req, res, next) => {
Controller.uploadCsv(req.file)
.then((result) => res.json(result))
.catch(next)
})
console data
[ 'name',
'address'
'phoneNumber',
'email',
'created_at',
'updated_at']
[ 'aaa',
'delhi',
'1102558888',
'test#gmail.com',
'2017-10-08T06:17:09.922Z',
'2018-10-08T06:17:09.922Z',]
[ 'Oreo',
'bgl',
'1112589633',
'test123#gmail.com',
'2017-10-08T06:17:09.922Z',
'2018-10-08T06:17:09.922Z' ]
Insert the async keyword on the OnData function. Remember, it's not sequencial execution, so the records may be inserted on a completely diferent order between one program execution and another.
Replace:
.on('data', function(data) {
With:
.on('data', async function(data) {
TL;DR. Your code has a minor error that may be causing your problem - it's when you use await, in order to run it you'd need to put async before the function on the data handler - it may work for small files, but please read on it's not the right solution - I added one of the proper ways below.
ES6 async/await is a language construct that allows you to await for resolution of a Promise and continue executing the code in an async function. In your code you do have an async function declaration, however you added await in a non-async function. To clarify - await keyword will only be allowed when the closest function() { is async - in your case it's not.
I actually don't think your code would even compile and after some changes you'd fall straight to a problem mentioned in this question - this is because you're trying to run an asynchronous operation on a synchronous event handler in node. This asynchronous insert to the database will get run, but the end event will fire before the operations are completed.
In order to do this correctly - you could use a transform stream or abandon streaming altogether and simply use an array from CSV (there's more than enough good modules for that). I am however the author of the scramjet framework and I also think this should simply work as you wrote it, or maybe even simpler.
Here's a code that will do what you want:
const {StringStream} = require('scramjet');
Controller.uploadCsv = async(data) =>
fs.createReadStream(data.path)
.pipe(new StringStream('utf-8'))
.CSVParse({
delimiter: '\t',
newline: '\n',
escapeChar: '"',
quoteChar: '"'
})
.map(data => ({
name: data.name,
address: data.address,
phoneNo: data.phoneNumber,
email: data.email,
created_at: new Date(),
updated_at: new Date()
}))
.each(async entry => await models.table.create(entry))
.each(result => log(result)) // if it's worth logging
.run();
Scramjet simply uses streams (all classes extend built-in node.js streams) underneath, but exposes an interface similar to synchronous ones on Array etc. You can run your async operations and it returns a Promise from run operation.

How to update some data based on array value in Mongoose?

I'd like to update some data in Mongoose by using array value that I've find before.
Company.findById(id_company,function(err, company) {
if(err){
console.log(err);
return res.status(500).send({message: "Error, check the console. (Update Company)"});
}
const Students = company.students;
User.find({'_id':{"$in" : Students}},function(err, users) {
console.log(Students);
// WANTED QUERY : Update company = null from Users where _id = Students[];
});
});
Students returns users._id in array with object inside, and I use that to find users object, and then I want to set null a field inside users object, that field named as "company". How I can do that? Thank you.
From what you posted (I took the liberty to use Promises but you can roughly achieve the same thing with callbacks), you can do something like:
User.find({'_id':{"$in" : Students}})
.then( users =>{
return Promise.all( users.map( user => {
user.company = null;
return user.save()
}) );
})
.then( () => {
console.log("yay");
})
.catch( e => {
console.log("failed");
});
Basically, what I'm doing here is making sure .all() user models returned by the .find() call are saved properly, by checking the Promised value returned for .save()ing each of them.
If one of these fails for some reasons, Promise.all() return a rejection you can catch afterhand.
However, in this case, each item will be mapped to a query to your database which is not good. A better strategy would be to use Model.update(), which will achieve the same, in, intrinsically, less database queries.
User.update({
'_id': {"$in": Students}
}, {
'company': <Whatever you want>
})
.then()
use .update but make sure you pass option {multi: true} something like:
User.update = function (query, {company: null}, {multi: true}, function(err, result ) { ... });

Sequelize query inside then()

Trying to:
get a list of users
from the user details get the trips created by the users
and based on the output performing some actions
The following is the code I am trying to run.
models.user.findAll({})
.then(function (users) {
for (var i = 0; i < users.length; i++) {
var userName = users[i].name;
var userEmail = users[i].email;
models.trip.findOne({ attributes: [ [userName, 'name'], [userEmail, 'email'], 'id' ], where: { userId: users[i].id } })
.then(function (trip) {
if (trip == null) {
//Send Emails
}
});
}
})
.catch(function (error){ // enter code here
console.log(">>>>>",error);
});
Due to call back the second Sequelize does not run correctly.
Can you please advice on how to approach this issue? Is it using asyncawait/coyield?
You should debug this by using console.log for example. First you should try to print your first callback result, may be the database connection is not properly configured, may be the table is empty. there are many reasons. also it's more comfortably to use .forEach method instead of 'for' loop
array.forEach((item, i, arr)=>{
///....
})
Don't use the async method in for and any loop. It is better to use Promise.all or any async library.
The code will be like that.
var tasks = []
users.forEach(function (user) {
tasks.push(models.trip({/*some attrs*/}).then(function (trip){
if (trip) return Promise.resolve()
return sendEmailPromise(user)
}))
})
Promise.all(tasks).then(function() {
//done
}).catch(errorHandler)
If the models had associations, such that you could just include the other model and add where clause to it.
Sequelize Docs
User.hasMany(Trips, {foreignKey: 'userId'});
User.findAll({
include: {
model: Trips,
where: {userId: id}
}
});

Resources