I'm currently working on a project in express and I'm using knex.js to handle migrations and queries.
I'm still trying to grasp the concept of promises and how I can run multiple queries with knex.
I have the following code which inserts a new record into my database, this is located in my Unit model file.
this.addUnit = function(unit_prefixV, unit_nameV, unit_descriptionV, profile_id) {
return new Promise(function(resolve, reject) {
knex.insert({ unit_prefix: unit_prefixV, unit_name: unit_nameV, unit_description: unit_descriptionV })
.into('units').then(function(unit) {
resolve(unit)
}).catch(function(error) {
reject(error)
})
})
}
In my routes.js file I then call this on a post request, like so:
app.post('/dashboard/unit/add', ensureAuthenticated, function(req, res) {
let postErrors = []
if (req.body.unit_name.trim() == "") {
postErrors.push('Unit name cannot be empty.')
}
if (req.body.unit_prefix.trim() == "") {
postErrors.push('Unit prefix cannot be empty.')
}
if (req.body.unit_description.trim() == "") {
postErrors.push('Unit description cannot be empty.')
}
if (postErrors.length > 0) {
res.render('addUnit', { errors: postErrors, user: req.user })
} else {
unitModel.addUnit(req.body.unit_prefix.trim(), req.body.unit_name.trim(), req.body.unit_description.trim(), req.session.passport.user.id).then(function(unit) {
res.redirect('/dashboard')
})
}
})
This successfully inserts a new record into my units table, however, I would like to select the user id from the users table with the matching profile_id and then insert another record into my users_units table. All within the this.addUnit function.
For reference my users table consists of:
id
google_id
my users_units table consists of:
user_id
unit_id
I've made an attempt to chain the queries but it only executed the initial insert query and not the others. Here is that rather ugly attempt:
this.addUnit = function(unit_prefixV, unit_nameV, unit_descriptionV, profile_id) {
return new Promise(function(resolve, reject) {
knex.insert({ unit_prefix: unit_prefixV, unit_name: unit_nameV, unit_description: unit_descriptionV })
.into('units').then(function(unit) {
knex('users').where({ "google_id": profile_id }).select('id').then(function(uid) {
knex.insert({ user_id: uid, unit_id: unit }).into('users_units').then(function(user_units) {
resolve(user_unit)
}).catch(function(error) {
reject(error)
})
resolve(uid)
})
console.log(unit)
resolve(unit)
}).catch(function(error) {
reject(error)
})
})
}
Any help will be greatly appreciated!
You're nearly there. There are just a few simple point to grasp :
A Promise can be reolved only once
An explicit Promise is not needed anyway because a naturally occurring promise can be returned
return a Promise at each stage ...
... until the innermost stage, from which the returned value is the finally delivered result.
Errors needn't be eplicitly handled unless you want to inject your own custom error messages or take remedial action.
Having taken all that on board, you might write :
this.addUnit = function(unit_prefixV, unit_nameV, unit_descriptionV, profile_id) {
return knex.insert({ 'unit_prefix':unit_prefixV, 'unit_name':unit_nameV, 'unit_description':unit_descriptionV }).into('units')
// ^^^^^^
.then(function(unit) {
return knex('users').where({ 'google_id':profile_id }).select('id')
// ^^^^^^
.then(function(uid) {
return knex.insert({ 'unit_id':unit, 'user_id':uid }).into('users_units')
// ^^^^^^
.then(function(user_units) {
return { 'unit_id':unit, 'user_id':uid, 'user_units':user_units };
// ^^^^^^
});
});
});
}
If the caller is interested only in success/failure of the process and not the full { unit, uid, user_units } object, then the innermost .then() can be omitted :
this.addUnit = function(unit_prefixV, unit_nameV, unit_descriptionV, profile_id) {
return knex.insert({ 'unit_prefix':unit_prefixV, 'unit_name':unit_nameV, 'unit_description':unit_descriptionV }).into('units')
.then(function(unit) {
return knex('users').where({ 'google_id':profile_id }).select('id')
.then(function(uid) {
return knex.insert({ 'unit_id':unit, 'user_id':uid }).into('users_units');
});
});
}
The promise returned by .addUnit() will still deliver user_units, which the caller can use or ignore.
There's a major proviso to these solutions (and others); a multi-stage update query like this should really be wrapped in a transaction - ie something that allows earlier stages to be rolled back. Otherwise a failure part way through is likely to leave the database in some indeterminate state. This answer is as good a starting point as any.
Related
I'm currently implementing admin dashboard of online shopping app.I want to implement method to perform user deletion and store that deleted user data temporally on another collection.
(Copy_userdata->Save it on another collection -> delete original data)
As an example my users data currently available in collection called users, and after deleting that user particular user's data must be available in another collection, lets say deleted_users collection. Are there any easy way to do that? thanks!
You will be modify some of the code but this is the basic logic,
Use aggregation for copy collections over
Refer here for aggregate function using mongo client
So the function looks like this
public aggregation(collectionName: string, pipelines: Object[]): Promise<Array<any>>
{
return new Promise((resolve, reject) =>
{
let cursor: mongodb.AggregationCursor<any> = null;
//Here you will use getCollection method on your own to fetch the collection
this.getCollection(collectionName)
.then((collection: mongodb.Collection) =>
{
cursor = collection.aggregate(pipelines);
return cursor.toArray();
})
.then((result: Array<any>) =>
{
return resolve(result);
})
.catch((error: any) =>
{
//error//
});
}
public dropCollection(collectionName: string): Promise<any>
{
return new Promise((resolve, reject) =>
{
this.getCollection(collectionName)
.then((collection: mongodb.Collection) =>
{
collection.drop((err: Error, result: any) =>
{
if (err)
{
return reject(DataDropError);
}
return resolve(result);
});
})
.catch(reject);
});
}
public async backupAndDrop()
{
const Object = [ { $match: {} }, { $out: "DeletedCollection" } ];
try
{
await this.aggregationPipeline("originalCollection", Object);
await this.dropCollection("originalCollection");
}
catch (e)
{
throw e;
}
}
Also try to run this on your mongo shell:
db.originalCollection.aggregate([ { $match: {} }, { $out: "Backup" } ])
Why don't you add a flag like isDeleted which is false by default and then make it true when the user is deleted?
You can do something like this...
Client.connect(connection_string, function(err, db) {
if(err){
console.log(err);
}
else{
db.collection(CollectionA).find().forEach(function(d){ db.collection(CollectionB).insert(d); });
}
Try out if it works.
This can help too:
How to properly reuse connection to Mongodb across NodeJs application and modules
You can first find the record to be deleted and do a create with that data to the new collection and then delete the record.
db.collection(CollectionA).findOne({userIdTODelete}, function(err, res){
db.collection(CollectionB).insertOne(res, function() {
db.collection(CollectionA).deleteOne({userIdTODelete});
})
});
I need to query my database for users based on an array of emails and then execute a function for each result, I do this with eachAsync:
mongoose.model('User')
.find({email: {$in: ['foo#bar.com', 'bar#foo.com']}})
/* -- Run side effects before continuing -- */
.cursor()
.eachAsync((doc) => {
// do stuff
});
The problem I'm having is that I need to return a 404 status if any of the users with the given emails do not exist.
I've been looking through the mongoose docs but I can't seem to find a way of running "side effects" when working with queries. Simply "resolving" the DocumentQuery with .then doesn't work since you can't turn it into a cursor afterwards.
How can I achieve this?
You could try implementing it as shown below. I hope it helps.
// Function using async/await
getCursor: async (_, res) => {
try {
const result = []; // To hold result of cursor
const searchArray = ['foo#bar.com', 'bar#foo.com'];
let hasError = false; // to track error when email from find isn't in the array
const cursor = await mongoose.model('User').find({ email: { $in: searchArray } }).cursor();
// NOTE: Use cursor.on('data') to read the stream of data passed
cursor.on('data', (cursorChunk) => {
// NOTE: Run your side effect before continuing
if (searchArray.indexOf(cursorChunk.email) === -1) {
hasError = true;
res.status(404).json({ message: 'Resource not found!' });
} else {
// Note: Push chunk to result array if you need it
result.push(cursorChunk);
}
});
// NOTE: listen to the cursor.on('end')
cursor.on('end', () => {
// Do stuff or return result to client
if (!hasError) {
res.status(200).json({ result, success: true });
}
});
} catch (error) {
// Do error log and/or return to client
res.status(404).json({ error, message: 'Resource not found!' });
}
}
Hello I am new to Postgresql and I wanted to learn how one handles 0 results as an error is thrown. Essentially I want to get a user if it doesn't exist, return null if one doesn't, and have an error handler. Below is the current code I am using. Any tips on a better way to do this are appreciated!
var options = {
// Initialization Options
promiseLib: promise
};
var pgp = require('pg-promise')(options);
var connectionString = 'postgres://localhost:5432/myDbName';
var db = pgp(connectionString);
function getUser(id) {
let user = new Promise(function(resolve, reject) {
try {
db.one('select * from users where loginName = $1', id).then(function(data) {
console.log(data);
resolve(data);
}).catch (function (e) {
console.log('error: '+e);
reject(e);
});
}
catch (e) {
console.log('error: '+e);
reject(e);
}
});
return user;
}
output in console:
error: QueryResultError {
code: queryResultErrorCode.noData
message: "No data returned from the query."
received: 0
query: "select * from users where loginName = 'someUserName'"
}
I am the author of pg-promise.
In the realm of promises one uses .then to handle all normal situations and .catch to handle all error situations.
Translated into pg-promise, which adheres to that rule, you execute a database method that resolves with results that represent all the normal situations, so anything else ends up in .catch.
Case in point, if returning one or no rows is a normal situation for your query, you should be using method oneOrNone. It is only when returning no row is an invalid situation you would use method one.
As per the API, method oneOrNone resolves with the data row found, or with null when no row found, which you can check then:
db.oneOrNone('select * from users where loginName = $1', id)
.then(user=> {
if (user) {
// user found
} else {
// user not found
}
})
.catch(error=> {
// something went wrong;
});
If, however, you have a query for which returning no data does represent an error, the proper way of checking for returning no rows would be like this:
var QRE = pgp.errors.QueryResultError;
var qrec = pgp.errors.queryResultErrorCode;
db.one('select * from users where loginName = $1', id)
.then(user=> {
// normal situation;
})
.catch(error=> {
if (error instanceof QRE && error.code === qrec.noData) {
// found no row
} else {
// something else is wrong;
}
});
Similar considerations are made when choosing method many vs manyOrNone (method any is a shorter alias for manyOrNone).
Type QueryResultError has a very friendly console output, just like all other types in the library, to give you a good idea of how to handle the situation.
In your catch handler for the query, just test for that error. Looking at pg-promise source code, a code of noData is 0. So just do something like this:
db.one('select * from users where loginName = $1', id).then(function(data) {
console.log(data);
resolve(data);
}).catch (function (e) {
if(e.code === 0){
resolve(null);
}
console.log('error: '+e);
reject(e);
});
I would like to know if it's possible to run a series of SQL statements and have them all committed in a single transaction.
The scenario I am looking at is where an array has a series of values that I wish to insert into a table, not individually but as a unit.
I was looking at the following item which provides a framework for transactions in node using pg. The individual transactions appear to be nested within one another so I am unsure of how this would work with an array containing a variable number of elements.
https://github.com/brianc/node-postgres/wiki/Transactions
var pg = require('pg');
var rollback = function(client, done) {
client.query('ROLLBACK', function(err) {
//if there was a problem rolling back the query
//something is seriously messed up. Return the error
//to the done function to close & remove this client from
//the pool. If you leave a client in the pool with an unaborted
//transaction weird, hard to diagnose problems might happen.
return done(err);
});
};
pg.connect(function(err, client, done) {
if(err) throw err;
client.query('BEGIN', function(err) {
if(err) return rollback(client, done);
//as long as we do not call the `done` callback we can do
//whatever we want...the client is ours until we call `done`
//on the flip side, if you do call `done` before either COMMIT or ROLLBACK
//what you are doing is returning a client back to the pool while it
//is in the middle of a transaction.
//Returning a client while its in the middle of a transaction
//will lead to weird & hard to diagnose errors.
process.nextTick(function() {
var text = 'INSERT INTO account(money) VALUES($1) WHERE id = $2';
client.query(text, [100, 1], function(err) {
if(err) return rollback(client, done);
client.query(text, [-100, 2], function(err) {
if(err) return rollback(client, done);
client.query('COMMIT', done);
});
});
});
});
});
My array logic is:
banking.forEach(function(batch){
client.query(text, [batch.amount, batch.id], function(err, result);
}
pg-promise offers a very flexible support for transactions. See Transactions.
It also supports partial nested transactions, aka savepoints.
The library implements transactions automatically, which is what should be used these days, because too many things can go wrong, if you try organizing a transaction manually as you do in your example.
See a related question: Optional INSERT statement in a transaction
Here's a simple TypeScript solution to avoid pg-promise
import { PoolClient } from "pg"
import { pool } from "../database"
const tx = async (callback: (client: PoolClient) => void) => {
const client = await pool.connect();
try {
await client.query('BEGIN')
try {
await callback(client)
await client.query('COMMIT')
} catch (e) {
await client.query('ROLLBACK')
}
} finally {
client.release()
}
}
export { tx }
Usage:
...
let result;
await tx(async client => {
const { rows } = await client.query<{ cnt: string }>('SELECT COUNT(*) AS cnt FROM users WHERE username = $1', [username]);
result = parseInt(rows[0].cnt) > 0;
});
return result;
Here is my code :
server.get(url_prefix + '/user/:user_id/photos', function(req, res, next) {
if (!req.headers['x-session-id']) {
res.send({
status: {
error: 1,
message: "Session ID not present in request header"
}
})
} else {
User.findOne({
session_id: req.headers['x-session-id']
}, function(err, user) {
if (user) {
var user_id = req.params.user_id
Album.find({userId : user_id})
.populate('images')
.exec(function (err, albums) {
if (albums) {
albums.forEach(function(album, j) {
var album_images = album.images
album_images.forEach(function(image, i) {
Like.findOne({imageID : image._id, userIDs:user._id}, function(err,like){
if(like){
albums[j].images[i].userLike = true;
}
})
})
})
return res.send({
status: {
error: 0,
message: "Successful"
},
data: {
albums: albums
}
})
} else
return notify_error(res, "No Results", 1, 404)
})
}
else {
res.send({
status: {
error: 1,
message: "Invalid Session ID"
}
})
}
})
}
})
I am trying to add a extra value (albums[j].images[i].userLike = true;) to my images array, which is inside album array.
The problem is return res.send({ send the data before we get response from the foreach
How can I make it work, so that return should happen only after foreach has completed all the iteration
You will have to wait with invoking res.send until you fetched all the likes for all the images in each of the albums. E.g.
var pendingImageLikes = album_images.length;
album_images.forEach(function(image, i) {
Like.findOne({imageID : image._id, userIDs:user._id}, function(err,like){
if (like) {
albums[j].images[i].userLike = true;
}
if (!--pendingImageLikes) {
// we fetched all likes
res.send(
// ...
);
}
});
You might need to special case for album_images.length === 0.
Also, this does not take into account that you have multiple albums with multiple images each. You would have to delay res.send there in a very similar way to make this actually work. You might want to consider using a flow control library like first (or any other of your preference, just search for "flow control library") to make this a bit easier.
Also, you might want to consider not relying on semicolon insertion and manually type your semicolons. It prevents ambiguous expressions and makes the code easier to read.
Since you need your code to wait until all of the find operations have completed, I'd suggest you consider using the async package, and specifically something like each (reference). It makes using async loops cleaner, especially when dealing with MongoDB documents and queries. There are lots of nice features, including the ability to sequentially perform a series of functions or waterfall (when you want to perform a series, but pass the results from step to step).
> npm install async
Add to your module:
var async = require("async");
Your code would look something like this:
albums.forEach(function(album, j) {
async.each(album.images, function(album, done) {
Like.findOne({imageID: image._id, userIDs:user._id}, function(err, like){
if(!err && like){
albums[j].images[i].userLike = true;
}
done(err); // callback that this one has finished
})
})
}, function (err) { // called when all iterations have called done()
if (!err) {
return res.send({
status: {
error: 0,
message: "Successful"
},
data: {
albums: albums
}
});
}
return notify_error(res, "No Results", 1, 404);
});
});