Knex.js: Create table and insert data - node.js

Given that I have a Knex.js script like this:
exports.up = function(knex, Promise) {
return knex.schema.createTable('persons', function(table) {
table.increments('id').primary();
table.string('name').notNullable();
});
};
which currently creates a table.
How do I add subsequent insert statements to this script?
What I want to do is add a line like this (or similar):
knex.insert({id: 1, name: 'Test'}).into('persons')
I'm not sure I understand how this promise-based approach works. Am I supposed to write another script with insert statements? Or can I somehow append them to my existing script?
Unfortunately, I don't find any complete example of create + insert in Knex.js documentation.

The then method returns a Promise, which you can use to implement insertion after you have created the table. For example:
exports.up = function (knex, Promise) {
return Promise.all([
knex.schema.createTableIfNotExists("payment_paypal_status", function (table) {
table.increments(); // integer id
// name
table.string('name');
//description
table.string('description');
}).then(function () {
return knex("payment_paypal_status").insert([
{name: "A", description: "A"},
{name: "B", description: "BB"},
{name: "C", description: "CCC"},
{name: "D", description: "DDDD"}
]);
}
),
]);
};
exports.down = function (knex, Promise) {
return Promise.all([
knex.schema.dropTableIfExists("payment_paypal_status")
]);
};

With modern Javascript's await/async keywords, you could do it like this:
exports.up = async function(knex) {
await knex.schema.createTable('persons', function(table) {
table.increments('id').primary();
table.string('name').notNullable();
});
// You could replace "return" by "await" here if you wish.
return knex.insert({id: 1, name: 'Test'}).into('persons');
};
It's basically the same, except using await/async instead of then.

The then method returns a Promise, which you can use to implement insertion after you have created the table. For example:
exports.up = (knex) => {
return knex.schema
.createTable("payment_paypal_status", (table) => {
table.increments()
table.string("name")
table.string("description")
})
.then(() =>
knex("payment_paypal_status").insert([
{name: "A", description: "A"},
{name: "B", description: "BB"},
{name: "C", description: "CCC"},
{name: "D", description: "DDDD"},
])
)
}
exports.down = (knex) => {
return knex.schema.dropTableIfExists("payment_paypal_status")
}
PS.:
Since .createTableIfNotExists actually just generates plain "CREATE TABLE IF NOT EXIST..." query it will not work correctly if there are any alter table queries generated for columns afterwards. To not break old migrations this function is left untouched for now, but it should not be used when writing new code and it is removed from documentation.
I used the Fareed Alnamrouti example/code and follow the suggestion to discard Promise.All by Jonny Rathbone

Related

How to update some data based on array value in Mongoose?

I'd like to update some data in Mongoose by using array value that I've find before.
Company.findById(id_company,function(err, company) {
if(err){
console.log(err);
return res.status(500).send({message: "Error, check the console. (Update Company)"});
}
const Students = company.students;
User.find({'_id':{"$in" : Students}},function(err, users) {
console.log(Students);
// WANTED QUERY : Update company = null from Users where _id = Students[];
});
});
Students returns users._id in array with object inside, and I use that to find users object, and then I want to set null a field inside users object, that field named as "company". How I can do that? Thank you.
From what you posted (I took the liberty to use Promises but you can roughly achieve the same thing with callbacks), you can do something like:
User.find({'_id':{"$in" : Students}})
.then( users =>{
return Promise.all( users.map( user => {
user.company = null;
return user.save()
}) );
})
.then( () => {
console.log("yay");
})
.catch( e => {
console.log("failed");
});
Basically, what I'm doing here is making sure .all() user models returned by the .find() call are saved properly, by checking the Promised value returned for .save()ing each of them.
If one of these fails for some reasons, Promise.all() return a rejection you can catch afterhand.
However, in this case, each item will be mapped to a query to your database which is not good. A better strategy would be to use Model.update(), which will achieve the same, in, intrinsically, less database queries.
User.update({
'_id': {"$in": Students}
}, {
'company': <Whatever you want>
})
.then()
use .update but make sure you pass option {multi: true} something like:
User.update = function (query, {company: null}, {multi: true}, function(err, result ) { ... });

Knex Inserting Same Record Twice when Query Executed Asynchronously

In my /api/v1/chats POST route I make an async call to user_chat_queries.addUserChat for each user_id passed in the request body. The idea is that if lots of user_id's come in I don't want to have each insertion await, instead I'd like to to dispatch all the insertions asynchronously so I do:
Asynchronous Route handler (https://github.com/caseysiebel/lang-exchange/blob/master/src/server/routes/chats.js#L57):
await Promise.all(user_ids.map((user_id) => {
return user_chat_queries.addUserChat(user_id, chat.id)
}));
As opposed to,
Synchronously:
for (let user_id of user_ids) {
await user_chat_queries.addUserChat(user_id, chat.id)
}
And in the user_chat_queries (https://github.com/caseysiebel/lang-exchange/blob/master/src/server/db/queries/user_chat.js#L5):
addUserChat: ( async (user_id, chat_id) => {
const user_chat = await userChats
.insert({ user_id, chat_id })
.returning('*')
const data = await db('user_chat').select('*')
return user_chat;
}),
Now the route is accessed from my test file: (https://github.com/caseysiebel/lang-exchange/blob/master/test/routes.chats.test.js#L83)
it('should add 2 user_chats', (done) => {
chai.request(server)
.post('/api/v1/chats')
.send({
created_at: Date.now(),
user_ids: [ 2, 4 ]
})
.end((err, res) => {
should.not.exist(err);
res.status.should.equal(201);
res.type.should.equal('application/json');
res.body.status.should.eql('success');
const chat = res.body.data;
chat.should.include.keys('id', 'created_at');
knex('user_chat')
.select('*')
.then((data) => console.log('data', data))
done();
});
});
The log shows that the { user_id: 4, chat_id: 3 } in inserted in the user_chat table twice.
The expected result (and the result when executed synchronously) is one { user_id: 2, chat_id: 3 } record and one { user_id: 4, chat_id: 3 } record are inserted into the user_chat table.
I can't track down what is causing this. It seems to me that addUserChat should insert a record made from the inputs it is passed each time, regardless of when it resolves.
Full code base: https://github.com/caseysiebel/lang-exchange
Debugging console output: https://gist.github.com/caseysiebel/262997efdd6467c72304ee783dadd9af#file-console-L5
you need to use mapSeries type mechanism (Which is not available on default Promise API). So, you may need to use other 3rd party library like bluebird.
more details bluebird mapSeries
You don't show what userChats is in your code examples, but it looks like you may have a global instance of knex that you reuse across different queries, such as:
const userChats = knex('user_chats')
This is not safe because each query building operation mutates the query builder. You may get away with it if you're running in series but when you run in parallel, both your addUserChat calls are using the same userChats query builder at the same time and weird stuff happens.
Rule of thumb: call knex() once for every query you build.

Bookshelf.js - Passing models to collection.atttach()

The bookshelf documentation indicates that I should be able to pass an array of models into collection.attach(), and they demonstrate this with the following code:
var admin1 = new Admin({username: 'user1', password: 'test'});
var admin2 = new Admin({username: 'user2', password: 'test'});
Promise.all([admin1.save(), admin2.save()])
.then(function() {
return Promise.all([
new Site({id: 1}).admins().attach([admin1, admin2]),
new Site({id: 2}).admins().attach(admin2)
]);
})
It doesn't seem to work in my case, however. My save operation works fine if I pass in an array of ids:
export function create({ blocks, tags, ...rest }) {
const attributes = {
blocks: JSON.stringify(blocks),
...rest
}
return Post.forge(attributes).save().then(post => {
return post.tags().attach(tags.map(t => t.id)).then(() => post.refresh())
})
}
However, if I try to pass in the tags instead, like this:
return post.tags().attach(tags).then(() => post.refresh())
Then I receive an error:
Unhandled rejection error: column "id" of relation "posts_tags" does not exist
Am I misreading the documentation? Should I not be able to do this?

Mongoose/ express how to retrieve only the values of the array

I am new to node.js and want to do the following thing.
Write a query to fetch the annotation(array values) key from mongoDb and pass this array values [only ] as an argument to the second query.
Here is my code
// create the carousel based on the associated stills using keyword annotations
function findAssociatedArchivalStills(carousels, taskCb){
async.waterfall([
// 1st query
function findAssociatedAnnotations(archiveId, taskCb) {
Archive.findAnnotations(archiveId, function onResult(err,annotations){
console.log(annotations);
taskCb(err,annotations);
});
},
// 2nd query
function findAssociatedStills(annotations,taskCb) {
Still.findAssociatedStills(annotations,function onResult(err,stills){
taskCb(err,stills);
});
},
function buildCarousel(stills,taskCb) {
return taskCb(null, new Carousel({
title: 'Related Stills',
items: stills,
}));
},
], function onFinish(err) {
taskCb(null, carousels);
});
},
// done building the associated Episodes carousel
], function onFinish(err, carousels) {
handleResponse(err, res, carousels);
});
});
The methods are defined as follows
1st query definition in model
schema.statics.findAnnotations = function findAnnotations(archiveId, cb) {
this.findOne()
.where('_id', types.ObjectId(archiveId))
.select({'keywords':1, '_id':0})
.exec(cb);
};
2nd query definition in model
schema.statics.findAssociatedStills = function
findAssociatedStills(Annotations, cb) {
this.find()
.where({$text: { $search:Annotations}},{score:{$meta:"textScore"}})
.sort({score:{$meta:"textScore"}})
.limit(2)
.exec(cb);
};
THE PROBLEM IS
When I ran the 1st query , it is returning following
{ keywords:
[ 'IRELAND',
'ELECTIONS',
'PARTY_POLITICAL_BROADCASTS',
'FINE_GAEL' ] }
But the input to the next query should be only the values such as
'IRELAND', 'ELECTIONS', 'PARTY_POLITICAL_BROADCASTS', 'FINE_GAEL'
How to filter from the result only the values of the array without key
I know what will be the query in MongoDb
that is as follows
db.archives.episodes.find({_id:ObjectId("577cd9558786332020aff74c")}, {keywords:1, _id:0}).forEach( function(x) { print(x.keywords); } );
Is it good to filter it in the query or is it right way to filter in the returned script.
Please advice.Thanks for your time.
You're using series, not waterfall. And your archiveId cannot be set in the first function. You need to setup it before async.waterfall.
Here's the right syntax (with waterfall) :
function findAssociatedArchivalStills(carousels, masterCallback){
var archiveId = 'yourArchiveId';
async.waterfall([
// 1st query
function findAssociatedAnnotations(taskCallback) {
Archive.findAnnotations(archiveId, taskCallback);
},
// 2nd query
function findAssociatedStills(annotations, taskCallback) {
Still.findAssociatedStills(annotations,taskCallback);
},
function buildCarousel(stills, taskCallback) {
return taskCallback(null, new Carousel({
title: 'Related Stills',
items: stills
}));
}
], function onFinish(err) {
if (err){
return masterCallback(err);
}
masterCallback(null, carousels);
});
}
Documentation : http://caolan.github.io/async/docs.html#.waterfall
PS : Always use different names for your function's callback and your async functions callbacks.

Omiting column names / inserting objects directly into node-postgres

I'd like to pass dictionaries with column names as keys, thus avoiding declaring the column names within the query itself (typing them directly).
Assume I have a table User with 2 column names:
idUser(INT)
fullName(VARCHAR)
To create a record using node-postgres, I'll need to declare within the query the column names like so:
var idUser = 2;
var fullName = "John Doe";
var query = 'INSERT INTO User(idUser, age) VALUES ($1, $2)';
database.query(query, [idUser, fullName], function(error, result) {
callback(error, result.rows);
database.end();
});
I'd prefer if there was a way to just pass a dictionary & have it infer the column names from the keys - If there's an easy trick I'd like to hear it.
E.g something like this:
var values = {
idUser : 2,
fullName: "John Doe"
};
var query = 'INSERT INTO User VALUES ($1)';
database.query(query, [values], function(error, result) {
callback(error, result.rows);
database.end();
});
A complete example of doing it with pg-promise:
const pgp = require('pg-promise')(/*options*/);
const cn = 'postgres://username:password#host:port/database';
const db = pgp(cn);
const values = {
idUser: 2,
fullName: 'John Doe'
};
// generating the insert query:
const query = pgp.helpers.insert(values, null, 'User');
//=> INSERT INTO "User"("idUser","fullName") VALUES(2,'John Doe')
db.none(query)
.then(data => {
// success;
})
.catch(error => {
// error;
});
And with focus on high performance it would change to this:
// generating a set of columns from the object (only once):
const cs = new pgp.helpers.ColumnSet(values, {table: 'User'});
// generating the insert query:
const query = pgp.helpers.insert(values, cs);
//=> INSERT INTO "User"("idUser","fullName") VALUES(2,'John Doe')
There's no support for key-value values in the insert statement, so it can not be done with native sql.
However, the node-postgres extras page mentions multiple sql generation tools, and for example Squel.js parameters can be used to construct sql in a way very close like what you're looking for:
squel.insert()
.into("User")
.setFieldsRows([
{ idUser: 2, fullName: "John Doe" }
])
.toParam()
// => { text: 'INSERT INTO User (idUser, fullName) VALUES (?, ?)',
// values: [ 2, 'John Doe' ] }
My case was a bit special as I had a field named order in the JSON object which is a keyword in SQL. Therefore I had to wrap everything in quotes using a JSONify() function.
Also note the numberedParameters argument as well as the double quotes around the 'Messages' string.
import { pool } from './connection';
function JSONify(obj: Map<string, any>) {
var o = {};
for (var i in obj) {
o['"' + i + '"'] = obj[i]; // make the quotes
}
return o;
}
// I have a table named "Messages" with the columns order and name
// I also supply the createdAt and updatedAt timestamps just in case
const messages = [
{
order: 0,
name: 'Message with index 0',
createdAt: new Date().toISOString(),
updatedAt: new Date().toISOString(),
}
]
// Create the insert statement
const insertStatement = insert({ numberedParameters: true })
.into('"Messages"')
.setFieldsRows(messages.map((message) => JSONify(message)))
.toParam();
console.log(insertStatement);
// Notice the quotes wrapping the table and column names
// => { text: 'INSERT INTO "Messages" ("order", "name", "createdAt", "updatedAt") VALUES ($1, $2, $3, $4)',
// values: [ 0, 'Message with index 0', '2022-07-22T13:51:27.679Z', '2022-07-22T13:51:27.679Z' ] }
// Create
await pool.query(insertStatement.text, insertStatement.values);
See the Squel documentation for more details.
And this is how I create the pool object if anyone is curious.
import { Pool } from 'pg';
import { DB_CONFIG } from './config';
export const pool = new Pool({
user: DB_CONFIG[process.env.NODE_ENV].username,
host: DB_CONFIG[process.env.NODE_ENV].host,
database: DB_CONFIG[process.env.NODE_ENV].database,
password: DB_CONFIG[process.env.NODE_ENV].password,
port: DB_CONFIG[process.env.NODE_ENV].port,
});

Resources