Node.js Testing with Mongoose. unique gets ignored - node.js

I'm having a little trouble with an integration test for my mongoose application. The problem is, that my unique setting gets constantly ignored. The Schema looks more or less like this (so no fancy stuff in there)
const RealmSchema:Schema = new mongoose.Schema({
Title : {
type : String,
required : true,
unique : true
},
SchemaVersion : {
type : String,
default : SchemaVersion,
enum: [ SchemaVersion ]
}
}, {
timestamps : {
createdAt : "Created",
updatedAt : "Updated"
}
});
It looks like basically all the rules set in the schema are beeing ignored. I can pass in a Number/Boolean where string was required. The only thing that is working is fields that have not been declared in the schema won't be saved to the db
First probable cause:
I have the feeling, that it might have to do with the way I test. I have multiple integration tests. After each one my database gets dropped (so I have the same condition for every test and precondition the database in that test).
Is is possible that the reason is my indices beeing droped with the database and not beeing reinitiated when the next text creates database and collection again? And if this is the case, how could I make sure that after every test I get an empty database that still respects all my schema settings?
Second probable cause:
I'm using TypeScript in this project. Maybe there is something wrong in defining the Schema and the Model. This is what i do.
1. Create the Schema (code from above)
2. Create an Interface for the model (where IRealmM extends the Interface for the use in mongoose)
import { SpecificAttributeSelect } from "../classes/class.specificAttribute.Select";
import { SpecificAttributeText } from "../classes/class.specificAttribute.Text";
import { Document } from "mongoose";
interface IRealm{
Title : String;
Attributes : (SpecificAttributeSelect | SpecificAttributeText)[];
}
interface IRealmM extends IRealm, Document {
}
export { IRealm, IRealmM }
3. Create the model
import { RealmSchema } from '../schemas/schema.Realm';
import { Model } from 'mongoose';
import { IRealmM } from '../interfaces/interface.realm';
// Apply Authentication Plugin and create Model
const RealmModel:Model<IRealmM> = mongoose.model('realm', RealmSchema);
// Export the Model
export { RealmModel }

Unique options is not a validator. Check out this link from Mongoose docs.

OK i finally figured it out. The key issue is described here
Mongoose Unique index not working!
Solstice333 states in his answer that ensureIndex is deprecated (a warning I have been getting for some time now, I thought it was still working though)
After adding .createIndexes() to the model leaving me with the following code it works (at least as far as I'm not testing. More on that after the code)
// Apply Authentication Plugin and create Model
const RealmModel:Model<IRealmM> = mongoose.model('realm', RealmSchema);
RealmModel.createIndexes();
Now the problem with this will be that the indexes are beeing set when you're connection is first established, but not if you drop the database in your process (which at least for me occurs after every integration test)
So in my tests the resetDatabase function will look like this to make sure all the indexes are set
const resetDatabase = done => {
if(mongoose.connection.readyState === 1){
mongoose.connection.db.dropDatabase( async () => {
await resetIndexes(mongoose.models);
done();
});
} else {
mongoose.connection.once('open', () => {
mongoose.connection.db.dropDatabase( async () => {
await resetIndexes(mongoose.models);
done();
});
});
}
};
const resetIndexes = async (Models:Object) => {
let indexesReset: any[] = [];
for(let key in Models){
indexesReset.push(Models[key].createIndexes());
}
Promise.all(indexesReset).then( () => {
return true;
});
}

Related

usage of classMethods vs instanceMethods in sequilizejs?

I am new to sequilizejs and basically am trying to refactor code that i've written in the controller and came across classMethods and instanceMethods. I see instance methods defined like so:
/lib/model/db/users.js
module.exports = function(sequelize, DataTypes) {
var instance_methods = get_instance_methods(sequelize);
var User = sequelize.define("User", {
email : {
type : DataTypes.STRING,
allowNull : false
},
}, {
classMethods: class_methods,
instanceMethods : instance_methods,
});
return User;
};
function get_instance_methods(sequelize) {
return {
is_my_password : function( password ) {
return sequelize.models.User.hashify_password( password ) === this.password;
},
};
function get_class_methods(sequelize) {
return {
hashify_password : function( password ) {
return crypto
.createHash('md5')
.update(
password + config.get('crypto_secret'),
(config.get('crypto_hash_encoding') || 'binary')
)
.digest('hex');
},
};
My understanding of the above is that classMethods are generic functions defined for the whole model and instanceMethods are basically a reference to a given row in a table/model, am i right in assuming this ? this would be my primary question.
Also i don't see any reference of classMethods and instanceMethods in the docs HERE. I only found this previous answer HERE. That provides a somewhat comprehensive understanding of the difference between instanceMethods and classMethods.
Basically i'am just trying to confirm weather my understanding matches the intended usage for class vs instance methods and also links to the official docs for the same would be highly appreciated.
The official way to add both static and instance methods is using classes like this:
class User extends Model {
static classLevelMethod() {
return 'foo';
}
instanceLevelMethod() {
return 'bar';
}
getFullname() {
return [this.firstname, this.lastname].join(' ');
}
}
User.init({
firstname: Sequelize.TEXT,
lastname: Sequelize.TEXT
}, { sequelize });
See Models as classes
Your understand is correct. In short: classes can have instances. Models are classes. So, Models can have instances. When working with an instance method, you will notice the this — which is the context, which refers to that particular class/model instance.
Hence, if you have a User model that has:
an instance method called is_my_password
a class model called hashify_password
User.hashify_password('123') will return the hashed version of 123. The User instance is not needed here. hashify_password is general function attached to the User model (class).
Now, if you'd like to call is_my_password() you do need a User instance:
User.findOne({...}).then(function (user) {
if (user.is_my_password('123')) {
// ^ Here we call `is_my_password` as a method of the user instance.
...
}
}).catch(console.error)
In general, when you have functions that do not need the particular model instance data, you will define them as class methods. They are static methods.
And when the function works with the instance data, you define it as instance method to make it easier and nicer to call.

How to generate unique number getting from postgres sql in jest tests running concurrently

i have an issue with unique constraint on one of my fields.
I'm adding records to database to be able to check by tests is my code working as expected.
One of table field is unique number that is provided from outside (it's not related to some other table in the same database), i need to generate this unique number for each test, but i met with unique constraint issue.
I have following function:
export const findMinUniqueUserId = async (): Promise<number> => {
const subscriptions = await prisma.$queryRaw<Subscription[]>(`
SELECT "userId"
FROM public."Subscriptions"
ORDER BY "userId" DESC
LIMIT 1
`);
const firstFreeUserId = (subscriptions[0]?.userId || 0) + 1;
return firstFreeUserId;
};
that returns the first minimum free "userId" field.
I have also the following tests:
describe("Test 1", () => {
it("should do something", async () => {
const draftSub = {
userId: await findMinUniqueUserId()
...some other fields
}
await prisma.subscription.create({
data: draftSub
})
...some other test stuff
})
})
And the second one:
describe("Test 2", () => {
it("should do something", async () => {
const draftSub = {
userId: await findMinUniqueUserId()
...some other fields
}
await prisma.subscription.create({
data: draftSub
})
...some other test stuff
})
})
Sometimes i'm getting an error:
Unique constraint failed on the fields: (`userId`)
I've heard that each of test suit (describe block) works on seperate worker thread, i was trying to prepare some kind of singleton class, that can helps me but i think each instance of class is creating in separete worker thread, so generated userId is not unique.
This is what i was trying with singleton class:
export class UserIdManager {
private static instance: UserIdManager
private static userIdShiftBeforeDatabaseCall = 0
private static minFreeUserIdAfterDatabaseCall = 0
private constructor() {
return;
}
private static async init() {
this.minFreeUserIdAfterDatabaseCall = await findMinUniqueUserId();
}
public static async reserveMinFreeUserId() {
let minFreeUserId = UserIdManager.userIdShiftBeforeDatabaseCall;
UserIdManager.userIdShiftBeforeDatabaseCall++;
if (!UserIdManager.instance) {
UserIdManager.instance = new UserIdManager();
await this.init();
}
minFreeUserId += UserIdManager.minFreeUserIdAfterDatabaseCall;
return minFreeUserId;
}
}
But i realize that it doesn't help me with multithreading. I've used this, but with the same result:
....
const draftSub = {
userId: await UserIdManager.reserveMinFreeUserId()
...some other fields
}
....
So, the question is how to generate unique number for each test. When i pass --runInBand option to jest everything is working correctly, but it takes much more time.
What you are using is the typical MAX()+1 method of assigning unique values. Unfortunately this is a virtual guarantee you will get duplicate values for your unique value. This is a result of the Multi-Version Concurrency Control (MVCC) nature of Postgres. In a MVCC database the actions taken by one session cannot be seen by another session until the first session commits. Thus when multiple sessions access max()+1 they each get the same result. The first one to commit succeeds, the second fails. The solution to this is creating a sequence and let Postgres assign the unique value, it will not assign the same value twice regardless how many sessions access the sequence concurrently. The cost however being your values will contain gaps - accept it, get over it, and move on. You can have the sequence generated by defining your userid as a generated identity (Postgres10 or later) or as serial for older versions.
create table subscriptions ( id generated always as identity ...) -- for versions Postgres 10 or later
or
create table subscriptions ( id serial ...) -- for versions prior to Postgers 10
With either of those in place get rid of your findMinUniqueUserId function. You may also want to look into insert...returning... functionality

Optional parameters on sequelize query

Good morning.
I'm quite new to NodeJS / sequelize world and I'm currently facing a problem while trying to display a dashboard on screen.
This dashboard has three filters: two dates (period), client name, and employee name. The user can select none, one, two, or all the filters and my database needs to work accordingly.
That being said, my problem is with Sequelize because I don't know how to treat this problem of parameters not being "always" there.
I've seen this question:
Sequelize optional where clause parameters?
but this answer doesn't work anymore. I also tried another way of building the where clause, but I failed on it as well (mainly due to sequelize operators).
The last thing I tried was to make a single query with all parameters included but try to find some value (or flag) that would make sequelize ignore the parameter, for the case when the parameter was no there*, but it looks like Sequelize doesn't have anything like that.
* I've read a question here that has an answer saying that {} would do the trick but I tried that as well but didn't work.
In summary: I need to make a query that can "change" over time, for example:
Foo.findAll({
where: {
id : 1,
}
});
Foo.findAll({
where: {
id {
[Op.in] : [1,2,3,4,5]
},
name: "palmeiira",
}
});
Do you know a way of doing it without the need of using a lot if / switch statements?
I'm currently using Sequelize v. 5.5.1.
Update
I tried doing as suggested by #Anatoly and created a function to build the parameters. It was something like that. (I tried a "smaller" version just to test)
async function test() {
const where = {};
where[Op.and] = [];
where[Op.eq].push({
id: {
[Op.in]: [1,2,3]
}
});
return where;
}
I setted the return value to a const:
const query = await test()
And tried console.log(query)
The result was: { [Symbol(and)]: [ { id: [Object] } ] }, which made me believe that the problem was parsing the Op part so i tried using 'Op.and' and 'Op.in' to avoid that and it solved this problem, but led to another on sequelize that said Invalid value
Do you have any idea where is my error ?
P.S.: #Anatoly very nice idea you gave me on original answer. Thank you very much.
If these three conditions should work together then you can use Op.and with an array of conditions:
const where = {}
if (datesFilter || clientNameFilter || employeenameFilter) {
where[Op.and] = []
if (datesFilter) {
where[Op.and].push({
dateField: {
[Op.between]: [datesFilter.start, datesFilter.finish]
}
})
}
if (clientNameFilter) {
where[Op.and].push({
name: {
[Op.iLike]: `%${clientNameFilter.value}%`
}
})
}
if (employeenameFilter) {
where[Op.and].push({
employeeName: {
[Op.iLike]: `%${employeenameFilter.value}%`
}
})
}
}
const dashboardItems = await DashboardItem.findAll({ where }, {
// some options here
})
If the conditions should work as alternatives then just replace Op.and with Op.or

graphql: Must provide Source. Received: {kind: "Document, definition ...}

I'm really new to Graphql (just yesterday actually). I am "playing" around and try the various tools of the ecosystem (apollo-server, graphql.js ...ect).
For the sake of experimenting, I am trying to call a query from within nodejs (and not from a client in the browser, such as a react application)
First of all this is my simple schema along with resolvers:
export const mySchema = gql`
type User {
id: ID!
name:
surname: String
}
# root query has been defined in another file
extend type Query {
users: [User]
test: [User]
}
`
export const myResolvers = {
users: () => [ __array_of_users__ ],
test: () => /* this is where I would like to re-invoke the 'users query'
}
Using the makeExecutableSchema function, I create a schema object with my types and my resolvers and I export this schema into the apollo server application. Every thing works fine so far.
Now following this stackoverflow suggested solution, I created a helper function which should allow me to invoke a query defined in my schema as following:
import { graphql } from "graphql";
import { schema } from "./my-schema";
export const execute = str => {
return graphql(schema, str );
};
With this helper function, my resolvers become:
import { gql } from "apollo-server-express";
import { execute } from '__path_to_helper_function__';
export const myResolvers = {
users: () => [ __array_of_users__ ],
test: () => execute( gql`
query users {
name
}
`)
}
But in the playground, when I try the query:
{
test {
name
}
}
I get the following error:
I don't even know if what I am trying to do (to call a query from within node) can be done. Any suggestion will be greatly appreciated.
Thnaks
graphql-tag takes a string and parses it into a DocumentNode object. This is effectively the same as passing a String to the parse function. Some functions exported by the graphql module, like execute, expect to be passed in a DocumentNode object -- the graphql function does not. It should be passed just a plain String as the request, as you can see from the signature:
graphql(
schema: GraphQLSchema,
requestString: string,
rootValue?: ?any,
contextValue?: ?any,
variableValues?: ?{[key: string]: any},
operationName?: ?string
): Promise<GraphQLResult>
So, just drop the gql tag. You can see an (incomplete) API reference here.

Optional but non-nullable fields in GraphQL

In an update to our GraphQL API only the models _id field is required hence the ! in the below SDL language code. Other fields such as name don't have to be included on an update but also cannot have null value. Currently, excluding the ! from the name field allows the end user to not have to pass a name in an update but it allows them to pass a null value for the name in, which cannot be allowed.
A null value lets us know that a field needs to be removed from the database.
Below is an example of a model where this would cause a problem - the Name custom scalar doesn't allow null values but GraphQL still allows them through:
type language {
_id: ObjectId
iso: Language_ISO
auto_translate: Boolean
name: Name
updated_at: Date_time
created_at: Date_time
}
input language_create {
iso: Language_ISO!
auto_translate: Boolean
name: Name!
}
input language_update {
_id: ObjectId!
iso: Language_ISO!
auto_translate: Boolean
name: Name
}
When a null value is passed in it bypasses our Scalars so we cannot throw a user input validation error if null isn't an allowed value.
I am aware that ! means non-nullable and that the lack of the ! means the field is nullable however it is frustrating that, as far as I can see, we cannot specify the exact values for a field if a field is not required / optional. This issue only occurs on updates.
Are there any ways to work around this issue through custom Scalars without having to start hardcoding logic into each update resolver which seems cumbersome?
EXAMPLE MUTATION THAT SHOULD FAIL
mutation tests_language_create( $input: language_update! ) { language_update( input: $input ) { name }}
Variables
input: {
_id: "1234",
name: null
}
UPDATE 9/11/18: for reference, I can't find a way around this as there are issues with using custom scalars, custom directives and validation rules. I've opened an issue on GitHub here: https://github.com/apollographql/apollo-server/issues/1942
What you're effectively looking for is custom validation logic. You can add any validation rules you want on top of the "default" set that is normally included when you build a schema. Here's a rough example of how to add a rule that checks for null values on specific types or scalars when they are used as arguments:
const { specifiedRules } = require('graphql/validation')
const { GraphQLError } = require('graphql/error')
const typesToValidate = ['Foo', 'Bar']
// This returns a "Visitor" whose properties get called for
// each node in the document that matches the property's name
function CustomInputFieldsNonNull(context) {
return {
Argument(node) {
const argDef = context.getArgument();
const checkType = typesToValidate.includes(argDef.astNode.type.name.value)
if (checkType && node.value.kind === 'NullValue') {
context.reportError(
new GraphQLError(
`Type ${argDef.astNode.type.name.value} cannot be null`,
node,
),
)
}
},
}
}
// We're going to override the validation rules, so we want to grab
// the existing set of rules and just add on to it
const validationRules = specifiedRules.concat(CustomInputFieldsNonNull)
const server = new ApolloServer({
typeDefs,
resolvers,
validationRules,
})
EDIT: The above only works if you're not using variables, which isn't going to be very helpful in most cases. As a workaround, I was able to utilize a FIELD_DEFINITION directive to achieve the desired behavior. There's probably a number of ways you could approach this, but here's a basic example:
class NonNullInputDirective extends SchemaDirectiveVisitor {
visitFieldDefinition(field) {
const { resolve = defaultFieldResolver } = field
const { args: { paths } } = this
field.resolve = async function (...resolverArgs) {
const fieldArgs = resolverArgs[1]
for (const path of paths) {
if (_.get(fieldArgs, path) === null) {
throw new Error(`${path} cannot be null`)
}
}
return resolve.apply(this, resolverArgs)
}
}
}
Then in your schema:
directive #nonNullInput(paths: [String!]!) on FIELD_DEFINITION
input FooInput {
foo: String
bar: String
}
type Query {
foo (input: FooInput!): String #nonNullInput(paths: ["input.foo"])
}
Assuming that the "non null" input fields are the same each time the input is used in the schema, you could map each input's name to an array of field names that should be validated. So you could do something like this as well:
const nonNullFieldMap = {
FooInput: ['foo'],
}
class NonNullInputDirective extends SchemaDirectiveVisitor {
visitFieldDefinition(field) {
const { resolve = defaultFieldResolver } = field
const visitedTypeArgs = this.visitedType.args
field.resolve = async function (...resolverArgs) {
const fieldArgs = resolverArgs[1]
visitedTypeArgs.forEach(arg => {
const argType = arg.type.toString().replace("!", "")
const nonNullFields = nonNullFieldMap[argType]
nonNullFields.forEach(nonNullField => {
const path = `${arg.name}.${nonNullField}`
if (_.get(fieldArgs, path) === null) {
throw new Error(`${path} cannot be null`)
}
})
})
return resolve.apply(this, resolverArgs)
}
}
}
And then in your schema:
directive #nonNullInput on FIELD_DEFINITION
type Query {
foo (input: FooInput!): String #nonNullInput
}

Resources