Using #Type discriminator with class-validator and class-transform not working in tandem - nestjs

I have a class with a property on it, which can be a number of classes based on a property. #Type obviously is perfect for this, but the issue is, the discriminator does not exist on the object, it exists on the parent.
Consider the following:
class Parent {
type: 'a' | 'b'
#Type(() => ?, {
discriminator: {
property: 'type',
subTypes: [
{ value: TypeA, name: 'a' },
{ value: TypeB, name: 'b' },
]
}
}
data: TypeA | TypeB
}
Naturally I can't do this. I tried a custom decorator which does something like:
const TypeOnParent: () => PropertyDecorator = () => {
const __class__ = class {}
const prop = '__type'
return (target, key) => {
Transform(({ value, obj }) => {
value[prop] = obj.type
return value
})(target, key)
Type(() => __class__, {
keepDiscriminatorProperty: true,
discriminator: {
property: prop,
subTypes: [
{ name: 'a', value: TypeA },
{ name: 'b', value: TypeB },
],
},
})(target, key)
}
}
class Parent {
type: 'a' | 'b'
#TypeOnParent('type')
data: TypeA | TypeB
}
The goal here is to pass the parent prop onto the child, so that Type discriminator can do its job. However, the discriminator prop that I pass onto the child 'data' prop doesn't seem to work. It just defaults to using the class instance. I've tried changing the order of the decorators.
The result is an identical object no matter what, but if I pass the value in manually via payload, it works fine. If I use the transform, it never works.
Am I missing something? Does class-transform ALWAYS run type before any other decorators? Or is there a better way to achieve this?
I am using nestjs global validation pipe if that helps.

I have solved this not by using discriminator, but the type function.
Type will always run first, before Transform, so Transform is not an option.
However, it passes an object into the function, so we can simply use that:
#Type((opts) => opts.object.type === 'a' ? TypeA : TypeB)
data: TypeA | TypeB

Related

How to infer property values from arrays of objects in Typescript?

I'd like the TypeScript compiler to infer the names of properties on an object so that it's easier to invoke the appropriate functions later. However, despite considering invoking functions, TypeScript doesn't give any intellisense of the functions I can invoke.
Taking into account the following example:
interface InputBase {
alias: string;
}
const input: Array<InputBase> = [
{ alias: 'getNotes' },
{ alias: 'getComments' },
];
const client = input.reduce((acc, curr: InputBase) => {
return {
...acc,
[curr.alias]: () => {
return curr.alias + ' call';
},
};
}, {});
client.getNotes();
My problem is really the lack of inference in the client variable, because currently the data type is {} despite appearing in the terminal {getNotes: ƒ, getComments: ƒ}.
How can I resolve this?

Getting a value of a sibling field in a field resolver

Say, I have these typeDefs (just an example):
type CityInfo {
CityState: {
City: String!
State: String!
}
Zip: String!
}
type Query {
CitiesStatesZips: [CityInfo]
}
Now, say there is a rest api that gives me a list of Zip codes and there is another REST API that returns City/State by Zip code. I can write this resolver:
const resolvers = {
Query: {
CitiesStatesZips: (parent, args, {dataSources}) => dataSources.ZipApi.getZipCodes()
},
CityInfo: {
CityState: (parent, args, { dataSources }) => dataSources.CityStateApi(** HOW DO I PASS ZIP HERE **)
}
}
In CityState field resolver I need to pass the value of Zip field of the same object - how do I access it?
If CitiesStatesZips returns an array of objects (CityInfo type) with zip field ...
... then zip is already resolved when CityState resolver is called ... then you can simply use parent.zip as an arg for datasource call.

TypeScript: 'number | undefined' is not assignable to parameter of type 'number'

I have a model with Activiy Interface and Mongo Schema, and a function that takes an ID and logs it.
Then in my route I'm querying all activities from a given customer where the externalId is not undefined. With the returned array I call that function defined in my model, passing externalId as param.
But TypeScript keeps throwing: Argument of type 'number | undefined' is not assignable to parameter of type 'number'. Type 'undefined' is not assignable to type 'number'.
I already tried to wrap the call with an if statement checking if externalId && typeof externalId === 'number', but I can't make the error go away.
Model:
import { Schema, model, } from 'mongoose';
interface IActivity {
customer: string,
externalId?: number,
}
const activitySchema: Schema = new Schema({
Customer: {
type: String,
required: true,
minlength: 5,
maxlength: 50,
},
externalId: { type: Number, },
});
export function updateExternal(externalId: number): void {
console.log(externalId);
}
export const Activity = model<IActivityModel>('Activity', activitySchema);
Route:
const completeActivity = await Activity
.find({
'externalId': { '$ne': undefined, },
'customer': 'Foo Customer',
});
completeActivity.forEach((activity) => {
updateExternal(activity.externalId); // This is where the error appears
});
It's interesting to think what should happen in order to get the value at a key, when the key may not really have a value. activity.externalId has type number | undefined so what shall we do to move from number | undefined to number?
We need to 'remove' that part of the type, that is to say, we need to provide some way to convert all the undefined parts of number | undefined to number.
Often we do this by providing some default - we say that whenever a value is undefined, don't use it - use the default instead. It is difficult to get a good type for this with typescript, though; we can't specify that the type of the object has the property we're looking for; so we must simply say that it can be any, and as such, the function would only be able to return any, as in:
interface IX {
a?: number;
}
function propOr<K extends string | number, U, O extends Record<K, U>>(k: K, defaultValue: U, obj: O) {
return obj[k];
}
const testFunction1 = (x: IX) => x.a; // has type number | undefined
const testFunction2 = (x: IX) => propOr('a', -1, x); // type error:
The type error we get here is:
Argument of type 'IX' is not assignable to parameter of type 'Record<"a", number>'.
Types of property 'a' are incompatible.
Type 'number | undefined' is not assignable to type 'number'.
Type 'undefined' is not assignable to type 'number'.(2345)
We might look at other libraries, to see what they do, Sanctuary, for instance, has it as:
prop(p: string): (q: any) => any;
So when typing a function, we're a little stuck, we're forced to lose the type information (unless someone else can jump in with a better idea).
The best solution I can think of, is to just use an or to always get a number:
interface IX {
a?: number;
}
const a: IX = {};
const b = a.a || 0;
Now b has the type number.

How to pass in the done() parameter on an async jest test.each case

I'm trying write a jest test case that tests an async method, I want to pass in the done() parameter so jest waits for it to be fired before it ends the test, however, I'm not sure where to put it.
Any ideas?
const testcases = [
[
'Crew',
[1,2,3],
Enum.Level1
],
[
'Staff',
[4,5,6],
Enum.Level2
]
];
test.each(testcases )(
'Should be able to load differing cases %p',
(
typeName: string,
initalVals: string[],
type: LevelType
) => {
// some call that updates mobx store state
when(
() => mobxstoreProperty.length == initalVals.length,
() => {
// my assertions
done();
}
);
}
);
For a single jest test I can do this:
test('my single test', done => {
// some call that updates mobx store state
when(
() => mobxstoreProperty.length == initalVals.length,
() => {
// my assertions
done();
}
);
});
Just unsure how to do it for when I use the test.each method.
I use named parameters and I can add the done() method as the last function parameter. For example like so:
const testcases: {
typeName: string;
initalVals: string[],
type: LevelType
}[] = [
{
typeName: 'Crew',
initalVals: [1,2,3],
type: Enum.Level1
},
{
typeName: 'Staff',
initalVals: [4,5,6],
type: Enum.Level2
},
];
test.each(testcases)(
'Should be able to load differing cases %p',
// Must use `any` for `done`, as TypeScript infers the wrong type:
({typeName, initalVals, type}, done: any) => {
// some call that updates mobx store state
when(
() => mobxstoreProperty.length == initalVals.length,
() => {
// my assertions
done();
}
);
}
);
I haven't tested if you can just add the done() method as last parameters with array arguments, but maybe that works, too.
to pass and evaluate done, the done callback should be very last argument in the function for test case arguments.
also, here's how to deal with typings, when you use the test.each method in typescript:
// found at https://github.com/DefinitelyTyped/DefinitelyTyped/issues/34617
it.each<number | jest.DoneCallback>([1, 2, 3])(
'dummy: %d',
(num: number, done: jest.DoneCallback) => {
done();
},
);
There is no perfect answer at the moment, as there is an issue in the Jest library with the templated types of test.each()
So for now, all solutions to achieve what you want, require to do some tricks with the types.
A solution for complex test parameters, with the light array syntax definition:
test.each<Array<string | string[] | LevelType | jest.DoneCallback>>([
["Crew", [1, 2, 3], LevelType.Level1],
["Staff", [4, 5, 6], LevelType.Level2],
])(
"Should be able to load differing cases %p",
(
typeName: string,
initalVals: string[],
type: LevelType,
done: jest.DoneCallback
) => {
// some test code
}
);
The trick in this solution is the use of test.each<Array<string | string[] | LevelType | jest.DoneCallback>> that bypasses the templated type issue.
For more information, see the opened issues https://github.com/DefinitelyTyped/DefinitelyTyped/issues/34617 and https://github.com/facebook/jest/issues/8518
Edit:
This solution has been edited to replace the initial any type in by a more precise one.
Thanks to Alejandro Moreno for the idea in the comments

Optional but non-nullable fields in GraphQL

In an update to our GraphQL API only the models _id field is required hence the ! in the below SDL language code. Other fields such as name don't have to be included on an update but also cannot have null value. Currently, excluding the ! from the name field allows the end user to not have to pass a name in an update but it allows them to pass a null value for the name in, which cannot be allowed.
A null value lets us know that a field needs to be removed from the database.
Below is an example of a model where this would cause a problem - the Name custom scalar doesn't allow null values but GraphQL still allows them through:
type language {
_id: ObjectId
iso: Language_ISO
auto_translate: Boolean
name: Name
updated_at: Date_time
created_at: Date_time
}
input language_create {
iso: Language_ISO!
auto_translate: Boolean
name: Name!
}
input language_update {
_id: ObjectId!
iso: Language_ISO!
auto_translate: Boolean
name: Name
}
When a null value is passed in it bypasses our Scalars so we cannot throw a user input validation error if null isn't an allowed value.
I am aware that ! means non-nullable and that the lack of the ! means the field is nullable however it is frustrating that, as far as I can see, we cannot specify the exact values for a field if a field is not required / optional. This issue only occurs on updates.
Are there any ways to work around this issue through custom Scalars without having to start hardcoding logic into each update resolver which seems cumbersome?
EXAMPLE MUTATION THAT SHOULD FAIL
mutation tests_language_create( $input: language_update! ) { language_update( input: $input ) { name }}
Variables
input: {
_id: "1234",
name: null
}
UPDATE 9/11/18: for reference, I can't find a way around this as there are issues with using custom scalars, custom directives and validation rules. I've opened an issue on GitHub here: https://github.com/apollographql/apollo-server/issues/1942
What you're effectively looking for is custom validation logic. You can add any validation rules you want on top of the "default" set that is normally included when you build a schema. Here's a rough example of how to add a rule that checks for null values on specific types or scalars when they are used as arguments:
const { specifiedRules } = require('graphql/validation')
const { GraphQLError } = require('graphql/error')
const typesToValidate = ['Foo', 'Bar']
// This returns a "Visitor" whose properties get called for
// each node in the document that matches the property's name
function CustomInputFieldsNonNull(context) {
return {
Argument(node) {
const argDef = context.getArgument();
const checkType = typesToValidate.includes(argDef.astNode.type.name.value)
if (checkType && node.value.kind === 'NullValue') {
context.reportError(
new GraphQLError(
`Type ${argDef.astNode.type.name.value} cannot be null`,
node,
),
)
}
},
}
}
// We're going to override the validation rules, so we want to grab
// the existing set of rules and just add on to it
const validationRules = specifiedRules.concat(CustomInputFieldsNonNull)
const server = new ApolloServer({
typeDefs,
resolvers,
validationRules,
})
EDIT: The above only works if you're not using variables, which isn't going to be very helpful in most cases. As a workaround, I was able to utilize a FIELD_DEFINITION directive to achieve the desired behavior. There's probably a number of ways you could approach this, but here's a basic example:
class NonNullInputDirective extends SchemaDirectiveVisitor {
visitFieldDefinition(field) {
const { resolve = defaultFieldResolver } = field
const { args: { paths } } = this
field.resolve = async function (...resolverArgs) {
const fieldArgs = resolverArgs[1]
for (const path of paths) {
if (_.get(fieldArgs, path) === null) {
throw new Error(`${path} cannot be null`)
}
}
return resolve.apply(this, resolverArgs)
}
}
}
Then in your schema:
directive #nonNullInput(paths: [String!]!) on FIELD_DEFINITION
input FooInput {
foo: String
bar: String
}
type Query {
foo (input: FooInput!): String #nonNullInput(paths: ["input.foo"])
}
Assuming that the "non null" input fields are the same each time the input is used in the schema, you could map each input's name to an array of field names that should be validated. So you could do something like this as well:
const nonNullFieldMap = {
FooInput: ['foo'],
}
class NonNullInputDirective extends SchemaDirectiveVisitor {
visitFieldDefinition(field) {
const { resolve = defaultFieldResolver } = field
const visitedTypeArgs = this.visitedType.args
field.resolve = async function (...resolverArgs) {
const fieldArgs = resolverArgs[1]
visitedTypeArgs.forEach(arg => {
const argType = arg.type.toString().replace("!", "")
const nonNullFields = nonNullFieldMap[argType]
nonNullFields.forEach(nonNullField => {
const path = `${arg.name}.${nonNullField}`
if (_.get(fieldArgs, path) === null) {
throw new Error(`${path} cannot be null`)
}
})
})
return resolve.apply(this, resolverArgs)
}
}
}
And then in your schema:
directive #nonNullInput on FIELD_DEFINITION
type Query {
foo (input: FooInput!): String #nonNullInput
}

Resources