I have five years experience in programming but I am very new to NodeJS and TypeScript.
According to their documentation, the normal way to use AJV for synchronous validation in TypeScript is to use the return of AJV#compile as both a function and an object.
import { AJV } from 'ajv';
// I am loading my schema from an adjacent file
import * as create_search_response from './create_search.schema.json';
const ajv = new AJV();
const validator = ajv.compile(create_search_response);
function throwUnlessObjectIsValid(obj: any) {
if (!validator(obj)) {
// assume at least one error
throw validator.errors?.pop();
}
}
My understanding is that we use the global value of validator.errors to store and retrieve the error info, similar to errno in C.
What if I perform validation in two threads simultaneously? Wouldn't the errors get mixed up? I have looked at their asynchronous validation system, and it would solve my problem with promise-based API, but I'm very confused about the details of keywords. It seems to require that I change the JSON schema itself.
What is the best practice here?
Related
Background
A data structure in my database consists of "sections"; lists of custom objects. The number of sections may expand in the future, so to keep my code as DRY as possible, I wanted to determine the section to add/update/delete an item from to be defined dynamically as a parameter.
I quickly realised that doing something like #Body() section: SectionA | SectionB | SectionC... disables validation so I needed a single DTO Section that could encompass all sections. To do that I need to define dynamically which validators to apply as I have several #IsNotEmpty constraints.
So I came across this post whose selected answer recommends the usage of groups.
This posed the following challenges:
I now have to write a custom validation pipe. Relied heavily on this
I want to override the global validator pipe that I already had running and use my custom one for just that method. Outcome: didn't work, had to start defining the pipe on every controller method, a tradeoff I am willing to accept. Looks like there is no simple alternative.
However, I'm now faced with the final problem: how to use the parameters in the request to define these groups in the validator; another brick wall. No simple solution.
Solution
This question has been asked here but no satisfactory solution was actually given.
Option one recommended redefining the scope of the pipe to "request" level but didn't explain how, and solutions found online didn't work.
The second solution, using a custom decorator to perform the validation instead, did work, very well in fact here is a simplified version of the code:
export const ProfileSectionData = createParamDecorator(
async (data: unknown, ctx: ExecutionContext) => {
const request = ctx.switchToHttp().getRequest();
let object = plainToInstance(SectionDto, request.body); // I don't need to access the metatype from the request because I know what type I need but I'm sure I could if need-be.
const groups = [request.params.profileSection];
let validatorOptions = { groups, ...defaultOptions };
const errors = await validate(object, validatorOptions);
if (errors.length > 0) {
throw new BadRequestException();
}
return request.body;
},
);
Implications?
Here's my question. When Jay McDoniel recommended using a custom decorator, they warn: "Do note, that this could impact how the ValidationPipe is functioning if that is bound globally, at the class, or method level."
What does this mean?
Are there any vulnerabilities or performance drawbacks associated with this solution?
Obviously, one drawback is that you are using validation outside a validation pipe which is not ideal from a point of view of order and single-responsibility but I can't think of tangible inconveniences beyond aesthetics and maintainability.
Knowing the background, would you have approached the problem in a completely different way?
I have questions about the Object.setPrototypeOf(this, new.target.prototype) function because of this MDN warning:
Warning: Changing the [[Prototype]] of an object is, by the nature of how modern JavaScript engines optimize property accesses, currently a very slow operation in every browser and JavaScript engine. In addition, the effects of altering inheritance are subtle and far-flung, and are not limited to simply the time spent in the Object.setPrototypeOf(...) statement, but may extend to any code that has access to any object whose [[Prototype]] has been altered.
Because this feature is a part of the language, it is still the burden on engine developers to implement that feature performantly (ideally). Until engine developers address this issue, if you are concerned about performance, you should avoid setting the [[Prototype]] of an object. Instead, create a new object with the desired [[Prototype]] using Object.create().
So what would be the best way to restore the prototype string in TypeScript (Node.js) without using Object.setPrototypeOf(this, new.target.prototype) (but using classes)? This is all because of an error management middleware implemented in Express, in which I need to make use of instanceof to determine the origin of the error and thus return a proper response, but whenever I make a class that extends from Error, the error instanceof Error returns true, but error instanceof CustomError returns false. Doing a little research I found this in the official TypeScript documentation:
The new.target meta-property is new syntax introduced in ES2015. When an instance of a constructor is created via new, the value of new.target is set to be a reference to the constructor function initially used to allocate the instance. If a function is called rather than constructed via new, new.target is set to undefined.
new.target comes in handy when Object.setPrototypeOf or proto needs to be set in a class constructor. One such use case is inheriting from Error in NodeJS v4 and higher
// Example
class CustomError extends Error {
constructor(message?: string) {
super(message); // 'Error' breaks prototype chain here
Object.setPrototypeOf(this, new.target.prototype); // restore prototype chain
}
}
// Results
var CustomError = (function (_super) {
__extends(CustomError, _super);
function CustomError() {
var _newTarget = this.constructor;
var _this = _super.apply(this, arguments); // 'Error' breaks prototype chain here
_this.__proto__ = _newTarget.prototype; // restore prototype chain
return _this;
}
return CustomError;
})(Error);
And I thought okay everything is perfect since the code at the moment of being converted, I suppose that the compiler will take care of that function that is slow and will replace it with another procedure that is more efficient, but to my surprise it's not like that and I'm a little worried about the performance (I'm working on a big project).
I'm using TypeScript version 3.9.6 in Node 14.5.0, these are screenshots of the tests I did:
TypeScript with Node
TypeScript Playground
TypeScript compiler results
I tried to migrate my express server code from js to typescript and ran into issues when trying to access fields I added to the request object in my own middleware functions.
For example, I am doing something like this:
app.use((req, res, next) => {
let account = checkAuthentication(req);
if(account) {
req.authenticated = true;
req.account = account;
next();
}
});
Now I want to access those properties later on in my other middleware functions, but the .d.ts file from #types/express defines the req object the same way for all middleware functions of course.
Now I came up with two ways to deal with this, but both seem bad:
Use a typeguard in every middleware function. But then I add useless code to the js output because my routes are setup in a way so that I know at "compile" time what shape my req object has in each middleware function.
Use a type assertion in every middleware function. But then I don't get the benefits of typescript, despite writing my code so that all types are know at compile time, because I basically disable typechecking for the req object then.
So my question is: How, if at all, can I utilize typechecking in that scenario? If I can't, does that mean a framework like express is impossible in a statically typed language (like C#)?
You can use module augmentation to extend the Request type defined in the express module
declare global {
namespace Express {
export interface Request {
account: Account;
authenticaticated: boolean
}
}
}
I trying to convert a nodejs project to TypeScript and while mostly I did not faced really difficult obstacles during this process, the codebase has few gotchas like this, mostly in startup code:
function prepareConfiguration() {
let cloudConfigLoader = require('../utils/cloud-config');
return cloudConfigLoader.ensureForFreshConfig().then(function() {
//do some stuff
});
}
I may be need just an advice on which approach for refactoring this has less code changes to be made to make it work in TypeScript fashion.
In response to comments, more details:
That require loads the node module, not a JSON file. From that module the ensureForFreshConfig function contacts with a cloud service to load a list of values to rebuild a configuration state object.
Problem is that mdule was made in standard node approach of "module is isngleton object" and its independencies include auth component that will be ready only when the shown require call is made. I know it is not best a way to do so..
Typesript does not allow "mport that module later" except with dynamyc import which is problematic as mentiond in comment.
The "good" approach is to refactor that part of startup and make the ensureForFreshConfig and its dependency to initiate its ntenras on demand via cnstructors.. I just hoped ofr some soluiton to reduce things to be remade durng this transition to the TypeScript
import { cloudConfigLoader } from '../utils/cloud-config'
async function prepareConfiguration() {
await cloudConfigLoader.ensureForFreshConfig()
// do some stuff
// return some-stuff
}
The function is to be used as follows.
await prepareConfiguration()
I'm developing an application in which I need to have some abstraction.
I mean there, that I would like to "simulate" an interface behaviour like creating a contract inside of my concrete classes.
Actually, dealing with Users, I'm having a UserMongoRepository class with the contract implemented :
getAll() returns the full list of users by promise
getById(id) returns the user concerned by promise
save(user) saves the user by promise
... etc
I have the same methods implemented inside of the UserMysqlRepository (allowing me to switch behaviour when a change is needed.
Problem
My problem is that I'm dealing with Mongoose that doesn't act like a datamapper, but more like an active record.
It means that my implementation of save(user) would be a bit weird like following :
save(user){
let mongooseUser = this.convert(user);
return user.save();
}
The convert method allows me to switch from a standard Model to a specific Mongoose model. It allows me, again, to have some abstraction and to don't have to rewrite my full application data access.
My real problem is when I try to unit test my full class :
import MongooseUser from '../../auth/mongooseModel/MongooseUser';
/**
* UserMongoRepositoryclass
*/
export default class UserMongoRepository{
/**
* Create an UserMongoRepository
*/
constructor(){
}
/**
* Convert a User to a MongooseUser
*/
convert(user){
return new MongooseUser({email:user.mail,password:user.password,firstname:user.firstName, lastname:user.lastName});
}
findById(id){
return MongooseUser.find({id:id});
}
save(user){
return user.save();
}
}
In a standard way, I would inject my DAO inside of my constructor, and being able to mock it.
In the case of mongoose, it's a bit disturbing, because the element that makes the job isn't an instantiated object (so that I can mock it) but a class definition imported at the top of the document.
Solutions
Should I pass the MongooseUser class definition as a parameter inside of the constructor ?
Implying that I will have this code inside of the convert method :
let user = new this.MongooseUser({})
Have you got a better idea, to abstract mongoose behaviour in data mapper way ?
I don't want to use another module, it's, in my sense, the most advanced one with NodeJS...
I'm not familiar with the import syntax (nor EMCASCRIPT-6), though you say you're using node.js, so I'd recommend using the proxquire package. The idea is that the package allows you to require an external package, while stubbing the requirements that that package would use. So in your case, you could do something like:
proxyquire('../my/class/that/uses/mongoose', {
mongoose: MyTestMongooseImplementation
})
Which would allow you to use your own mongoose implementation while still using your MongooseUser as you have defined it in your package. Alternatively, you could just override the the MongooseUser class (path relative to the the file whose requirements you are stubbing:
proxyquire('/path/to/UserMongooseRepository', {
'../../auth/mongooseModel/MongooseUser': MyTestMongooseUser
})
Documentation: https://www.npmjs.com/package/proxyquire