TSLint: Backbone get() called outside of owning model meaning - node.js

I am using Microsoft's tslint-microsoft-contrib tslint configuration and I am really happy with it. However there is one rule which warns me about my code. I don't understand the rule description text or how I could solve this more elegant.
[tslint] Backbone get() called outside of owning model:
this.client.get('locations') (no-backbone-get-set-outside-model)
Code:
import * as Redis from 'ioredis';
import config from './config';
export class RedisWrapper {
private client: Redis.Redis
constructor(redisUrl: string) {
this.client = new Redis(redisUrl)
}
public async getLocations(): ILocation[] {
const locationsResponse: string = await this.client.get('locations')
}
}
In this line the tslint warning pops up: const locationsResponse: string = await this.client.get('locations')
The question:
Originally I faced this issue at a different place in my project and I thought I was supposed to write wrapper methods with typedefs, but I wasn't able to make tslint happy with that either. Can someone enlighten me what this rule means and how I could solve it?

I will quote HamletDRC (from the Microsoft team) who explained the rule itself very well:
The point of the no-backbone-get-set-outside-model rule is to make
sure that you don't invoke dynamically dispatched methods that the
compiler cannot enforce correctness on. For example, the compiler will
not complain if you type route.params.get('id'),
route.params.get('ID'), route.params.get('Id') but only one of those
invocations will actually work at runtime. The design advice is to
define a statically typed "getId(): number" method on the RouteParams
object so the compiler can enforce these calls. So, in my opinion the
rule actually has found an issue in your code that you should fix (but
see my second point :) )
Source: https://github.com/Microsoft/tslint-microsoft-contrib/issues/123
In this specific case one could extend the Redis class like this:
export class RedisWrapper extends Redis {
public async getLocations(): Promise<ILocation[]> {
const response: string = await this.get('locations');
if (response == null || response.length === 0) { return []; }
return <ILocation[]>JSON.parse(response);
}
}

Related

Jest mocking private members

In a Node/Express server, we use a repository that needs to be unit-tested using Jest.
//Private things
let products;
function loadProducts() {
if (!products)
products = fetchProductsFromSomeDbOrServiceOrWhatever()
}
function saveProducts() {
persistPrivateProductsToADbOrServiceOrWhatever()
}
// Exported/public things
export function read() {
loadProducts();
return products;
}
export function add(product) {
loadProducts();
products.push(product);
saveProducts();
}
We want to unit test like this:
import { read, add } from './productRepo';
it('can read products', () => {
expect(read().length).toBe(5);
});
it('can add a product', () => {
const oldNum = read().length;
add({id:0, name:'test prod', moreProps});
expect(read().length).toBe(oldNum+1)
});
You get the idea. It's not a class so we can't mess with the prototype.
Problem: How do I mock the private products and/or loadProducts and/or saveProducts so that it isn't reading from the actual data source?
Presumably these private functions call out to other pieces of functionality you've written yourself or imported from libraries.
function loadProducts() {
if (!products)
products = fetchProductsFromSomeDbOrServiceOrWhatever()
}
function saveProducts() {
persistPrivateProductsToADbOrServiceOrWhatever()
}
Let's take fetchProductsFromSomeDbOrServiceOrWhatever as the example. One basic architectural consideration to make the code properly encapsulated and testable is to put this functionality in a separate module. So I would expect an import at the head of the file:
import fetchProductsFromSomeDbOrServiceOrWhatever from './fetchProductsFromSomeDbOrServiceOrWhatever'
So in this case just mock it in your test file:
jest.mock('./fetchProductsFromSomeDbOrServiceOrWhatever');
If the functionality is not extracted into a separate module this makes your code less testable; this on its own is a good reason to refactor.
Note: the other replies on this thread are correct when they say that private functions of classes should not be tested, but I think that is a slightly different issue from the one you are asking.
First, start initializing products to an empty array, else tests are doomed to fail because of the null value. Also change the null check
Then parametrize your loader and saver functions so your functions can be testable. Last write tests for you loader and saver functions outside of this repo function.
// assummed imports
fetchProductsFromSomeDbOrServiceOrWhatever=()=>{}
persistPrivateProductsToADbOrServiceOrWhatever=()=>{}
//Private things
let products=[];
function loadProducts(loader) {
loader=loader || fetchProductsFromSomeDbOrServiceOrWhatever
if (products.length==0)
products = loader()
}
function saveProducts(saver) {
saver=saver || persistPrivateProductsToADbOrServiceOrWhatever
saver()
}
// Exported/public things
export function read(loader) {
loadProducts(loader);
return products;
}
export function add(product,loader,saver) {
loadProducts(loader);
products.push(product);
saveProducts(saver);
}
both exported functions can now use fetch/persist functions either by importing or as arguments.
Now the remaining is the mocking loader and saver function. saver function does not change anything so it can be null or empty. but if you want to check if it is called inside, then you need to mock it.
import {jest} from '#jest/globals'
import { read, add } from './productRepo';
it('can read products', () => {
loader=jest.fn().mockReturnValue([{id:7},{id:42}])
expect(read(loader).length).toBe(2);
expect(loader).toBeCalledTimes(1)
});
it('can add a product', () => {
loader=jest.fn().mockReturnValue([{id:7},{id:42}])
saver=jest.fn()
const oldNum = read(loader).length;
add({id:0, name:'test prod'},loader,saver);
expect(read(loader).length).toBe(oldNum+1)
expect(loader).toBeCalledTimes(0)
expect(saver).toBeCalledTimes(1)
});
There is a "gotcha" here. Since productRepo is imported once, loader is called in the first test but will not be called again in the second test since the first has already changed the products. Thus subsequent tests must take this into account when using non-class packages.
you must not get access to private properties or methodes anyway.
instead you can provide setter and getter for your properties.
for methodes I believe you can break it into some private parts and some public parts. private parts for your actual data source and public parts that can be used in test either.
I suggest implementing an initialize method on productRepo.js.
export function init(data) {
products = data
}
Then, you can init products with mocked data.
Also, if you can't change the file, you could use the rewire library, which lets you access non-exported functions and variables.

Multiple routes with same path - use next route (e.g. feature toggle, a/b testing)

does anybody know if there is an option to implement multiple routes in nestjs with the same path.
Lets say:
/checkout in BasketControllerNew
/checkout in BasketController
At the first controller, there is a guard which decides if the user can access the /checkout at BasketControllerNew.
Returning false will throw an ForbiddenException, fine.
If I throw a custom exception, the second route will also not be executed.
Filters seems also not able to jump to the next route.
I want to implement feature toggle functionality.
In this case for example a new checkout process with a completely new controller file.
The new checkout process should be tested by some users for example or can be enabled and disabled by feature toggles.
My preferred behaviour would be to throw a custom exception like DisabledRouteException or something like that which means "please try the next matching route".
Another possible usecase could be:
A cms with a wildcard route which tries to find pages because they do not have a specific prefix at their path
There will be a module which uses routes prefixed with moduleA/.
As long as the module is not enabled for a specific customer, the moduleA/* should be handled by the wildcard route because there can be pages at this path.
Has anybody an idea how to implement such things?
Thanks a lot for your support and keep healthy.
Daxi
I have found a solution, which works for me:
Summary:
Multiple routes with same path
Guard which throws a Exception
Filter which catches the exception and executes the next route.
Example code for controller:
#Controller()
export class TestController {
#Get()
#UseGuards(TestGuard)
getHello(): string {
return "Hello";
}
#Get()
#UseGuards(TestGuard)
getHello2(): string {
return "Hello2";
}
#Get()
getHello3(): string {
return "Hello3";
}
}
Example code for guard:
#Injectable()
export class TestGuard implements CanActivate {
canActivate(
context: ExecutionContext,
): boolean | Promise<boolean> | Observable<boolean> {
// add logic here to switch between execution or skip
throw new NotFoundException('Module not active');
}
}
Example code for filter:
#Catch(NotFoundException)
export class TestFilter implements ExceptionFilter {
catch(exception: NotFoundException, host: ArgumentsHost) {
const ctx = host.switchToHttp();
const next = ctx.getNext();
next();
}
}
Enable the filter globaly in main.ts:
async function bootstrap() {
const app = await NestFactory.create(TestModule);
app.useGlobalFilters(new TestFilter());
await app.listen(3000);
}
To get a proper handling, the code can be improved by generating a custom exception and filter only that, currently all 404s will trigger the next route.
At this example the guard will always throw the exception which ends up at this behaviour:
guard for getHello will throw the NotFoundException
filter will trigger the next route
guard for getHello2 will throw also the NotFoundException
filter will again trigger the next route
getHello3 will be executed (no guard active)
My code is only a small POC which can be improved as described and should be improved.
The test was only a very small quick and dirty solution, but if you will need a similar solution, now you can start at the same point.
Kind regards,
Daxi

A better way to inject Typescript dependencies in Node.js?

I'm working on a backend solution in Node.js and Express, using Typescript. I'm trying to do dependency injection similar to Angular, but lacking the #Injectable() decorator I'm doing this:
Dependency:
export class SomeDependency {
public someMethod() {
console.log('Some method is running');
}
}
Parent:
import { SomeDependency } from './someDependency';
export class Whatever {
constructor(
private someDependency = new SomeDependency()
) {}
public doSomething() {
console.log('Look, I\'m doing something!');
this.someDependency.someMethod();
}
}
It works, but it may not be the best way. Any suggestions on how to improve it are appreciated.
On the other hand, what do you guys think: is it better to import dependencies like this, in the constructor, or create new instance every time, or most of the time? Like this:
import { SomeDependency } from './someDependency';
export class Whatever {
public doSomething() {
console.log('Look, I\'m doing something!');
new someDependency().someMethod();
}
}
As SomeDependency isn't a singleton, I wonder which one is less efficient: keeping an instance alive in the parent, or creating a new one every time, letting the garbage collector take care of it when the call finished.
your use case is absolutely correct. It is always a good practice to keep your modules as lightly coupled. So, in future you can switch between the implementation without touching the main logic but the Javascript or Typescript doesn't provide the dependency injection mechanism.
To achieve this you can use a library know as Inversify.
Inversify is powerful and lightweight inversion of control container for JavaScript & NodeJS. It implements IoC and allow us to inject dependencies, uncoupling our method from any implementation details
It give you the same functionality as we use in Angular like you can annotate the class with #Injectable() annotation and perform dependency injection.
For reference you can go to: https://www.linkedin.com/pulse/how-use-dependency-injection-nodejs-typescript-projects-ribeiro
You may want to look into Dime. It's a very simple library I made for dependency injection similar to Angular. There are more details on the Github page and the wiki. It's still in early development, so there might be bugs.
Example:
import { ItemsService } from './items-service'; // ItemsService is an interface
import { Inject } from '#coined/dime';
class ItemsWidget {
#Inject()
private itemsService: ItemsService;
render() {
this.itemsService.getItems().subscribe(items => {
// ...
})
}
}
// Setup
const appPackage = new Package("App", {
token: "itemsService",
provideClass: AmazonItemsService // Use any implementation
});
Dime.mountPackages(appPackage);
// Run the application
const widget = new ItemsWidget();
widget.render();

FIrebase Firestore onCreate Cloud Function: Object unidentified [duplicate]

I'm building a cloud function that will use the Stripe API to process payments. This is within a firebase project. When I run firebase deploy I get the error "Object is possible 'undefined'" const existingSource = customer.sources.data.filter( (s) => s.id === source).pop();
I'm not sure how to resolve this.
Here is my xxx.ts where getorCreateCustomer exists
/** Read the stripe customer ID from firestore, or create a new one if missing */
export const getOrCreateCustomer = async(uid: string) => {
const user = await getUser(uid);
const customerId = user && user.stripeCustomerId;
//if missing customerId, create it
if (!customerId) {
return createCustomer(uid);
}
else {
return stripe.customers.retrieve(customerId);
}
}
Based on the definitions and contents of your functions, TypeScript is unable to infer the return type of getOrCreateCustomer. It is making the assumption that it could return undefined, and its strict mode is calling you out on the fact that you could be referencing a property on an undefined object, which would result in an error at runtime.
What you need to do is declare the return type to be something that can't possibly be undefined, and make sure the code in the body of the function is correct on that guarantee (otherwise you'll get a new error).
If you can't do that (but you really should do that), you might want to instead disable strict mode in your tsconfig.json file, since that is what's enforcing this level of correctness in your code.
I suggest the first option, even if you have to write more lines of code, as it's a better use of TypeScript's typing system.
What #Doug mentioned, but also you could write your logic to make sure that every part of customer.sources.data is not undefined...
ie:
const { sources } = customer
if (sources) {
const { data } = sources
if (data) {
// then filter / etc etc ...
}
}
7 months later, I figured out the best solution.
I simply wrapped the the contents of the firebase callable function in the following if/else statement. It's a bit redundant but it works.
if (!context.auth) {
// Throwing an HttpsError so that the client gets the error details.
throw new functions.https.HttpsError('failed-precondition', 'The function must be called ' +
'while authenticated.');
}
else{ ...copy function code here }
If you don't care about the authentication piece you can simply define the type of context as any.
(data, context:any)
Open tsconfig.json and add "strictNullChecks": false to angularCompilerOptions object. It worked for me.
{
...
"angularCompilerOptions": {
"strictNullChecks": false,
...
}
}

Export resolved promise result

Consider I have a.js with following class
class Connector {
constructor (url) {
this.url = url;
this.conneciton = null;
}
async connect() {
this.connection = await someThidPartyModule.connect(url);
return this;
}
}
// here I would like to do something like
// export default new Connector().connect();
Then use in b.js, c.js etc. connection from resovled connect method:
import Connector from 'a.js';
Connector.connection.callSomeMethod(); // here connection already exists after that promise resolved
As far as I aware it is not possible to do this, but maybe the some hack or workaround exists?
So after some tries found following:
export Promise is not a good idea, as on each import we also will get Promise, event if it is already resolved
Better to export class instance, call connect method and then allow to use it in all other files
Sadly anyway promise moving up to main endpoint file, where we have to make initialization async function and inside wait for promise to resolve
Also tried to export class with static create method, but that leave us with instance on the far end, which cannot be exported to other files.

Resources