Type hints of flow not stripped by babel - node.js

I have a React.JS project which uses a custom 'theme' with UI components.
This theme also provides build scripts (webpack config, babel configs, etc.).
I want to start using Flow in this project.
I installed the needed npm packages and added flow to babel's presets, then I added props = {mytestprop: string} to one of my React` classes.
Webpack compiled my code successfully, but the type hints were not stripped! Of course, the browser was not able to execute this code - when I try to run it, it raisesReferenceError: string is not defined.
The current list of presets from .babelrc is: ["es2015", "react", "stage-2", "flow"]. I'm sure that this is the actual list used by babel because if I delete any of the first 3 presets, compilation fails.
Do you have any ideas on what could lead to this behavior when stripping Flow types?

It's not that type annotations are not being stripped. It's that { mytestprop: string } is not a valid type annotation on the right-hand side of an assignment because it clashes with the syntax for defining an object.
Specifically, when Flow's parser sees the statement { mytestprop: string } it will interpret this as an attempt to create an object with a field named mytestprop with its value set to the value of the variable string, so it will leave the statement alone as it is, and you'll get the error you've seen in the browser.
The correct way to type object declarations is to type the left-hand side of the declaration.
For instance,
let myProps: { myTestProp: string } = { myTestProp: "testProp" };
if you aren't declaring your props separately, you could declare a custom type:
type myPropType = { myTestProp: string }
// ...
const myComponent = (props: myPropType) => //render your component
Since the type statement is exclusive to Flow and not a valid JavaScript statement, it will be stripped correctly.

Related

Order of keys in package.json exports

I believe I understand the basic functioning of the exports key in package.json files:
// package.json
{
"exports": {
".": {
// used by typescript
"types": "./file_with_type_defs.d.ts",
// used by ESM resolution
"import": "./file_to_import.mjs",
// used by CJS resolution
"require": "./file_to_require.cjs",
// used by ...others?
"default": "./file_one_more.js"
}
}
}
Question: Does the order of the "types", "import", "require", and "default" keys matter? Normally I would think no way, since JSON object keys are unordered. From json.org:
An object is an unordered set of name/value pairs. An object begins with { ...
But Typescript documentation says that "types" must come first:
Entry-point for TypeScript resolution - must occur first!
"types": "./types/index.d.ts"
and the NodeJS documentation says "default" should come last
"default" - the generic fallback that always matches. Can be a CommonJS or ES module file. This condition should always come last.
So... Does the order of the export keys matter? If not, what do the NodeJS and Typescript documentation mean when they talk about "first" and "last"?
Having swapped the order of "types" with other keys, its order seems not to matter.
Webpack, which also uses the export key explains this field as follows:
Notes about ordering
In an object where each key is a condition, order of properties is significant. Conditions are handled in the order they are specified.
Rather than see this as a plain object, consider it being like this if-else case:
let file;
if (platform_supports('types')) {
file = "./file_with_type_defs.d.ts";
} else if (platform_supports('import')) {
file = "./file_to_import.mjs";
} else if (platform_supports('require')) {
file = "./file_to_require.cjs";
} else if (true) { // default
file = "./file_one_more.js";
}
If you were to swap the order, it might be like this:
let file;
if (true) { // default
file = "./file_one_more.js";
} else if (platform_supports('types')) {
file = "./file_with_type_defs.d.ts";
} else if (platform_supports('import')) {
file = "./file_to_import.mjs";
} else if (platform_supports('require')) {
file = "./file_to_require.cjs";
}
Even though Typescript understands .d.ts files it would use file_one_more.js since it matches first.
I have tried swapping the order of "types" with other keys. It seems not to matter.
It might be that Typescript prioritizes the types condition over others. However, I'd stick with what they advise—"must occur first" is not "ought to occur first", after all.
As a side-note, historically JavaScript object keys are unordered / the order is not guaranteed. In practice, browsers did preserve the key order and this behavior was standardized in ES2015: non-integer keys are preserved in insertion order.
The JSON standard doesn't make the same promise, as there are many implementations of JSON decoders in different languages and it the predates ES2015 standard.
JSON objects are "unordered set" as "key do not have a particular order(i.e. never have to be sorted)", not as "do not have an order"
Parsed JS object DOES have an order of keys matching the orger in JSON
Depending on if the code uses for (let k in json) or if (json.types) the behaviour may be different
So by whatever reason it's recommended to order keys in the way some parsers may expect
In your case I'd recommend you to try swapping them, seeing that nothing happens, and swapping them back

Enum attribute in lit/lit-element

We are trying to build a component with a property variant that should only be set to "primary" or "secondary" (enum). Currently, we are just declaring the attribute as a String, but we were wondering if there is a better way for handling enums? For example, should we validate somehow that the current value is part of the enum? Should we throw an error if not?
I asked this question on Slack and the answers I got lean towards declaring the property as String and use hasChanged() to display a warning in the console if the property value is invalid.
Standard HTML elements accept any string as attribute values and don't throw exceptions, so web components should probably behave the same way.
This all sounds reasonable to me.
If you're using TypeScript I'd recommend just using strings. You can use export type MyEnum = 'primary' | 'secondary' to declare it and then use #property() fooBar: MyEnum to get build time checking. You can use #ts-check to do this in plain JS with #type MyEnum too.
This works well if the enums are for component options or that map to server-side enums that will get validated again.
However, if you want to validate user input into enums or loop through them a lot this is less good. As the JS runs it has no visibility of the type. You need an object dictionary, something like:
const MyEnum = Object.freeze({
primary: 'primary',
secondary: 'secondary'
});
// Enforce type in TS
const value: keyof MyEnum;
// Validate
const validated = MyEnum[input.toLower()];
// Loop
for(const enumVal of Object.keys(MyEnum)) ...
// Or Convert to a different value type
const MyEnum = Object.freeze({
primary: 1,
secondary: 2
});
These are somewhat idiosyncratic. Again, if you're using TypeScript it has an enum keyword that compiles to something like this and I'd use that rather than rolling your own. Strings are the better option unless you need to validate, loop or convert the values.

TS: Cannot invoke an expression whose type lacks a call signature when defined dynamically, but it works

I'm still quite new to typescript, so please be gentle with me if I'm doing something with no sense for this technology!
The problem that I'm trying to solve is having a dynamic way to define how my application errors should be structured, but leaving to the users the faculty to enrich the messages.
So I tried to create this logic in a module that could be extended easily from the application, but I'm currently facing the problem:
Error:(35, 18) TS2349: Cannot invoke an expression whose type lacks a call signature. Type 'ErrorMessage' has no compatible call signatures.
What I thought it was a good idea (but please tell me if I'm wrong), was to use a register and a map to have the possibility to extend this mapping every time I want. So I created my ErrorMessage interface to be like the following:
export interface ErrorMessage {
actionMessage: string;
actionSubject: string;
originalErrorMessage?: string;
toString: () => string;
}
and a register for these, called ErrorResponseRegister, as it follows:
export enum defaultErrors {
ExceptionA = 'ExceptionA',
ExceptionB = 'ExceptionB',
}
export class ErrorResponseRegister {
private mapping: Map<string, ErrorMessage>;
constructor() {
this.mapping = new Map()
.set(defaultErrors.ExceptionA, exceptionAErrorMessage)
.set(defaultErrors.ExceptionB, exceptionBErrorMessage);
}
}
So at the end, every ErrorMessage function should look like:
export function exceptionAErrorMessage(originalErrorMessage?: string): ErrorMessage {
return {
enrichment1: "Something happened",
enrichment2: "in the application core",
originalErrorMessage: originalErrorMessage,
toString(): string {
return `${this.enrichment1} ${this.enrichment2}. Original error message: ${originalErrorMessage}`;
},
};
}
Please note I haven't used classes for this ones, as it doesn't really need to be instantiated
and I can have a bunch of them where the toString() method can vary. I just want to enforce the errors should have an enrichment1 and enrichment2 that highlight the problem in a better way for not-technical people.
So, now, back to code. When I'm trying to use the exceptionAErrorMessage statically, I can't see any problem:
console.log(exceptionAErrorMessage(originalErrorMessage).toString())
But when I try dynamically, using the map defined in the ErrorResponseRegister, something weird happens:
// In ErrorResponseRegister
public buildFor(errorType: string, originalErrorMessage?: string): Error {
const errorMessageBuilder = this.mapping.get(errorType);
if (errorMessageBuilder) {
return errorMessageBuilder(originalErrorMessage).toString();
}
return "undefined - do something else";
}
The code works as expected, the error returned is in the right format, so the toString function is executed correctly.
BUT, the following error appears in the IDE:
Error:(32, 18) TS2349: Cannot invoke an expression whose type lacks a call signature. Type 'ErrorMessage' has no compatible call signatures.
The line that causes the problem is
errorMessageBuilder(originalPosErrorMessage).toString()
Can someone help me to understand what I'm doing wrong?
It looks like your problem is you've mistyped mapping... it doesn't hold ErrorMessage values; it holds (x?: string)=>ErrorMessage values:
private mapping: Map<string, (x?: string) => ErrorMessage>;
What's unfortunate is that you initialize this variable via new Map().set(...) instead of the using an iterable constructor argument.
The former returns a Map<any, any> which is trivially assignable to mapping despite the mistyping. That is, you ran smack into this known issue where the standard library's typings for the no-argument Map constructor signature produces Map<any, any> which suppresses all kinds of otherwise useful error messages. Perhaps that will be fixed one day, but for now I'd suggest instead that you use the iterable constructor argument, whose type signature declaration will infer reasonable types for the keys/values:
constructor() {
this.mapping = new Map([
[defaultErrors.ExceptionA, exceptionAErrorMessage],
[defaultErrors.ExceptionB, exceptionBErrorMessage]
]); // inferred as Map<defaultErrors, (orig?: string)=>ErrorMessage>
}
If you had done so, it would have flagged the assignment as an error with your original typing for mapping (e.g., Type 'Map<defaultErrors, (originalErrorMessage?: string | undefined) => ErrorMessage>' is not assignable to type 'Map<string, ErrorMessage>'.) Oh well!
Once you make those changes, things should behave more reasonably for you. Hope that helps; good luck!
Link to code

Missing type annotation error flow js

I am using flow js for static type checking in my project. I am getting errors while checking type.
Here are the steps which I followed while setting up flow in project.
npm i flow-bin -SD
Added commands in project.json:
"scripts": {
"flow": "flow",
"flow:check": "flow check ./src/"
}
Now, While running npm run flow:check, I am getting this error.
Missing type annotation for fn.
6| module.exports = function( ds, schema, fn ) {
^^
Because Flow needs you to tell it the type signature of that function.
Now if that's code you don't control (code inside node_modules for example) I suggest to exclude that from being typechecked by Flow; most libraries don't ship/bundle type definitions for Flow (the flow-typed repo might have them).
If that is code that you control (it's part of your app's code), then just add the types. For example (this are random types, you should replace these with the correct ones):
module.exports = function( ds: string, schema: number, fn: (string) => boolean ): boolean {
// ...
};
In this example, the ds parameter has to be a string, the schema has to be a number, and the fn parameter has to be a function that accepts a string as the only parameter and will return a boolean when called. And the result type of the exported function is a boolean as well.

When I use definition instead of class in puppet, what's the best practice for parameters?

I realise that it's generally a good idea to create params.pp in the module with modulename::params class and inherit that in modulename class to handle parameters in a separate file. How do I do that if instead of class, I am creating a definition?
Just to clarify, I'm using a definition to be able to install multiple versions of the same application on the server.
Good question. Since there is no inheritance available for defined types in Puppet the params.pp patterns can not be reproduced in the exact same way for defined types as for classes. There is another way though.
The following code outputs 'hello world' via the Foo['bar'] defined type:
class params {
$msg = 'hello world'
}
define foo($msg = $params::msg ) {
notify{ $msg: }
}
foo { 'bar': }
include params
Now, for the above to function it is necessary for params to be included. Otherwise the Puppet parser will complain that the class params has not been evaluated and therefore the $params::msg variable can not be resolved.
It is not necessary to provide ordering between the inclusion of params and the definition of bar, since in Puppet classes are always evaluated before defined types. If this would not be so the above would likely cause the same evaluation problem and you would have to write:
foo { 'bar':
require => Class['params'] # <- not necessary
}
include params
So for this to work in a module foo you can simply add a params class as you are used to and start your init.pp with:
include foo::params
define foo($x = $foo::params::x, $y = $foo::params::y, ...)
One important note
Before you happily proceed with the params.pp pattern I advise you to read this blog post: the problem with params.pp

Resources