what is the need to create .babelrc in preact application - preact

I am starting with Preact application. the documentation says
Instead of declaring the #jsx pragma in your code, it's best to configure it globally in a .babelrc:
For Babel 5 and prior:
{ "jsxPragma": "h" }
For Babel 6:
{
"plugins": [
["transform-react-jsx", { "pragma":"h" }]
]
}
I am new to Preact world and wants to understand why do we need to create this file & what is jsx pragma ?

The pragma is a comment (/** #jsx h */ here) placed on the top of a file containing some JSX, telling the JSX transformer which function you want to call to create each element of you virtual dom. Preact uses hyperscript, that's why you need to use the h pragma. Preact suggest to create this file so babel take care of the pragma itself, without you needing to add it to every file. This way you can't forget to put it on a file.
You suggest you to read WTF is JSX, which is a fundamental post if you want to know everything about JSX.

Related

What is the CORRECT working flow for typescript, nodejs, npm, esmodule and commonjs?

I'm writing an idle game with typescript. It runs into a situation that, I should deal with some compute big number logic both on client and server. The client does not support bigint well, and the server does. So, for some lazy reason, I make a choice to cut out the same logic, caculate with JSBI(There 's babel plugin to convert JSBI to bigint), and export a single npm package. I think the client and the server will use it in there situation and it works.
So I have it done, and I got the problem. The client is es module, the server is commonjs. I ran babel on the whole package and got some compile problem -- some JSBI useages does not been converted because of functions was invoked deep and crossed. I made a new package called jsbi-extension, and change all JSBI function to call jsbi-extension instead.
It works after babel runs, but I got new problems.The package should export 2 different code with cjs and mjs extension, and after babel, Igot 2 different .d.ts . I don't know how to config the package.json to export different .d.ts with different target, and I think if I wan't to make it universal, maybe there should be 4 export for one single npm package. Like for bigint & cjs, bigint & mjs, JSBI & cjs, JSBI & mjs .. and I wanna to know If there's some CORRECT working flow for this situation.
The jsbi-extension is here https://github.com/darklinden/jsbi-extension and at last I build it to different branch to make it works, I think it is noy CORRECT.
Is there any Macro or any else, like type BI = ifdef bigint then bigint else (import JSBI) endif to define for .d.ts file?
Or Is there any key for package.json to define separated exports? like
{
"exports" : {
".":[
{
"importerHasImported" :"jsbi",
"import-types":"...",
"import":"...",
"require-types":"...",
"require":"..."
},
{
"importerNotImported" :"jsbi",
"import-types":"...",
"import":"...",
"require-types":"...",
"require":"..."
}
]
}
}

Importing typescript definitions for my bundled javascript library

I have a library I wrote in typescript, it contains multiple files and an index.ts file that contains all the exports of the library.
I used webpack to compile the entire library into a single index.js file but I'm having trouble importing it WITH type definitions.
Lets say this is my library:
src
-- index.ts
-- items-dal.ts
-- items-bl.ts
output
-- index.js
-- index.d.ts
-- items-dal.d.ts
-- items-bl.d.ts
webpack.config
ts.config
package.json
So I copied the output folder to my other project but when I try to create a class that inherits one of my library's classes I get an error:
// user-dal.ts
const { ItemsDAL } = require("./output");
class UsersDAL extends ItemsDAL {
constructor() {
super("users");
}
}
export default new UsersDAL();
// usage
import usersDal from "./users-dal.ts";
usersDal.getAll() // <-- Property "getAll" doesn't exist on type usersDal
I know I can work around this by using require() but I'd prefer having the actual typings in my other projects.
The reason I'm doing this is because I'm obfuscating the index.js file but I don't mind exposing the name of the functions it contains. It may sound counter-productive but it provides enough security for my needs.
Is there a way to make typescript detect the d.ts files? (or any other way to have obfuscated code with typings)
Just add a "types" declaration to your package.json.
{
...
"main": "./output/index.js",
"types": "./output/index.d.ts",
...
}

Require.js Optimizer incorrectly ordering shim dependencies

I have a web application that uses Require in order to load dependencies. I have a set of JS libraries that are included using the Require config.shim object.
Two such example libraries are:
require.config({
shim: {
"libs/leaflet": {
exports: "L"
}
"libs/leaflet-dvf": {
deps: ["libs/leaflet"],
exports: "L"
}
}
The second library, leaflet-dvf requires the first, leaflet. The second is a plugin to the first that depends on the global scope variable L that the first library defines.
When I run the application using Require normally, everything works fine. I can include either library from the shim, and everything works great. No problems.
The problem comes when I run this code through the Require r.js Optimizer. The Optimizer, when it builds the single optimized JS file, will incorrectly order the dependencies. In the built file, the leaflet-dvf code will come before the leaflet code. This causes a JS runtime error because the dependant plugin cannot find the L global scope variable that is required.
My build config looks like:
({
baseUrl: "../js",
paths: {
"requireLib": "../js/libs/require"
},
include: ["requireLib"],
name: "Main",
out: "bin/Main-built.js",
optimize: "none",
wrapShim: true
})
When I run the Optimizer, using Rhino, it builds my output file. In the Main-built.js file, the code for the plugin will come before the required library. This causes an L undefined error.
How do I get the Optimizer to respect the dependency order of my Shims, in order to properly order the library files in my Optimized JS file?
I had a similar problem a while back with knockout extensions and shim didn't work correctly. This is how we solved it.
Create a module called: leafletLib
define(["libs/leaflet","libs/leadleft-dvf"],function(leftlet,dvf){
return leaflet;
});
LeafletLib has the main library and all of the extensions. On modules that have leaflet or leaflet-dvf as a dependancy you call leafletLib. It is kind of hacky but it might work for you.
define(["leafletLib"],function(leafletLib){});

Why do I need to add a "shim" for AngularJS when using Require.js?

I have seen several examples that use this:
main.js
/*global require*/
'use strict';
require.config({
paths: {
angular: './angular',
app: './Content/app',
ngAnimate: './Scripts/angular-animate',
uiRouter: './Scripts/angular-ui-router'
},
shim: {
angular: {
exports: 'angular'
}
}
});
require(['angular', 'app'], function (angular) {
angular.bootstrap(document, ['app']);
});
Can someone explain to me why the shim is needed? My application uses other modules such as angular-ui router, jQuery etc. Do I need to do something similar and add a shim for these?
The rule is pretty simple: if a library/script/package/plugin is AMD-aware, then you don't need a shim. (Actually, you must not use a shim for it.) If it is not AMD-aware, then you need a shim.
A library/etc is AMD-aware if it detects that an AMD loader is present and calls define to make itself known to the loader.
jQuery from about 1.8 onwards has not needed a shim because it calls define. Angular, on the other hand, does not call define.
To know whether a specific piece of code needs a shim, you can read its documentation or if the documentation is not clear on this, then you can check the source code for a call to define. For instance jQuery 1.11.0 has this code:
// Register as a named AMD module, since jQuery can be concatenated with other
// files that may use define, but not via a proper concatenation script that
// understands anonymous AMD modules. A named AMD is safest and most robust
// way to register. Lowercase jquery is used because AMD module names are
// derived from file names, and jQuery is normally delivered in a lowercase
// file name. Do this after creating the global so that if an AMD module wants
// to call noConflict to hide this version of jQuery, it will work.
if ( typeof define === "function" && define.amd ) {
define( "jquery", [], function() {
return jQuery;
});
}
How it looks like will vary from one case to the other but the basic think you want to look for is the check that define exists, is a function, has the amd property set and the call to define.
(Note that jQuery is a special case where they decided to hard code the name of the module in the define call (first parameter: jquery). Generally the name of the module won't be present in the define call but will be left for RequireJS to infer on the basis of the file name.)

Do RequireJS modules "inherit" dependencies?

If I have a module that requires an application namespace, e.g.:
define(["app"], function(App){
[...]
});
... and the namespace requires libraries used by all of my modules, e.g.:
define(["jquery", "underscore", "backbone"], function($, _, Backbone){
[...]
});
... then all of my modules have access to the libraries required by the namespace, i.e. I can use $, _, and Backbone.
I like this behavior because I can avoid being repetitious, but I suspect that I'm cheating somehow, and that I should require libraries in each module.
Can anyone set me straight here?
Yeah, that's kinda hacky. You only have access to jQuery, underscore and backbone because they're also defined onto the global scope. Backbone and undersocre aren't real AMD module, they have to use a shim config. jQuery declare himself on the global scope and as an AMD module so it works everywhere.
So, yes it work like that, but it's not optimal. Real AMD module (non-shimmed) won't work this way as they need to be passed in the define functions arguments, and you won't be able to pull only one module to test it in a separate environment, etc. This way, you cannot load different versions of a scripts to work with different module/app section/page.
The goal of AMD is to bring modularity to your code so every module declare it's own dependencies and will work out of the box it without relying on the global scope (which is a good thing to prevent name collision and conflict with third party/other dev working on the same project).
If you find it's redundant to redeclare everytime your base dependencie, create a boilerplate file that you just copy/paste when creating another module (it's better than nothing). And, maybe some command line tools can build AMD module wrapper for you.
Soooo, yes it works, but it won't scale if your project ever get bigger or need to be updated pieces by pieces.
Hope this help !
good news for the above answer: underscore 1.6.0 now is wrapped as a amd module :)
see "lib.chartjs" for exporting globals in not amd wrapped "shimmed" javascript libraries
requirejs.config({
paths: {
"moment": "PATH_TO/js/moment/2.5.0/moment.min",
"underscore": "PATH_TO/js/underscore/1.6.0/underscore",
"jquery": "PATH_TO/js/jquery/1.10.2/jquery.min",
"lib.jssignals": "PATH_TO/js/jssignals/1.0.0-268/signals.min",
// WORKAROUND : jQuery plugins + shims
"lib.jquery.address": "PATH_TO/js/jqueryaddress/1.6/jquery-address"
"lib.jquery.bootstrap":"PATH_TO/js/bootstrap/3.0.3/bootstrap",
"lib.chartjs": "PATH_TO/js/chartjs/0.2/Chart.min",
},
shim: {
"lib.jquery.address": {deps: ["jquery"]},
"lib.jquery.bootstrap": {deps: ["jquery"]},
"lib.chartjs": {deps: ["jquery"], exports: "Chart"},
}
});

Resources