babel-core 'transform' function cannot find plugin - node.js

I have a global node module that uses babel-core transform function.
I have no .babelrc at the root of my module.
It takes a file and basically just use transform to 'compile' it.
const result = transformSync(content, {
filename: src,
});
There is a .babelrc file along with the said file and I'm indeed able to find it
{
"presets": ["#babel/preset-env"]
}
but it complains about not finding '#babel/preset-env' which is right, because the module is installed with mine and not the file/.babelrc being transpiled.
I've tried many options in https://babeljs.io/docs/en/options but still can't make it work.
How can I configure transform so it get plugins from my module while loading babel configuration from the folder of the transpiled file ?

By design, Babel's plugin loader searches for plugins relative to the config file that references them, or uses the cwd for plugins passed directly in the transformSync options. Control of that isn't exposed to utilities calling Babel.
Changing those sematics would mean that a Babel config file would vary in behavior based on the tool that was loading it, which would be very inconsistent for users, especiall considering that one of the main benefits of having the config file format is so that config can easy be shared across multiple tools calling Babel, for instance one for tests and one for bundling.
If you want users to be able to customize your tool, it sounds like what you may actually want is your own entirely separate config file for your tool, so you can define whatever semantics you want for that.

Related

NodeJS import file with the same name as a folder

I have the following file structure: ([] are folders)
[MySQL]
ConnectionPool.ts
Connection.ts
MySQL.ts
In my code, I am using typescript to develop the app, which will be built to javascript for the production version. The development version is tested directly from the uncompiled typescript files, using babel with the following configurations:
{
"presets": [
["#babel/preset-env", {
"targets": {
"node": "current"
}
}],
"#babel/preset-typescript"
],
"plugins": [
"#babel/plugin-transform-runtime",
[
"module-resolver",
{
"alias": {
... list of some aliases
}
}
]
]
}
My problem is the following, I do most of my imports like this:
import ConnectionPool from 'MySQL/ConnectionPool`
which works for me, as when I run my dev code, the compiler correctly identifies the file extension to .ts, and it also correctly identifies the built versions file extension to .js.
But if I want to import my MySQL.ts file, I can't do it this way, as I will get the following error: Error: Cannot find module './'. If I specify in my import statement the file extension, everything works correctly, but then, when I build my code, there will be no more .ts files in there, so I will get errors there.
Strangely, on the frontend, where I use webpack, I get no complaints about importing files without extension whose name is the same as a folders name in the same directory. What solutions do I have for this issue, which does not revolve around renaming my files or folders in the structure?
Rename your file MySQL.ts to index.ts and move it into the folder MySQL
In the meantime, I resolved my problem by explicitly specifying .ts as the file extension, and when building the code, using the babel-plugin-transform-rename-import to replace the .ts extensions to .js extension.
I won't accept this as the correct answer, as this is too hacky for me, so if a better solution comes up, I am still open for it.
I don't believe what you want to do is possible, node will automatically include either the file or the directory but if both of them are called the same thing node will have no idea which one to import and it will always prefer one over the other.
Using import Example from 'example' will have node search either for any files called example.js or directories called example with an index.js and it will only load one of them.
The reason webpack does it correctly is because webpack will never bundle a "directory", it will take what you've specified as an import and try all of the resolves extensions until it finds one and then it will import that file, so you'll never have conflicting files with directories with webpack.
I thought maybe a solution is to use aliasing in your typescript config so that you can use import Example from '#example' and differentiate between your directories and files that way. Then you can also do import Example from 'example' if you just want to load the file itself. But even then I don't think that will even work because once it gets compiled to javascript you'll just have the same issue as before with conflicting paths.
That being said, while I understand being a little bit obsessed with naming conventions and small things, I really don't think you should be storing "mysql" outside of the "mysql" directory, for two reasons. The first reason is that it's part of that directory, that's what that directory is for, it contains the "mysql" stuff, so why would you want to store it outside. Secondly using a "index.ts" communicates something to other developers, when I open up a new project or directory I'm immediately looking for something called "index.ts" or similar, otherwise I have no idea where to even begin. Using "index.ts" is a good way to communicate to anyone that "this file right here is the one you're looking for, everything starts here". That being said, just call it index.ts and store it where it belongs.
You can try in your tsconfig.json to set moduleResolution to the classic strategy:
{
"compilerOptions": {
"moduleResolution": "classic"
}
}
Unfortunately, looking at the code, this seems to be impossible in your setup:
import is handled by module-resolver
module-resolver calls the NPM package resolve
resolve.sync checks for the no-extension case before testing extensions.
Given that most systems are going to follow Node module resolution, I think you're out of luck, without some manual aliases or moving files around.
FWIW, NodeJS says that extensions in import are mandatory.

baseUrl and paths in typescript - how

I've started an application in typescript. I've learned about some features of TS and it looked like something, that could help me, hovewer, I don't work like I would expect. The documentation of TS says something about Path mapping in Module resolution chapter. This module mapping can, if I understand it correctly, save me few or even lot of double-dots in import. So I have created a "inc" directory with one file (at this time), that will be included in multiple files in multiple directories. Into tsconfig I've written following:
"baseUrl": ".",
"paths": {
"#inc/*": [ "inc/*" ]
}
Now, I would expect, that using import { X } from "#inc/somefile" (even from file resting somewhere deep in folder tree) would result in importing that export X from ./inc/somefile.ts (or .js, when runnning).
Hovewer, the compiler/transpiler will leave the import statement intact, so when I try to run this code using node.js, it will die because there is no #inc/somefile - node doesn't read tsconfig and tsc doesn't create any mapping functions.
I can, of course, read and parse the path element by hand in some kind of require wrapper, but I believe there is something I'm doing wrong and/or better way to achieve this.
Thank you for your replies.
Remember that
if you then try to exeute the compiled files with node (or ts-node), it will only look in the node_modules folders all the way up to the root of the filesystem and thus will not find the modules specified by paths in tsconfig. -- introduced in tsconfig-paths readme.
In conclusion, directly require tsconfig-paths into dependencies list and load tsconfig file on run could be a quick solution.

Intellij IDEA resolve local require() path [ node.js ]

I'm trying to avoid relative require() calls in my express setup. I'd also like to avoid placing my code in the node_modules folder. In short, I'm trying to implement any of the methods described in this gist.
Any of those solutions will work fine for executing code with node or npm. However, I'm trying to find a solution that will also be supported by Intellij IDEA's code resolver, i.e. trying to make sure "go to declaration" and autocomplete hinting works.
I've tried the following
Setting NODE_PATH in the run configuration.
Using a global prefix, i.e. require( global.__base + "mylib").
Adding a symlinked folder to node_modules/.
Adding a symlink from a lib/ folder to node_modules/lib/ does work, but comes with two caveats:
Changes to the source files aren't picked up automatically, so I have to manually "synchronize" node_modules/lib, and
When "going to declaration", IntelliJ (of course) opens node_modules/lib/mylib instead of lib/mylib. This can lead to confusion as the actual file and the symlinked file can be open in separate windows.
Instead of a different way to require local paths (all these methods do work with node after all), I'd be happy with a way to hint to IDEA that it should search the lib/ folder for sources.
So, I realised that if you add a library through Project Structure > Libraries, it won't actually be enabled.
Instead, go to Preferences > Languages & Frameworks > Javascript > Libraries and add a new library. Set the framework type to node_modules, Visibility to Project and add your lib folder.
After adding it, make sure the Enabled checkbox is checked.
That's it, Intellij can now resolve your require('mylib') paths.
Use whatever method from the gist mentioned in the question to actually get node to resolve the paths.

How get grunt-closure-compiler to apply minimization to each file separately in a directory

Is there a way I can get grunt-closure-compiler to apply minimization to each file separately in a directory (overriding the original) instead of producing a single file as the output. If I can't override the original I am happy to place output files in a separate output directory.
https://github.com/gmarty/grunt-closure-compiler
Normally the procedue would be like this producing a single file:
grunt.initConfig({
'closure-compiler': {
frontend: {
closurePath: '/src/to/closure-compiler',
js: 'static/src/frontend.js',
jsOutputFile: 'static/js/frontend.min.js',
maxBuffer: 500,
options: {
compilation_level: 'ADVANCED_OPTIMIZATIONS',
language_in: 'ECMASCRIPT5_STRICT'
}
}
}
});
You can use the module option of the Closure Compiler to produce multiple output files. You would have to list each JavaScript file as its own module, so if you have many JavaScript files this could be pretty tedious.
The module option is not very well documented, but see the posts below to see how it works:
Using the Module Option in Closure Compiler to Create Multiple
Output Files
How do I split my javascript into modules using Google's Closure
Compiler?
As the other answer suggests, you need the modules option. However grunt-closure-compiler doesn't actually support this.
There's a fork of it which supports modules. It doesn't use the standard grunt file config so you can't use globbing patterns to get it to take all of the files in a folder. I've gotten around this by writing a grunt task to create the modules object and pass it into the config for the closure-compiler task.

How to use AMD code completion with webstorm and requirejs?

I have something like this
define(function(require) {
var Router = require('./router');
var Backbone = require('backbone');
var Log = require('log');
...
Apparently Webstorm is meant to support AMD modules but I can't get it to work, instead a get a massive list of properties from every .js file in the project.
Has anyone had any luck getting Webstorm code completion / refactoring with requirejs modules?
Update, I was able to get it working if I following the following construct
define(['backbone', './router', './log'], function(Backbone, Router, Log) {
however, all paths have to be relative. This is impractical for a path that is configured in require.config, so Backbone does not have code completion.
requirejs.config({
baseUrl: 'js',
paths: {
'backbone' : '../bower_components/backbone/backbone-min',
...
Plus, the above syntax becomes ugly when there are many dependencies...
update 2
The above does not work if you change directory, for example, the Log below does not get code completion:
define(['backbone', './router', '../utils/log'], function(Backbone, Router, Log) {
As commented above, the support for AMD and CustomJS modules is on the roadmap for WebStorm 8. The Early Access Program started recently offers a preview for the AngularJS and Spy-js support only, but the AMD support is in progress and I hope that it'd appear in the next update.
In the meanwhile, you can try the Require.js plugin which provides a partial support for the requirejs modules. You'll get path completion for the module dependencies in the define call, including the requirejs plugins recognition:
The code completion offers so many "false positives" that you'd better learn the interface of your objects and use the "IntelliSense" just as a hint or a help to finish long identifiers; I doubt that the plugin helps the IDE here:
Other features like finding usages or refactoring (rename file and rename object) do not work across the boundary of the module closure. You're better off with the Replace in Path feature...
Notes to the plugin: The path to the config file in the plugin settings is relative to the public directory (the base URL). Also, I'd recommend checking out the binary package from the github project site, which may offer a newer version than the WebStorm plugin manager. (The 0.13 downloaded from there fixed crashing of the plugin with my project, while the WebStorm IDE still offered the 0.12.)
UPDATE: The issue WEB-825 appears to be partially fixed in WebStorm-134.1081 downloadable from EAP. The Find Usages feature recognizes formal parameters initialized from the requirejs dependencies and searches for the module references in the project instead of for the variable reference in the current closure:
Refactoring has improved. Renaming a file in the projtect does affect all module references, but it introduces a relative path to the requirejs base directory (URL). For example, renaming toolitems.js in the src/model directory to menuitems.js changes all references from "model/toolitems" to "../model/menuitems"; providing the requirejs base directory is in the src directory. Renaming a method works in the entire project. Renaming an object works only within the scope of the current closure. Should work globally? You may intentionally choose different object names for the same module export in every closure with Require.js... Still, it is a good habit to use the same names for consistency and an improvement here would be nice.
The Require.js plugin, which brings a partial AMD support to WebStorm 7 seems not to be needed in WebStorm 8 anymore.
UPDATE II: The Require.js plugin still helps the WebStorm 8 users. It recognizes baseUrl and paths from the Require.js configuretion. You can map module path prefixes to different directory roots and still have to "Go To Location" feature working:
// file configured for the Require.js config file path
require.config({
paths: {
core: '../core/src',
recman: '../recman/src',
app: '.'
}
});
// Example of a module from the app project
define([
'core/model/node', 'recman/model/hold, 'app/view/manager'
], function (NodeModel, HoldModel, ManagerView) {
...
});

Resources