Consider this package.json:
{
"main": "./index.js",
"name": "#myorg/foo",
"scripts": {
"build": "babel src -d dist --delete-dir-on-start --source-maps inline --copy-files --no-copy-ignored",
"check-release": "npm run release --dry-run",
"postbuild": "cp package.json dist",
"release": "npm run build && cd dist && npm publish"
}
}
I'm trying to figure out what's the best practice in order to adjust it so it'll publish the src folder that contains ESM modules in a way that'll allow tree-shaking in the final app (CRA app).
Right now there are some issues with the above package.json:
npm publish won't behave as I'd like to - you need to use npm run release instead and that's not good as other tooling will rely on publish
To create the dist folder I'm using Babel but that doesn't feel right, is it better to use another tool instead to bundle it? My goal is the final app being able to use all imports easily (with no subpath exports, index, aliases, ...)
How would you adjust the above package.json to fullfill the above requirements?
Below are the two points answered!
npm publish won't behave as I'd like to... you need to use npm run release instead and that's not good as other tooling will rely on publish.
If you look below, you can see that Babel has been removed. Due to this, there is no need for your release script anymore.
You can just run the publish command when it is ready.
$ npm publish --access=public
Remember, this only works if you did the below!
To create the dist folder I'm using Babel but that doesn't feel right, is it better to use another tool instead to bundle it? My goal is the final app being able to use all imports easily (with no subpath exports, index, aliases, ...).
You can follow a few steps to stop using Babel and use an alternative:
In your package.json, add the following key:
{
"type": "module"
}
With this, also change require() to import, and module.exports to export. For example:
// Before...
const foo = require("#myorg/foo");
module.exports = { func };
// After...
import foo from "#myorg/foo";
export { func };
Instead of importing and exporting everything from one file, you can do that in a simpler way in package.json. You need to just add something like below:
{
"exports": {
".": "./dist/index.js",
"./feature": "./dist/feature.js",
}
}
For more information, see the documentation.
When someone downloads your module, they will import like this (like Firebase):
import index from "#myorg/foo"; // index.js
import features from "#myorg/foo/features"; // features.js
Not sure if these fully answered your question, but they might give you a starting point!
Related
I'm playing around with Yarn 2, and I want to do something like this.
I have a monorepo of the structure:
/
packages/
shared-ts/
package.json
src/
lib*/
app-ts/
package.json
src/
lib*/
app-js/
package.json
src/
lib*/
where lib* denotes that the folder is gitignored, but is where the compiled code will live.
In this example, I have a dependency library shared-ts that is used by two apps, app-ts and app-js.
The conventional approach
The conventional approach to configuring a monorepo like this, is that in shared-ts I would have a package.json like:
"main": "lib/index.js"
"scripts" : {
"build": "tsc"
}
Where the build script will build index.js and index.d.ts into the lib folder.
When both app-ts and app-js then resolve the package, they look in the lib folder and find the index.js and in app-ts's case - the index.d.ts.
This works fine, except that the developers need to remember to run the build script if they have made changes to shared-ts in order for the changes to propagate across.
Where this could potentially become problematic is where there are many layers of dependencies.
Attempted work around 1 - point main to src/index.ts.
I can change shared-ts package.json to
"main": "src/index.ts"
"scripts" : {
"build": "tsc"
}
This generally won't work, a plain node process won't be able to parse the syntax in the .ts file (eg. the import keyword).
Potential workaround - publishConfig
So something I'm considering, but haven't tried yet is, using the publishConfig
fields in the package.json
This field contains various settings that are only taken into consideration when a package is generated from your local sources (either through yarn pack or one of the publish commands like yarn npm publish).
"main": "src/index.ts",
"publishConfig": {
"main": "lib/index.js"
}
The idea being that:
When you publish a package to npm, lib/index.js will be used as main. 👍 code is ready for consumption, no compilation required.
If being used directly in the monorepo src/index.ts will be used as main. 😕 This kind of works as if you were running app-ts with ts-node for example.
However, where this starts breaking down is:
Running app-js in a development environment (where you don't have any additional syntax parsing set up).
Practical current best solution
My current best solution is to 'just give up on this 'no compile' aspiration' - if a developer makes changes to some code, they need to re-run build for the changes to propagate across.
How about using this?:
import someValue from 'some-package/src/index';
I can do this in my monorepo like the image below
I believe using nx will be good choice here. While it won't help you run the uncompiled code, it has pretty good features. In particular, you can automatically run the affected:apps on certain changes. For example, if you have a start command, it will run the start command for all the affected apps.
I wanted the same thing but had to compromise on the "Automatic compilation on changes" option in my JetBrains IDE.
It allows me to debug with ts-node as well as run the code using the native node binary.
I have:
packages
-models
-package.json
-....
-server
-src
-index.ts
-package.json
In my packages/server/package.json, I have:
"scripts": {
"dev": "ts-node src/index.ts"
},
"dependencies": {
"#myapp/models": "../models",
In my packages/server/src/index.ts, I have:
import { sequelize } from '#myapp/models'
In my packages/models/src/index.ts, I have:
export type UserAttributes = userAttr
export { sequelize } from './sequelize';
but it gives me an error:
Try `npm install #types/myapp__models` if it exists or add a new declaration (.d.ts) file containing `declare module '#myapp/models';`
import { sequelize } from '#myapp/models'
How do I get this to work properly?
Lerna will take care of the dependencies between your local packages, you just need to make sure you set them up correctly. The first thing I would suggest is to go to #myapp/models and make sure that your package.json contains the fields you will need: main and more importantly types (or typings if you prefer):
// packages/models/package.json
{
// ...
"main": "dist/index.js",
"types": "dist/index.d.ts",
// ...
}
As you can see I made both of them point to some dist folder, which takes me to my second point - you will need to build every package as if it was a separate NPM module outside of the monorepo. I am not saying you need the dist folder, where you build it is up to you, you just need to make sure that from the outside your #myapp/models exposes main and types and that these are valid and existing .js and .d.ts files.
Now for the last piece of the puzzle - you need to declare your #myapp/models dependency as if it was a "real" package - you need to specify its version rather than point to a folder:
// packages/server/package.json
{
"dependencies": {
// ...
"#myapp/models": "0.0.1" // Put the actual version from packages/models/package.json here
// ...
}
}
Lerna will notice that this is a local package and will install & link it for you.
I don't know Lerna, but a good tool to deal with monorepos is npm link.
cd packages/models
npm link
cd packages/server
restore the version in dependencies "#myapp/models": "x.y.z",
npm link #myapp/models
It should be enough.
Hope this helps.
lerna bootstrap first then yarn add <package_name> works
Or
lerna bootstrap first, then add the package and the target version then run yarn.
After you put in the dependencies in your server's package.json, you just need to run
lerna run --stream build
And your local should be able to access to it.
I am unable to import TS files that exist in my Angular project directories inside my Node server.
I've looked into various settings for a tsconfig.json file for the Node server specifically but have had no luck.
I run my node server via npm start like so "start": "nodemon --exec ts-node -- ./start.ts"
My project structure looks something like this...
....
node_modules/
src/
app/
shared/
models/
entity.model.ts
stub-server/
start.ts
src/
data/
entity-data.ts
angular.json
tsconfig.json
...
I expect to be able to import the relevent Typescript classes/interfaces/enums etc from entity.model.ts inside entity-data.ts so that I can enforce type safety within my mocked data.
You definitely need to use better tooling for your project.
Please check https://nx.dev/ and create your Angular app and Node app, move your code there, and then you will be able to create a library with your shared stuff, to be imported from both projects.
Nx will take care of the configuration and the tsconfig paths required to make it work. And talking about tsconfig.paths, you may try to setup one in your current project, perhaps this structure is not clean:
"compilerOptions": {
"paths": {
"#shared/*": ["src/app/shared/*"]
}
}
and import your stuff from your server like
import { MyModel } from '#shared/models';
Anyways, consider the migration of your code to Nx and you will be able to enjoy a clean architecture and workflow. Happy coding!
OP EDIT: If anyone else comes across this: the app was created using create-react-app, which limits importing to within the src folder. However if you upgrade react-scripts to v1.0.11 it does let you access package.json.
I'm trying to get the version number from package.json in my app.
I've already tried these suggestions, but none of them have worked as I can't access package.json from outside the src folder (might be due to React, I'm new to this). Moving package.json into src then means I can't run npm install, npm version minor, and npm run build from my root folder. I've tried using process.env.npm_package_version but that results in undefined.
I'm using Jenkins, and I haven't set it up to push the commits up yet, but the only idea I have is to get the version from the tags in GitLab, but I have no idea how to do that, and it would add unnecessary dependency to the repo, so I would really like to find an alternative.
EDIT:
My file structure is like:
--> RootAppFolder
|--> build
|--> node_modules
|--> public
|--> src
|--> Components
|--> Root.js
|
|--> package.json
So to access package.json from Root.js I have to do import packageJson from './../../package.json' and then I get the following error:
./src/components/Root.js
Module not found: You attempted to import
./../../package.json which falls outside of the project src/
directory. Relative imports outside of src/ are not supported. You can
either move it inside src/, or add a symlink to it from project's
node_modules/.
Solving this without importing and exposing package.json to the create-react-app
Requires: version 1.1.0+ of create-react-app
.env
REACT_APP_VERSION=$npm_package_version
REACT_APP_NAME=$npm_package_name
index.js
console.log(`${process.env.REACT_APP_NAME} ${process.env.REACT_APP_VERSION}`)
Note: the version (and many other npm config params) can be accessed
Note 2: changes to the .env file will be picked only after you restart the development server
From your edit I would suggest to try:
import packageJson from '/package.json';
You could also try to create a symlink:
# From the project root.
cd src; ln -s ../package.json package.alias.json
List contents of src directory and you'll see the symlink.
ls
#=> package.alias.json -> ../package.json
Adding the .alias helps reduce the "magic" for others and your future self when looking at this. Plus, it'll help text editors keep them apart. You'll thank me later. Just make sure you update your JS code to import from ./package.alias.json instead of ./package.json.
Also, please take a look at this question:
The create-react-app imports restriction outside of src directory
Try this.
// in package.json
"version": "1.0.0"
// in index.js
import packageJson from '../package.json';
console.log(packageJson.version); // "1.0.0"
I don't think getting version by 'import' or 'require' package is correct.
You can add a script in you package.json
"start": "REACT_APP_VERSION=$npm_package_version react-app-script start",
You can get it by "process.env.REACT_APP_VERSION" in any js files.
It also works in build scripts, like this:
"build": "REACT_APP_VERSION=$npm_package_version react-app-script build",
import package.json
Generally speaking, importing package.json is not good. Reasons: security & bundle size concerns
Yes, latest webpack (default config) + ES6 import does tree-shaking (i.e. only includes the "version" value instead of the whole package.json) for both import packageJson from '../package.json' and import { version } from '../package.json'. But it is not guaranteed if you use CommonJS (require()), or have altered your webpack config, or use another bundler/transpiler. It's weird to rely on bundler's tree-shaking to hide your sensitive data. If you insist on importing package.json but do not want the whole package.json exposed, you may want to add some post-build checks to ensure other values in package.json are removed.
However the security concern here remains theoretical for open source projects whose package.json is public after all. If both security and bundle size are not problems, or, the non-guaranteed tree-shaking is good enough for you, then go ahead)
.env
The .env method, if it works, then it's good, but if you don't use create-react-app, you might need to install dotenv and do some additional configurations. There's also one small concern: it is not recommended to commit the .env file (here and here), but if you do the .env method, it looks like you will have to commit the file as it likely becomes essential for your program to work.
Best practice (arguably)
(this is not primarily for create-react-app, but you still can either use react-app-rewired or eject cra in order to configure webpack in cra)
If you use webpack, then with DefinePlugin:
plugins: [
new webpack.DefinePlugin({
'process.env.VERSION': JSON.stringify(
process.env.npm_package_version,
),
}),
]
You can now use console.log(process.env.VERSION) in your front-end program (development or production).
(You could simply use VERSION instead of process.env.VERSION, but it usually requires additional configuration to satisfy linters: add globals: {VERSION: 'readonly'} in .eslintrc (doc); add declare var VERSION: string; in .d.ts file for TypeScript)
Although it's "npm_package_version", it works with yarn too. Here's a list of npm's exposed environment variables.
Other bundlers may have similar plugins, for example, #rollup/plugin-replace.
Open your package.json file and change this line
Problem Is:
// in package.json
"scripts": {
"dev": "REACT_APP_VERSION=local REACT_APP_VERSION_NUMBER=$npm_package_version react-scripts start",
...
}
Solution is
// in package.json
"scripts": {
"dev": "react-scripts start",
...
}
I have a issue with npm and the main field. I see the documentation and as of my understanding I point main to be a different entry point than ./index.js. I already tested the package where all dist files are inside the root folder. I ignore src and test during pack phase using .npmignore but I did not like the point that building and packing the project to verify the structure pul all my files into the package root folder. So i changed the output to be dist instead.
If i use npm pack and extract the file I get the following structure:
/
dist
-- index.js
-- moduleA
-- index.js
package.json
README.md
So for so good. But now I am forced to import it as follows:
import {moduleA} from "myNpmModule/dist/moduleA";
But I dont want to have the dist folder in my import. So I set main in package.json
"main": "dist/index.js"
But still it does not work and only works if I import with dist.
I use npm 3.10.7 and node 6.7.0.
Can anyone help?
Regards
It's hard to tell for sure not knowing the contents of your main index.js and moduleA but it's usually done in a way that you don't import any specific file, but rather the directory containing the package.json - like:
import {moduleA} from "myNpmModule";
Now, the index.js referenced as "main" in package.json should import the rest of the modules, and export them as its own module.exports properties.
For example, in dist/index.js:
import {moduleA} from "./moduleA";
module.exports.moduleA = moduleA;
and in your main code:
import {moduleA} from "myNpmModule";
Something like that - with possible differences to suits your own module's structure.
Actually I wrote a module that automatically does something like that, importing modules in subdirectories and exporting them as properties. I haven't put it on npm because it was for my own use, when I publish it to npm I'll update this answer.
Update
Here is a working example of what I described above - with import changed to require() to avoid the need for a transpilation step.
Module
A module following my advice from this answer:
https://github.com/rsp/node-nested-project-structure-example
Project structure:
dist
-- index.js
-- moduleA
-- index.js
package.json
moduleA.js
dist/index.js contents:
var {moduleA} = require('./moduleA');
module.exports.moduleA = moduleA;
dist/moduleA/index.js contents:
module.exports.moduleA = {
info: 'This is what dist/moduleA/index.js exports as moduleA'
};
package.json contents:
{
"name": "nested-project-structure-example",
"version": "0.0.1",
"description": "An example for a Stack Overflow answer",
"main": "dist/index.js",
"scripts": {
"test": "node test.js"
},
// ...
}
moduleA.js contents:
module.exports = require('./dist/moduleA');
Usage
A project that uses this module:
https://github.com/rsp/node-nested-project-structure-usage
It can be imported like this:
Version 1
var {moduleA} = require('nested-project-structure-example');
console.error(moduleA.info);
This imports the dist/ModuleA/index.js via the dist/index.js file referenced in package.json. See test1.js for a working example.
Version 2
var {moduleA} = require('nested-project-structure-example/dist/moduleA');
console.error(moduleA.info);
This imports the dist/ModuleA/index.js directly knowing the internal path including dist. See test2.js for a working example.
Version 3
var {moduleA} = require('nested-project-structure-example/moduleA');
console.error(moduleA.info);
This imports the dist/ModuleA/index.js via the moduleA.js file in the main project directory. That way doesn't need to know the internal project organization - dist path is not needed. See test3.js for a working example.
The whole content of the moduleA.js in the project is:
module.exports = require('./dist/moduleA');
Without having such a file in your project's root directory you will not be able to import the moduleA without either including the dist in your path or importing it directly via the main js file of your project included in package.json (dist/index.js in this case).
Those are 3 ways to achieve the goal of your question, two of which don't include the dist in the code that imports the module. I hope it answers your question.
Those are the only options that you have without splitting your module into a set of completely separate modules, each distributed separately.
So here is what I understand of how this works. I am not 100% sure it that is true. I gained this insight from plain observation and reasoning rather from actual seeing this in a doc.
assumption 1 (package.json):
{... "name": "my-package", "main": "dist/index.js", "scripts": { "build": "babel src --out-dir dist", "prepublish": "npm run build" }... }
assumption 2 (package structure):
/
-- dist
-- moduleA
-- index.js
-- moduleAA
-- index.js
-- moduleB
-- index.js
doing the above you get:
var myPackage = require("my-package");
var moduleAA = myPackage.moduleA.moduleAA;
// or in short
var moduleAA = require("my-package").moduleA.moduleAA;
however it seems like that:
import moduleA from "my-package/moduleA/moduleAA";
is not equivalent to the statement using require above. What you could do instead id:
import { moduleA } from "my-pakage";
const moduleAA = moduleA.moduleAA;
assuming you still want to have the direct import from moduleAA with the above given project structure you would need to do:
import moduleAA from "my-package/dist/moduleA/moduleAA";
So here is my conclusion and how i understand this.
... from "my-package/dist/moduleA/moduleAA"; does not look through the project structure from a JS/npm point of view (what is exported) but instead it looks at the file structure of the package as soon as you use a / in the from phrase.
Which means that if you use
import { moduleA } from "my-pakage";
it will actually import all exports from dist/index.js but if you import "my-package/moduleA" it actually looks inside the package if the path "/moduleA" exists. In the above case that is not true. If we omit the dist folder and move the structure into the package root that statement would actually work the way you would expect.
So no one can ask why I want to have this stuff in dist folder? It is easy to ignore in git. As I understand the best practise using node you use the "prepublish" step to actually build your package. which means if you make a fresh checkout of the code and run "npm install" which executes "npm run prepublish" by design it spams the folder structure with the packaged and transformed files.
After playing with this a few hours I gave up and just accept that "npm install" would potentially spam my folders. As an alternative I could not define prepublish in my package.json and just run "npm run build" prior to "npm publish". ("scripts": { "build": "babel src --out-dir ." })
This is an old question and maybe there is something better now but here is how I did it
add this to your scripts in package.json
"scripts": {
"link:publish": "tsc && cp package.json dist && cp README.md dist && cp tsconfig.json dist && cd dist && npm publish",
},
basically coppy things in the dist folder to publish from there. a bit hacky but in my case necessary