I have a single file (either a .js or .node generated with C++, but works the same) that I can use in node.js by calling:
var addon = require("./addon");
It's not an official package or anything; it has no package.json (and I want to keep it that way).
This above code works fine if I run it in a simple node.js application, but how do I include it in a node.js library? For example:
exports.addon = require("./addon")
This doesn't seem to work, I tried changing the package.json:
"dependencies": {
"addon": "file:./addon.node",
}
but when I use
require("addon");
later it says it can't be found. [EDIT: after I run npm publish and then npm -i mymodule in another file]
Am I missing something?
Assuming you are writing your own library and want to include you add-on there, I would build a file structure like this:
index.js
addon/cpp-generated.js
addon/other.js
...
Then from index.js you can just do:
const cpp_generated = require('./addon/cpp-generated')
I've only used .js extensions. You need to change a configuration file to allow for new extensions (like the .node that you mention.)
There is no need to mention the file in your package.json for things to work. It may be needed there if you run a build, although it should get included because of the require() statement anyway.
So it looks like you're doing it right except maybe for the extension (your example shows .node.)
And to make sure it gets published, I would add it to the list of files:
"files": [
"index.js",
"addon/cpp-generated.js",
...
]
Related
I'm playing around with Yarn 2, and I want to do something like this.
I have a monorepo of the structure:
/
packages/
shared-ts/
package.json
src/
lib*/
app-ts/
package.json
src/
lib*/
app-js/
package.json
src/
lib*/
where lib* denotes that the folder is gitignored, but is where the compiled code will live.
In this example, I have a dependency library shared-ts that is used by two apps, app-ts and app-js.
The conventional approach
The conventional approach to configuring a monorepo like this, is that in shared-ts I would have a package.json like:
"main": "lib/index.js"
"scripts" : {
"build": "tsc"
}
Where the build script will build index.js and index.d.ts into the lib folder.
When both app-ts and app-js then resolve the package, they look in the lib folder and find the index.js and in app-ts's case - the index.d.ts.
This works fine, except that the developers need to remember to run the build script if they have made changes to shared-ts in order for the changes to propagate across.
Where this could potentially become problematic is where there are many layers of dependencies.
Attempted work around 1 - point main to src/index.ts.
I can change shared-ts package.json to
"main": "src/index.ts"
"scripts" : {
"build": "tsc"
}
This generally won't work, a plain node process won't be able to parse the syntax in the .ts file (eg. the import keyword).
Potential workaround - publishConfig
So something I'm considering, but haven't tried yet is, using the publishConfig
fields in the package.json
This field contains various settings that are only taken into consideration when a package is generated from your local sources (either through yarn pack or one of the publish commands like yarn npm publish).
"main": "src/index.ts",
"publishConfig": {
"main": "lib/index.js"
}
The idea being that:
When you publish a package to npm, lib/index.js will be used as main. 👍 code is ready for consumption, no compilation required.
If being used directly in the monorepo src/index.ts will be used as main. 😕 This kind of works as if you were running app-ts with ts-node for example.
However, where this starts breaking down is:
Running app-js in a development environment (where you don't have any additional syntax parsing set up).
Practical current best solution
My current best solution is to 'just give up on this 'no compile' aspiration' - if a developer makes changes to some code, they need to re-run build for the changes to propagate across.
How about using this?:
import someValue from 'some-package/src/index';
I can do this in my monorepo like the image below
I believe using nx will be good choice here. While it won't help you run the uncompiled code, it has pretty good features. In particular, you can automatically run the affected:apps on certain changes. For example, if you have a start command, it will run the start command for all the affected apps.
I wanted the same thing but had to compromise on the "Automatic compilation on changes" option in my JetBrains IDE.
It allows me to debug with ts-node as well as run the code using the native node binary.
I am working on a javascript library which is built by webpack. The project will be built for two targets, web and node. I followed the instruction from this link: https://webpack.js.org/concepts/targets/ to setup the multiple targets in webpack. It works fine and it generates two target files build/test-web.js and build/test-node.js.
The file build/test-web.js is listed in the main attribute in package.json as below:
"name": "#my-org/test
"main": "build/test-web.js",
so I am able to import this file by require('#my-org/test'). I wonder how I can import the other file build/test-node.js. I know I can import it via require('#my-org/test/build/test-node.js') but I am looking for a better solution to make developers import it more easily.
I checked this library: https://github.com/patrickhulce/generate-export-aliases but it doesn't work for scoped package name.
There is a great Setting up multi-platform npm packages article by Dr. Axel Rauschmayer which explains how you can achieve what you want.
Node.js will only look at the main field of package.json to resolve a module so you should put there the path to build/test-node.j. The builders which bundle browser code (i.e. Webpack, Rollup) will first additionally look at the browser field (if target: web) where you should specify the path to your build/test-web.j. So your package.json should look like this:
{
...
"main": "build/test-node.j",
"browser": "build/test-web.j",
...
}
I am working on a NodeJS (v. 8.12.0, EcmaScript 6) project, whose project structure is similar to:
project_root/
src/
utils/
protocol_messages/
helpers.js
tests/
unit/
utils/
protocol_messages/
helpers.js
I am writing tests using Mocha as a test framework.
Question
In the helpers.js under tests/unit/utils/protocol_messages/, what's the proper way of importing the module-under-test?
To elaborate:
I want to avoid the relative path in: require('../../../../../src/utils/protocol_messages/helpers').
It works, but it's ugly, and if the project structure changes, I would have to rewrite the test imports, as well.
(I am new to Javascript so I might be doing several things wrong.)
Update
Solutions provided in this question:
require.main.require: in a comment to this answer, "This solution will not work if code covered with unit tests like Mocha test".
Extracting my utils to a node module doesn't make sense for me, since the code is very application specific.
Having an extra node_modules under my src/ project root of a NodeJS project doesn't seem to make sense.
Using a Javascript transpiler when I am using only features available in NodeJS and writing CommonJS projects seems a bit of an overkill.
If I am mistaken on any of the above points, please point it out, as I am at a loss. It seems to me like NodeJS doesn't not provide a native way to import CommonJS modules with absolute paths.
You can use wavy npm package.
This module lets you turn things like require('../../../../foo') into something like
require('~/foo'). The way it works is that on postinstall it creates a symlink in app/node_modules/~ to point to app/
Assume you need a config.js file present at your project's root in a file which is present at /routes/api/users/profile.js, you do not want to import it as ../../../config.js
Create a directory structure in your project's root directory as described below:
/Modules
index.js
package.json
Modules/index.js
module.exports.config = require('../config.js')
Modules/package.js
{
"name": "modules",
"main": "index.js",
"version": "1.0.0",
"dependencies": {}
}
Now run
npm install ./Modules
/routes/api/users/profile.js
const { config } = require('modules')
This way autocomplete feature of your code editor will also work. No more global variables pollution, long relative imports, no dependency on environment variables and the best part is, it will work with pm2 and nodemon.
I'm very new to using npm and TypeScript so I'm hoping that I'm missing something obvious. We have authored a node package written in TypeScript for our own internal use. So, the source is written in Typescript, and the file structure looks like this:
src/myModule.ts
myModule.ts
Where myModule.ts looks like this:
export * from "./src/myModule";
I then run tsc to generate .js and .d.ts files. So, the files then look like this:
src/myModule.js
src/myModule.ts
src/myModule.d.ts
myModule.js
myModule.ts
myModule.d.ts
All of this gets pushed to git and then our main app includes this package as a dependency via a git URL. When I first attempted this, I got an error like this:
export * from "./src/myModule";
^
ParseError: 'import' and 'export' may appear only with 'sourceType: module'
After some searching around, I found that the issue was with the .ts files that were getting loaded in. So I added the following to my .npmignore:
*.ts
!*.d.ts
After doing this, it brings in the node package without any problems.
The problem I am running into is that I want to be able to run off of local changes to the node package during active development. I want to be able to make changes in the node package, and then have the main app pick up these changes so that I can make sure everything works together before pushing to git. Everything I find says to use npm link. This makes it point to my local directory as expected, but the problem is that it now has all the .ts files, and the same errors show up.
I'm sure npm link works great when everything is written in JavaScript, but we are using TypeScript to write everything. Is there a way to make this work with TypeScript?
This makes it point to my local directory as expected, but the problem is that it now has all the .ts files, and the same errors show up.
The errors will only show up if you have those .ts files and a seperate declaration for those files.
Recommended fix
Just use the seperate .ts/.js/.d.ts files and steer clear of bundling a single .d.ts.
Is there a way to write a single module/package that can be posted both to npm and Bower, without having to duplicate files?
Imagine you have a simple JS file with some code that is self-contained (i.e. it doesn't have any external dependencies).
An ideal directory would look something like:
/file.js
/package.json
/bower.json
The problem in this case is that "file.js" to work with npm would need a module.exports statement, whereas this would not work with Bower.
So, is there a way to avoid producing two separate almost identical files?
This seems the best option so far (inspired by the Angular team).
Create an index.js file in the project root, with this content:
module.exports = require('your-original-module.js');
Then, in package.json add this line:
"main": "index.js",
Simple, but effective!
If your module doesn't depend on other npm modules,
you can provide file (lets call it 'bowerify.js') with
window.MyUtility = require('./file');
to expose your utility as global variable.
And then use browserify to package your code for the browser:
src: 'bowerify.js',
dest: 'my_bower_module.js'
Now you can install my_bower_module.js with bower.