When writing custom components is it better to publish the .vue file directly or to publish a compiled version using webpack/other-bundling-tool ?
Bonus: Is there an official document regarding conventions to follow when publishing custom components?
EDIT: What are the pros and cons of either method?
I've published a few open source projects and from experience I can say that it's better to publish your code - or rather, set the main entry point - as a compiled distributable for a few reasons:
Firstly, by outputting a UMD module you are creating a distributable that works across all environments (webpack, browserify, CDN, AMD) and it's as simple as adding the following to your webpack config:
output: {
...
library: 'MyPackageName',
libraryTarget: 'umd',
umdNamedDefine: true
},
Secondly, most developers using webpack will exclude babel-loader from compiling scripts in their node_modules folder by doing something like:
{
test: /\.js$/,
loader: 'babel-loader',
exclude: /node_modules/
}
So, if developers need to compile your code themselves and you have anything in your project that is not a .vue file that uses ES2015 (e.g. a mixin) then you would need to tell developers to apply babel-loader to your project folder in their webpack config.
As for browserifydevelopers having to compile your project, you would need to add vueify and babelify as transforms to package.json (they actually can't set this up themselves) and tell those developers that those are dependencies and get them to set up an appropriate .babelrc file.
All that setup can be a nightmare for devs, many will have little knowledge about their build process, so won't know about excludes, they won't know about transforms, they will just get a bunch of errors and either remove your package or create issues on your repo.
And that's just for the two most common build processes, you will still probably want a CDN and you will still want to allow those using AMD modules to use your package, so a UMD module is the way to go.
That said, you should still distribute the .vue files themselves, which will also allow devs to compile your project if they have advanced configuration requirements.
Related
I have created a generic framework for creating dashboards which consists of multiple modules using Angular-CLI.
Some modules are completely independent. Developers who are using this library can add the modules on their project on demand. I had a previous version of this framework created in Angular Js 1.0, in this I have delivered as javascript min files.
What are the things I have to take care to create this Library as private not for public or is there any way to package my modules as separate and deliver without NPM?
This question boils down to two independent tasks: Creating the library package and publishing it internally.
Create the library package
In order to create a library with AOT support, you need to compile it using ngc -p src/tsconfig-aot.json.
tsconfig-aot.json is a copy of tsconfig.json with an additional section:
"files": [
"./app/index.ts"
],
"angularCompilerOptions": {
"annotateForClosureCompiler": true,
"genDir": "../dist/out-lib-tsc",
"skipMetadataEmit" : false,
"skipTemplateCodegen": true,
"strictMetadataEmit": true,
"flatModuleOutFile": "libname.js",
"flatModuleId": "libname"
}
I need to fix the directory structure by moving the files from src/app to the root of the output directory. Then you copy src/app and src/asserts to the output directory.
There are several detailed guides out there. For example Distributing an Angular Library - The Brief Guide
Publish the library package
There are several options to publish private libraries:
reference a branch in a git repository Note: It is probably a good idea to use different repositories for developing (without compiler output) and publishing
npm offers private repositories for a fee
you can setup a local registry (for example Artifactory or Sinopia)
I am looking for information on how to bundle dependencies with Webpack. Haven't been doing front-end development much recently and behind the latest trends.
(a) I would like to bundle x number of dependencies with Webpack, but
I do not wish to specify an entry point. Such that if the bundle was
require'd, nothing would execute.
(b) This has probably nothing to do with (a) - ideally I could bundle
them as AMD modules. Basically, would like to take NPM modules and my
code and convert things to AMD.
I am guessing that the above can be accomplished through some webpack.config.js configuration, but I haven't see anything online demonstrating how you can bundle deps with Webpack without specifying an entry point.
You have to specify an entrypoint, otherwise Webpack won't be able to parse your modules and statically analyze the dependencies.
That said, you don't have to directly specify an entrypoint in your configuration. You can use the webpack --entry path/to/entry.js $OTHER_ARGS and then require all the dependencies therein, or you could use the configuration can and specify all the required modules:
{
entry: ['react', 'foo', 'bar', './ours/a', './ours/b']
}
In any case, the way Webpack evaluates your modules during runtime does not make these modules readily available. I suspect what you actually may be interested in is creating library targets, which are compiled independently and then reused in other Webpack builds.
Here is a nice article that explains the approach in detail and then you can refer to the official documentation.
My application has a directory structure more or less like this:
src-program/ - contains frontend code including package.json and webpack.config.js
src-server/ - contains backend code including a different package.json and .babelrc
shared/foo.js - is JavaScript code that is needed by both the frontend and the backend
All code uses ES2015 syntax and thus is transpiled using Babel.
For the frontend the "transpilation" is done during the Webpack build by using the babel-loader.
For the backend it is done on-the-fly by babel-register.
shared/foo.js requires other modules, that are found in the package.json files of both the frontend and the backend.
Due to how NodeJS/Webpack resolve modules, the shared module isn't found normally.
For Webpack I solved this in a somewhat hacky way using this configuration:
resolve: {
root: __dirname,
fallback: [
__dirname + "/../shared",
__dirname + "/node_modules"
],
extensions: ['', '.js', '.jsx']
},
The first fallback makes sure that the "shared" module is resolved and the second fallback makes sure that modules required by the shared module are still resolved to the frontend node_modules directory.
This allows including the shared module as simple as this:
import * as foo from 'foo';
However, I'm having difficulties to make the backend (ie. NodeJS) resolve the shared module the same way.
I tried with app-module-path, which makes foo.js resolve, but then the file is either not processed by Babel or additional Babel modules like transform-runtime (indirectly needed by foo.js) cannot be resolved since they reside in src-server/node_modules...
I could probably work around the problem by pre-transpiling the code instead of using babe-register but it all doesn't feel right anyway.
So, what is a good way to share code between a Webpack build and the NodeJS server process?
Can you package the shared module up as an npm package (even a package just residing on your filesystem)? Then your src-program and src-server projects can add it as a dependency in their package.json, and it will get copied into their respective node_modules folders.
See: how to specify local modules as npm package dependencies
I am currently developing a project, which I want to test in different environments - including node.js and different browsers with karma/selenium - to avoid compatibility issues. (I think I will use browserify in browsers, but I am not familiar with it yet.)
I have a nested testing directory, something like this:
repo/
- project.js
- project.my.module.js
- spec/
-- helpers/
--- a.jasmine.helper.js
-- support/
--- jasmine.json
-- project.my.module/
--- ModuleClass.spec.js
-- project.MyClass.spec.js
-- project.OtherClass.spec.js
Currently I tested the project only with jasmine-npm (which is jasmine 2.2 for node.js). By testing the working directory is the repo/, where I run node.exe with jasmine.js. The jasmine.js loads the jasmine.json:
{
"spec_dir": "spec",
"spec_files": [
"**/*[sS]pec.js"
],
"helpers": [
"helpers/**/*.js"
]
}
Now I have 2 problems here.
How can I avoid the long relative paths by require, for example require("../../project.my.module.js") in the ModuleClass.spec.js file? (I would rather use a short constant name, like I can do by symlink.)
How can I do this in a way that is compatible with running the same test files in different browsers? (I want to keep the commonjs module definitions by the tests.)
I checked some tutorials about node.js, and it seems like I have two options. I can use the package.json (with some magic config options unknown to me), or I can move the files I want to load, to the node_modules/ (which I am sure I won't do). I am open for suggestions, because I can't see how it is possible to solve this issue...
edit:
The karma-browserify appears to solve the testing problem, probably I have to add a jasmine for browser, but that's okay. I don't have to change the commonjs module definitions by the tests. So it is possible to test both in node.js and the browser with the long paths.
edit2:
I ended up adding the parent dir of my repo to NODE_PATH. This way I can require every project I am currently developing.
What about symlinking your project directory under node_modules (e.g. as node_modules/project) and requiring like require("project/project.my.module.js")?
In the data-main require js file, we write like this:
paths: {
jquery: 'lib/jquery',
underscore: 'lib/underscore'
}
What I did was manually download the row JS library files and make "lib" folder and move the file into the folder and change the file name if necessary.
I use Nodejs for server, and I am wondering if there's any tool to create these client-side Require path files automatically from the installed Node-Modules. Browserify does a similar job if I don't user Require (creating one JS file, and call it in the other browser JS files.) But it seems like Browserify cannot be used as a path in Require.
Any thoughts? Thanks.
An alternative solution (to browserify, with which I'm not familiar) is to use bower for managing client side libraries. It is similar to node/npm, but is geared towards browser libraries.
It will not copy or rename libraries, because that step isn't necessary. Instead the libraries will be placed in a directory called bower_components. The paths config would look like
paths: {
jquery: "../../bower_components/jquery/dist/jquery",
bootstrap: "../../bower_components/bootstrap/dist/js/bootstrap",
...
}
(the actual number of .. in the path depends on values of other requirejs options).
In development, when all dependencies are loaded asynchronously as separate files they will be loaded from bower_components and requirejs optimizer will find them there when generating the optimized single source.
Adding the dependency paths to the config file can be half-automated with grunt plugin grunt-bower-requirejs. The idea is that after a library is installed using bower install LIBRARY it's path can be added with grunt bower.