Building monorepo babel-transpiled node JS application with dependencies - node.js

I am working on a project that is hosted as a monorepo. For simplification purposes let's say that inside there are three self-explanatory packages: server, a webapp client and library. The directory structure would be something like the following:
the-project
packages
server
src
webapp
src
library
src
All packages employ flow type notation, use a few >ES5 features and, for this reason, go through babel transpilation. The key difference is that transpilation of the webapp package is done via webpack, whereas server employs a gulp task that triggers script transpilation through the gulp-babel package. library is transpiled automatically when web is built.
Now, the problem I have is that for server to build, babel requires library to be built first and its package.json to specify its (built) main JS source file so its transpiled artifacts can be included. As you can imagine, this would quickly become problematic if the project were to contain multiple libraries that are actively being developed (which it does), as all would require building, including any dependent packages (like server in this simple case).
As an attempt to overcome this annoyance, I initially thought of using webpack to build the server, which would take care of including whatever dependencies it requires into a bundle, but I ran into issues as apparently webpack is not meant to be used on node JS applications.
What strategies are available for building a node JS application requiring Babel transpilation, such that the application's source files as well as any dependencies are built transparently and contained in a single output directory?
Annex A
Simplified gulp task for transpilation of scripts, as employed by server.
return gulp
.src([`src/**/*.js`], { allowEmpty: true })
.pipe(babel({ sourceMap: true }))
.pipe(gulp.dest('dist'));
As can be seen above, only server's own source files are included in the task. If src were to be changed to also include library, the task would emit the dependencies' artifacts in server's own output directory and any require('library') statements within would attempt to locate the built artifacts in packages/library and not packages/server/dist, thus resulting in import failures.

First of all, I am not sure what your server is doing. If it is doing a database connection or some calculations then I would not recommend it to be built by webpack. Whereas If your server is just doing Server-Side Rendering and making some API calls to other servers then I would recommend it to be bundled using webpack.
A lot of projects follow this philosophy. For example, you can take a look at something similar, I have done in one of my personal projects [Blubus]. Specifically, you might be interested in webpack-server-config. And also you can take a look at how big projects like spectrum does it.

Related

NodeJS CI/CD Build Missing Files

I am setting up CI/CD for a NodeJS project and occasionally the developer forgets to send up a file (module) to source control. I run npm ci and npm test without problem and the application gets deployed to my server. However, it will error out once executed due to the missing module.
Is there a best practice for ensuring that all files required by a node application are available before allowing it to be deployed?
I don't know your exact configuration but here is how my teams have handled similar issues in the past:
Unit Tests. Normally, a CI system can catch this type of error before you deploy. If your CI tests aren't flagging your missing module before the code gets deployed, then a commonly used solution would be to write a test that ensures the module is present. That way the problem would be automatically caught when your unit tests are run. Something along these lines might work:
// mymodule.test.js using Mocha syntax:
import {expect} from 'chai';
import mymodule from './mymodule';
describe('my module', () => {
it('should export something!', () => {
expect(!!mymodule).to.be.true;
});
});
Version Control workflow. It sounds like there is also an issue here with your team's version control workflow. Generally, all of the files required to build the production application should be kept under version control and developers should be committing frequently. In this situation, I would normally do some investigation to see what is happening -- it might require training or perhaps the app is structured in a way that is overly confusing or complex to engineers.
Use a lockfile for npm packages. If the missing module is a npm modules, then there are a few things that could cause it to be missing. Generally, all npm modules should be listed in your package-lock.json or yarn.lock file. This ensures that the production version of the application will be in sync with what developers are using locally. I personally discourage developers from installing npm modules globally unless it is absolutely necessary. In that situation, your CI server (and perhaps your production server) will need to be updated to include the exact same versions of the global packages.
Automated build systems. I think you are indicating that your problem is caused by modules that aren't under source control. But I have also seen some situations where legacy build systems might omit a file that is important when building the app for production. Modern build tools like Webpack and Babel normally include every module referenced by the application but older solutions like grunt and gulp might require some fine-tuning to ensure the files are always included in an automated way (to avoid a situation where developers are expected to manually add the modules to the build system and often forget to do so).
The best way to prevent this is to have the developer get the project's checksum and compare that with source control and/or your server. If the checksums match, then all of the files were transfered.
If you (or your deployment software/service) are using rsync in your deployment process and use the -C parameter then some directories might be getting filtered out.
I had a similar CI/CD issue where npm packages used the directory name "core" and it was ignored due to the -C parameter.
Replace all of your require() calls with webpack import calls, and have your build run webpack. At runtime, node will run the bundle instead of your regular entrypoint.
Webpack will catch all unreachable imports during build time.
All this is assuming the missing files are modules (code) rather than resources (e.g. JSON files).

why react should usually be a prod dependency and not dev-dependency

Sorry if I'm missing some obvious thing, but I can't seem to understand why react (or react-dom) should be a dependency and not dev-dependency on most projects..
Usually the src is written in es6 and stored in /src or /client for example, and when someone want to build the project for prod he will create /build or /dist where the finalize bundle.js will seat (using webpack and such). from there (at prod) it simply a web server which serve regular js (the bundle) and html file.
Do I miss somthing?
I dont understand why will I want react on the prod (unless going for SSR of course)..
Thank you very much!
dependencies are required to run, devDependencies only to develop, e.g.: unit tests, Coffeescript to Javascript transpilation, minification, ...
React is a dependency because it is included in the final build.
In case of a React App, all your JSX is converted to a syntax similar to React.createElement and hence you would require your App to have React during run time as well. Similary methods like setState and lifecyle functions are all executed during run-time.
As a matter of fact react is a library that has exposed some methods and APIs to access and manipulate DOM.
Consider this similar to jquery or express where you add it in the production build because they are being used during run time.
When creating a react app, things like babel, webpack etc which are only used to create a bundle would be dev-dependencies as once the packaged bundle is generated it need not be generated during run time of application
I had the same question and found this article that explains it well.
To summarize, it shouldn't matter where you put the react dependencies since Webpack will bundle them into a self-contained file and the code will run fine in either case. However, separating development and production dependencies allows for a better communication to other developers for the purpose of these dependencies.
A dependency is something that is imported in the src/ or client/ module like you mentioned. The code that you run depends on it and hence is required in the production bundle.
If it were something like Babel it could be a devDependency because:
It is not imported in source code.
It is a transpiler and works on code in compile time
But that is not the case with React. Functions like setState or lifecycle hooks are being executed in run time, meaning the library has to be present during run time.
React could very well be listed as a peer dependency is some of the popuar libraries. This is because we assume that all projects that are going to use this library are going to have React installed. Like consider the case of react-bootstrap. It can only be run in react projects, so we include react as peer dependency reducing the overhead of installing it.
If you need a framework that would disappear in compile time, take a look at Svelte.

The best way to actively develop an NPM package that's consumed by another app running locally

I'm currently working on a React application that's consuming a React library we also develop.
Currently, the process is to copy over the dist folder of the library over to the node_modules folder of the application.
To resolve the tedious nature of this, I thought the solution would be simple: to npm link the package in our application, and have the JSX/React components run through the application's babel-loader. That way, we'd also get webpack's dev server to watch for changes in the library and refresh automatically.
The problem with this is that the library's babel settings are different from those of the consuming application. For instance, root imports in the library (e.g. import ~/some-module) are supposed to resolve from the root folder of the library, but instead, they resolve to the root folder of the application, resulting in errors, because the only babel configuration it uses is the .babelrc from the application.
I tried adding separate webpack config rules to make exceptions for the library, but now it feels kind of hacky. In addition to that, the webpack dev server runs incredibly slow to boot up, presumably because it's running a babel transformation on the library too.
Is there an easier way to do this? Like telling webpack that "for this library in node_modules, use its own configuration file, and respect all of its own babel settings and relative imports?"

Socket.io and server-side webpack bundling

In web project server side/backend is packed and bundled by webpack in node mode (https://webpack.js.org/configuration/node/) to achieve one independent bundled distributed file, like for client/frontend side.
But there is one problem: that project is dependent on Socket.io library, and server-side part of socket.io contains following line: https://github.com/socketio/socket.io/blob/master/lib/index.js#L110 , which implies load some library part at runtime.
Such behavior causes problems in bundling server side by webpack, because socket.io-client library is not required directly by require operation, and that's why does not compile into bundle.
Of course, it's potentially possible to develop own Webpack plugin, which will search and operate over require.resolve invocations, for example, by resolving target files and place them in memory file system or pack into bundle as resources. But it not simple manual work, and most probably well-known solution is exists.
For example, maybe is there already bundled version of library for webpack usage?

Angular2 deploying to production environment questions

Some questions to put angular2 web project to production environment
We do development on lite server but what is best for production? Is is some other server module of nodejs? Technically we can have any server (apache, tomcat, etc).
How should we do source code management for below context.
browser must include js files so project should have js files when deployed
In standard java project we just commit .java files and uses jenkins (may be other tools) to compile and make the deploy-able structure
Should we follow same strategy here? i.e. don't commit compiled js files and deploy using some node compiler which takes ts files and compiles it to js
What is the best way to minify/obfuscate the js files
I know a way using outDir and outFile with grump but I don't want every files tobe included in one minified file because it kills the concept of lazy loading
Is there a way to minify and obfuscate js files on compile time only?
What enableProdMode() do? How it is different than not using it?
Here are some answers to your questions:
Angular2 applications only consist of static files so they can be serve by any static Web servers or server applications that can define static folders (Express, ...)
Regarding source code management, you must have a packaging phase to optimize the application loading (gater files, uglify, ...). Your source code must contain your TypeScript files (or JS files if using ES5 or ES6). Such packaging can be done using Gulp for example. Your Jenkins server will be able to checkout the source code, build it and execute tests.
In fact, when not using the outFile property of the TypeScript compiler, you won't be able to gather all the JS compiled files into a single one since anonymous modules will be created within each JS files.
See this question for more details of this:
How do I actually deploy an Angular 2 + Typescript + systemjs app?
Regarding prod mode, here is an extract of the documentation:
Disable Angular's development mode, which turns off assertions and other checks within the framework.
One important assertion this disables verifies that a change detection pass does not result in additional changes to any bindings (also known as unidirectional data flow).

Resources