Lerna/webpack repository not building inside docker - node.js

I have a lerna repository that utilizes webpack to build. When I build in the repository directly with NODE_ENV=production npm run lerna run build --scope=#contuit/service-api-executor the build works properly. However, in the docker build I never end up with any output file main.js.
If I build in the repository first and then do the docker build, the built file gets copied and the container runs properly. What am I missing here?
FROM node:fermium as build
# Create app directory
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
# Install app dependencies
COPY package.json .
RUN npm install --loglevel notice
COPY packages/service-api-executor ./packages/service-api-executor
COPY packages/http ./packages/http
COPY packages/engine-sdk ./packages/engine-sdk
COPY packages/logger ./packages/logger
COPY packages/error ./packages/error
COPY packages/service ./packages/service
COPY packages/cli ./packages/cli
COPY lerna.json .
# Bundle app source
RUN yarn bootstrap
RUN NODE_ENV=production npm run lerna run build --scope=#contuit/service-api-executor
EXPOSE 7001
CMD [ "node", "packages/service-api-executor/dist/main.js" ]
webpack.config
const path = require('path');
const webpack = require('webpack');
const LernaProject = require('#lerna/project');
const StartServerPlugin = require('start-server-webpack-plugin');
const nodeExternals = require('webpack-node-externals');
const getBackendConfig = async (mode = 'development') => {
const project = new LernaProject(process.cwd());
const packages = await project.getPackages();
const moduleDirs = packages.map((p) => path.resolve(p.location, 'node_modules'));
const plugins = [new webpack.DefinePlugin({ 'global.GENTLY': false })];
console.log({ mode });
if (mode === 'development') {
plugins.push(
new StartServerPlugin({
name: 'main.js',
signal: true,
nodeArgs: ['--inspect']
})
);
}
return {
mode,
target: 'node',
stats: { logging: 'verbose' },
entry: './src/drivers/webserver/index.js',
watchOptions: {
// eslint-disable-next-line require-unicode-regexp
ignored: ["/node_modules\/(?!#contuit)/"],
},
externals: [
nodeExternals({
modulesDir: path.resolve(process.cwd(), 'node_modules'),
additionalModuleDirs: moduleDirs,
}),
],
output: {
filename: 'main.js',
path: path.resolve(process.cwd(), 'dist'),
},
plugins,
};
};
module.exports = { getBackendConfig };

Related

How do I integrate NewRelic into a Node Typescript Express server bundled with Webpack?

Frankly, I've tried it all. I'm not a total whiz with Webpack, however I seem to be getting along pretty well over the years with configuring new projects.
What I cannot seem to do now is set up the NewRelic service into an existing Node/Typescript/Express/Webpack application.
As it stands, my app gets nicely bundled to a single file in my /dist folder and runs quick and nimble. Seems like this 'node agent' put out by New Relic doesn't play well with Typescript imports.
Webpack Config
const path = require('path');
const webpack = require('webpack');
const nodeExternals = require('webpack-node-externals');
const NodemonPlugin = require ('nodemon-webpack-plugin');
module.exports = (env = {}) => {
const config = {
entry: ['./src/app.ts'],
mode: env.development ? 'development' : 'production',
target: 'node',
devtool: env.development ? 'inline-source-map' : false,
resolve: {
extensions: ['.ts', '.js'],
modules: ['node_modules', 'src', 'package.json'],
},
module: {
rules: [
{
test: /\.ts$/,
use: ['ts-loader', 'eslint-loader'],
// exclude: /node_modules/,
},
],
},
plugins: [],
externals: [ 'newrelic', nodeExternals() ]
};
if (env.nodemon) {
config.watch = true;
config.plugins.push(new NodemonPlugin())
}
return config;
};
there exists a standard /project_root/.newrelic file
CircleCi picks up this project up and runs "build:ci" script from package.json ==> "webpack"
output is /dist/main.js
references
https://docs.newrelic.com/docs/agents/nodejs-agent/installation-configuration/install-nodejs-agent
https://docs.newrelic.com/docs/agents/nodejs-agent/installation-configuration/nodejs-agent-configuration
https://discuss.newrelic.com/t/node-agent-fails-with-webpack/24874
Your first line of the starting point of the app should be
import newrelic from 'newrelic';
Of course, run npm install newrelic --save first
Then, create a newrelic.js file on the root of the repo (outside of src).
Then you put in the details like:
'use strict'
exports.config = {
app_name: ['appName'],
license_key: '1234567890',
allow_all_headers: true,
attributes: {
exclude: [
'request.headers.cookie',
'request.headers.authorization',
'request.headers.proxyAuthorization',
'request.headers.setCookie*',
'request.headers.x*',
'response.headers.cookie',
'response.headers.authorization',
'response.headers.proxyAuthorization',
'response.headers.setCookie*',
'response.headers.x*'
]
}
}

Creating an npm package from CRA build

We are trying to create a microfront ends app, we would like to take each micro app (created with CRA) run
npm run build
over this app, take the /build folder created, and make a npm package out of it and publish it to our npm repo.
We don't want to ejact the app projects to be able to edit the webpack.config.
We prefer to take the /build and pass it over webpack again (even in a different project) in order to get an output that can be used as npm package and can be published and imported correctly to a new project.
Im trying doing it by taking the /build folder and running it over webpack with this configuration:
const path = require("path");
const UglifyJsPlugin = require("uglifyjs-webpack-plugin");
const glob = require("glob");
module.exports = {
entry: {
"bundle.js": glob
.sync("build/static/?(js|css)/main.*.?(js|css)")
.map(f => path.resolve(__dirname, f))
},
output: {
filename: "build/static/js/bundle.min.js",
libraryTarget: "commonjs2"
},
module: {
rules: [
{
test: /\.css$/,
use: ["style-loader", "css-loader"]
}
]
},
plugins: [new UglifyJsPlugin()],
resolve: {
alias: {
src: path.join(__dirname, "./src")
}
},
externals: {
react: "commonjs react"
}
};
The result is a single js. after publishing it to npm, im trying to import it and getting many errors like:
I think there is something missing in my webpack.config or maybe there is a different way to take the all /build folder and combine it to something that can be published as npm package and imported correctly?
Any help will be very much appreciated!
Thank you!

How do I bundle bcrypt in a yarn workspace monorepo using webpack and serverless-framework? [duplicate]

I'm using yarn workspaces where the root directory has a package directory with all my repos. Each repo has its own node_modules directory containing its dependencies. The root node_modules directory contains all the dev dependencies for the whole project as well as all other dev related things such as webpack.config files. Webpack uses hot module reload for the express server package.
The problem I have is, how to configure webpack externals to exclude all node_modules directories through the whole project, not just in the root?
webpack-node-externals doesn't seem to work given this scenario.
Error message:
WARNING in ./packages/servers/express/node_modules/colors/lib/colors.js
127:29-43 Critical dependency: the request of a dependency is an expression
WARNING in ./packages/servers/express/node_modules/express/lib/view.js
79:29-41 Critical dependency: the request of a dependency is an expression
Webpack config:
const webpack = require('webpack');
const path = require('path');
const nodeExternals = require('webpack-node-externals');
const StartServerPlugin = require('start-server-webpack-plugin');
module.exports = {
entry: [
'babel-polyfill',
'webpack/hot/poll?1000',
path.join(__dirname, '../packages/servers/express/server/index.js')
],
watch: true,
target: 'node',
externals: [
nodeExternals({
whitelist: ['webpack/hot/poll?1000']
})
],
resolve: {
alias: {
handlebars: 'handlebars/dist/handlebars.js'
}
},
module: {
rules: [
{
test: /\.js?$/,
use: 'babel-loader',
exclude: /node_modules/
}
]
},
plugins: [
new StartServerPlugin('server.js'),
new webpack.NamedModulesPlugin(),
new webpack.HotModuleReplacementPlugin(),
new webpack.NoEmitOnErrorsPlugin(),
new webpack.DefinePlugin({
'process.env': { BUILD_TARGET: JSON.stringify('server') }
})
],
output: {
path: path.join(__dirname, '../packages/servers/express/.build'),
filename: 'server.js'
}
};
If using yarn workspaces with webpack-node-externals a better solution than setting modulesFromFile: true is to use the following externals setting in your webpack config:
externals: [
nodeExternals(),
nodeExternals({
modulesDir: path.resolve(__dirname, 'path/to/root/node_modules'),
}),
],
Essentially using two instances of nodeExternals. 1 for the package node_modules and one for the root node_modules.
Thanks to #blackxored I was able to fix it on my project.
In your webpack config file do the following:
import nodeExternals from 'webpack-node-externals'
Then add
externals: [
nodeExternals({
modulesFromFile: true,
}),
],
Yarn workspaces hoist compatible modules to the root node_modules directory leaving any incompatible (different semver, etc.) modules with the dependent workspace's node_modules directory. If a package is requested without using a relative path it is either native, from node_module's, or possibly a symlinked package from one of your workspaces. You probably want all of those packages to be external.
how to configure webpack externals to exclude all node_modules directories through the whole project, not just in the root?
I would try using a function with webpack's external option. You are passed the context of the require, the name of the module requested, and a callback to indicate whether this particular import (require) should be considered external.
externals: [
(ctx, req, cb) => {
if (!/node_modules/.test(ctx) && req[0] !== '.') {
// Assumes you have defined an "entries" variable
let notAnEntry = (path) => {
return Object.keys(entries).every((entry) => {
return entries[entry] !== path
});
};
if (notAnEntry(require.resolve(req))) {
// This module is external in a commonjs context
return cb(null, `commonjs ${req}`);
}
}
cb();
}
]

Docker + Webpack (Dev Server) + Yarnpkg incomplete builds

Problem
Converting a webpack project that runs locally right now to run inside docker containers. This work takes place in two git branches: develop, and containers.
Local (No Container)
develop is the stable base, which runs locally via
$ yarn install && npm run dev given the following in package.json
"scripts": {
"start": "node .",
"env:dev": "cross-env NODE_ENV=development",
"env:prod": "cross-env NODE_ENV=production",
"predev": "npm run prebuild",
"dev": "npm run env:dev -- webpack-dev-server",
//[...]
}
The branch develop does include yarn.lock, though FWIW, $ rm yarn.lock && yarn install --force && npm run dev does start up the server correctly, i.e. GET http://localhost:3000 gives me the homepage, as I expect to see it. The above all works the same after $ git checkout containers
Docker
After shutting down the local dev server, I run $ git checkout containers, and this branch does NOT contain the yarn.lock or package.lock. I then run $ docker-compose up --build web (in a separate terminal, in a sibling directory that contains the following in the docker-compose.yaml)
web:
build:
context: ../frontend/
dockerfile: Dockerfile
env_file: ../frontend/.env
volumes:
- ../frontend/src:/code/src
ports:
- "3001:3000"
depends_on:
- api
networks:
- backend
The frontend/Dockerfile for the service web is like so
# Dockerfile
FROM node:latest
RUN mkdir /code
ADD . /code/
WORKDIR /code/
RUN yarn cache clean && yarn install --non-interactive --force && npm rebuild node-sass
CMD npm run dev --verbose
given
#frontend/.dockerignore
node_modules
deploy
.circleci
stories
.storybook
All seems to go well, and the final line of the startup is web_1 | Server is running at http://localhost:3000/.
Yet when I GET http://localhost:3001 (note port mapping in docker-compose), the page that's returned does not contain the expected <style>...</style> tag in the <head> as is supposed to be injected (as far as I understand) by webpack, given the configuration below
// https://github.com/diegohaz/arc/wiki/Webpack
const path = require('path')
const devServer = require('#webpack-blocks/dev-server2')
const splitVendor = require('webpack-blocks-split-vendor')
const happypack = require('webpack-blocks-happypack')
const serverSourceMap = require('webpack-blocks-server-source-map')
const nodeExternals = require('webpack-node-externals')
const AssetsByTypePlugin = require('webpack-assets-by-type-plugin')
const ChildConfigPlugin = require('webpack-child-config-plugin')
const SpawnPlugin = require('webpack-spawn-plugin')
const ExtractTextPlugin = require('extract-text-webpack-plugin')
const {
addPlugins, createConfig, entryPoint, env, setOutput,
sourceMaps, defineConstants, webpack, group,
} = require('#webpack-blocks/webpack2')
const host = process.env.HOST || 'localhost'
const port = (+process.env.PORT + 1) || 3001
const sourceDir = process.env.SOURCE || 'src'
const publicPath = `/${process.env.PUBLIC_PATH || ''}/`.replace('//', '/')
const sourcePath = path.join(process.cwd(), sourceDir)
const outputPath = path.join(process.cwd(), 'dist/public')
const assetsPath = path.join(process.cwd(), 'dist/assets.json')
const clientEntryPath = path.join(sourcePath, 'client.js')
const serverEntryPath = path.join(sourcePath, 'server.js')
const devDomain = `http://${host}:${port}/`
//[...]
const sass = () => () => ({
module: {
rules: [
{
test: /\.(scss|sass)$/,
use: [
{ loader: 'style-loader' },
{ loader: 'css-loader' },
{ loader: 'sass-loader'},
],
},
],
},
})
const extractSass = new ExtractTextPlugin({
filename: 'style.css',
})
const prodSass = () => () => ({
module: {
rules: [
{ test: /\.(scss|sass)$/,
use: extractSass.extract({
use: [
{ loader: 'css-loader', options: { minimize: true } },
{ loader: 'sass-loader' },
],
fallback: 'style-loader',
}),
},
],
},
})
const babel = () => () => ({
module: {
rules: [
{ test: /\.jsx?$/, exclude: /node_modules/, loader: 'babel-loader' },
],
},
})
const assets = () => () => ({
module: {
rules: [
{ test: /\.(png|jpe?g|svg|woff2?|ttf|eot)$/, loader: 'url-loader?limit=8000' },
],
},
})
const resolveModules = modules => () => ({
resolve: {
modules: [].concat(modules, ['node_modules']),
},
})
const base = () => group([
setOutput({
filename: '[name].js',
path: outputPath,
publicPath,
}),
defineConstants({
'process.env.NODE_ENV': process.env.NODE_ENV,
'process.env.PUBLIC_PATH': publicPath.replace(/\/$/, ''),
}),
addPlugins([
new webpack.ProgressPlugin(),
extractSass,
]),
apiInsert(),
happypack([
babel(),
]),
assets(),
resolveModules(sourceDir),
env('development', [
setOutput({
publicPath: devDomain,
}),
sass(),
]),
env('production', [
prodSass(),
]),
])
const server = createConfig([
base(),
entryPoint({ server: serverEntryPath }),
setOutput({
filename: '../[name].js',
libraryTarget: 'commonjs2',
}),
addPlugins([
new webpack.BannerPlugin({
banner: 'global.assets = require("./assets.json");',
raw: true,
}),
]),
() => ({
target: 'node',
externals: [nodeExternals()],
stats: 'errors-only',
}),
env('development', [
serverSourceMap(),
addPlugins([
new SpawnPlugin('npm', ['start']),
]),
() => ({
watch: true,
}),
]),
])
const client = createConfig([
base(),
entryPoint({ client: clientEntryPath }),
addPlugins([
new AssetsByTypePlugin({ path: assetsPath }),
new ChildConfigPlugin(server),
]),
env('development', [
devServer({
contentBase: 'public',
stats: 'errors-only',
historyApiFallback: { index: publicPath },
headers: { 'Access-Control-Allow-Origin': '*' },
host,
port,
}),
sourceMaps(),
addPlugins([
new webpack.NamedModulesPlugin(),
]),
]),
env('production', [
splitVendor(),
addPlugins([
new webpack.optimize.UglifyJsPlugin({ compress: { warnings: false } }),
]),
]),
])
module.exports = client
Interestingly, adding this line to package.json
"dev-docker": "npm run predev && npm run env:dev -- webpack --progress --watch --watch-poll",
and changing the last line of the Dockerfile to CMD npm run dev-docker does yield the desired effect...
Hypotheses
My current suspicion is that I am missing something about how the webpack dev server handles serving its loader output, and have not mapped some port properly, but that's a shot in the dark.
Alternatively, the webpack-dev-server version is a problem. Local is 4.4.2 where docker's shows 5.6.0, though this seems probably not the issue as the documentation for latest matches my own setup. I've confirmed that the package.json specification for the loader modules is the latest stable on each of them.
Apologia
Recognizing that this is a problem caused by the intersection of several technologies in a config-dependent and necessarily idiosyncratic way, I humbly ask your help in working through this dependency hell. If it seems like I do not understand how a given piece of the puzzle operates, I'm happy to hear it. Any ideas, leads, or suggestions, however tenuous, will be greatly appreciated and exploited to the best of my abilities.
Long shot here, but I was trying to run a grails-vue app in docker containers and had issues with the port mappings of webpack-dev-server not being exposed properly.
I found this issue on github https://github.com/webpack/webpack-dev-server/issues/547 which led to me adding --host 0.0.0.0 to my dev task in package.json like so:
"dev": "webpack-dev-server --inline --progress --config build/webpack.dev.conf.js --host 0.0.0.0"
This solved my problem, maybe this will help you find your answer.
It's been a while, but coming back to this problem, I found the actual answer.
The webpack-dev-server uses two ports. Thus, in exposing only the one port (3000) I was not getting the built files, which are served in client.js on localhost:3001. The clue was right there the whole time in the JS console: a connection refused error on GET localhost:3001/client.js.
The solution is to expose both ports on the container, i.e.
docker run -it -p 3000:3000 -p 3001:3001 --rm --entrypoint "npm run env:dev -- webpack-dev-server" ${CONTAINER_REGISTRY}/${IMAGE_NAME}:${IMAGE_TAG}
It could be possible that your locally installed packages differ from the packages in the docker container.
To be sure that you have the same packages installed, you should include yarn.lock and package.lock files. If you only use yarn yarn.lock should suffice. Even if this does not solve your specific problem, it can prevent others, because now you have a deterministic build.

How to use webpack with a monorepo (yarnpkg workspaces)

I'm using yarn workspaces where the root directory has a package directory with all my repos. Each repo has its own node_modules directory containing its dependencies. The root node_modules directory contains all the dev dependencies for the whole project as well as all other dev related things such as webpack.config files. Webpack uses hot module reload for the express server package.
The problem I have is, how to configure webpack externals to exclude all node_modules directories through the whole project, not just in the root?
webpack-node-externals doesn't seem to work given this scenario.
Error message:
WARNING in ./packages/servers/express/node_modules/colors/lib/colors.js
127:29-43 Critical dependency: the request of a dependency is an expression
WARNING in ./packages/servers/express/node_modules/express/lib/view.js
79:29-41 Critical dependency: the request of a dependency is an expression
Webpack config:
const webpack = require('webpack');
const path = require('path');
const nodeExternals = require('webpack-node-externals');
const StartServerPlugin = require('start-server-webpack-plugin');
module.exports = {
entry: [
'babel-polyfill',
'webpack/hot/poll?1000',
path.join(__dirname, '../packages/servers/express/server/index.js')
],
watch: true,
target: 'node',
externals: [
nodeExternals({
whitelist: ['webpack/hot/poll?1000']
})
],
resolve: {
alias: {
handlebars: 'handlebars/dist/handlebars.js'
}
},
module: {
rules: [
{
test: /\.js?$/,
use: 'babel-loader',
exclude: /node_modules/
}
]
},
plugins: [
new StartServerPlugin('server.js'),
new webpack.NamedModulesPlugin(),
new webpack.HotModuleReplacementPlugin(),
new webpack.NoEmitOnErrorsPlugin(),
new webpack.DefinePlugin({
'process.env': { BUILD_TARGET: JSON.stringify('server') }
})
],
output: {
path: path.join(__dirname, '../packages/servers/express/.build'),
filename: 'server.js'
}
};
If using yarn workspaces with webpack-node-externals a better solution than setting modulesFromFile: true is to use the following externals setting in your webpack config:
externals: [
nodeExternals(),
nodeExternals({
modulesDir: path.resolve(__dirname, 'path/to/root/node_modules'),
}),
],
Essentially using two instances of nodeExternals. 1 for the package node_modules and one for the root node_modules.
Thanks to #blackxored I was able to fix it on my project.
In your webpack config file do the following:
import nodeExternals from 'webpack-node-externals'
Then add
externals: [
nodeExternals({
modulesFromFile: true,
}),
],
Yarn workspaces hoist compatible modules to the root node_modules directory leaving any incompatible (different semver, etc.) modules with the dependent workspace's node_modules directory. If a package is requested without using a relative path it is either native, from node_module's, or possibly a symlinked package from one of your workspaces. You probably want all of those packages to be external.
how to configure webpack externals to exclude all node_modules directories through the whole project, not just in the root?
I would try using a function with webpack's external option. You are passed the context of the require, the name of the module requested, and a callback to indicate whether this particular import (require) should be considered external.
externals: [
(ctx, req, cb) => {
if (!/node_modules/.test(ctx) && req[0] !== '.') {
// Assumes you have defined an "entries" variable
let notAnEntry = (path) => {
return Object.keys(entries).every((entry) => {
return entries[entry] !== path
});
};
if (notAnEntry(require.resolve(req))) {
// This module is external in a commonjs context
return cb(null, `commonjs ${req}`);
}
}
cb();
}
]

Resources