Docker and Webpack hot reload not working - node.js

Here I would like to use Docker for my future react/webpack app, but I can't configure Webpack and/or docker correctly for the reload to work (webpack-dev-server ).
I don't really understand why, the configuration seems ok to me, maybe my "start" command which is not good ?
Here is the Dockerfile config :
FROM node:11-alpine
WORKDIR /app
COPY package.json /app
RUN npm install
COPY . /app
CMD npm start
EXPOSE 8081
Here Webpack.config.js :
const HtmlWebPackPlugin = require("html-webpack-plugin");
const path = require('path');
module.exports = {
entry: "./src/App.jsx",
output: {
path: path.resolve(__dirname, 'dist'),
filename: 'bundle.js'
},
module: {
rules: [
{
test: /\.(js|jsx)$/,
exclude: /node_modules/,
use: {
loader: "babel-loader"
}
},
{
test: /\.s[ac]ss$/i,
use: [
// Creates `style` nodes from JS strings
'style-loader',
// Translates CSS into CommonJS
'css-loader',
'resolve-url-loader',
// Compiles Sass to CSS
'sass-loader',
]
}
]
},
devServer: {
historyApiFallback: true,
port:8081,
host: '0.0.0.0',
watchOptions: {
aggregateTimeout: 500, // delay before reloading
poll: 1000 // enable polling since fsevents are not supported in docker
}
},
plugins: [new HtmlWebPackPlugin({ template: "./src/index.html" })]
};
Here npm start scripts :
"start": "webpack-dev-server --host 0.0.0.0 --config ./webpack.config.js --mode development",
Thank you !

From the comment, you are not using binding host volume to the container. you should bind host volume to make hot reloading work.
docker run -it --rm -v $PWD/host_app_code/:/app test
where $PWD/host_app_code/ is the path to host files, once you bind this path, your change on the host will effect inside the container and hot reload should work and you will not need to build image every time.

As mentioned in #Adiii's answer, you will need to use a bind-mount volume so that the files are changed inside the container as you change them on your host, without having to rebuild the image.
I just wanted to add that Docker's getting started tutorial explains using bind-mounts for development. I would definitely recommend to go through that if you haven't, as it can give a deeper understanding of how it works and why this is necessary.
Docker's getting-started tutorial is available on docker-hub:
docker run -d -p 80:80 docker/getting-started

Related

Cannot use import statement outside a module

I'm faced with a problem with my API in Heroku. I have a Node JS API, built with Typescript and hosted in Heroku. Everything looks correct when I try to execute the local script and build the script, but when I need to run the start script, things don't work.
My configurations are:
Node: v18.12.1
NPM: v8.19.2
I have some scripts to transpile .ts files to .js files with babel
"build": "babel src --extensions \".js,.ts\" --out-dir dist --copy-files",
"dev:server": "ts-node-dev -r tsconfig-paths/register --inspect --transpile-only --ignore-watch node_modules src/shared/infra/http/server.ts",
"start": "node dist/shared/infra/http/server.js"
When I execute dev:server and build script, everything runs with success, but when I run the start script, I receive this error:
enter image description here
I checked some points, like my tsconfig.json and my babel.config, but everything looks correct.
module.exports = {
presets: [
['#babel/preset-env', { targets: { node: 'current' } }],
'#babel/preset-typescript',
],
plugins: [
[
'module-resolver',
{
alias: {
'#modules': './src/modules',
'#config': './src/config',
'#shared': './src/shared',
},
},
],
'babel-plugin-transform-typescript-metadata',
['#babel/plugin-proposal-decorators', { legacy: true }],
['#babel/plugin-proposal-class-properties', { loose: true }],
[
'const-enum',
{
transform: 'removeConst',
},
],
],
};
Because of it, when I deploy API in Heroku, I recive this error:
enter image description here.
I don't have an idea why this occur, because about 1 month ago the API was running perfectly on Heroku production instance.
I appreciate it if anyone can help and give me some tips, about this problem.
What I tried
Check and change my npm and node versions
Check babel confgs
Add "type":"module"in my packages.json

Generate stand alone js artifacts using Vite as side effect of another build

I'm using Vite (vite#3.1.8)
to build Typescript artifacts for an SPA "site" using SolidJS (solid-js#1.6.0).
here's my vite.config.ts
import { defineConfig, resolveBaseUrl } from 'vite'
import solidPlugin from 'vite-plugin-solid'
export default defineConfig({
plugins: [solidPlugin()],
server: {
port: 3000,
},
build: {
target: 'esnext',
outDir: '../htdocs',
rollupOptions: {
input: {
index: "./index.html",
dev: "./dev.html",
test: "./test.ts",
},
output: {
entryFileNames: `assets/[name].js`,
chunkFileNames: `assets/[name].js`,
assetFileNames: `assets/[name].[ext]`
}
},
},
});
Currently, it actually builds 2 html files (index.html and dev.html) and the artifacts needed to run those files. Its great. Couldn't be happier.
I would like to have the transpiler to also kick out test.js so that I can run it to do some sanity checking before deploying to production.
I'm hoping to do vite build, and then run node ../htdocs/assets/test.js (or something similar), and have it block the final deployment if any my sanity tests fail.
however, when I attempt to do this, I get an error when I run test.js, complaining about my use of import statements.
Warning: To load an ES module, set "type": "module" in the package.json or use the .mjs extension.
setting my package type to module in package.json doesn't fix it. changing the test file to test.mjs doesnt fix it. I'm not really sure what to try next.
What I really wish it would do is do the whole "import" as part of transpiling, and make one self-contained test.js that just runs. It seems like that is what it does when it builds index.html and dev.html, why wont it do that for my ts file?
That should work. I just tried making a new repo with your vite.config.ts, one-line index.html, dev.html, and test.ts files, and vite, vite-plugin-solid, solid-js installed. In the end I got a ../htdocs/assets/test.js file.
You might also be interested in checking out Vitest which makes testing like this easier to do, and won't accidentally end up in your deployed htdocs.
The best solution I could find was to make a separate config file for building the tests.
import { defineConfig } from 'vite'
import solidPlugin from 'vite-plugin-solid'
export default defineConfig({
plugins: [solidPlugin()],
server: {
port: 3000,
},
build: {
target: 'esnext',
outDir: '../htdocs',
lib: {
entry: "./test-runner.ts",
name: "test-runner",
fileName: "test-runner"
},
rollupOptions: {
},
},
});
and then, update my package.json to make my test script compile and run the output from that alternative vite config.
"scripts": {
"start": "vite",
"dev": "vite",
"build": "vite build",
"serve": "vite preview",
"test": "vite build --config vite.config-tests.ts && node ../htdocs/test-runner.umd.js"
},

How to publish sitemap.xml file with webpack

I don't have much experience configuring Webpack and I am working on a React project and have created an external script that generates a sitemap.xml file. So far so good.
The project is using React 16 and Webpack 4.41, how can I make the sitemap.xml file available to be used in production? When I run npm run build locally I can see the file is not being added to the /public folder, even after I have added the xml rule ( module/rules ) to the webpack.config.js file.
mode: "production",
context: path.resolve(__dirname, "./app"),
entry: {
app: "./js/app.js",
styles: "./scss/main.scss",
},
output: {
filename: "[name].deploy.bundle.js",
path: path.resolve(__dirname, "./public/assets"),
publicPath: "/assets",
},
module: {
rules: [
{ test: /\.html/ },
{ test: /\.xml/ },
]
}
What Am I missing?
Can it be because my file is currently outside the context folder?
The (simplified) project structure is as follows:
project
| sitemap.xml
| node-script-that-creates-sitemap.js
| webpack.config.js
|
└───app/
|
└───public/
I'd appreciate your help.
It turns out the solution didn't have to do with webpack. I managed to solve the issue by copying the file in the build step.
I added these scripts to my package.json file:
"scripts": {
"sitemap-to-build": "npm run generate-sitemap && cp ./sitemap.xml _site/sitemap.xml",
"build": "[other stuff...] && webpack --env.NODE_ENV=production --config webpack.config.js && npm run sitemap-to-build
}

Simple node.js workflow with docker

I'm using Docker on windows for development purposes and i'm trying to create a simple workflow for a node.js project.
I followed this tutorial https://nodejs.org/en/docs/guides/nodejs-docker-webapp/ so my Dockerfile looks like this
FROM node:boron
# Create app directory
WORKDIR /usr/src/app
# Install app dependencies
COPY package.json .
# For npm#5 or later, copy package-lock.json as well
# COPY package.json package-lock.json ./
RUN npm install
# Bundle app source
COPY . .
EXPOSE 8080
CMD [ "npm", "start" ]
My "workflow" for each change would look like this
FIRST BUILD
docker build -t thomas/myApp DockerProjects/myApp ; docker run --name app -p 49160:8080 -d thomas/myApp
AFTER EACH CHANGE
docker build -t thomas/myApp DockerProjects/myApp ; docker stop app ; docker rm app ; docker run --name app -p 49160:8080 -d thomas/myApp
I don't want to have hundreds of containers after each change in the project, that's why i'm deleting it before creating another one.
I see several problems:
Each time there is a change and a new image is build, a new <none>:<none> image is created. These images have the same weight as the original one. How can I avoid that ?
Can I use nodemon somehow ?
Can I launch this process automatically each time I change something in the code ?
Docker is quite new for me and i'm still experimenting with it.
Thanks
You can use nodemon in your project to restart your app automatically while your source code directory would be mounted on a volume.
For instance, with this directory structure (which is using Grunt from package.json to run nodemon) :
app/
├── Dockerfile
├── package.json
├── Gruntfile.js
├── src/
│ └── app.js
└── docker-compose.yml
You can use docker-compose which is a tool used to run multiple container. This can be useful if you want to add a database container your app would talk to or any additionnal services interacting with your app.
The following docker-compose config will mount src folder on /usr/src/app/src on the container. With nodemon looking for changes inside src, you will be able to make changes on your machine that will restart the app on the container automatically
To use this you would do :
cd app
docker-compose up
The command above with build the image from dockerfile and start the containers defined in docker-compose.yml.
docker-compose.yml :
version: '2'
services:
your-app:
build: .
ports:
- "8080:8080"
restart: always
container_name: app_container
volumes:
- ./src:/usr/src/app/src
environment:
- SERVER_PORT=8080
Dockerfile :
FROM node:latest
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
COPY package.json .
COPY Gruntfile.js .
RUN npm install
CMD ["npm","start"]
Gruntfile.js :
var path = require('path');
module.exports = function (grunt) {
grunt.initConfig({
pkg: grunt.file.readJSON('package.json'),
concurrent: {
dev: {
tasks: ['nodemon'],
options: {
logConcurrentOutput: true
}
}
},
nodemon: {
dev: {
script: 'src/app.js',
options: {
ignore: [
'node_modules/**'
],
ext: 'js'
}
}
},
clean: {}
});
grunt.loadNpmTasks('grunt-concurrent');
grunt.loadNpmTasks('grunt-nodemon');
grunt.registerTask('default', ['concurrent']);
};
package.json :
{
"name": "your-app",
"version": "1.0.0",
"description": "service",
"scripts": {
"start": "grunt"
},
"author": "someone",
"license": "MIT",
"dependencies": {
"express": "^4.14.0"
},
"devDependencies": {
"grunt": "1.x.x",
"grunt-cli": "1.x.x",
"grunt-concurrent": "2.x.x",
"grunt-nodemon": "0.4.x"
}
}
Sample app.js :
'use strict';
const express = require('express');
const port = process.env.SERVER_PORT;
var app = express();
app.get('/', function(req, res) {
res.send('Hello World');
});
app.listen(port, function() {
console.log('listening on port ' + port);
});
To rebuild the image, you would perform docker-compose build
Each time there is a change and a new image is build, a new <none>:<none> image is created. These images have the same weight as the original one. How can I avoid that ?
You can't. This : iamge is your previous image which was replaced by your new image. So just delete it: docker image prune
Can I use nodemon somehow?
I'm not familiar with that, but it looks like it only restarts your server, but doesnt't do a npm install.
Can I launch this process automatically each time I change something in the code?
I would use Jenkins and automatically build your new Docker image on each git commit.

Webpack not updating on Heroku rebuild

I have a Node app which builds React with Webpack and is hosted on Heroku. Whenever I push a newer version to Heroku master, the React files do not update. I have now pushed several newer versions but the React files in webpack:// will not update and remain the originals from when I first deployed the app.
Here is my webpack.config.js file:
const webpack = require('webpack');
const path = require('path');
module.exports = {
entry: {
main: `${__dirname}/src/app.js`
},
output: {
path: __dirname,
filename: './public/bundle.js'
},
module: {
loaders: [{
loader: 'babel-loader',
query: {
presets: ['react', 'es2015', 'stage-2']
},
test: /\.jsx?$/,
exclude: /(node_modules|bower_components)/
}]
},
devtool: 'cheap-module-eval-source-map'
};
My package.json includes "heroku-postinstall": "webpack -p -w --config ./webpack.config.js --progress".
I also faced similar issue.
(Just make sure that your webpack config file is correct and does not have any errors while running the webpack build locally)
I modified my post-install script in following way inside my package.json
"scripts": {
"clean": "rimraf public/bundle.*",
"build": "cross-env NODE_ENV=production webpack --config ./webpack.prod.config.js --progress --colors",
"postinstall": "npm run clean && npm run build",
}
When I push my changes to heroku "postinstall" gets called and it perform to task one after another
clean old build files
generate new build
In this way old files gets deleted from cache.
but there are are few dependencies which you need to install
rimraf
npm install --save rimraf
You can choose any other alternative to "rimraf" as well.

Resources