How to publish sitemap.xml file with webpack - node.js

I don't have much experience configuring Webpack and I am working on a React project and have created an external script that generates a sitemap.xml file. So far so good.
The project is using React 16 and Webpack 4.41, how can I make the sitemap.xml file available to be used in production? When I run npm run build locally I can see the file is not being added to the /public folder, even after I have added the xml rule ( module/rules ) to the webpack.config.js file.
mode: "production",
context: path.resolve(__dirname, "./app"),
entry: {
app: "./js/app.js",
styles: "./scss/main.scss",
},
output: {
filename: "[name].deploy.bundle.js",
path: path.resolve(__dirname, "./public/assets"),
publicPath: "/assets",
},
module: {
rules: [
{ test: /\.html/ },
{ test: /\.xml/ },
]
}
What Am I missing?
Can it be because my file is currently outside the context folder?
The (simplified) project structure is as follows:
project
| sitemap.xml
| node-script-that-creates-sitemap.js
| webpack.config.js
|
└───app/
|
└───public/
I'd appreciate your help.

It turns out the solution didn't have to do with webpack. I managed to solve the issue by copying the file in the build step.
I added these scripts to my package.json file:
"scripts": {
"sitemap-to-build": "npm run generate-sitemap && cp ./sitemap.xml _site/sitemap.xml",
"build": "[other stuff...] && webpack --env.NODE_ENV=production --config webpack.config.js && npm run sitemap-to-build
}

Related

Cannot use import statement outside a module

I'm faced with a problem with my API in Heroku. I have a Node JS API, built with Typescript and hosted in Heroku. Everything looks correct when I try to execute the local script and build the script, but when I need to run the start script, things don't work.
My configurations are:
Node: v18.12.1
NPM: v8.19.2
I have some scripts to transpile .ts files to .js files with babel
"build": "babel src --extensions \".js,.ts\" --out-dir dist --copy-files",
"dev:server": "ts-node-dev -r tsconfig-paths/register --inspect --transpile-only --ignore-watch node_modules src/shared/infra/http/server.ts",
"start": "node dist/shared/infra/http/server.js"
When I execute dev:server and build script, everything runs with success, but when I run the start script, I receive this error:
enter image description here
I checked some points, like my tsconfig.json and my babel.config, but everything looks correct.
module.exports = {
presets: [
['#babel/preset-env', { targets: { node: 'current' } }],
'#babel/preset-typescript',
],
plugins: [
[
'module-resolver',
{
alias: {
'#modules': './src/modules',
'#config': './src/config',
'#shared': './src/shared',
},
},
],
'babel-plugin-transform-typescript-metadata',
['#babel/plugin-proposal-decorators', { legacy: true }],
['#babel/plugin-proposal-class-properties', { loose: true }],
[
'const-enum',
{
transform: 'removeConst',
},
],
],
};
Because of it, when I deploy API in Heroku, I recive this error:
enter image description here.
I don't have an idea why this occur, because about 1 month ago the API was running perfectly on Heroku production instance.
I appreciate it if anyone can help and give me some tips, about this problem.
What I tried
Check and change my npm and node versions
Check babel confgs
Add "type":"module"in my packages.json

Generate stand alone js artifacts using Vite as side effect of another build

I'm using Vite (vite#3.1.8)
to build Typescript artifacts for an SPA "site" using SolidJS (solid-js#1.6.0).
here's my vite.config.ts
import { defineConfig, resolveBaseUrl } from 'vite'
import solidPlugin from 'vite-plugin-solid'
export default defineConfig({
plugins: [solidPlugin()],
server: {
port: 3000,
},
build: {
target: 'esnext',
outDir: '../htdocs',
rollupOptions: {
input: {
index: "./index.html",
dev: "./dev.html",
test: "./test.ts",
},
output: {
entryFileNames: `assets/[name].js`,
chunkFileNames: `assets/[name].js`,
assetFileNames: `assets/[name].[ext]`
}
},
},
});
Currently, it actually builds 2 html files (index.html and dev.html) and the artifacts needed to run those files. Its great. Couldn't be happier.
I would like to have the transpiler to also kick out test.js so that I can run it to do some sanity checking before deploying to production.
I'm hoping to do vite build, and then run node ../htdocs/assets/test.js (or something similar), and have it block the final deployment if any my sanity tests fail.
however, when I attempt to do this, I get an error when I run test.js, complaining about my use of import statements.
Warning: To load an ES module, set "type": "module" in the package.json or use the .mjs extension.
setting my package type to module in package.json doesn't fix it. changing the test file to test.mjs doesnt fix it. I'm not really sure what to try next.
What I really wish it would do is do the whole "import" as part of transpiling, and make one self-contained test.js that just runs. It seems like that is what it does when it builds index.html and dev.html, why wont it do that for my ts file?
That should work. I just tried making a new repo with your vite.config.ts, one-line index.html, dev.html, and test.ts files, and vite, vite-plugin-solid, solid-js installed. In the end I got a ../htdocs/assets/test.js file.
You might also be interested in checking out Vitest which makes testing like this easier to do, and won't accidentally end up in your deployed htdocs.
The best solution I could find was to make a separate config file for building the tests.
import { defineConfig } from 'vite'
import solidPlugin from 'vite-plugin-solid'
export default defineConfig({
plugins: [solidPlugin()],
server: {
port: 3000,
},
build: {
target: 'esnext',
outDir: '../htdocs',
lib: {
entry: "./test-runner.ts",
name: "test-runner",
fileName: "test-runner"
},
rollupOptions: {
},
},
});
and then, update my package.json to make my test script compile and run the output from that alternative vite config.
"scripts": {
"start": "vite",
"dev": "vite",
"build": "vite build",
"serve": "vite preview",
"test": "vite build --config vite.config-tests.ts && node ../htdocs/test-runner.umd.js"
},

Docker and Webpack hot reload not working

Here I would like to use Docker for my future react/webpack app, but I can't configure Webpack and/or docker correctly for the reload to work (webpack-dev-server ).
I don't really understand why, the configuration seems ok to me, maybe my "start" command which is not good ?
Here is the Dockerfile config :
FROM node:11-alpine
WORKDIR /app
COPY package.json /app
RUN npm install
COPY . /app
CMD npm start
EXPOSE 8081
Here Webpack.config.js :
const HtmlWebPackPlugin = require("html-webpack-plugin");
const path = require('path');
module.exports = {
entry: "./src/App.jsx",
output: {
path: path.resolve(__dirname, 'dist'),
filename: 'bundle.js'
},
module: {
rules: [
{
test: /\.(js|jsx)$/,
exclude: /node_modules/,
use: {
loader: "babel-loader"
}
},
{
test: /\.s[ac]ss$/i,
use: [
// Creates `style` nodes from JS strings
'style-loader',
// Translates CSS into CommonJS
'css-loader',
'resolve-url-loader',
// Compiles Sass to CSS
'sass-loader',
]
}
]
},
devServer: {
historyApiFallback: true,
port:8081,
host: '0.0.0.0',
watchOptions: {
aggregateTimeout: 500, // delay before reloading
poll: 1000 // enable polling since fsevents are not supported in docker
}
},
plugins: [new HtmlWebPackPlugin({ template: "./src/index.html" })]
};
Here npm start scripts :
"start": "webpack-dev-server --host 0.0.0.0 --config ./webpack.config.js --mode development",
Thank you !
From the comment, you are not using binding host volume to the container. you should bind host volume to make hot reloading work.
docker run -it --rm -v $PWD/host_app_code/:/app test
where $PWD/host_app_code/ is the path to host files, once you bind this path, your change on the host will effect inside the container and hot reload should work and you will not need to build image every time.
As mentioned in #Adiii's answer, you will need to use a bind-mount volume so that the files are changed inside the container as you change them on your host, without having to rebuild the image.
I just wanted to add that Docker's getting started tutorial explains using bind-mounts for development. I would definitely recommend to go through that if you haven't, as it can give a deeper understanding of how it works and why this is necessary.
Docker's getting-started tutorial is available on docker-hub:
docker run -d -p 80:80 docker/getting-started

bundle.js not getting created

I am trying to create simple webpack project using vs code.
It has two folders:
1.dist
2.src
i have app.js file in src folder and i need the bundle file to be created with the webpack command
for this i am using below command
webpack ./src/app.js ./dist/bundle.js
but this command is giving below error
ERROR in multi ./src/app.js ./dist/app.bundle.js
Module not found: Error: Can't resolve './dist/app.bundle.js' in 'D:\Webpack\WEBPACK-101'
# multi ./src/app.js ./dist/app.bundle.js
There is some minor thing which I am missing, it would be very helpful if anyone can figure it out what exactly I am missing?
Thanks in advance!!!
If you don't have a webpack.config.js file, you can rely on the default configuration for that, but you need to do some things before.
First, your "main" script has to be in: src/ and it should be named index.js
By default the output is always dist/.
So knowing that, you can run: webpack. And your bundle is going to be generated.
Or you could webpack --entry ./src/app.js --output ./dist/bundle.js
You will need to create webpack.config.js
const path = require('path');
module.exports = {
entry: './src/app.js',
output: {
filename: 'bundle.js',
path: path.resolve(__dirname, 'dist')
}
};
Update package.json
{
"name": "webpack-demo",
"version": "1.0.0",
"description": "",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1",
+ "build": "webpack"
},
.....
}
And Run
npm run build
More Details
First of all it is important that all data in fajl package.json will be entered through the command line/ terminal/gitshell...Than, on windows 10 I resolved the problem with typing in terminal next:
$ npm install typescript

Webpack not updating on Heroku rebuild

I have a Node app which builds React with Webpack and is hosted on Heroku. Whenever I push a newer version to Heroku master, the React files do not update. I have now pushed several newer versions but the React files in webpack:// will not update and remain the originals from when I first deployed the app.
Here is my webpack.config.js file:
const webpack = require('webpack');
const path = require('path');
module.exports = {
entry: {
main: `${__dirname}/src/app.js`
},
output: {
path: __dirname,
filename: './public/bundle.js'
},
module: {
loaders: [{
loader: 'babel-loader',
query: {
presets: ['react', 'es2015', 'stage-2']
},
test: /\.jsx?$/,
exclude: /(node_modules|bower_components)/
}]
},
devtool: 'cheap-module-eval-source-map'
};
My package.json includes "heroku-postinstall": "webpack -p -w --config ./webpack.config.js --progress".
I also faced similar issue.
(Just make sure that your webpack config file is correct and does not have any errors while running the webpack build locally)
I modified my post-install script in following way inside my package.json
"scripts": {
"clean": "rimraf public/bundle.*",
"build": "cross-env NODE_ENV=production webpack --config ./webpack.prod.config.js --progress --colors",
"postinstall": "npm run clean && npm run build",
}
When I push my changes to heroku "postinstall" gets called and it perform to task one after another
clean old build files
generate new build
In this way old files gets deleted from cache.
but there are are few dependencies which you need to install
rimraf
npm install --save rimraf
You can choose any other alternative to "rimraf" as well.

Resources