I have two API in node js using babel and I have a package.json commands to use to make the application working this is the commands:
"build": "del-cli dist/ && babel src -d dist --copy-files",
"serve": "cross-env NODE_ENV=production node dist/index.js",
"start:noupdate": "cross-env NODE_ENV=development babel-node src/index.js",
"start:serve": "cross-env NODE_ENV=production node dist/index.js",
I have two domains one is https://api.website1.com.br and another is https://website2.com.br/api.
They have the same env file name but with another data for each database, that is .env.production and .env.development
When I make this "yarn build", my Linux execute this command :
"build": "del-cli dist/ && babel src -d dist --copy-files",
And this is working fine when I try to put in production mode on my real webservers, i go to the folder from the project and run this command to make the app online with PM2:
pm2 start npm -- run-script start:serve NODE_ENV=production
That will make this command work:
"cross-env NODE_ENV=production node dist/index.js"
The app runs just fine, but I have a problem he only runs one and doesn't create a new PM2 APP he just restarts what I start.
Example if I go to the folder in my https://api.website1.com.br and run this command first in this he starts, but I go to the another he doesn't start that but reload my already early app don't create a new one, what I'm doing wrong?
I manage to work this using pm2 ecosystem, that I found in this documentation from http://pm2.keymetrics.io/docs/usage/application-declaration/
I configure the default file and put a name my APP:
module.exports = {
apps : [{
name: "app",
script: "./app.js",
env: {
NODE_ENV: "development",
},
env_production: {
NODE_ENV: "production",
}
}]
}
and use this command pm2 start ecosystem.config.js and now is working, I post here to know if someone has the same problem
hi i have a nodejs and reactjs application in my local , developed the application from a boiler plate code , in boiler plate code package.json i have these following scripts
"scripts": {
"client-install": "npm install --prefix client",
"start": "node server.js",
"server": "nodemon server.js",
"client": "npm start --prefix client",
"dev": "concurrently \"npm run server\" \"npm run client\"",
"heroku-postbuild": "NPM_CONFIG_PRODUCTION=false npm install --prefix client && npm run build --prefix client"
}
as of i know it can be deployed on heroku with these scripts , can i use same scripts to deploy on azure , do i need to change anything here .
// Serve static assets if in production
if (process.env.NODE_ENV === "production") {
// Set static folder
app.use(express.static("client/build"));
app.get("*", (req, res) => {
res.sendFile(path.resolve(__dirname, "client", "build", "index.html"));
});
}
const port = process.env.PORT || 5000;
app.listen(port, () => console.log(`Server started on port ${port}`));
and this is what i have in my server.js as a starting point. i don't have any config folder to diff environments.for an example
// i don't have this folder structure and am not using webpack
-- config
|-- dev.json
|-- prod.json
can some one suggest , what is the best way to deploy it , or can i use same post-build script by changing it key like azure-postbuild
edited : i think i should use postinstall instead of heroku-postbuild
Hi I've been struggling for a while to figure out how to make an app on an AWS Windows server 2016 instance publicly accessible. My Ec2 traffic has been opened for inbound 8080. netstat shows 8080 is listening. The app can run on localhost on the EC2 by running command "npm run dev" but cannot be accessed publicly.
webpack.config.js:
const path = require('path');
const CopyWebpackPlugin = require('copy-webpack-plugin');
module.exports = {
entry: './app/javascripts/app.js',
output: {
path: path.resolve(__dirname, 'build'),
filename: 'app.js'
},
plugins: [
// Copy our app's index.html to the build folder.
new CopyWebpackPlugin([
{ from: './app/index.html', to: "index.html" },
{ from: './app/javascripts/browser-solc.min.js', to: "browser-solc.min.js" }
])
],
...
In package.json:
"scripts": {
"lint": "eslint ./",
"build": "webpack",
"dev": "webpack-dev-server",
...
I tried to modify "dev": "webpack-dev-server" to be "dev": "webpack-dev-server --inline --port 8080 --hot --host ec2_public_DNS_address", the console said it's run on the DNS with port 8080 open but I still cannot access it publicly.
Please help!
I have a Node app which builds React with Webpack and is hosted on Heroku. Whenever I push a newer version to Heroku master, the React files do not update. I have now pushed several newer versions but the React files in webpack:// will not update and remain the originals from when I first deployed the app.
Here is my webpack.config.js file:
const webpack = require('webpack');
const path = require('path');
module.exports = {
entry: {
main: `${__dirname}/src/app.js`
},
output: {
path: __dirname,
filename: './public/bundle.js'
},
module: {
loaders: [{
loader: 'babel-loader',
query: {
presets: ['react', 'es2015', 'stage-2']
},
test: /\.jsx?$/,
exclude: /(node_modules|bower_components)/
}]
},
devtool: 'cheap-module-eval-source-map'
};
My package.json includes "heroku-postinstall": "webpack -p -w --config ./webpack.config.js --progress".
I also faced similar issue.
(Just make sure that your webpack config file is correct and does not have any errors while running the webpack build locally)
I modified my post-install script in following way inside my package.json
"scripts": {
"clean": "rimraf public/bundle.*",
"build": "cross-env NODE_ENV=production webpack --config ./webpack.prod.config.js --progress --colors",
"postinstall": "npm run clean && npm run build",
}
When I push my changes to heroku "postinstall" gets called and it perform to task one after another
clean old build files
generate new build
In this way old files gets deleted from cache.
but there are are few dependencies which you need to install
rimraf
npm install --save rimraf
You can choose any other alternative to "rimraf" as well.
Thanks to an excellent answer by #McMath I now have webpack compiling both my client and my server. I'm now on to trying to make webpack --watch be useful. Ideally I'd like to have it spawn something like nodemon for my server process when that bundle changes, and some flavor of browsersync for when my client changes.
I realize it's a bundler/loader and not really a task runner, but is there some way to accomplish this? A lack of google results seems to indicate I'm trying something new, but this must have been done already..
I can always have webpack package to another directory and use gulp to watch it/copy it/browsersync-ify it, but that seems like a hack.. Is there a better way?
Install the following dependencies:
npm install npm-run-all webpack nodemon
Configure your package.json file to something as seen below:
package.json
{
...
"scripts": {
"start" : "npm-run-all --parallel watch:server watch:build",
"watch:build" : "webpack --watch",
"watch:server" : "nodemon \"./dist/index.js\" --watch \"./dist\""
},
...
}
After doing so, you can easily run your project by using npm start.
Don't forget config WatchIgnorePlugin for webpack to ignore ./dist folder.
Dependencies
npm-run-all - A CLI tool to run multiple npm-scripts in parallel or sequential.
webpack - webpack is a module bundler. Its main purpose is to bundle JavaScript files for usage in a browser, yet it is also capable of transforming, bundling, or packaging just about any resource or asset.
nodemon - Simple monitor script for use during development of a node.js app.
Faced the same problem and found the next solution - webpack-shell-plugin.
It
allows you to run any shell commands before or after webpack builds
So, thats my scripts in package.json:
"scripts": {
"clean": "rimraf build",
"prestart": "npm run clean",
"start": "webpack --config webpack.client.config.js",
"poststart": "webpack --watch --config webpack.server.config.js",
}
If I run 'start' script it launches next script sequence: clean -> start -> poststart.
And there is part of 'webpack.server.config.js':
var WebpackShellPlugin = require('webpack-shell-plugin');
...
if (process.env.NODE_ENV !== 'production') {
config.plugins.push(new WebpackShellPlugin({onBuildEnd: ['nodemon build/server.js --watch build']}));
}
...
"onBuildEnd" event fires only once after first build, rebuilds are not trigger "onBuildEnd", so nodemon works as intended
I like the simplicity of nodemon-webpack-plugin
webpack.config.js
const NodemonPlugin = require('nodemon-webpack-plugin')
module.exports = {
plugins: [new NodemonPlugin()]
}
then just run webpack with the watch flag
webpack --watch
In addition to #Ling's good answer:
If you want to build your project once, before you watch it with nodemon, you can use a webpack compiler hook. The plugin's code triggers nodemon in the done hook once after webpack has finished its compilation (see also this helpful post).
const { spawn } = require("child_process")
function OnFirstBuildDonePlugin() {
let isInitialBuild = true
return {
apply: compiler => {
compiler.hooks.done.tap("OnFirstBuildDonePlugin", compilation => {
if (isInitialBuild) {
isInitialBuild = false
spawn("nodemon dist/index.js --watch dist", {
stdio: "inherit",
shell: true
})
}
})
}
}
}
webpack.config.js:
module.exports = {
...
plugins: [
...
OnFirstBuildDonePlugin()
]
})
package.json:
"scripts": {
"dev" : "webpack --watch"
},
Hope, it helps.
There's no need to use plugins here. You could try running multiple nodemon instances like below. Try modifying the following script for your use case, and see if it works for you:
"scripts": {
"start": "nodemon --ignore './public/' ./bin/www & nodemon --ignore './public/' --exec 'yarn webpack'",
"webpack": "webpack --config frontend/webpack.config.js"
}
You don't need any plugins to use webpack and nodemon, just use this scripts on your package.json
"scripts": {
"start": "nodemon --ignore './client/dist' -e js,ejs,html,css --exec 'npm run watch'",
"watch": "npm run build && node ./server/index.js",
"build": "rimraf ./client/dist && webpack --bail --progress --profile"
},
#Ling has an answer very close to being correct. But it errors the first time somebody runs watch. You'll need to modify the solution as so to prevent errors.
Run npm install npm-run-all webpack nodemon
Create a file called watch-shim.js in your root. Add the following contents, which will create a dummy file and directory if they're missing.
var fs = require('fs');
if (!fs.existsSync('./dist')) {
fs.mkdir('./dist');
fs.writeFileSync('./dist/bundle.js', '');
}
Setup your scripts as so in package.json. This will only run watch if the watch-shim.js file runs successfully. Thereby preventing Nodemon from crashing due to missing files on the first run.
{
...
"scripts": {
"start": "npm run watch",
"watch": "node watch-shim.js && npm-run-all --parallel watch:server watch:build",
"watch:build": "webpack --progress --colors --watch",
"watch:server": "nodemon \"./dist/bundle.js\" --watch \"./dist/*\""
}
...
},
Assuming nodemon server.js touch the server.js file afterEmit:
// webpack.config.js
module.exports = {
// ...
plugins: [
// ...,
// 👇
apply: (compiler) => {
compiler.hooks.afterEmit.tap('AfterEmitPlugin', (compilation) => {
require('child_process').execSync('touch server.js') // $ touch server.js
});
}
]
}
I tried most of the solution provided above. I believe the best one is to use nodemon-webpack-plugin .
It is very simple to use i.e. just add
const NodemonPlugin = require('nodemon-webpack-plugin')
to webpack file with
new NodemonPlugin() as your plugin.
Below are the scripts to use it:
"scripts": {
"watch:webpack-build-dev": "webpack --watch --mode development",
"clean-db": "rm -rf ./db && mkdir -p ./db",
"local-dev": "npm run clean-db && npm run watch:webpack-build-dev"
...
}
After this you can simply run npm run local-dev.
Adding a module to development is usually not as bad as adding to a production one. And mostly you will be using it for the development anyway.
This also doesn't require any additional package like nodemon or npm-run-all etc.
Also nodemon-webpack-plugin only works in watch mode.