Getting Docker to work with Gulp and Browsersync - node.js

I'm trying to get Docker working with Gulp / Browsersync and having a lot of trouble. My docker-compose file:
# compose file for local development build
version: "3"
services:
web:
build:
context: ./webapp
args:
- NODE_ENV=development
environment:
- PORT=8080
command: npm run start-dev
restart: "no"
volumes:
- ./webapp:/app
- /app/node_modules/
ports:
- "8080:8080"
- "7000:7000"
The command runs a script in my package.json which is gulp & nodemon start.js. In theory, this should start gulp and nodemon simultaneously. I see them both starting up in the terminal, but changes to my watched files do not trigger an update.
The Dockerfile I'm referencing in the compose file is as such, which essentially just copies my app over and installs gulp globally (NODE_ENV is set to develpoment by docker-compose and the start command is overwritten to npm run start-dev):
FROM node:8.16-alpine
# env setup
ARG NODE_ENV=production
ENV NODE_ENV=${NODE_ENV}
WORKDIR /app
COPY . /app
# install npm packages
RUN npm install
RUN npm -g install nodemon#^1.19.1
RUN npm -g install gulp#^4.0.2
# compile scss (runs with browsersync in dev)
RUN if [ ${NODE_ENV} != "development" ]; then gulp; fi
# not used by heroku
EXPOSE 8080
CMD ["npm", "start"]
My gulpfile (this is where I have the least understanding as it was written by someone else). My guess is something that is has something to do with the proxy, but from what I've seen in other guides it should be correct.
const gulp = require('gulp')
const sass = require('gulp-sass')
const scsslint = require('gulp-scss-lint')
const size = require('gulp-size')
const csso = require('gulp-csso')
const autoprefixer = require('gulp-autoprefixer')
const browserSync = require('browser-sync')
const plumber = require('gulp-plumber')
const reload = browserSync.reload
const AUTOPREFIXER_BROWSERS = [
'ie >= 10',
'ie_mob >= 10',
'ff >= 30',
'chrome >= 34',
'safari >= 7',
'opera >= 23',
'ios >= 7',
'android >= 4.4',
'bb >= 10'
]
const SOURCE = {
scss: 'scss/**/*.scss',
css: 'public/css',
nunjucks: 'views/**/*.nunjucks',
html: '*.html',
js: ['/*.js', 'public/js/*.js']
}
// browser-sync task for starting the server.
gulp.task('browser-sync', function() {
browserSync({
proxy: "web:8080",
files: ["public/**/*.*"],
browser: "google chrome",
port: 7000
})
})
gulp.task('scss-lint', function(done) {
gulp.src('/' + SOURCE.scss)
.pipe(scsslint())
done()
})
// Compile, lint, and automatically prefix stylesheets
gulp.task('sass', gulp.series('scss-lint', function(done) {
let res = gulp.src(SOURCE.scss)
.pipe(plumber())
.pipe(sass({
includePaths: ['node_modules/', 'public/lib/chartist-js/dist/scss/']
}))
.pipe(autoprefixer({
browsers: AUTOPREFIXER_BROWSERS
}))
.pipe(csso(SOURCE.css))
.pipe(gulp.dest(SOURCE.css))
.pipe(size({
title: 'CSS: '
}))
// livereload for development
if (process.env.NODE_ENV === 'development') {
res.pipe(reload({
stream: true
}))
}
done()
})
)
gulp.task('bs-reload', function() {
browserSync.reload()
})
// default task to be run with `gulp`
if (process.env.NODE_ENV === 'development') {
// compile and start browsersync
gulp.task('default', gulp.series('sass', 'browser-sync', function(done) {
gulp.watch(SOURCE.scss, ['sass'])
gulp.watch([SOURCE.js, SOURCE.nunjucks, SOURCE.html], ['bs-reload'])
done()
}))
}
else {
// just compile
gulp.task('default', gulp.series('sass'))
}
module.exports = gulp

Related

How to build and start next.JS in production with graphql appollo server?

I am trying to dockerize a next.JS TypeScript app which uses express and apollo graphql.
My server/index.ts looks like this:
app.prepare().then(() => {
const server = express.default();
const apolloServer = new ApolloServer({
typeDefs,
resolvers,
});
server.get("*", (req, res) => {
return handle(req, res);
});
apolloServer.start().then((res) => {
console.log(res);
const graphqlHandler = apolloServer.createHandler({ path: "/" });
server.use("/api/graphql", graphqlHandler);
server.listen(process.env.PORT || 3000, (err: string | void) => {
if (err) throw err;
console.log(
`>>> Listening on http://localhost:${process.env.PORT || 3000}`
);
});
});
});
apollo client:
const GRAPHQL_URL = process.env.NODE_ENV == 'development' ? 'http://localhost:3000/api/graphql': 'https://app1.com/api/graphql' ;
package.json:
"scripts": {
"build:next": "next build",
"build": "npm run build:next && npm run build:server",
"start": "next start",
"start:production": "node dist/index.js"
If building with npm run build and then npm run start:production, then after the first refresh I get the error ``` ReferenceError: Cannot access 'data' before initialization ````. In this case the query request is on the CSR and not with getServerSideProps. The environment variable here is still "development" and not "production".
If building with next build and next start then my appollo server does not start and I get a 404 that the graphql API is not found.
I am starting the app in production in a docker container:
FROM node:16
ENV PORT 3000
# Create app directory
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
# Installing dependencies
COPY package*.json /usr/src/app/
RUN npm install
# Copying source files
COPY . /usr/src/app
# Building app
RUN npm run build
EXPOSE 3000
# Running the app
ENTRYPOINT [ "npm", "run", "start:production" ]
What am I doing wrong here?

Dockerfile not working on backend with react

I'm trying to run a react app with 2 node servers. One for the front end and one for the back-end connected with a mysql data-base.
I'm trying to use docker for the container and I managed to get the database and the front-end server up. However,When the back-end server is fired it seems like it doesn't acknowledge the Dockerfile.
node_server | npm WARN exec The following package was not found and will be installed: nodemon
node_server | Usage: nodemon [nodemon options] [script.js[args]
node_server |
node_server | See "nodemon --help" for more.
node_server |
node_server exited with code 0
Dockerfile - client:
FROM node:latest
RUN mkdir -p /usr/src/app
WORKDIR /usr/scr/app
EXPOSE 3000
COPY package.json .
RUN npm install express body-parser nano nodemon cors
COPY . .
Dockerfile - server
FROM node:latest
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
RUN npm init -y
RUN npm install express body-parser nano nodemon cors
EXPOSE 5000
CMD ["npx", "nodemon", "src/server.js"]
docker-compose
version: '3'
services:
backend:
build:
context: ./server
dockerfile: ./Dockerfile
depends_on:
- mysql
container_name: node_server
image:
raff/node_server
ports:
- "5000:5000"
volumes:
- "./server:/usr/src/app"
frontend:
build:
context: ./client
dockerfile: ./Dockerfile
container_name: node_client
image:
raff/node_client
ports:
- "3000:3000"
volumes:
- "./client:/usr/src/app"
mysql:
image: mysql:5.7.31
container_name: db
ports:
- "3306:3306"
environment:
MYSQL_ROOT_PASSWORD: admin
MYSQL_DATABASE: assignment
The server side is not done yet, but i don't believe it's causing this error.
Server.js
"use strict";
const path = require("path");
const express = require("express");
const app = express();
const bodyParser = require("body-parser");
app.use(bodyParser.urlencoded({ extended: true }));
app.use(express.json());
const mysql = require("mysql");
let con = mysql.createConnection({
host: "mysql",
port: "3306",
user: "root",
password: "admin",
});
const PORT = 5000;
const HOST = "0.0.0.0";
app.post("/posting", (req, res) => {
var topic = req.body.param1;
var data = req.body.param2;
sql_insertion(topic, data);
});
// Helper
const panic = (err) => console.error(err);
// Connect to database
con.connect((err) => {
if (err) {
panic(err);
}
console.log("Connected!");
con.query("CREATE DATABASE IF NOT EXISTS assignment", (err, result) => {
if (err) {
panic(err);
} else {
console.log("Database created!");
}
});
});
//select database
con.query("use assignment", (err, result) => {
if (err) {
panic(err);
}
});
// Create Table
let table =
"CREATE TABLE IF NOT EXISTS posts (ID int NOT NULL AUTO_INCREMENT, Topic varchar(255), Data varchar(255), Timestamp varchar(255), PRIMARY KEY(ID));";
con.query(table, (err) => {
if (err) {
panic(err);
} else {
console.log("Table created!");
}
});
app.get("*", (req, res) => {
res.sendFile(path.join(__dirname, "client/build" , "index.html"));
});
app.listen(PORT, HOST);
console.log("up!");
Modify this line
CMD ["npx", "nodemon", "src/server.js"]
By
CMD ["npx", "nodemon", "--exec", "node src/server.js"]
While putting the command in package.json under scripts section is better.
Your volumes: declarations are hiding everything that's in the image, including its node_modules directory. That's not normally required, and you should be able to trim the frontend: container definition down to
backend:
build: ./server # default `dockerfile:` location
depends_on:
- mysql
image: raff/node_server # only if you plan to `docker-compose push`
ports:
- "5000:5000"
The image then contains a fixed copy of the application, so there's no particular need to use nodemon; just run the application directly.
FROM node:latest
WORKDIR /usr/src/app # also creates the directory
COPY package.json package-lock.json .
RUN npm ci # do not `npm install` unmanaged packages
COPY . . # CHECK: `.dockerignore` must include `node_modules`
EXPOSE 5000
CMD ["node", "src/server.js"]
This apparently isn't a problem for your frontend application, because there's a typo in WORKDIR -- the image installs and runs its code in /usr/scr/app but the bind mount is over /usr/src/app, so the actual application's /usr/scr/app/node_modules directory isn't hidden.

Getting this Error While running docker-compose up - TypeError: redis.createCient({}) is not a function

I wrote this script to count the users every time they visit.
But during the build process, it's successfully downloading and installing the dependencies
but when executing using command docker-compose up the line redis.createClient({}) is throwing error as to its not a function.
**#Dockerfile**
FROM node:alpine
WORKDIR /app
COPY package.json .
RUN npm install
COPY . .
CMD ["npm", "start"]
**#docker-compose.yml**
version : '3'
services:
redis-server:
restart: always
image: redis
node-app:
restart: on-failure
build: .
ports:
- "4001:8081"
**# Application Code**
const express = require('express');
const redis = require('redis');
const process = require('process');
const app = express();
const client = redis.createClient({
host: 'redis-server',
port: 6379
});
client.set('visits', 0);
app.get('/', (req, res) => {
client.get('visits', (err, visits) => {
res.send('Number of visits ' + visits);
client.set('visits', parseInt(visits) + 1);
});
});
app.listen(8081, () => {
console.log('Listening on port 8081');
});

process.env undefined from docker-compose

I have a script in nodejs with webpack where I want to use environment variables from docker-compose but every times the variables is undefined.
this is a little piece of docker-compose:
container:
image: "node:8-alpine"
user: "node"
working_dir: /home/node/app
environment:
- NODE_ENV=development
volumes:
- ./project:/home/node/app
- ./conf:/home/node/conf
command: "yarn start"
I have this webpack configuration:
const path = require('path');
const TerserPlugin = require('terser-webpack-plugin');
const DefinePlugin = require('webpack').DefinePlugin;
module.exports = {
entry: './src-js/widget.js',
mode: process.env.NODE_ENV || 'development',
output: {
filename: 'widget.js',
path: path.resolve(__dirname, 'public')
},
optimization: {
minimizer: [new TerserPlugin()]
},
plugins: [
new DefinePlugin({
ENV: JSON.stringify(process.env.NODE_ENV)
})
]
};
Into my node script I would like to use NODE_ENV variables, so I have tried all this solutions but every time is undefined
console.log(process.env.NODE_ENV);
console.log(ENV);
console.log(process.env); //is empty
From the docker container I have tried to print environment variables and inside it there is NODE_ENV but I can't use it into node file. Why?
Usually I use yarn build or yarn watch to recompile it
Try this in your webpack configuration:
new DefinePlugin({
'process.env.NODE_ENV': JSON.stringify(process.env.NODE_ENV)
})
Additionally, official docs have a snippet about it.

Gulp-nodemon and watch task

I'm trying to create my build flow using gulp and nodemon. The objective is to watch sass files and compile them to css, and also restart node application when server file changes.
My gulpfile.js:
gulp.task('sass', function(){
return gulp.src(sassFilesTobeProcessed).
pipe(sass()).
pipe(concat('ready_stylesheet.css')).
pipe(gulp.dest('express/public/stylesheets'))
})
gulp.task('watch', function(){
return gulp.watch(allSassFiles, ['sass']);
})
gulp.task('serve', function(){
return nodemon({
script: 'express/app.js',
}).on('start', ['watch'])
.on('change', ['watch'])
.on('restart', function(){
console.log('restarted');
})
})
The watch task is working fine, files are compiled after change. But changes in my app.js server file doesn't trigger server restart. When I comment the .on statements it starts to work fine (server reloads), but then of course sass files are no longer observed. I assume hence, there is some conflict between these two, which I cannot discover. Appreciate any help! My OS - Windows 7, node 4.2.6, nodemon 1.9.1
Use a task dependency instead of .on(event) to start your watch task:
gulp.task('serve', ['watch'], function(){
return nodemon({
script: 'express/app.js',
})
.on('restart', function(){
console.log('restarted');
})
})
emit the restart event with nodemon
const cfg = require('../config')
const gulp = require('gulp')
const nodemon = require('nodemon')
const gnodemon = require('gulp-nodemon')
gulp.task('nodemon', ['ts', 'json'], () => {
gnodemon({
script: cfg.paths.main,
tasks: ['ts', 'json'],
ext: 'js',
watch: [cfg.paths.src],
// para no alterar el entorno de prodicion con test
env: {'NODE_ENV': process.env.NODE_ENV !== 'production'
? process.env.NODE_ENV || 'development' : 'development'}
})
.on('start', ['mocha'])
})
gulp.task('default', ['nodemon'], () => {
gulp.watch(cfg.paths.src, (event) => nodemon.emit('restart'))
})

Resources