I'm starting off a new project. I currently have a strucute like this, from root folder:
/app (CRA frontend app)
/server (Node.js Express app)
Dockerfile
docker-compose.yml
My requirements is the following:
Development
Fire up Docker that creates necessary container(s)
Hot reloading for frontend React app (using CRA)
Node.js server that can serve my React app with SSR (automatically updated when editing)
Accessible via http://localhost:3000
Production
Potentially fire up Docker that creates necessary container(s)
Creates production ready version of React app
Creates production ready version of Express app
Accessible via port 80
Where I am right now is somewhere between everything. I don't know how to setup Docker the right way in order to make this whole thing work, and I don't really know how to structure my React app vs the Express app while developing. The Production part seems to be easier as soon as I know how to structure the Development part... + Nginx as a proxy for the Express app?
I'm currently building a Docker container which fires up a container where hot reloading is working etc, but I don't know how to setup the Express part so they work nicely together...?
Any help is much appreciated.
Thanks
Very broad question. Perhaps better to break it down into more direct questions. Anyway, I don't think running your dev setup in Docker is ideal. Instead build your app normally with CRA. Then deploy in Docker.
In my own projects, I have a docker container running a node server which serves the react app using SSR.
Here is the docker part. Note that your package.json should have a script named start:prod for this to work. That script then starts your app in production.
// --- Dockerfile
# Pulled from docker hub and has everything
# needed to run a node project
FROM node:alpine
ENV PORT 3000
# Navigate (cd) to the app folder in the docker container
WORKDIR /usr/src/app
# Copy all package.json / package-lock.json etc. to the root folder
# Executed on build: docker build .
COPY ./package*.json ./
RUN npm i
# copy entire project into docker container
COPY . .
# build front-end with react build scripts and store them in the build folder
RUN npm run build
EXPOSE 3000
CMD ["npm", "run", "start:prod"]
Here's the express server that will start the server.
// -- server.js
import express from "express";
import router from "./controller/index";
const app = express();
const port = 4000;
// Tell the app to use the routes above
app.use(router);
// start the app
app.listen(port, () => {
console.log(`express running on port ${port}`);
});
Here is the controller/index.js file you'll need to start up
// -- controller/index.js
import express from "express";
import path from "path";
import serverRenderer from '../middleware/renderer';
const router = express.Router();
// root (/) should always serve our server rendered page
router.use('^/$', serverRenderer());
// other static resources should just be served as they are
router.use(express.static(
path.resolve(__dirname, '..', '..', 'build'),
{ maxAge: '30d' },
));
export default router;
And finally the renderer which renders the app on the server.
// -- renderer.js
import React from "react";
import { renderToString } from "react-dom/server";
import App from "../../src/App";
const path = require("path");
const fs = require("fs");
export default () => (req, res) => {
// point to html file created by CRA's build tool
const filePath = path.resolve(__dirname, "..", "..", "build", "index.html");
fs.readFile(filePath, "utf8", (error, htmlData) => {
if (error) {
console.error("error", error);
return response.status(404).end();
}
// render the app as string
const html = renderToString(<App />);
// inject rendered app into final html and send
return res.send(
htmlData
.replace('<div id="root"></div>', `<div id="root">${html}</div>`)
);
})
}
You will need bootstrap.js to inject support for certain packages.
// -- bootstrap.js
require('ignore-styles');
require('url-loader');
require('file-loader');
require('babel-register')({
ignore: [/(node_modules)/],
presets: ['es2015', 'react-app'],
plugins: [
'syntax-dynamic-import',
'dynamic-import-node'
]
});
require("./index");
You can find the details of it all here:
https://blog.mytoori.com/react-served-by-express-running-in-docker-container
Related
I am following along with this tutorial to learn how to containerize a node.js / express application I'm working on. Everything seems to be working fine, except that when I docker ps after building the container, I don't see my running container.
Node.js / express code:
const express = require("express");
const app = express();
const port = process.env.PORT || 3000;
app.get('/', (req, res) => {
res.send("Hi there")
})
app.listen(port, () => console.log("Listening On Port 3000"))
Dockerfile
FROM node:latest
WORKDIR /untitled
COPY package.json .
RUN npm install
EXPOSE 3000
COPY . ./
CMD node server.js
My working directory is "untitled" because that's the default name that Webstorm gives my project and I haven't changed it yet.
The image build seems to have worked fine
As well as the container build
But docker ps isn't showing me any containers
I'm developing with Next.js + Electron + Typescript.
I'm using the npx create-next-app --example with-electron-typescript command to generate code.
npm run dev (the contents are npm run build-electron && electron . ), it seems that the local server is up and running on localhost8000, but when build, the server is not up internally, and it is running by directly accessing the file.
However, some APIs do not work correctly if there is no domain in the location.origin , so it works in Dev, but does not work in Build.
So, if it is possible, I would like to run the server on localhost in the build version as well as in the Dev version.
Is there anything I can do to make it work?
It's not shown in any of the examples, even though someone requested one:
https://github.com/vercel/next.js/issues/28225
It is possible using a custom server:
https://nextjs.org/docs/advanced-features/custom-server
You can follow these steps to create one:
Clone the Electron Next TypeScript example repo:
https://github.com/vercel/next.js/tree/canary/examples/with-electron-typescript
Update ./electron-src/index.ts with the following code:
import isDev from 'electron-is-dev';
import { createServer } from 'http';
import next from 'next';
import { parse } from 'url';
app.on('ready', async () => {
// Use server-side rendering for both dev and production builds
const nextApp = next({
dev: isDev,
dir: app.getAppPath() + '/renderer'
});
const requestHandler = nextApp.getRequestHandler();
// Build the renderer code and watch the files
await nextApp.prepare();
// Create a new native HTTP server (which supports hot code reloading)
createServer((req: any, res: any) => {
const parsedUrl = parse(req.url, true)
requestHandler(req, res, parsedUrl)
}).listen(3000, () => {
console.log('> Ready on http://localhost:3000')
})
mainWindow.loadURL('http://localhost:3000/')
Update ./package.json Electron build configuration to include the renderer src files:
"build": {
"asar": true,
"files": [
"main",
"renderer"
]
}
In ./package.json move next from devDependencies to dependencies. This means it will be available to run in production builds
Then use helpful scripts to unpack the binary and see the files/folder inside:
npx asar extract ./dist/mac/ElectronTypescriptNext.app/Contents/Resources/app.asar ./dist/unpack
Run the unpacked version to debug:
./node_modules/.bin/electron ./dist/unpack
I have created an Express Server version and NextJS versions to prove it is possible:
https://github.com/kmturley/electron-server/tree/feature/express
https://github.com/kmturley/electron-server/tree/feature/next
I am new to Node JS. I have a Node JS + Express app that is causing a Docker container to repeatedly stop with exit code 137 - out of memory error every 5-10 minutes.
The app could not be more simple serving up static html, css, js and images
const express = require('express')
const app = express()
const port = 8080
app.use(express.static('public'))
app.get('/', (req, res) => {
res.sendFile('index.html', { root: __dirname + "/public" } );
})
app.listen(port, () => {
console.log(`web app listening at http://localhost:${port}`)
})
Directory structure
--root
server.js
--public
index.html
--css
--js
--img
The host is an ECS EC2 instance - T2 small with 2Gb memory. The container task allocation 1024 MiB of memory.
Dockerfile
# from nodes 12 image
FROM node:12
MAINTAINER coco
# Create app directory
WORKDIR /usr/src/app
# Install app dependencies
COPY package*.json ./
RUN npm install
# Bundle app source - copy all files/folders in current directory
COPY . .
# Specify ports
EXPOSE 8080
# Run the app
CMD [ "node", "server.js" ]
What about this Node app is causing a memory leak? Though, it may be express that's leaking. I'm using express 4.17.1
I am building a SPA in Angular 8. I have a multi-stage Docker image that runs ng build to build the distribution and then a simple express server is used to serve the application. (Note: The backend API is on an entirely separate express server.)
Requirements
I need to setup a login page "outside" of the SPA. The login page must be displayed if the user is not authenticated, that way the SPA is not bootstrapped until the authentication is successful (by checking a bearer token in the authorization header).
Questions
Do I need a separate Angular installation to load the login page separate from the rest of the app? Or, should I just skip Angular for the login page and build a simple express page with Pug that sends a POST to the API for authentication?
Note: I am seeking general advice on how to proceed and any examples would be very helpful as well.
Dockerfile
### Dev, QA, and Production Docker servers ###
### Stage 1: Build ###
# Base image
FROM node:12 as builder
# Set working directory
RUN mkdir -p /home/angular/app
WORKDIR /home/angular/app
# Add `/home/angular/app/node_modules/.bin` to $PATH
ENV PATH /home/angular/app/node_modules/.bin:$PATH
# Install and cache app dependencies
COPY angular/package.json /home/angular/app/package.json
RUN npm install -g #angular/cli#8 \
&& npm install
# Add app
COPY ./angular /home/angular/app
# Generate build
RUN ng build --output-path=dist
### Stage 2: Server ###
FROM node:12
USER node
# Create working directory
RUN mkdir /home/node/app
## From 'builder' stage copy over the artifacts in dist folder
COPY --from=builder --chown=node /home/angular/app/dist /home/node/app/dist
# Copy Express server code to container
COPY --chown=node ./express /home/node/app
WORKDIR /home/node/app
RUN npm install
# Expose ports
EXPOSE 4201
CMD ["npm", "start"]
Express server for Angular SPA
This server is run when the Dockerfile executes its command CMD ["npm", "start"]
const express = require('express');
const http = require('http');
const app = express();
// Set name of directory where angular distribution files are stored
const dist = 'dist';
// Set port
const port = process.env.PORT || 4201;
// Serve static assets
app.get('*.*', express.static(dist, { maxAge: '1y' }));
// Serve application paths
app.all('*', function (req, res) {
res.status(200).sendFile(`/`, { root: dist });
});
// Create server to listen for connections
const server = http.createServer(app);
server.listen(port, () => console.log("Node Express server for " + app.name + " listening on port " + port));
Angular supports multiple applications under same project. You can create separate login application using following command:
ng generate application <you-login-app-name-here>
This way you can keep only login related code in '' and other code in you main app. You can build, test or run this new app separate using following commands:
ng build <you-login-app-name-here>
ng test <you-login-app-name-here>
ng serve <you-login-app-name-here>
Angular will generate the build output in /dist/ folder which can be mapped to express route to serve file.
Nodejs application created using Express (express genrator) and used handlebars as view engine. Created couple of routes and works fine. Application running on port 3000.
Express routes:
...
app.use('/', index);
app.use('/landing', landing);
app.use('/home', home);
app.use('/api', api);
...
There is an admin panel separate application built on Angular
Currently Angular application running on port 4200 and uses APIs from NodeJs application which running on port 3000.
Angular application routes
const routes : Routes = [
{ path: '', redirectTo: '/user', pathMatch: 'full' },
{
path: 'user',
component : UserComponent,
children : [
{ path:'', redirectTo: '/account', pathMatch: 'full' },
{ path: 'account', component: AccountComponent },
]
},
]
NodeJs application folder structure
api/
api.js
bin/
www
modules/
mongoose.js
node_modules/
public/
css/
fonts/
img/
js/
ngapp/ => Angular resources created with ng build
inline.bundle.js
main.bundle.js
polyfills.bundle.js
styles.bundle.js
vendor.bundle.js
routes/
home.js
index.js
landing.js
views/
common/
header.hbs
footer.hbs
layouts/
master.hbs
ngapp/
index.html => Angular index.html file
index.hbs
landing.hbs
home.hbs
app.js
package.json
What I'm trying:
Want to run both NodeJs and Angular application on same port i.e. port 3000.
What I have done:
Ran ng build and placed index.html file inside /views/ngapp/ of nodejs folder structure.
Created one 'user' route in nodejs and serving that index.html file of angular application. (May be this is not a good way)
app.get('/user', function (req, res, next) {
res.sendFile(path.join(__dirname + '/views/ngapp/index.html'));
});
Somehow its loaded but encountered an error:
My question is about how we can integrate Angular Application (may be running on separate route but on the same port) with NodeJs application which already have some routes defined and used view engine to render pages.
There are two possible solutions.
Use your node application to serve the static frontend files. Then you can't really use ng serve (this is probably what you'd do when running live).
You should be able to tell Express to serve static content from an Angular build directory like this:
app.use(express.static('../angular/dist'));
Which would work if you had a file structure like so and were running serve.js with Node:
-node server
-serve.js
-angular
-dist/*
You can customize as needed by configuring the Angular build folder to be wherever you need it, or use Grunt/Gulp to move files around to the folders you prefer with a build task.
Use nodejs with a different port, and use Angular's proxy config, to have Angular think the api port is actually 4200 (this is probably best during development).
This is primarily a concern during development I reckon, since you most likely wont (and shouldn't) be using ng serve live, so option 2 would be my best recommendation.
To configure a proxy, you create a file in your angular application root directory called proxy.config.json with the following content:
{
"/api/*": {
"target": "http://localhost:3000",
"secure": false,
"changeOrigin": true
}
}
Then when you run ng serve, you run it with ng serve --proxy-config proxy.config.json instead.