NodeJS Express Angular serving a login page entirely separate from the SPA - node.js

I am building a SPA in Angular 8. I have a multi-stage Docker image that runs ng build to build the distribution and then a simple express server is used to serve the application. (Note: The backend API is on an entirely separate express server.)
Requirements
I need to setup a login page "outside" of the SPA. The login page must be displayed if the user is not authenticated, that way the SPA is not bootstrapped until the authentication is successful (by checking a bearer token in the authorization header).
Questions
Do I need a separate Angular installation to load the login page separate from the rest of the app? Or, should I just skip Angular for the login page and build a simple express page with Pug that sends a POST to the API for authentication?
Note: I am seeking general advice on how to proceed and any examples would be very helpful as well.
Dockerfile
### Dev, QA, and Production Docker servers ###
### Stage 1: Build ###
# Base image
FROM node:12 as builder
# Set working directory
RUN mkdir -p /home/angular/app
WORKDIR /home/angular/app
# Add `/home/angular/app/node_modules/.bin` to $PATH
ENV PATH /home/angular/app/node_modules/.bin:$PATH
# Install and cache app dependencies
COPY angular/package.json /home/angular/app/package.json
RUN npm install -g #angular/cli#8 \
&& npm install
# Add app
COPY ./angular /home/angular/app
# Generate build
RUN ng build --output-path=dist
### Stage 2: Server ###
FROM node:12
USER node
# Create working directory
RUN mkdir /home/node/app
## From 'builder' stage copy over the artifacts in dist folder
COPY --from=builder --chown=node /home/angular/app/dist /home/node/app/dist
# Copy Express server code to container
COPY --chown=node ./express /home/node/app
WORKDIR /home/node/app
RUN npm install
# Expose ports
EXPOSE 4201
CMD ["npm", "start"]
Express server for Angular SPA
This server is run when the Dockerfile executes its command CMD ["npm", "start"]
const express = require('express');
const http = require('http');
const app = express();
// Set name of directory where angular distribution files are stored
const dist = 'dist';
// Set port
const port = process.env.PORT || 4201;
// Serve static assets
app.get('*.*', express.static(dist, { maxAge: '1y' }));
// Serve application paths
app.all('*', function (req, res) {
res.status(200).sendFile(`/`, { root: dist });
});
// Create server to listen for connections
const server = http.createServer(app);
server.listen(port, () => console.log("Node Express server for " + app.name + " listening on port " + port));

Angular supports multiple applications under same project. You can create separate login application using following command:
ng generate application <you-login-app-name-here>
This way you can keep only login related code in '' and other code in you main app. You can build, test or run this new app separate using following commands:
ng build <you-login-app-name-here>
ng test <you-login-app-name-here>
ng serve <you-login-app-name-here>
Angular will generate the build output in /dist/ folder which can be mapped to express route to serve file.

Related

API URL changes after npm build React

My React App communicates with a backend service using an API endpoint as follows:
const API_URL = process.env.REACT_APP_API_URL;
const response = await fetch(`${API_URL}/data`);
The environment variable REACT_APP_API_URL is saved in a .env file which looks like this:
REACT_APP_API_URL="http://1.2.3.4:8000"
When I serve the app using npm start everything works fine and the developer console shows that the call was made to the right URL:
Request URL: http://1.2.3.4:8000/data
I received the data as expected as well. But when I run npm run build and then serve the app with serve -s build/ -l 3000 the app fails to fetch because the request goes to the following URL:
Request URL: http://localhost:3000/1.2.3.4:8000/
What am I missing for this to happen?
So the problem was not with my React App or the environment variables but with the way I was serving the app
The listening argument -l in the serve command is the culprit. It should be -p for the port to host on
So the right command is
serve -s build/ -p 3000

ENV variables within cloud run server are no accessible

So,
I am using NUXT
I am deploying to google cloud run
I am using dotenv package with a .env file on development and it works fine.
I use the command process.env.VARIABLE_NAME within my dev server on Nuxt and it works great, I make sure that the .env is in git ignore so that it doesnt get uploaded.
However, I then deploy my application using the google cloud run... I make sure I go to the Enviroments tab and add in exactly the same variables that are within the .env file.
However, the variables are coming back as "UNDEFINED".
I have tried all sorts of ways of fixing this, but the only way I can is to upload my .env with the project - which I do not wish to do as NUXT exposes this file in the client side js.
Anyone come across this issue and know how to sort it out?
DOCKERFILE:
# base node image
FROM node:10
WORKDIR /user/src/app
ENV PORT 8080
ENV HOST 0.0.0.0
COPY package*.json ./
RUN npm install
# Copy local nuxt code to the container
COPY . .
# Build production app
RUN npm run build
# Start the service
CMD npm start
Kind Regards,
Josh
Finally I found a solution.
I was using Nuxt v1.11.x
From version equal to or greater than 1.13, Nuxt comes with Runtime Configurations, and this is what you need.
in your nuxt.config.js:
export default {
publicRuntimeConfig: {
BASE_URL: 'some'
},
privateRuntimeConfig: {
TOKEN: 'some'
}
}
then, you can access like:
this.$config.BASE_URL || context.$config.TOKEN
More details here
To insert value to the environment variables is not required to do it in the Dockerfile. You can do it through the command line at the deployment time.
For example here is the Dockerfile that I used.
FROM node:10
WORKDIR /app
COPY package*.json ./
RUN npm install
COPY . .
CMD ["npm","start"]
this is the app.js file
const express = require('express')
const app = express()
const port = 8080
app.get('/',(req,res) => {
const envtest = process.env.ENV_TEST;
res.json({message: 'Hello world',
envtest});
});
app.listen(port, () => console.log(`Example app listening on port ${port}`))
To deploy use a script like this:
gcloud run deploy [SERVICE] --image gcr.io/[PROJECT-ID]/[IMAGE] --update-env-vars ENV_TEST=TESTVARIABLE
And the output will be like the following:
{"message":"Hello world","envtest":"TESTVARIABLE"}
You can check more detail on the official documentation:
https://cloud.google.com/run/docs/configuring/environment-variables#command-line

Final Deployment of React Project

I am new to NodeJS and come from Java world, but in last 3 month I have done quite good development.
I use ExpressJS and ReactJSin my first project, Now during development we use 2 http server 1 for ExpressJS back-end application and another for ReactJS front-end application.
Now is this the way we have to deploy on production or we can combine it as 1 application and deploy on 1 http server listening on port 80.
regards
Deploy a production React app to Heroku
1. Create a React App
npm create-react-app heroku-deploy-test
cd heroku-deploy-test
2. Create an Express JS server to serve your production build
//server.js
const express = require('express');
const path = require('path');
const app = express();
app.use(express.static(__dirname));
app.use(express.static(path.join(__dirname, 'build')));
app.get('/*', function (req, res) {
res.sendFile(path.join(__dirname, 'build', 'index.html'));
});
const port = process.env.PORT || 8080;
app.listen(port);
In your package.json file, change the start script to the following: start:
"node server.js"
3. Deploy to Heroku
If you don’t already have a Heroku account, create one here: https://signup.heroku.com/
In your command line, run the following: heroku login
You will need to type in your heroku credentials to the terminal. Once you have successfully entered your heroku credentials, run the following in your terminal to create a new deployed app:
heroku create heroku-deploy-test
(Replace heroku-deploy-test with your own app name.)
Then push your app build to heroku with the following git in your terminal:
git init
git add .
git commit -m "initial commit"
heroku git:remote -a heroku-deploy-test
git push heroku master
These commands install your dependencies, initialize git, and connect your repo to the remote repository hosted by Heroku.
Note: if you already initialized your git before running heroku create [app-name], then you don’t need to run heroku git:remote -a [app-name].
run heroku open and your development app will open in your default browser. If you want a production build, I think you already know what to do. - > Create a production build of your React app. Create a proper .gitignore file so only the relevant files will be deployed.
IMPORTANT: If you already had a .gitignore file, make sure that this line isn't in it /build :)
May I also suggest reading this blog! Have a good one!

CRA, Node.js, nginx in Docker?

I'm starting off a new project. I currently have a strucute like this, from root folder:
/app (CRA frontend app)
/server (Node.js Express app)
Dockerfile
docker-compose.yml
My requirements is the following:
Development
Fire up Docker that creates necessary container(s)
Hot reloading for frontend React app (using CRA)
Node.js server that can serve my React app with SSR (automatically updated when editing)
Accessible via http://localhost:3000
Production
Potentially fire up Docker that creates necessary container(s)
Creates production ready version of React app
Creates production ready version of Express app
Accessible via port 80
Where I am right now is somewhere between everything. I don't know how to setup Docker the right way in order to make this whole thing work, and I don't really know how to structure my React app vs the Express app while developing. The Production part seems to be easier as soon as I know how to structure the Development part... + Nginx as a proxy for the Express app?
I'm currently building a Docker container which fires up a container where hot reloading is working etc, but I don't know how to setup the Express part so they work nicely together...?
Any help is much appreciated.
Thanks
Very broad question. Perhaps better to break it down into more direct questions. Anyway, I don't think running your dev setup in Docker is ideal. Instead build your app normally with CRA. Then deploy in Docker.
In my own projects, I have a docker container running a node server which serves the react app using SSR.
Here is the docker part. Note that your package.json should have a script named start:prod for this to work. That script then starts your app in production.
// --- Dockerfile
# Pulled from docker hub and has everything
# needed to run a node project
FROM node:alpine
ENV PORT 3000
# Navigate (cd) to the app folder in the docker container
WORKDIR /usr/src/app
# Copy all package.json / package-lock.json etc. to the root folder
# Executed on build: docker build .
COPY ./package*.json ./
RUN npm i
# copy entire project into docker container
COPY . .
# build front-end with react build scripts and store them in the build folder
RUN npm run build
EXPOSE 3000
CMD ["npm", "run", "start:prod"]
Here's the express server that will start the server.
// -- server.js
import express from "express";
import router from "./controller/index";
const app = express();
const port = 4000;
// Tell the app to use the routes above
app.use(router);
// start the app
app.listen(port, () => {
console.log(`express running on port ${port}`);
});
Here is the controller/index.js file you'll need to start up
// -- controller/index.js
import express from "express";
import path from "path";
import serverRenderer from '../middleware/renderer';
const router = express.Router();
// root (/) should always serve our server rendered page
router.use('^/$', serverRenderer());
// other static resources should just be served as they are
router.use(express.static(
path.resolve(__dirname, '..', '..', 'build'),
{ maxAge: '30d' },
));
export default router;
And finally the renderer which renders the app on the server.
// -- renderer.js
import React from "react";
import { renderToString } from "react-dom/server";
import App from "../../src/App";
const path = require("path");
const fs = require("fs");
export default () => (req, res) => {
// point to html file created by CRA's build tool
const filePath = path.resolve(__dirname, "..", "..", "build", "index.html");
fs.readFile(filePath, "utf8", (error, htmlData) => {
if (error) {
console.error("error", error);
return response.status(404).end();
}
// render the app as string
const html = renderToString(<App />);
// inject rendered app into final html and send
return res.send(
htmlData
.replace('<div id="root"></div>', `<div id="root">${html}</div>`)
);
})
}
You will need bootstrap.js to inject support for certain packages.
// -- bootstrap.js
require('ignore-styles');
require('url-loader');
require('file-loader');
require('babel-register')({
ignore: [/(node_modules)/],
presets: ['es2015', 'react-app'],
plugins: [
'syntax-dynamic-import',
'dynamic-import-node'
]
});
require("./index");
You can find the details of it all here:
https://blog.mytoori.com/react-served-by-express-running-in-docker-container

Starting Angular application as a default of a NodeJS project

I have a NodeJS application and an Angular 6 as a frontend.
The project looks like:
-> Node Project
---> src
---> Client_App (Anuglar)
To run the application, I need to follow those commands and start the server and angular separately, like:
-> node start
-> cd src/Client_App
-> ng serve
I need to start the two application with one single command or to add my dist file of Angular to be run at the start of my NodeJS, which is using Jade right now.
I am still new to NodeJS and still don't know how to configure it.
Anybody can help? Thanks
Edited:
I have tried now to add the dist folder to my views folder and run it within the app.js
app.get('/', function (req, res, next) {
res.sendFile(path.join(__dirname + '/app_server/views/ngapp/index.html'));
});
But I am receiving the error, that my .js and .css folders are not found:
When you build your application with the CLI ng build --prod, you get a dist folder : this folder contains all of your application, bundled into different files (feel free to look at them).
To be able to create a .ZIP file with that, you will need two things :
this dist folder
an http server
You have the first one, but not the second one.
All you need is a very simple server. For instance, http-server on NPM can do that. By installing it as a dev dependency, you could create a command in your package.json file
"deploy-locally": "http-server ./dist"
And now run it with
npm run deploy-locally
Or even better,
"start": "http-server ./dist"
And run with
npm start
If you don't want to use a NPM package (or forced to use NodeJS), simply create a basic http server in a JS file and run it with your command line (sorry, can't help on that, not into nodeJS right now).
You can create a new route and pass in app.route as express.static as below,
var express = require('express');
var path = require('path');
var bodyParser = require('body-parser');
var app = express();
app.use(express.static(path.join(__dirname, 'dist')));
make sure, u have build version of angular application by running this command,
ng build --prod --build-optimizer
You would need express to install in this case. express has amazing ways to handle all this

Resources