Flutter Web App Hosted in NodeJS shows only white screen - node.js

I have built a flutter web application (flutter stable#3.3.9) where I have set the url strategy to PathUrlStrategy().
Of course, locally this application builds fine and runs fine. I am hosting this application in a NodeJS application as follows:
import express, {Express, Request, Response} from "express";
import dotenv from "dotenv";
import cookieParser from "cookie-parser";
import https from "https";
import {ClientRequest, IncomingMessage} from "http";
import path from "path";
const app: Express = express();
const port = process.env.PORT;
app.use(express.json());
app.use(express.urlencoded({extended: false}));
app.use(cookieParser());
app.use(express.static(path.join(__dirname, "flutter")));
app.get("/api/page/:id", async (req, res) => {
getPageSessionHandler(req, res);
});
app.post("/api/payment", async (req, res) => {
console.log("handling payment request");
postPaymentHandler(req, res);
});
app.get("*", (_, res) => {
res.sendFile(path.resolve(__dirname, "flutter/index.html"));
});
app.listen(port, () => {
console.log(
`⚡️[server (nodejs)]: Server is running at http://localhost:${port}`,
);
});
var postPaymentHandler = (authToken: String, req: any, res: any) => {
//implementation removed
};
var getPageSessionHandler = (authToken: String, req: any, res: any) => {
//implementation removed
};
Of course, this runs fine locally as follows:
flutter build web --release --web-render=html
Then move the build/web* output to the proper folder in my nodejs server.
I can even locally containerize this application and run it from my docker desktop (windows 11) using the following dockerfile:
FROM debian:latest AS build-env
# Install flutter dependencies and nodejs dependencies
RUN apt-get update
RUN apt-get install -y curl git wget unzip gettext-base libgconf-2-4 gdb libstdc++6 libglu1-mesa fonts-droid-fallback lib32stdc++6 python3
RUN apt-get clean
RUN curl -fsSL https://deb.nodesource.com/setup_12.x | bash -
RUN apt-get -y install nodejs
RUN npm install
WORKDIR /app
COPY . .
RUN npm install
RUN npm run build
RUN git clone https://github.com/flutter/flutter.git /usr/local/flutter
ENV PATH="/usr/local/flutter/bin:/usr/local/flutter/bin/cache/dart-sdk/bin:${PATH}"
WORKDIR /app/flutter_src/payment
# Build the app
RUN touch .env
RUN flutter clean && flutter pub get
RUN flutter build web --release --web-renderer=html
# Build the final image
WORKDIR /app
RUN cp -R flutter_src/payment/build/web dist/flutter
CMD ["/bin/sh", "-c", "exec node dist/index.ts"]
Again, this works fine in both my windows node server environment, and also in my docker desktop container. When I deploy this via my CI/CD pipeline to AWS ECR, I am unable to load the application. I can hit the associated API endpoints in the node's index.ts file above. My kubernetes pod is healthy... but I am not able to load routes with a second slash... i.e.:
https:///welcome <--- loads fine
https:///user/fun_username <-- does not load.
After a ton of debugging, I'm finding another oddity in how this behaves in environment. See the Network log from a request to my application's deep-linked route in Chrome (and also Edge):
when requesting /page/{page-id-here}, the browser is requesting the main.dart.js at the subroute.
What's even more perplexing is that if I request the same deep route in Firefox, not only does my application load as intended and work as expected, the browser seems to literally be requesting my main.dart.js at the root (what it should be) of my node server, as it's serving the flutter directory statically... See this screenshot from firefox:
here, the main.dart.js file is requested from the root of the node server as I'd expect.
I have tried all sorts of routing tricks in the node server, but that just feels wrong, especially since it seems to be specific to my environment. If I revert the path strategy back to client-only routing (/#/path/here), this works fine, but does not meet my requirements on this project.
Thank you all for your help. This is weeks of struggling I'm deferring on at this point.

Related

How to run electron on a localhost server in Build as well as in Dev

I'm developing with Next.js + Electron + Typescript.
I'm using the npx create-next-app --example with-electron-typescript command to generate code.
npm run dev (the contents are npm run build-electron && electron . ), it seems that the local server is up and running on localhost8000, but when build, the server is not up internally, and it is running by directly accessing the file.
However, some APIs do not work correctly if there is no domain in the location.origin , so it works in Dev, but does not work in Build.
So, if it is possible, I would like to run the server on localhost in the build version as well as in the Dev version.
Is there anything I can do to make it work?
It's not shown in any of the examples, even though someone requested one:
https://github.com/vercel/next.js/issues/28225
It is possible using a custom server:
https://nextjs.org/docs/advanced-features/custom-server
You can follow these steps to create one:
Clone the Electron Next TypeScript example repo:
https://github.com/vercel/next.js/tree/canary/examples/with-electron-typescript
Update ./electron-src/index.ts with the following code:
import isDev from 'electron-is-dev';
import { createServer } from 'http';
import next from 'next';
import { parse } from 'url';
app.on('ready', async () => {
// Use server-side rendering for both dev and production builds
const nextApp = next({
dev: isDev,
dir: app.getAppPath() + '/renderer'
});
const requestHandler = nextApp.getRequestHandler();
// Build the renderer code and watch the files
await nextApp.prepare();
// Create a new native HTTP server (which supports hot code reloading)
createServer((req: any, res: any) => {
const parsedUrl = parse(req.url, true)
requestHandler(req, res, parsedUrl)
}).listen(3000, () => {
console.log('> Ready on http://localhost:3000')
})
mainWindow.loadURL('http://localhost:3000/')
Update ./package.json Electron build configuration to include the renderer src files:
"build": {
"asar": true,
"files": [
"main",
"renderer"
]
}
In ./package.json move next from devDependencies to dependencies. This means it will be available to run in production builds
Then use helpful scripts to unpack the binary and see the files/folder inside:
npx asar extract ./dist/mac/ElectronTypescriptNext.app/Contents/Resources/app.asar ./dist/unpack
Run the unpacked version to debug:
./node_modules/.bin/electron ./dist/unpack
I have created an Express Server version and NextJS versions to prove it is possible:
https://github.com/kmturley/electron-server/tree/feature/express
https://github.com/kmturley/electron-server/tree/feature/next

ENV variables within cloud run server are no accessible

So,
I am using NUXT
I am deploying to google cloud run
I am using dotenv package with a .env file on development and it works fine.
I use the command process.env.VARIABLE_NAME within my dev server on Nuxt and it works great, I make sure that the .env is in git ignore so that it doesnt get uploaded.
However, I then deploy my application using the google cloud run... I make sure I go to the Enviroments tab and add in exactly the same variables that are within the .env file.
However, the variables are coming back as "UNDEFINED".
I have tried all sorts of ways of fixing this, but the only way I can is to upload my .env with the project - which I do not wish to do as NUXT exposes this file in the client side js.
Anyone come across this issue and know how to sort it out?
DOCKERFILE:
# base node image
FROM node:10
WORKDIR /user/src/app
ENV PORT 8080
ENV HOST 0.0.0.0
COPY package*.json ./
RUN npm install
# Copy local nuxt code to the container
COPY . .
# Build production app
RUN npm run build
# Start the service
CMD npm start
Kind Regards,
Josh
Finally I found a solution.
I was using Nuxt v1.11.x
From version equal to or greater than 1.13, Nuxt comes with Runtime Configurations, and this is what you need.
in your nuxt.config.js:
export default {
publicRuntimeConfig: {
BASE_URL: 'some'
},
privateRuntimeConfig: {
TOKEN: 'some'
}
}
then, you can access like:
this.$config.BASE_URL || context.$config.TOKEN
More details here
To insert value to the environment variables is not required to do it in the Dockerfile. You can do it through the command line at the deployment time.
For example here is the Dockerfile that I used.
FROM node:10
WORKDIR /app
COPY package*.json ./
RUN npm install
COPY . .
CMD ["npm","start"]
this is the app.js file
const express = require('express')
const app = express()
const port = 8080
app.get('/',(req,res) => {
const envtest = process.env.ENV_TEST;
res.json({message: 'Hello world',
envtest});
});
app.listen(port, () => console.log(`Example app listening on port ${port}`))
To deploy use a script like this:
gcloud run deploy [SERVICE] --image gcr.io/[PROJECT-ID]/[IMAGE] --update-env-vars ENV_TEST=TESTVARIABLE
And the output will be like the following:
{"message":"Hello world","envtest":"TESTVARIABLE"}
You can check more detail on the official documentation:
https://cloud.google.com/run/docs/configuring/environment-variables#command-line

NodeJS Express Angular serving a login page entirely separate from the SPA

I am building a SPA in Angular 8. I have a multi-stage Docker image that runs ng build to build the distribution and then a simple express server is used to serve the application. (Note: The backend API is on an entirely separate express server.)
Requirements
I need to setup a login page "outside" of the SPA. The login page must be displayed if the user is not authenticated, that way the SPA is not bootstrapped until the authentication is successful (by checking a bearer token in the authorization header).
Questions
Do I need a separate Angular installation to load the login page separate from the rest of the app? Or, should I just skip Angular for the login page and build a simple express page with Pug that sends a POST to the API for authentication?
Note: I am seeking general advice on how to proceed and any examples would be very helpful as well.
Dockerfile
### Dev, QA, and Production Docker servers ###
### Stage 1: Build ###
# Base image
FROM node:12 as builder
# Set working directory
RUN mkdir -p /home/angular/app
WORKDIR /home/angular/app
# Add `/home/angular/app/node_modules/.bin` to $PATH
ENV PATH /home/angular/app/node_modules/.bin:$PATH
# Install and cache app dependencies
COPY angular/package.json /home/angular/app/package.json
RUN npm install -g #angular/cli#8 \
&& npm install
# Add app
COPY ./angular /home/angular/app
# Generate build
RUN ng build --output-path=dist
### Stage 2: Server ###
FROM node:12
USER node
# Create working directory
RUN mkdir /home/node/app
## From 'builder' stage copy over the artifacts in dist folder
COPY --from=builder --chown=node /home/angular/app/dist /home/node/app/dist
# Copy Express server code to container
COPY --chown=node ./express /home/node/app
WORKDIR /home/node/app
RUN npm install
# Expose ports
EXPOSE 4201
CMD ["npm", "start"]
Express server for Angular SPA
This server is run when the Dockerfile executes its command CMD ["npm", "start"]
const express = require('express');
const http = require('http');
const app = express();
// Set name of directory where angular distribution files are stored
const dist = 'dist';
// Set port
const port = process.env.PORT || 4201;
// Serve static assets
app.get('*.*', express.static(dist, { maxAge: '1y' }));
// Serve application paths
app.all('*', function (req, res) {
res.status(200).sendFile(`/`, { root: dist });
});
// Create server to listen for connections
const server = http.createServer(app);
server.listen(port, () => console.log("Node Express server for " + app.name + " listening on port " + port));
Angular supports multiple applications under same project. You can create separate login application using following command:
ng generate application <you-login-app-name-here>
This way you can keep only login related code in '' and other code in you main app. You can build, test or run this new app separate using following commands:
ng build <you-login-app-name-here>
ng test <you-login-app-name-here>
ng serve <you-login-app-name-here>
Angular will generate the build output in /dist/ folder which can be mapped to express route to serve file.

CRA, Node.js, nginx in Docker?

I'm starting off a new project. I currently have a strucute like this, from root folder:
/app (CRA frontend app)
/server (Node.js Express app)
Dockerfile
docker-compose.yml
My requirements is the following:
Development
Fire up Docker that creates necessary container(s)
Hot reloading for frontend React app (using CRA)
Node.js server that can serve my React app with SSR (automatically updated when editing)
Accessible via http://localhost:3000
Production
Potentially fire up Docker that creates necessary container(s)
Creates production ready version of React app
Creates production ready version of Express app
Accessible via port 80
Where I am right now is somewhere between everything. I don't know how to setup Docker the right way in order to make this whole thing work, and I don't really know how to structure my React app vs the Express app while developing. The Production part seems to be easier as soon as I know how to structure the Development part... + Nginx as a proxy for the Express app?
I'm currently building a Docker container which fires up a container where hot reloading is working etc, but I don't know how to setup the Express part so they work nicely together...?
Any help is much appreciated.
Thanks
Very broad question. Perhaps better to break it down into more direct questions. Anyway, I don't think running your dev setup in Docker is ideal. Instead build your app normally with CRA. Then deploy in Docker.
In my own projects, I have a docker container running a node server which serves the react app using SSR.
Here is the docker part. Note that your package.json should have a script named start:prod for this to work. That script then starts your app in production.
// --- Dockerfile
# Pulled from docker hub and has everything
# needed to run a node project
FROM node:alpine
ENV PORT 3000
# Navigate (cd) to the app folder in the docker container
WORKDIR /usr/src/app
# Copy all package.json / package-lock.json etc. to the root folder
# Executed on build: docker build .
COPY ./package*.json ./
RUN npm i
# copy entire project into docker container
COPY . .
# build front-end with react build scripts and store them in the build folder
RUN npm run build
EXPOSE 3000
CMD ["npm", "run", "start:prod"]
Here's the express server that will start the server.
// -- server.js
import express from "express";
import router from "./controller/index";
const app = express();
const port = 4000;
// Tell the app to use the routes above
app.use(router);
// start the app
app.listen(port, () => {
console.log(`express running on port ${port}`);
});
Here is the controller/index.js file you'll need to start up
// -- controller/index.js
import express from "express";
import path from "path";
import serverRenderer from '../middleware/renderer';
const router = express.Router();
// root (/) should always serve our server rendered page
router.use('^/$', serverRenderer());
// other static resources should just be served as they are
router.use(express.static(
path.resolve(__dirname, '..', '..', 'build'),
{ maxAge: '30d' },
));
export default router;
And finally the renderer which renders the app on the server.
// -- renderer.js
import React from "react";
import { renderToString } from "react-dom/server";
import App from "../../src/App";
const path = require("path");
const fs = require("fs");
export default () => (req, res) => {
// point to html file created by CRA's build tool
const filePath = path.resolve(__dirname, "..", "..", "build", "index.html");
fs.readFile(filePath, "utf8", (error, htmlData) => {
if (error) {
console.error("error", error);
return response.status(404).end();
}
// render the app as string
const html = renderToString(<App />);
// inject rendered app into final html and send
return res.send(
htmlData
.replace('<div id="root"></div>', `<div id="root">${html}</div>`)
);
})
}
You will need bootstrap.js to inject support for certain packages.
// -- bootstrap.js
require('ignore-styles');
require('url-loader');
require('file-loader');
require('babel-register')({
ignore: [/(node_modules)/],
presets: ['es2015', 'react-app'],
plugins: [
'syntax-dynamic-import',
'dynamic-import-node'
]
});
require("./index");
You can find the details of it all here:
https://blog.mytoori.com/react-served-by-express-running-in-docker-container

edited files at host, but docker container is not refreshed

I am new to docker, I have successfully installed docker on ubuntu,
I am testing a node app, at folder: /home/samir/Documents/docker-centos:
$ ls
Dockerfile Dockerfile~ index.js index.js~ package.json package.json~
I don't know what are those duplicates ending with ~ as I didn't add them.
index.js:
var express = require('express');
// Constants
var PORT = 8080;
// App
var app = express();
app.get('/', function (req, res) {
// even if I change the output here, I still get Hello world only
res.send('Hello world, This was added\n');
});
app.listen(PORT);
console.log('Running on http://localhost:' + PORT);
Although I have started the container using a data volume that points to the same app dir like this:
docker run -p 49160:8080 -v /home/samir/Documents/docker-centos -d samir/centos-node-hello
but when I view the output like:
curl -i localhost:49160
I get Hello world even if I changed the file..
Do I miss something?
How would I run the container, and edit files at host? why it didn't work?
EDIT
Dockerfile
FROM centos:centos6
# Enable Extra Packages for Enterprise Linux (EPEL) for CentOS
RUN yum install -y epel-release
# Install Node.js and npm
RUN yum install -y nodejs npm
# Install app dependencies
COPY package.json /src/package.json
RUN cd /src; npm install
# Bundle app source
COPY . /src
EXPOSE 8080
CMD ["node", "/src/index.js"]
You need to build the Docker image again using the docker build command
If you're planning on using this setup not only for one-off testing of your app, but for development as well, you'd be better off mounting your application code as a volume.

Resources