How to run electron on a localhost server in Build as well as in Dev - node.js

I'm developing with Next.js + Electron + Typescript.
I'm using the npx create-next-app --example with-electron-typescript command to generate code.
npm run dev (the contents are npm run build-electron && electron . ), it seems that the local server is up and running on localhost8000, but when build, the server is not up internally, and it is running by directly accessing the file.
However, some APIs do not work correctly if there is no domain in the location.origin , so it works in Dev, but does not work in Build.
So, if it is possible, I would like to run the server on localhost in the build version as well as in the Dev version.
Is there anything I can do to make it work?

It's not shown in any of the examples, even though someone requested one:
https://github.com/vercel/next.js/issues/28225
It is possible using a custom server:
https://nextjs.org/docs/advanced-features/custom-server
You can follow these steps to create one:
Clone the Electron Next TypeScript example repo:
https://github.com/vercel/next.js/tree/canary/examples/with-electron-typescript
Update ./electron-src/index.ts with the following code:
import isDev from 'electron-is-dev';
import { createServer } from 'http';
import next from 'next';
import { parse } from 'url';
app.on('ready', async () => {
// Use server-side rendering for both dev and production builds
const nextApp = next({
dev: isDev,
dir: app.getAppPath() + '/renderer'
});
const requestHandler = nextApp.getRequestHandler();
// Build the renderer code and watch the files
await nextApp.prepare();
// Create a new native HTTP server (which supports hot code reloading)
createServer((req: any, res: any) => {
const parsedUrl = parse(req.url, true)
requestHandler(req, res, parsedUrl)
}).listen(3000, () => {
console.log('> Ready on http://localhost:3000')
})
mainWindow.loadURL('http://localhost:3000/')
Update ./package.json Electron build configuration to include the renderer src files:
"build": {
"asar": true,
"files": [
"main",
"renderer"
]
}
In ./package.json move next from devDependencies to dependencies. This means it will be available to run in production builds
Then use helpful scripts to unpack the binary and see the files/folder inside:
npx asar extract ./dist/mac/ElectronTypescriptNext.app/Contents/Resources/app.asar ./dist/unpack
Run the unpacked version to debug:
./node_modules/.bin/electron ./dist/unpack
I have created an Express Server version and NextJS versions to prove it is possible:
https://github.com/kmturley/electron-server/tree/feature/express
https://github.com/kmturley/electron-server/tree/feature/next

Related

Flutter Web App Hosted in NodeJS shows only white screen

I have built a flutter web application (flutter stable#3.3.9) where I have set the url strategy to PathUrlStrategy().
Of course, locally this application builds fine and runs fine. I am hosting this application in a NodeJS application as follows:
import express, {Express, Request, Response} from "express";
import dotenv from "dotenv";
import cookieParser from "cookie-parser";
import https from "https";
import {ClientRequest, IncomingMessage} from "http";
import path from "path";
const app: Express = express();
const port = process.env.PORT;
app.use(express.json());
app.use(express.urlencoded({extended: false}));
app.use(cookieParser());
app.use(express.static(path.join(__dirname, "flutter")));
app.get("/api/page/:id", async (req, res) => {
getPageSessionHandler(req, res);
});
app.post("/api/payment", async (req, res) => {
console.log("handling payment request");
postPaymentHandler(req, res);
});
app.get("*", (_, res) => {
res.sendFile(path.resolve(__dirname, "flutter/index.html"));
});
app.listen(port, () => {
console.log(
`⚡️[server (nodejs)]: Server is running at http://localhost:${port}`,
);
});
var postPaymentHandler = (authToken: String, req: any, res: any) => {
//implementation removed
};
var getPageSessionHandler = (authToken: String, req: any, res: any) => {
//implementation removed
};
Of course, this runs fine locally as follows:
flutter build web --release --web-render=html
Then move the build/web* output to the proper folder in my nodejs server.
I can even locally containerize this application and run it from my docker desktop (windows 11) using the following dockerfile:
FROM debian:latest AS build-env
# Install flutter dependencies and nodejs dependencies
RUN apt-get update
RUN apt-get install -y curl git wget unzip gettext-base libgconf-2-4 gdb libstdc++6 libglu1-mesa fonts-droid-fallback lib32stdc++6 python3
RUN apt-get clean
RUN curl -fsSL https://deb.nodesource.com/setup_12.x | bash -
RUN apt-get -y install nodejs
RUN npm install
WORKDIR /app
COPY . .
RUN npm install
RUN npm run build
RUN git clone https://github.com/flutter/flutter.git /usr/local/flutter
ENV PATH="/usr/local/flutter/bin:/usr/local/flutter/bin/cache/dart-sdk/bin:${PATH}"
WORKDIR /app/flutter_src/payment
# Build the app
RUN touch .env
RUN flutter clean && flutter pub get
RUN flutter build web --release --web-renderer=html
# Build the final image
WORKDIR /app
RUN cp -R flutter_src/payment/build/web dist/flutter
CMD ["/bin/sh", "-c", "exec node dist/index.ts"]
Again, this works fine in both my windows node server environment, and also in my docker desktop container. When I deploy this via my CI/CD pipeline to AWS ECR, I am unable to load the application. I can hit the associated API endpoints in the node's index.ts file above. My kubernetes pod is healthy... but I am not able to load routes with a second slash... i.e.:
https:///welcome <--- loads fine
https:///user/fun_username <-- does not load.
After a ton of debugging, I'm finding another oddity in how this behaves in environment. See the Network log from a request to my application's deep-linked route in Chrome (and also Edge):
when requesting /page/{page-id-here}, the browser is requesting the main.dart.js at the subroute.
What's even more perplexing is that if I request the same deep route in Firefox, not only does my application load as intended and work as expected, the browser seems to literally be requesting my main.dart.js at the root (what it should be) of my node server, as it's serving the flutter directory statically... See this screenshot from firefox:
here, the main.dart.js file is requested from the root of the node server as I'd expect.
I have tried all sorts of routing tricks in the node server, but that just feels wrong, especially since it seems to be specific to my environment. If I revert the path strategy back to client-only routing (/#/path/here), this works fine, but does not meet my requirements on this project.
Thank you all for your help. This is weeks of struggling I'm deferring on at this point.

How can I use a custom build of CKEditor 5 with React and Vite?

For the past several months, I've been building my app with Create React App.
However, Ionic now supports Vite and I am attempting to migrate my app from CRA to Vite.
Originally, I made a CKEditor 5 Custom Build and set it up in a React app like this:
import React from 'react';
// eslint-disable-next-line #typescript-eslint/ban-ts-comment
// #ts-ignore Ckeditor does not supply TypeScript typings.
import { CKEditor } from '#ckeditor/ckeditor5-react';
// eslint-disable-next-line #typescript-eslint/ban-ts-comment
// #ts-ignore Ckeditor does not supply TypeScript typings.
import Editor from 'ckeditor5-custom-build/build/ckeditor';
Before building my app, I build the custom CKEditor like this:
cd ckeditor5; npm run build
The CKEditor build command is webpack --mode production.
Now, after configuring Vite, when I run npm run build, I get the following error:
'default' is not exported by ckeditor5/build/ckeditor.js, imported by
src/components/contentTypeCard/CKEditorInput.tsx
The CKEditor issue queue has a thread on a lack of documentation on issues with Vite, but there's nothing in particular about how to resolve this issue.
What I tried
I tried building CKEditor in development mode (webpack --mode development) and examining the ckeditor.js file to try to export Editor, but the file has over 100,000 lines of code and I am totally lost.
In my cause its:
"react": "18.2.0",
"vite": "2.9.10",
Here is solution what i found:
package.json
"ckeditor5-custom-build": "file:libs/ckeditor5",
vite.config.ts
export default defineConfig(() => {
return {
plugins: [react()],
optimizeDeps: {
include: ['ckeditor5-custom-build'],
},
build: {
commonjsOptions: { exclude: ['ckeditor5-custom-build'], include: [] },
},
};
});
RichTextEditor.tsx
import { CKEditor, CKEditorProps } from '#ckeditor/ckeditor5-react';
import Editor from 'ckeditor5-custom-build';
export function RichTextEditor({
defaultValue,
...props
}: RichTextEditorProps) {
return (
<EditorContainer>
<CKEditor editor={Editor} data={defaultValue || ''} {...props} />
</EditorContainer>
);
}

NPM and NodeJS Compatibility: NodeJS works from PM prompt, but not script

I am attempting to get a lighthouse script running in Node.JS (which I am new to). I followed the intial instructions here https://github.com/GoogleChrome/lighthouse/blob/master/docs/readme.md#using-programmatically. I was able to complete the prior steps in the package manager console (Visual Studio 2017):
npm install -g lighthouse
lighthouse https://airhorner.com/
//and
lighthouse https://airhorner.com/ --output=json --output-path=./report/test1.json
However, I do get an initial warning that NPM only supports Node.JS in versions 4 through 8 and recommends a newer version. The problem is I am running Node v12 and NPM v5 - both the latest.
When I create a script version like below (app.js)
const lighthouse = require('lighthouse');
const chromeLauncher = require('chrome-launcher');
const config = {
extends: 'lighthouse:default',
settings: {
emulatedFormFactor: 'desktop',
onlyCategories: 'performance',
output: 'json',
outputPath: './report.json'
}
};
function launchChromeAndRunLighthouse(url, opts = null, config) {
return chromeLauncher.launch().then(chrome => {
opts.port = chrome.port;
return lighthouse(url, opts, config).then(results => {
return chrome.kill().then(() => results.lhr);
});
});
}
// Usage:
launchChromeAndRunLighthouse('https://airhorner.com/', config).then(results => {
// Use results!
});
And run the command
C:\src\project> node app.js
I get the error - Cannot find module 'lighthouse'
don't install lighthouse locally use it inside your working dir .
first start by running npm init that will create the package.json file inside the current working dir
then npm install --save lighthouse will download it and save it to node_modules now you can use it locally inside your working dir
it should look something like this
app.js
package.json
node_modules/
then run node app.js

CRA, Node.js, nginx in Docker?

I'm starting off a new project. I currently have a strucute like this, from root folder:
/app (CRA frontend app)
/server (Node.js Express app)
Dockerfile
docker-compose.yml
My requirements is the following:
Development
Fire up Docker that creates necessary container(s)
Hot reloading for frontend React app (using CRA)
Node.js server that can serve my React app with SSR (automatically updated when editing)
Accessible via http://localhost:3000
Production
Potentially fire up Docker that creates necessary container(s)
Creates production ready version of React app
Creates production ready version of Express app
Accessible via port 80
Where I am right now is somewhere between everything. I don't know how to setup Docker the right way in order to make this whole thing work, and I don't really know how to structure my React app vs the Express app while developing. The Production part seems to be easier as soon as I know how to structure the Development part... + Nginx as a proxy for the Express app?
I'm currently building a Docker container which fires up a container where hot reloading is working etc, but I don't know how to setup the Express part so they work nicely together...?
Any help is much appreciated.
Thanks
Very broad question. Perhaps better to break it down into more direct questions. Anyway, I don't think running your dev setup in Docker is ideal. Instead build your app normally with CRA. Then deploy in Docker.
In my own projects, I have a docker container running a node server which serves the react app using SSR.
Here is the docker part. Note that your package.json should have a script named start:prod for this to work. That script then starts your app in production.
// --- Dockerfile
# Pulled from docker hub and has everything
# needed to run a node project
FROM node:alpine
ENV PORT 3000
# Navigate (cd) to the app folder in the docker container
WORKDIR /usr/src/app
# Copy all package.json / package-lock.json etc. to the root folder
# Executed on build: docker build .
COPY ./package*.json ./
RUN npm i
# copy entire project into docker container
COPY . .
# build front-end with react build scripts and store them in the build folder
RUN npm run build
EXPOSE 3000
CMD ["npm", "run", "start:prod"]
Here's the express server that will start the server.
// -- server.js
import express from "express";
import router from "./controller/index";
const app = express();
const port = 4000;
// Tell the app to use the routes above
app.use(router);
// start the app
app.listen(port, () => {
console.log(`express running on port ${port}`);
});
Here is the controller/index.js file you'll need to start up
// -- controller/index.js
import express from "express";
import path from "path";
import serverRenderer from '../middleware/renderer';
const router = express.Router();
// root (/) should always serve our server rendered page
router.use('^/$', serverRenderer());
// other static resources should just be served as they are
router.use(express.static(
path.resolve(__dirname, '..', '..', 'build'),
{ maxAge: '30d' },
));
export default router;
And finally the renderer which renders the app on the server.
// -- renderer.js
import React from "react";
import { renderToString } from "react-dom/server";
import App from "../../src/App";
const path = require("path");
const fs = require("fs");
export default () => (req, res) => {
// point to html file created by CRA's build tool
const filePath = path.resolve(__dirname, "..", "..", "build", "index.html");
fs.readFile(filePath, "utf8", (error, htmlData) => {
if (error) {
console.error("error", error);
return response.status(404).end();
}
// render the app as string
const html = renderToString(<App />);
// inject rendered app into final html and send
return res.send(
htmlData
.replace('<div id="root"></div>', `<div id="root">${html}</div>`)
);
})
}
You will need bootstrap.js to inject support for certain packages.
// -- bootstrap.js
require('ignore-styles');
require('url-loader');
require('file-loader');
require('babel-register')({
ignore: [/(node_modules)/],
presets: ['es2015', 'react-app'],
plugins: [
'syntax-dynamic-import',
'dynamic-import-node'
]
});
require("./index");
You can find the details of it all here:
https://blog.mytoori.com/react-served-by-express-running-in-docker-container

express server starting react client

Until now, I have been using create-react-app for my projects, with the express-server and the react client each in their own folders.
However, I am now trying to avoid create-react-app in order to really understand how everything work under the hood. I am reading an Hacker Noon article that explains how to setup react with typescript and webpack. In this article they also have the express server at the root of the client which compiles everything itself:
const path = require('path'),
express = require('express'),
webpack = require('webpack'),
webpackConfig = require('./webpack.config.js'),
app = express(),
port = process.env.PORT || 3000;
app.listen(port, () => { console.log(`App is listening on port ${port}`) });
app.get('/', (req, res) => {
res.sendFile(path.resolve(__dirname, 'dist', 'index.html'));
});
let compiler = webpack(webpackConfig);
app.use(require('webpack-dev-middleware')(compiler, {
noInfo: true, publicPath: webpackConfig.output.publicPath, stats: { colors: true }
}));
app.use(require('webpack-hot-middleware')(compiler));
app.use(express.static(path.resolve(__dirname, 'dist')));
In the end, the start command looks like it:
"start": "npm run build && node server.js"
So I assume the client and the server start on the same port.
Why would you do such a thing? Are there any pros and cons?
It is true that this will allow your development to happen using the same server as express and that web pack will continuously update your dist/index.html file with whatever updates you make to your file. There's not too much of a disadvantage to this as this is just for development. But typically on prod you'll have a single built file that you will serve. And it will not web pack-dev-middleware to be running. Once you've built your server. For the purposes of production it might be possible that you'll only need static assets. But typically, even the server which serves mostly client files will potentially need a server if you want to do server side rendering and/or code splitting.
The command: "npm run build && node server.js" will run the bash/cmd commands into the terminal. npm run build is one step because of the use of && it will if that command succeeds, run the next command which is node server.js which is a strange command I would probably run node ./ (and put the server as index.js) or at least just write node server.
What I'd prefer to see in your package.json:
"start": "yarn build && node ./"
That would be possible if you mv server.js index.js (and npm i -g yarn).
Another thing to note, and look into is what the build step does.
Further Explanation:
The command runs the build step so check what your "build": key runs in your package.json.
This command will probably not exit with the code 1 (any exit code of a terminal process that is above 0 will result in an error and will not pass the &&).
Presumably, the build process described in the package.json will take all the javascript and CSS files and put them into the index.html file which will then be sent to the client side whenever someone access the '/' path.
After that succeeds, it will start the server that you put the code to above.
res.sendFile(path.resolve(__dirname, 'dist', 'index.html'));
will happen if anybody comes across the '/' path.

Resources