I run simple Angular Universal SSR (server-side rendering) application, everything works fine, server renders html but there is one problem. static assets, like fonts, images, icons doesn't get loaded by server, but browser. What I want to do, is to render html with static assets.
I tried express.static() function but couldn't make it work. So how can I make it work?
Got it working by suggestions here.
Implement HTTP interceptor according to this article. It will add absolute url to all requests with a relative path, so SSR with server running will work. But while static pre-rendering this.request will be empty, in this case you should redirect such requests to your own static server, e.g. http://localhost:3000
Create NodeJs script for pre-rendering. It will run static server on port 3000 (the same port as in interceptor), when it's running, you can execute npm run prerender in child process. Then listen to events error and close on the subprocess and close server when they happen:
const { spawn } = require('child_process');
// In static server 'listen' callback
const sp = spawn('npm', ['run', 'prerender'], { stdio: 'inherit', timeout: 5 * 60 * 1000 })
sp.on('error', (err) => {
// Pre-rendering failed.
// TODO kill subprocess, close server, end current process with error code
});
sp.on('exit', (code: number) => {
// Pre-rendering is finished
// TODO Close server
if (code !== 0) {
// Pre-rendering failed.
// TODO End current process with error code
}
})
Related
So my express app has a small Node server setup so it can serve up the index.html file when the home route '/' is hit. This is a requirement of using the App Services from Azure, there has to be this server.js file to tell the server how to serve up the client, and i had a previous implementation of this working, however i wanted to change my file structure. previously i had, the client React app in a folder client and the server.js in a folder server along with all of the conrtollers and routes. i've since moved the server API to its own application as there are other apps that depend on it. and i moved the client up one directory into the main directory. Everything was working fine till the other day when all of the sudden when you hit the home route / it will not serve up the index.html file. if you hit any other route it works, if you even hit a button linking back to the homepage, it works, but it wont serve up the app from the / and i cannot for the life of me figure out why, on my development server there are no errors in the console. and im most definitely targeting the correct directory and place for the index. but its like the server isnt reading the route to serve up.
if (process.env.NODE_ENV === 'production') {
console.log('running');
app.use(express.static(path.resolve(path.join(__dirname, 'build'))));
// no matter what route is hit, send the index.html file
app.get('*', (req, res) => {
res.sendFile(path.resolve(path.join(__dirname, 'build', 'index.html')));
});
} else {
app.get('/', (req, res) => {
res.send('API is running...');
});
}
So here im saying if the NODE_ENV is in production make the build folder static, and then whatever route is hit. (Note: i also tried this app.get with other route formats such as /* or / all have the same issues. however in my previous iteration when the client and server where deployed in the same location, /* is what i used.) The .env varialbes are setup correctly, as when the server is ran, itll console log running.. but even if i put a console log inside of the app.get() its like its never hit unless i access the route from something else first.
for example, if i place a console log inside of app.get that states hit whenever the route is hit, hitting / directly does nothing, but if i go to /login itll serve up the correct html on the client and console log hit in the terminal...
If you are having server files inside the client react app, then we are basically accessing file which are not inside our server file. So, we can serve static files using the following code:
const express = require("express");
const app = express(); // create express app
const path = require('path');
app.use(express.static(path.join(__dirname, "..", "build")));
app.use(express.static("build"));
app.listen(5000, () => {
console.log("server started on port 5000");
});
Now in your packages.json of the client react app change the name of start tag under scripts tag to start-client. Then add this following tag to the scripts tag:
"start":"npm run build && (cd server && npm start)",
Basically, this will build the react app and start the server.
It should look like this :
Also in the packages.json of your server add the following tag under script tag
"start":"node server.js"
So when you run the following command npm start it should look like this :
I am trying to create some setup and teardown logic for an expressjs server. Here's my entry code:
import fs from "fs";
import express from "express";
import { setRoutes } from "./routes";
let app = express();
const server = app.listen(8080, function () {
console.log(`š§ Mock Server is now running on port : ${8080}`);
});
app = setRoutes(app);
function stop() {
fs.rmdirSync("./uploads", { recursive: true });
fs.mkdirSync("uploads");
console.log("\nš§¹ Uploads folder purged");
server.on("close", function () {
console.log("ā¬ Shutting down server");
process.exit();
});
server.close();
}
process.on("SIGINT", stop);
// Purge sw images on restart
process.once("SIGUSR2", function () {
fs.rmdirSync("./uploads/swimages", { recursive: true });
console.log("š§¹ Software Images folder purged");
process.kill(process.pid, "SIGUSR2");
});
The npm script to start this up is "start": "FORCE_COLOR=3 nodemon index.js --exec babel-node".
The setup and restart logic works as expected. I get š§ Mock Server is now running on port : 8080 logged to console on startup. When I save a file, nodemon restarts the server, and the code in process.once is executed. When I want to shut it all down, I ctrl + c in the terminal. The cleanup logic from within the stop function is run. However, the process bever fully exits. In the terminal, am still stuck in the process, and I have to hit ctrl + c again to fully exit the process. It looks like this:
As far as I know there are no open connections (other questions mentioned that if there is a keep-alive connection still open, the server will not close properly, but as far as I can tell, that is not the case). I have tried different variations of server.close(callback), server.on('close', callback), process.exit(), process.kill(process.pid), etc, but nothing seems to fully exit the process.
Note that if I simply run node index.js, I do not have this issue. The cleanup logic runs, and the process exits to completion without issue. It seems to be an issue when using nodemon only.
I don't want other developers to have to wait for cleanup logic to run and then hit ctrl + c again. What am I missing to run my cleanup logic and fully exit the process in the terminal?
There is an open connection for sure. Check this package that can tell you which one: https://www.npmjs.com/package/wtfnode
I use vue-cli in my dockerized project, where port mapping looks like this: "4180:8080", and the actual message after compiling my SPA looks like:
App running at:
- Local: http://localhost:8080/app/
It seems you are running Vue CLI inside a container.
Access the dev server via http://localhost:<your container's external mapped port>/app/
App works fine, I could access at via http://localhost:4180/app/ as conceived, but I'm not able to find a proper way to change the message above to show this link instead of "It seems you are running Vue CLI inside a container...". I could use webpack hooks to insert link before the message but actually wanna find the way to change the message, generated by cli. Is it possible somehow?
I came to this question - as I was looking to do the same thing with bash, running inside a Docker container (possibly what you're already doing).
You could achieve this by invoking Vue CLI commands through spawning a child node process, from within your Docker container (assuming your container is running node). You can then modify the output of stdout and stderr accordingly.
You can call a Javascript file in one of two ways:
use a shell script (bash, for example) to call node and run this script
set the entrypoint of your Dockerfile to use this script (assuming you're running node by default)
// index.js
const { spawn } = require('child_process')
const replacePort = string => {
return string.replace(`<your container's external mapped port>`, 8000)
}
const vueCLI = (appLocation, args) => {
return new Promise((resolve, reject) => {
const vue = spawn('vue', args, {cwd: appLocation})
vue.stdout.on('data', (data) => {
console.log(replacePort(data.toString('utf8', 0, data.length)))
})
vue.stderr.on('data', (error) => {
console.log(replacePort(error.toString('utf8', 0, error.length)))
})
vue.on('close', (exitCode) => {
if (exitCode === 0) {
resolve()
} else {
reject(new Error('Vue CLI exited with a non-zero exit code'))
}
})
})
}
vueCLI('path/to/app', CLI_Options).then(() => resolve()).catch(error => console.error(error))
This approach does have drawbacks, not limited to:
performance being slower - due to this being less efficient
potential danger of memory leaks, subject to implementation
risk of zombies, should the parent process die
For the reasons above and several others, this is a route that was found to be unsuitable in my specific case.
Instead of changing the message, it's better to change the port Vue is listening on.
. npm run serve -- --port 4180
This automatically updates your message to say the new port, and after you updated your docker port forward for the new port, it it will work again.
I am trying to add BrowserSync to my react.js node project. My problem is that my project manages the url routing, listening port and mongoose connection through the server.js file so obviously when I run a browser-sync task and check the localhost url http://localhost:3000 I get a Cannot GET /.
Is there a way to force browser-sync to use my server.js file? Should I be using a secondary nodemon server or something (and if i do how can the cross-browser syncing work)? I am really lost and all the examples I have seen add more confusion. Help!!
gulp.task('browser-sync', function() {
browserSync({
server: {
baseDir: "./"
},
files: [
'static/**/*.*',
'!static/js/bundle.js'
],
});
});
We had a similar issue that we were able to fix by using proxy-middleware(https://www.npmjs.com/package/proxy-middleware). BrowserSync lets you add middleware so you can process each request. Here is a trimmed down example of what we were doing:
var proxy = require('proxy-middleware');
var url = require('url');
// the base url where to forward the requests
var proxyOptions = url.parse('https://appserver:8080/api');
// Which route browserSync should forward to the gateway
proxyOptions.route = '/api'
// so an ajax request to browserSync http://localhost:3000/api/users would be
// sent via proxy to http://appserver:8080/api/users while letting any requests
// that don't have /api at the beginning of the path fall back to the default behavior.
browserSync({
// other browserSync options
// ....
server: {
middleware: [
// proxy /api requests to api gateway
proxy(proxyOptions)
]
}
});
The cool thing about this is that you can change where the proxy is pointed, so you can test against different environments. One thing to note is that all of our routes start with /api which makes this approach a lot easier. It would be a little more tricky to pick and choose which routes to proxy but hopefully the example above will give you a good starting point.
The other option would be to use CORS, but if you aren't dealing with that in production it may not be worth messing with for your dev environment.
I am working off of Yeoman's gulp-webapp generator. I have modified my gulp serve task to use my Express server, rather than the default connect server it ships with. My issue is with Livereload functionality. I am trying to simply port the connect-livereload to work with my Express server rather than having to install new dependencies. It's to my understanding that most connect middleware should work fine with Express, so I am assuming connect livereload is compatible with Express 4.
Here are the contents of the relevant tasks in my gulpfile:
gulp.task('express', function() {
var serveStatic = require('serve-static');
var app = require('./server/app');
app.use(require('connect-livereload')({port: 35729}))
.use(serveStatic('.tmp'));
app.listen(3000);
});
gulp.task('watch', ['express'], function () {
$.livereload.listen();
// watch for changes
gulp.watch([
'app/*.ejs',
'.tmp/styles/**/*.css',
'app/scripts/**/*.js',
'app/images/**/*'
]).on('change', $.livereload.changed);
gulp.watch('app/styles/**/*.css', ['styles']);
gulp.watch('bower.json', ['wiredep']);
});
gulp.task('styles', function () {
return gulp.src('app/styles/main.css')
.pipe($.autoprefixer({browsers: ['last 1 version']}))
.pipe(gulp.dest('.tmp/styles'));
});
gulp.task('serve', ['express', 'watch'], function () {
require('opn')('http://localhost:3000');
});
With this simple setup, when I run gulp serve in my cmd everything spins up fine and I can accept requests at http://localhost:3000.
Now if I go and change the body's background color from #fafafa to #f00 in main.css and hit save, my gulp output will respond with main.css was reloaded, as seen in the bottom of this screenshot.
However, my webpage does not update. The background color is still light-grey instead of red.
Is there perhaps a conflict between my express server config and the way gulp handles its files? Is my Express server forcing the use of app/styles/main.css rather than the use of .tmp/styles/main.css? Shouldn't the livereload script handle the injection of the new temporary file?
Thanks for any help.
EDIT:
I was able to move forward a bit by adding livereload.js to the script block of my index file, like so:
<script src="http://localhost:35729/livereload.js"></script>
I am now able to get live changes pushed to the client. Why was this file not getting injected before? How can I ensure this is getting used programatically as opposed to pasting it into my files?
I was able to get past this issue by removing the app.use(require('connect-livereload')({port: 35729})) from my gulpfile, along with a couple of other lines, and having that instantiate in my Express server's app.js file.
My gulpfile's express task now looks like this:
gulp.task('express', function() {
var app = require('./server/app');
app.listen(3000);
});
I added in the connect-livereload just above where I specify my static directory in Express:
if (app.get('env') === 'development') {
app.use(require('connect-livereload')());
}
app.use(express.static(path.join(__dirname, '../app')));
Once I started using this setup, I was getting the livereload.js script injected into my document, and client-side changes are now auto-refreshed just how I wanted.
Hope this helps someone!