I am at early stages setting up a next.js application, I only had experience using react so far.
I setup docker with a frontend app (next.js) on localhost:3000 and a backend app (node.js/express) on localhost:5000. They both work.
Now I am trying to call an express endpoint from the frontend, what I am doing is:
const registerUser = async event => {
event.preventDefault()
const res = await fetch(
process.env.NEXT_PUBLIC_SERVER + '/user/signup',
{
body: JSON.stringify({
username: event.target.name.value,
email: event.target.email.value,
password: event.target.password.value
}),
headers: {
'Content-Type': 'application/json'
},
method: 'POST'
}
)
result = await res.json()
}
and I am getting an error saying
Access to fetch at 'http://localhost:5000/user/signup' from origin 'http://localhost:3000' has been blocked by CORS policy: Response to preflight request doesn't pass access control check: No 'Access-Control-Allow-Origin' header is present on the requested resource. If an opaque response serves your needs, set the request's mode to 'no-cors' to fetch the resource with CORS disabled.
just a note: the endpoint works as expected using Postman.
I made some research and I find a few resources saying I should call an internal next.js endpoint (pages/api), and from there call my api. Is this the best practice with next.js? In react I just use to call the api directly.
Other than just how to solve this, I would like to know what's the best practice in this case? Thanks.
If you have separate servers for frontend and backend (for example, next.js and express) that cannot listen on the same port, there are two broad alternatives:
Either the browser loads the frontend from one server and makes API requests to the other server
next.js <-- browser --> express
This requires the backend app to set CORS headers, for example, using cors and the statement
app.use(cors({origin: "host of next.js", ...}));
Or the browser makes all requests to the port of next.js, and this forwards all API requests to the other server
browser --> next.js --> express
No CORS is necessary in this case, but API requests take more hops than before. So it is simplicity vs performance (like so often).
First of all, are you sure you need an Express BE? The power of Next.js relies in its serverless approach, most of times, unless you have a very complex BE, you can do everything with serverless functions.
If you really need to have a separate express server for your Next application remember that you will lose some important Next features:
Before deciding to use a custom server, please keep in mind that it should only be used when the integrated router of Next.js can't meet your app requirements. A custom server will remove important performance optimizations, like serverless functions and Automatic Static Optimization.
Usually to address the CORS issues in dev environment, since you need FE to run on a different PORT from BE to have Hot Reload, when you use React the best approach is the proxy approach, you can just add an entry to package.json on the React project,
"proxy": "http://localhost:5000" (if your server runs on PORT 5000)
Source: https://create-react-app.dev/docs/proxying-api-requests-in-development/
This way all the http traffic is going to be redirected on port 5000 and will reach your Express server, while keeping having hot reload features and your client files running on port 3000.
By the way, that's the case if you have a standard React FE and a custom Express BE, if you are using NextJS even with a custom Express Server, you need to create the server and to connect it using Next:
/ server.js
const { createServer } = require('http')
const { parse } = require('url')
const next = require('next')
const dev = process.env.NODE_ENV !== 'production'
const hostname = 'localhost'
const port = 3000
// when using middleware `hostname` and `port` must be provided below
const app = next({ dev, hostname, port })
const handle = app.getRequestHandler()
app.prepare().then(() => {
createServer(async (req, res) => {
try {
// Be sure to pass `true` as the second argument to `url.parse`.
// This tells it to parse the query portion of the URL.
const parsedUrl = parse(req.url, true)
const { pathname, query } = parsedUrl
if (pathname === '/a') {
await app.render(req, res, '/a', query)
} else if (pathname === '/b') {
await app.render(req, res, '/b', query)
} else {
await handle(req, res, parsedUrl)
}
} catch (err) {
console.error('Error occurred handling', req.url, err)
res.statusCode = 500
res.end('internal server error')
}
}).listen(port, (err) => {
if (err) throw err
console.log(`> Ready on http://${hostname}:${port}`)
})
})
source: https://nextjs.org/docs/advanced-features/custom-server
Again, I suggest you to deeply evaluate if you really need a custom express server for your app, because most of times you don't, and development experience is much smoother in a serverless environment!
I have single repository of project and have multiple databases for different clients.
Following types of database naming and URL to connectivity that I am using to connect database based on access URL:
Client's Database
shreyas_db (http://shreyas.locahost:3001)
ajay_db (http://ajay.locahost:3001)
vijay_db (http://vijay.locahost:3001)
Please guide how to implement this structure in NodeJs with Express.
Thanks
Here is the solution I figure out:
Add a router
app.use('/api', await APIRouter())
connect to different DBS in router middleware.
Get the subdomain(If you are using Nginx, you may found req.subdomains or req.host does not return what you expected, try to use req.headers.referer instead), use res.locals to save DB, so you can get it from every API call.
export const APIRouter = async () => {
const router = express.Router()
const client = await connectDatabase()
router.use(async (req, res, next) => {
//const host = req.headers.referer?.replace("https://","");
//const subdomain = host ? host.substring(0, host.indexOf('.')) : 'main'
const subdomain = req.subdomains.length ? req.subdomains.length[0] : 'main'
const db = client.db(subdomain)
res.locals.db = {
clients: db.collection<Client>('clients'),
}
next()
}
)
return router
};
I have got 2 domain names. For instance, example0.org and example1.org.
How to setup nodejs using express to manage both of them?
For instance, I wanna share
/publicexample0
folder as root for
example0.org
and
/publicexample1
folder for
example1.org
as root
This works just for a one domain:
var app = express.createServer();
app.get('/', function(req, res) {
res.send('Hello World');
});
app.listen(3000);
I guess what you can do is to take advantage of the HTTP host header. It contains:
The domain name of the server (for virtual hosting), and the TCP port
number on which the server is listening. The port number may be
omitted if the port is the standard port for the service requested.
Mandatory since HTTP/1.1.
You can see its specification in the RFC 2616 - HTTP v1.1
And obviously you could read the header out of your request in Express and make a decision based on its value.
router.get('/hello', function(req, res){
var host = req.get('host');
if(host === 'example0.org'){
res.send(200,'Welcome from example0.org');
} else if(host === 'example1.org'){
res.send(200,'Welcome from example1.org');
} else {
res.send(200,'Welcome stranger');
}
});
I have a linux server with a single IP bound to it. I want to host multiple Node.js sites on this server on this IP, each (obviously) with a unique domain or subdomain. I want them all on port 80.
What are my options to do this?
An obvious solution seems to be to have all domains serviced by a node.js web app that acts as a proxy and does a pass through to the other node.js apps running on unique ports.
Choose one of:
Use some other server (like nginx) as a reverse proxy.
Use node-http-proxy as a reverse proxy.
Use the vhost middleware if each domain can be served from the same Connect/Express codebase and node.js instance.
Diet.js has very nice and simple way to host multiple domains with the same server instance. You can simply call a new server() for each of your domains.
A Simple Example
// Require diet
var server = require('diet');
// Main domain
var app = server()
app.listen('http://example.com/')
app.get('/', function($){
$.end('hello world ')
})
// Sub domain
var sub = server()
sub.listen('http://subdomain.example.com/')
sub.get('/', function($){
$.end('hello world at sub domain!')
})
// Other domain
var other = server()
other.listen('http://other.com/')
other.get('/', function($){
$.end('hello world at other domain')
})
Separating Your Apps
If you would like to have different folders for your apps then you could have a folder structure like this:
/server
/yourApp
/node_modules
index.js
/yourOtherApp
/node_modules
index.js
/node_modules
index.js
In /server/index.js you would require each app by it's folder:
require('./yourApp')
require('./yourOtherApp')
In /server/yourApp/index.js you would setup your first domain such as:
// Require diet
var server = require('diet')
// Create app
var app = server()
app.listen('http://example.com/')
app.get('/', function($){
$.end('hello world ')
})
And in /server/yourOtherApp/index.js you would setup your second domain such as:
// Require diet
var server = require('diet')
// Create app
var app = server()
app.listen('http://other.com/')
app.get('/', function($){
$.end('hello world at other.com ')
});
Read More:
Read more about Diet.js
Read more about Virtual Hosts in Diet.js
Read more about Server in Diet.js
Hm ... why you think that nodejs should act as a proxy. I'll suggest to run several node apps listening on different ports. Then use nginx to forward the request to the right port. If use a single nodejs you will have also single point of failure. If that app crashes then all the sites go down.
Use nginx as a reverse proxy.
http://www.nginxtips.com/how-to-setup-nginx-as-proxy-for-nodejs/
Nginx brings a whole host of benefits to your applications in the form of caching, static file handling, ssl and load balancing.
I have an API I use on a site and below is my configuration. I also have it with SSL and GZIP, if someone needs it, just comment me.
var http = require('http'),
httpProxy = require('http-proxy');
var proxy_web = new httpProxy.createProxyServer({
target: {
host: 'localhost',
port: 8080
}
});
var proxy_api = new httpProxy.createProxyServer({
target: {
host: 'localhost',
port: 8081
}
});
http.createServer(function(req, res) {
if (req.headers.host === 'http://www.domain.com') {
proxy_web.proxyRequest(req, res);
proxy_web.on('error', function(err, req, res) {
if (err) console.log(err);
res.writeHead(500);
res.end('Oops, something went very wrong...');
});
} else if (req.headers.host === 'http://api.domain.com') {
proxy_api.proxyRequest(req, res);
proxy_api.on('error', function(err, req, res) {
if (err) console.log(err);
res.writeHead(500);
res.end('Oops, something went very wrong...');
});
}
}).listen(80);
If you are using connect/express server, you can see the vhost middleware. It will allow multiple domains(sub-domains) to be used for the server address.
You can follow the example given here, which looks exactly like what you need.
Here's how to do it using vanilla Node.js:
const http = require('http')
const url = require('url')
const port = 5555
const sites = {
exampleSite1: 544,
exampleSite2: 543
}
const proxy = http.createServer( (req, res) => {
const { pathname:path } = url.parse(req.url)
const { method, headers } = req
const hostname = headers.host.split(':')[0].replace('www.', '')
if (!sites.hasOwnProperty(hostname)) throw new Error(`invalid hostname ${hostname}`)
const proxiedRequest = http.request({
hostname,
path,
port: sites[hostname],
method,
headers
})
proxiedRequest.on('response', remoteRes => {
res.writeHead(remoteRes.statusCode, remoteRes.headers)
remoteRes.pipe(res)
})
proxiedRequest.on('error', () => {
res.writeHead(500)
res.end()
})
req.pipe(proxiedRequest)
})
proxy.listen(port, () => {
console.log(`reverse proxy listening on port ${port}`)
})
Pretty simple, huh?
First install forever and bouncy.
Then write a startup script. In this script, add a rule to the iptables firewall utility to tell it to forward the traffic on port 80 to port 8000 (or anything else that you choose). In my example, 8000 is where I run bouncy
sudo iptables -t nat -A PREROUTING -p tcp --dport 80 -j REDIRECT --to-port 8000
Using forever, let's tell the script to run bouncy on port 8000
forever start --spinSleepTime 10000 /path/to/bouncy /path/to/bouncy/routes.json 8000
The routes.json would something like
{
“subdomain1.domain.com" : 5000,
“subdomain2.domain.com" : 5001,
“subdomain3.domain.com" : 5002
}
NodeJS application1, application2 and application3 run on port 5000, 5001 and 5002 respectively.
The script I use in my case can be found here and you might have to change a little to fit in your environment.
I also wrote about this in more details and you can find it here.
This is my simplest demo project without any middleware or proxy.
This requires only a few codes, and it's enough.
https://github.com/hitokun-s/node-express-multiapp-demo
With this structure, you can easily set up and maintain each app independently.
I hope this would be a help for you.
Literally, when you get the request and response object, you can get the domain through "request.headers.host"... (not the IP address, actually the domain).
Based on #Michaaatje and #papiro, a very easy way:
Say you have some typical pages such as...
var app = express()
app.use(sess)
app.use(passport.initialize())
app.use(passport.session())
app.use('/static', express.static('static'))
app.get('/', ensureLoggedIn("/loginpage"), function(req, res, next) {
...
})
app.get('/sales', ensureLoggedIn("/loginpage"), function(req, res, next) {
...
})
app.get('/about', ensureLoggedIn("/loginpage"), function(req, res, next) {
...
})
app.post('/order', ensureLoggedIn("/loginpage"), urlencodedParser, (req, res) => {
...
})
.. and so on.
Say the main domain is "abc.test.com"
But you have an "alternate" domain (perhaps for customers) which is "customers.test.com".
Simply add this ...
var app = express()
app.use(sess)
app.use(passport.initialize())
app.use(passport.session())
app.use('/static', express.static('static'))
app.use((req, res, next) => {
req.isCustomer = false
if (req.headers.host == "customers.test.com") {
req.isCustomer = true
}
next();
})
and then it is this easy ...
app.get('/', ensureLoggedIn("/loginpage"), function(req, res, next) {
if (req.isCustomer) {
.. special page or whatever ..
return
}
...
})
app.get('/sales', ensureLoggedIn("/loginpage"), function(req, res, next) {
if (req.isCustomer) {
res.redirect('/') .. for example
return
}
...
})
app.get('/about', ensureLoggedIn("/loginpage"), function(req, res, next) {
if (req.isCustomer) { ... }
...
})
app.post('/order', ensureLoggedIn("/loginpage"), urlencodedParser, (req, res) => {
if (req.isCustomer) { ... }
...
})
Thanks to #Michaaatje and #papiro .
This guide from digital ocean is an excellent way. It uses the pm2 module which daemonizes your app(runs them as a service). No need for additional modules like Forever, because it will restart your app automatically if it crashes. It has many features that help you monitor the various applications running on your server. It's pretty awesome!
This can be done super easily by redbird. Suppose you have two domain names example.com and example1.com and two website serving on port 8000 and 8001.
You can set a reverse proxy to the two websites by the following scripts.
var proxy = require('redbird')({port: 80});
proxy.register("example.com", "http://localhost:8000");
proxy.register("example1.com", "http://localhost:8001");
You can use process management tool such as pm2 to run the scripts so that it will continue serving after you close your shell.
I am sorta new to node.js and web programming in general so excuse if I ask strange questions :D
So here is the way I am setting up my express node.js project.
I have a top level app.js simply to redirect traffic to a few subdomains:
var app = module.exports = express.createServer(options);
app.use(express.vhost('blog.localhost', require('./apps/blog/blog.js')));
app.use(express.vhost('app1.localhost', require('./apps/app1/app1.js')));
app.use(express.vhost('login.localhost', require('./apps/login/login.js')));
In each of the sub-apps, that is included via require(), I simply return a new express server:
module.exports = express.createServer(options);
What is the most elegant way to set up a 404 page? When I was just using a single app, I simply used
app.use(function(req, res, next){
res.render('404', {
});
});
But if I use this method, I am going to have to create a 404.jade for every app and I dislike useless duplications in code. Any idea how to share a single 404 logic across multiple vhosts?
You have 3 hosts that require 3 route files. My suggestion is to require in each this 404 route, from an external file.
This way you would have the 404 route for every host, but it will just reside in one place.
Example:
404 route file - 404.js
module.exports = function (app, req, res, options) {
app.get('/404', function(req, res, next) {
res.render('404', { options: options });
});
}
routes for host A
// define other routes here
...
// you pass the app, req, res as params
// the last param is for local vars sent to the view
require('./routes/404.js')(app, req, res, { host: 'HostA' });