express.js - serve different pages based on ip - node.js

I'm developing a NodeJS server that runs on a Raspberry PI. The Pi is attached to a screen and shows a website in kiosk mode. Users that see that screen can also connect to the server, but should be given a different page. I want both sites to run on the root (http://localhost/ and http://serverIP/), but serve different pages.
The server should see if the localhost is making a request or if any other device is making the request and serve the appropriate page.
Currently I can reach the same page on both localhost and a remote device and also see if it is done by the localhost or the remote. But when I try to redirect the remote IPs, they get stuck in a redirect loop.
app.use('/', function(request, response, next) {
let clientIP = getClientIP(request);
console.log('client IP: ', clientIP);
if (clientIP == '::1' || clientIP == '::ffff:127.0.0.1') {
// if localhost request go to next middleware to direct it to the public/index.html
next();
} else {
// if remote device request return the mobile page in public/mobilepage/mobile.html
response.redirect('/mobilepage'); //This gets stuck in a loop because when redirected app.use() gets called again with the redirected request and sees it is not localhost and will redirect again. What should I better do here?
}
}, express.static('public'));
function getClientIP(request){
return request.headers['x-forwarded-for'] || request.connection.remoteAddress;
}
My project folder looks like this
ProjectFolder/
| - server.js //the node server
| - public/ //the folder with the website
| | - index.html //the localhost main website
| | - assets/ //folder with all the css and js for the localhost website
| | - mobilepage/ //folder with the page for the remote devices
| | | - mobile.html //the page for the remote devices
| | | - mobileStyle.css //the style for the remote devices page
| | | - mobileScript.js //the script for the remote devices page
| | - sites/ //folder with all other sites for the localhost website
|
| - ... //other files like node_modules etc. that are not important to the question
I hope I explained my situation well enough. If something is unclear please let me know, I would love to get some help on this.

you can install a package called as express-ip
const express = require('express');
const app = express();
const expressip = require('express-ip');
app.use(expressip().getIpInfoMiddleware);
app.get('/', function (req, res) {
res.send(req.ipInfo);
});
Note in app.use use conditional logic based on the value of req.ipInfo and your objective.
Also, This won’t work if you are at localhost, since it needs to get the IP to do a lookup. So to test it, use ngrok to expose http://localhost:yourportvalue.

Related

express app is not sending index.html file to client

So my express app has a small Node server setup so it can serve up the index.html file when the home route '/' is hit. This is a requirement of using the App Services from Azure, there has to be this server.js file to tell the server how to serve up the client, and i had a previous implementation of this working, however i wanted to change my file structure. previously i had, the client React app in a folder client and the server.js in a folder server along with all of the conrtollers and routes. i've since moved the server API to its own application as there are other apps that depend on it. and i moved the client up one directory into the main directory. Everything was working fine till the other day when all of the sudden when you hit the home route / it will not serve up the index.html file. if you hit any other route it works, if you even hit a button linking back to the homepage, it works, but it wont serve up the app from the / and i cannot for the life of me figure out why, on my development server there are no errors in the console. and im most definitely targeting the correct directory and place for the index. but its like the server isnt reading the route to serve up.
if (process.env.NODE_ENV === 'production') {
console.log('running');
app.use(express.static(path.resolve(path.join(__dirname, 'build'))));
// no matter what route is hit, send the index.html file
app.get('*', (req, res) => {
res.sendFile(path.resolve(path.join(__dirname, 'build', 'index.html')));
});
} else {
app.get('/', (req, res) => {
res.send('API is running...');
});
}
So here im saying if the NODE_ENV is in production make the build folder static, and then whatever route is hit. (Note: i also tried this app.get with other route formats such as /* or / all have the same issues. however in my previous iteration when the client and server where deployed in the same location, /* is what i used.) The .env varialbes are setup correctly, as when the server is ran, itll console log running.. but even if i put a console log inside of the app.get() its like its never hit unless i access the route from something else first.
for example, if i place a console log inside of app.get that states hit whenever the route is hit, hitting / directly does nothing, but if i go to /login itll serve up the correct html on the client and console log hit in the terminal...
If you are having server files inside the client react app, then we are basically accessing file which are not inside our server file. So, we can serve static files using the following code:
const express = require("express");
const app = express(); // create express app
const path = require('path');
app.use(express.static(path.join(__dirname, "..", "build")));
app.use(express.static("build"));
app.listen(5000, () => {
console.log("server started on port 5000");
});
Now in your packages.json of the client react app change the name of start tag under scripts tag to start-client. Then add this following tag to the scripts tag:
"start":"npm run build && (cd server && npm start)",
Basically, this will build the react app and start the server.
It should look like this :
Also in the packages.json of your server add the following tag under script tag
"start":"node server.js"
So when you run the following command npm start it should look like this :

How to deploy NextJS application to Linux Server (CentOS 7) - VPS

I've got a question regarding building applications. I'm using simple VPS with node.js support. Now I do not know how to build my next.js application to production.
I want to deploy my application as static files.
I thought that I should use next build && next export then copy out dir to the server but during this process, I faced some issues - when I change route - everything is okay, but if I refresh the page - the page is not found because the server is looking for this file in directories. So how can I deploy my nextjs application in production mode with VPS server and static files?
I tried one thing which is not working fine probably or I did something wrong.
I added nodejs express server with
const express = require('express');
const next = require('next');
const dev = process.env.NODE_ENV !== 'production';
const app = next({dev});
const router = express.Router();
const handle = app.getRequestHandler();
app.prepare()
.then(() => {
const server = express();
server.get('*', (req, res) => {
return handle(req, res);
});
server.listen(3000, (err) => {
if (err) throw err;
console.log('> Ready on http://localhost:3000');
});
});
and start server with forever library NODE_ENV=production node server.js and it's working fine, but seems this is working in a wrong way - seems it's normal server like in dev mode - so it shouldn't be like that. (I see thunder icon on the right-bottom corner and I see all files which are same as in dev mode).
I want to deploy everything as static files.
Thank you for your help!
After you build and export you need to serve those files somehow. The reason the Express server works is because you are starting a HTTP server to serve the files.
So you need to serve those files either by using a static hosting provider (i.e. Vercel or Amazon S3). Otherwise you should start a server on your linux machine using something like serve to serve it at a port, similar to your Express server serving it as localhost:3000 which is then exposed on your VPS.

How can I get create-react-app to use an IP to connect to my api server?

I'm using Facebook's create-react-app. When I start the web-client I see in console:
You can now view web-client in the browser.
Local: http://localhost:3000/
On Your Network: http://192.168.1.107:3000/
The problem is my web-client uses localhost to connect to the api-server, which means I can't use the IP address on different devices to debug issues.
env-variables.js:
export const ENV = process.env.NODE_ENV || 'development';
const ALL_ENV_VARS = {
development: {
GRAPHQL_ENDPOINT_URI: 'http://localhost:4000/graphql',
},
....
I tried updating the above with:
GRAPHQL_ENDPOINT_URI: `http://${process.env.ip}:4000/graphql`,
That did not work, process.env.ip is returning undefined. How can I get the above GRAPHQL_ENDPOINT_URI to use the IP address which somehow create-react-app is getting?
Try adding the following to your client-side package.json:
"proxy": "http://localhost:4000/",
You can then leave the
http://localhost:4000
off of any URLs pointing to the API server from the client side. You would just refer to those addresses as
/graphql/<any additional URL data>
I've performed the same with a Node/Express backend and a React frontend - I resolved the /api portion in my server.js with the following:
//Use our router configuration when we call /api
app.use('/api', router);
just replace /api with /graphql there.
Take a look at this article for further explanation. Hope this helps!
https://medium.freecodecamp.org/how-to-make-create-react-app-work-with-a-node-backend-api-7c5c48acb1b0

Http Serving with node.js over the web

This is a very basic question. But I have looked and can't seem to find any tutorials that walk through this step. Everything either stops just before this step OR starts just after it.
I have launched an AWS server (Windows_Server-2016-English-Full-Containers-2016.10.18 (ami-d08edfc7)) and installed node in the default directory.
I have put a file into the following content:
var http = require('http');
const PORT=8080;
function handleRequest(request, response){
response.end('It Works!! Path Hit: ' + request.url);
}
var server = http.createServer(handleRequest);
server.listen(PORT, function(){
console.log("Server listening on: http://localhost:%s", PORT);
});
I then open CMD, navigage into the directory where node is installed and run the program with:
node myServer.js
Next I open a browser and navigate to http://localhost:8080 and I am served some content. Terrific.
My question is how do I go about making a request of that newly installed server from another machine over the internet. My primitive guess was to simply navigate to the AWS machine's public IP, as displayed in the AWS console and include the port number.
So for example if my IP were 55.173.140.15 I would type in the address http://55.173.140.15:8080 and expect to see the page. That is not working. So what configuration step am I missing?

Sharing one port among multiple node.js HTTP processes

I have a root server running with several node.js projects on it. They are supposed to run separately in their own processes and directories. Consider this file structure:
/home
+-- /node
+-- /someProject | www.some-project.com
| +-- index.js
| +-- anotherFile.img
| +-- ...
+-- /anotherProject | www.another-project.com
| +-- /stuff
| +-- index.js
| +-- ...
+-- /myWebsite | www.my-website.com
| +-- /static
| +-- index.js
| +-- ...
+-- ... | ...
Each index.js should be started as an individual process with its cwd set to its parent-folder (someProject, anotherProject, etc.).
Think ov vHosts. Each project starts a webserver which listens on its own domain. And there's the problem. Only one script can start since, they all try to bind to port 80. I digged to into the node.js API and looked for a possible solution: child_process.fork().
Sadly this doesn't work very well. When I try to send a server instance to the master process (to emit a request later on) or an object consiting of request and response from the master to the salve I get errors. This is because node.js internally tries to convert these advanced objects to a JSON string and then reconverts it to its original form. This makes all the objects loose their reference and functionality.
Seccond approach child.js
var http = require("http");
var server = http.createServer(function(req, res) {
// stuff...
});
server.listen(80);
process.send(server); // Nope
First approach master.js
var http = require("http"),
cp = require("child_process");
var child = cp.fork("/home/node/someProject/index.js", [], { env: "/home/node/someProject" });
var router = http.createServer(function(req, res) {
// domaincheck, etc...
child.send({ request: req, response: res }); // Nope
});
router.listen(80);
So this is a dead end. But, hey! Node.js offers some kind of handles, which are sendable. Here's an example from the documentation:
master.js
var server = require('net').createServer();
var child = require('child_process').fork(__dirname + '/child.js');
// Open up the server object and send the handle.
server.listen(1337, function() {
child.send({ server: true }, server._handle);
});
child.js
process.on('message', function(m, serverHandle) {
if (serverHandle) {
var server = require('net').createServer();
server.listen(serverHandle);
}
});
Here the child directly listens to the master's server. So there is no domaincheck inbetween. So here's a dead end to.
I also thought about Cluster, but this uses the same technology as the handle and therefore has the same limitations.
So... are there any good ideas?
What I currently do is rather hack-ish. I've made a package called distroy. It binds to port 80 and internally proxies all requests to Unix domain socket paths like /tmp/distroy/http/www.example.com, on which the seperate apps listen. This also (kinda) works for HTTPS (see my question on SNI).
The remaining problem is, that the original IP address is lost, as it's now always 127.0.0.1. I think I can circumvent this by monkeypatching the net.Server so that I can transmit the IP address before opening the connection.
If you are interested in a node.js solution check out bouncy, a websocket and https-capable http router proxy/load balancer in node.js.
Define your routes.json like
{
"beep.example.com" : 8000,
"boop.example.com" : 8001
}
and then run bouncy using
bouncy routes.json 80
Personally, I'd just have them all listen on dedicated ports or preferably sockets and then stick everything behind either a dedicated router script or nginx. It's the simplest approach IMO.
For connect middleware there is vhost extension. Maybe you could copy some of their concepts.

Resources