How to server static assets on an Openshift NodeJS app - node.js

I am working on my NodeJS app hosted by Openshift. Everything works fine right now but I want to speed things up by serving static files (html, css, js) from a web server rather than doing it from Express. I've read somewhere that Node cartridges do not have an Apache server running, and thus no .htaccess file from where I can configure Apache to send my files.
How can I serve my static files from a web server like Apache or Nginx from my NodeJS app on Openshift?

This may suit your needs. Bare as it gets static server...
var finalhandler = require('finalhandler')
var http = require('http')
var serveStatic = require('serve-static')
// Serve up public/ftp folder
var serve = serveStatic('public/', {'index': ['index.html']})
// Create server
var server = http.createServer(function(req, res){
var done = finalhandler(req, res)
serve(req, res, done)
})
// Listen
server.listen(process.env.PORT || 3000);

It's been a while since the original question was posted, but maybe this will help other people facing the same issue. Have a look at this custom OpenShift cartridge here: https://github.com/gsterjov/openshift-nginx-cartridge
I haven't tested it personally, but I've built other custom cartridges and I know the OpenShift platform is quite flexible if you're proficient enough with the shell, so if the above cartridge doesn't suit your needs you can easily fork it and tweak it as you see fit.
Personally I'm almost always serving static assets from Node.js. The built-in static server in Express.js got so much better lately and there's also st if you need more control over caching / etags.
Also, I've recently came across this interesting CDN-like alternative to "classic" hosting for static assets: http://surge.sh. I can imagine it would be fairly trivial to implement a gulp/grunt scenario for publishing your static assets on surge on deployment...

Related

Run backend directly in Angular ng serve

I'm currently developing using ng serve, with proxy configuration, while the proxy points to another nodejs instance on the same machine.
This backend is a simple express server, like this (simplified):
var express = require('express');
var app = express();
var customers = require('./customers.controller.js');
app.get('/api/customers', customers.getAll)
var server = app.listen(8081)
The frontend (ng serve) runs on port 4200 and proxies /api to http://localhost:8081/api
As far as I can see, this is the recommended setup.
However, I would prefer to have the backend running directly inside of ng serve instance instead of the proxy.
And if possible, even take advantage of the automatic reload feature of ng so that I don't have to restart the server if I change something on the backend code.
As both are nodejs and ng seems to be configurable, I think this is possible, but I can't find a starting point for defining my own routes
Its possible to do this
you just need to put your angular into the backend by utilize the nodejs routing
basicly angular is a "static file" and the entry point is coming from the index.html
// Redirect all the other resquests
this.app.get('*', (req, res) => {
res.sendFile(path.resolve('dist/index.html'));
});
but remember you need to handle the routing for image, js, css and others also.

Should I use express static dirname or use Node.js as a remote server?

My Node.js folders hirarchy looks like the next image:
My folders hirarchy
While app.js it's the Node.js main file, routes it's the Node.js routes and src it's the client public html files.
This is the code in app.js:
var express = require('express');
var app = express();
var server = require('http').createServer(app);
global.io = require('socket.io').listen(server);
var compression = require('compression');
var helmet = require('helmet');
var session = require('express-session');
var bodyParser = require('body-parser');
app.use(bodyParser.json()); // support json encoded bodies
app.use(bodyParser.urlencoded({ extended: true })); // support encoded bodies
app.use(express.static(__dirname + '/src'));
app.use(helmet());
app.use(compression());
app.use('/rides', require('./routes/ridesServer'));
app.use('/user', require('./routes/userServer'));
app.use('/offers', require('./routes/offersServer'));
app.use('/notifications', require('./routes/notificationsServer'));
server.listen("8080", function() {
console.log("Connected to db and listening on port 8080");
});
This is another API in routes/userServer.js file:
router.post('/verifytoken', function(req, res, next) {
// some functions here
});
And this another HTTP REQUEST I am doing from client side, in page: ride.js:
$.ajax({
method: "POST",
headers: {
"Content-Type": "application/json"
},
url: "user/verifytoken",
data: JSON.stringify(something),
success: function(response) {
// some code
},
error: function(error) {
// some code
}
});
As you can see, client files and Node.js server files are on the same server, and Node.js serves those static files via this command:
app.use(express.static(__dirname + '/src'));
I think, that it should be avoided, and there is a better way!
If you are a Node.js expert and familier with best practices, please, tell me if the next method of working is correct, if it does not, please correct me:
I thought about putting static files on public_html directory
and Node.js files in server directory which is under public_html directory.
Then run pm2 start app.js --watch or, node app.js on the app.js which is located in server directory, and not in public_html.
In result, index.html file will be served as just as another static file without any relation to Node.js server, and Node.js will be in its own folder, not dealing with any kind of client the side.
In other words, seperate Node.js and static files and put Node.js files as a sub directory and not main directory.
Then the HTTP REQUEST will be looking like this:
$.ajax({
method: "POST",
headers: {
"Content-Type": "application/json"
},
url: "server/user/verifytoken",
data: JSON.stringify(something),
success: function(response) {
// some code
},
error: function(error) {
// some code
}
});
Please note that I have added SERVER directory.
Furthermore, I can exchange the
url: "server/user/verifytoken",
to an IP from a remote app (like Ionic):
url: "123.123.123.123:443/server/user/verifytoken",
And then my HTTP REQUESTS will be served via HTTPS (because I am sending for port 443), I can create multiple apps on the same server and I have no struggles with any Node.js express static folders.
What do you think?
Thanks!
First let me say I'm not an expert. But I have 3 years of continuous development of Node.js based solutions.
In the past I have created solutions mixing client side code and server side code on the same project and it has work. At least for a while. But in the long run is a bad idea for many possible reasons. Some of them are:
Client side code and server side code may require different processes to produce working code. For example client side code may require trans compiling from ES6 to more compatible ES5 using something as gulp or Webpack. This is normally not the case for server side code because the runtime is more targeted.
Mixing client side code and an API server may prevent you from horizontally scaling one of them without the other.
This is like a mono repo. And having a mono repo without a CI process tailor for this scenario may produce very long development times.
What we currently do at my work is as follow:
Create a separated API server project. This way you can concentrate on developing a good API while working on this specific project. Let cross-cutting concerns (like authentication) outside the API server.
Create a separated project for your client side code (SPA maybe). Set your dev environment to proxy API requests to a running API server (may be running locally).
Create a separated project for the deployment of the entire solution. This project will put together the serving of the client code, proxying requests to the API and implementing cross-cutting concerns like authentication, etc.
Having your code separated in this way makes easy developing each pieces and fast evolution. But it may introduce some complexities:
This multi-project structure require you to be able to trigger testing the hole product each time one of the project changes.
It surface the need of integration testing
Some other considerations are:
API server and Website server may run on the same machine but in different ports.
You may secure your API server using SSL (on node using the standard https module) but notice that in all cases you need another actor in front of the API server (a website proxying requests to the actual API server of a API gateway that implement cross-cutting concerns like authentication, rate limiting, etc). In the past I pose the same question you have made yourself regarding the apropriate of using SSL in this scenario and the answer is here. My answer is: depends on the deployment conditions.

Running Keystone.js app over https on Heroku

I have a web app built on Keystone.js CMS for node.js that I will be deploying with a custom domain on Heroku. I want the whole app to run on https by default and not allow any http connections. I've look around quite a bit and can't seem to find a definitive answer as to the best way to go about this. Typically, i.e. for a Rails app, I would just buy a Heroku add-on SSL certificate for my custom domain(s) and point my DNS to point to the Heroku provisioned SSL endpoint. In my app, I would configure to default all connections to HTTPS.
For a node instance (and specifically a Keystone.js instance), I'm a little unclear. Can I just go about the same process as above, buy an SSL add-on and point my DNS to the Heroku SSL endpoint? Do I need to do anything in the base node code to support? And how to enforce https and not allow http?
New to node and keystone and so any help would be greatly appreciated!
Use express-sslify.
I put it in my routes/index.js since the function I export from there receives a reference to the express application.
All you need to do is to tell express to use sslify, but you probably want to not enable it for development.
Since july, Heroku defaults NODE_ENV to production so you can do
// Setup Route Bindings
exports = module.exports = function(app) {
if (process.env.NODE_ENV === 'production') {
var enforce = require('express-sslify');
app.use(enforce.HTTPS({ trustProtoHeader: true }));
}
// Declare your views
};
That will send a 301 to anyone trying to access your app over plain HTTP.

How to host multiple node.js apps on the same subdomain with Heroku?

We have a base domain of http://api.mysite.com. This should serve as the front door for all our APIs. Let's say we have two different APIs accessed with the following url structure:
http://api.mysite.com/launchrockets
http://api.mysite.com/planttrees
These are totally different. With regards running this on Heroku it seems we have two options.
1) Put everything inside one application on Heroku. This feels wrong (very wrong) and could lead to a higher chance of changes in one API inadvertently breaking the other.
2) Have 3 different Heroku apps. The first as a proxy (http://mysite-api-proxy.herokuapp.com) that will look at the incoming request and redirect to http://planttrees.herokuapp.com or http://launchrockets.herokuapp.com using a module like bouncy or http-proxy.
I'm leaning towards option 2 but I am concerned about managing the load on the proxy app. For web frameworks that have a synchronous architecture this approach would be disastrous. Yet with node.js using the cluster module and being asynchronous I think this may scale okay.
I've seen similar questions asked before but most related to synchronous frameworks where option 2 would definitely be a poor choice. This question is specific to node and how it would perform.
Thoughts on best way to architect this?
I implemented simple demo project to achieve multi-app structure.
https://github.com/hitokun-s/node-express-multiapp-demo
With this structure, you can easily set up and maintain each app independently.
I hope this would be a help for you.
Here is a blog post I wrote trying to answer this question. There are many options but you have decide what is right for your app and architecture.
http://www.tehnrd.com/host-multiple-node-js-apps-on-the-same-subdomain-with-heroku/
Similar to #TehNrd, I'm using a proxy. However this approach doesn't require multiple heroku apps, just one:
On your web app:
var express = require('express')
, url = require('url')
, api_app = require('../api/server') //this is your other apps index.js or server.js
, app = express()
, httpProxy = require('http-proxy')
, apiport = parseInt(process.env.PORT)+100 || 5100 //this works!
;
// passes all api requests through the proxy
app.all('/api*', function (req, res, next) {
api_proxy.web(req, res, {
target: 'http://localhost:' + apiport
});
});
On your API server:
var express = require('express');
var app = express();
var port = parseInt(process.env.PORT)+100 || 5100;
...
...
app.listen(port);

How to enable static files (and less support) when hosting a nodejs app within IIS using IISNode

Background:
Nodejs app using expressjs.
Hosted on IIS using IISNode
Nodejs app is in virtual directory called /myVirtualDirectory
Problem:
You want to provide static files or css using less but the url that is passed to nodejs is the full url and doesn't match what would be expected with a standalone nodejs app.
Solution:
var express = require('express');
var app = express();
var lessMiddleware = require('less-middleware');
app.use('/myVirtualDirectory', lessMiddleware({
src: __dirname + '/public',
compress: true
}));
app.use('/myVirtualDirectory', express.static(__dirname + '/public'));
Note where we have specified the middleware to use we have passed in the url prefix for it to respond to. As long as this is the same as the name of the virtual directory this will match and your files will be served up as expected.
One of the benefits of hosting node.js apps in IIS using iisnode is that you can rely on the static file handler in IIS to serve your static files. The benefit is a substantial improvement in performance, since requests for static content are served by native code without ever invoking JavaScript.
To set up a node.js application hosted in IIS using iisnode to serve static files using IIS static file handler, use the URL rewriting module as described in http://tomasz.janczuk.org/2012/05/yaml-configuration-support-in-iisnode.html
To understand the performance benefits of using static file handler instead of node.js modules to serve static files, read http://tomasz.janczuk.org/2012/06/performance-of-hosting-nodejs.html.

Resources