HAProxy configuration in OpenShift - node.js

I am new to HAProxy as well as OpenShift. Following is the setup I am trying to do - serve blog through Ghost(a NodeJS app), static website files through PHP cartridge(I assume this is the best way for serving static HTML/JS on OpenShift) and actual application. I would like to route requests to specific gear based on the URL.
I want to confirm if this is the correct way to set it up. Could you please give some pointers about the HAProxy configuration for this?

I think that rather than do that in the haproxy it would be worth either running a separate gear for your static assets, or using Amazon S3 or CloudFront for static assets.

Related

Elastic Beanstalk Node Static Files are not Loaded

I am having trouble serving my static files on Elastic Beanstalk using NodeJS deployed on Linux 2. My local environment works, but my deployment is unable to serve the static files located in a top-level static folder called 'public'.
My configuration is as follows:
option_settings:
aws:elasticbeanstalk:environment:proxy:staticfiles:
/images: public/images
/javascripts: public/javascripts
/stylesheets: public/stylesheets
I am certain that the configuration is processed correctly because I can view the results of the static file configuration within AWS UI. When I navigate to the home directory of my site (using http:// protocol), the HTML page is loaded, but the CSS and JS under the public directory is not. The error I get is as follows:
GET https://<domain name>/stylesheets/layout.css net::ERR_CONNECTION_TIMED_OUT
Note that the https:// protocol is used. From my understanding, the reason my local environment works is that my application serves the static files with the correct protocol. Here are my questions:
Why are my static files being served with protocol https:// when I request my home directory using http://?
I don't want to serve my static files through the application to reduce the number of requests to my application, noted here: https://docs.aws.amazon.com/elasticbeanstalk/latest/dg/command-options-general.html#command-options-general-environmentproxystaticfiles. Is there anything actually wrong with the configuration?
Issue was resolved. I am using Helmet JS for Content Security Policy (CSP), and it has a directive for converting insecure requests to secure ones: upgrade-insecure-requests. Make sure to remove that in the development phase for a site that is relying on http:// for content. Best practice is to use https:// when possible.

Do I need to make any changes in my node.js codes to deploy it over the web?

I've created a website using html,css,sass,js, node, express & ejs. Do I need to make any changes in the code to deploy it over the web?
P.S. I've set the port to process.env thing and added the css & js files to a public folder. Are there any other changes I need to make?
Not sure what you are asking.
You basically need to make it publicly available.
You could either do it using a host provider or in theory you could set up your own server using apache ,nginx etc then reroute your home router to your machine (laptop).
Your server will run the same way as locally in theory.
You just need something to redirect the traffic in and out.

How to integrate Strapi API and Admin Panel with another node app (that consumes Strapi API)?

I'm trying to develop an app that uses Strapi Admin Panel api generation, and, at the same time, serves as a website that consumes this api.
So, basically, I'm trying to build a website:
where /api route servers as a Strapi API endpoint
where /admin route serves as a Strapi Admin Panel for API creation
where all the other routes are configured to serve my website, i.e.:
/ route is the landing page of my website
/contacts is the contacts page
etc.
And, moreover, the static files of the website (html/css/etc) should be served from the server that, respectively, consumes the generated API (server-side).
I hope I'm not expressing myself too vaguely.
Essentially, I need to integrate one node.js app (my website) with another node.js app (Strapi) in such a way that they work seamlessly together as one.
Does anybody know how to achieve that?
I've read some answers here on Stackoverflow and on Strapi GitHub issues, and some people said that the way to achieve that is to run two separate apps on different ports, but it doesn't feel right to me (or maybe I just don't understand some basic stuff).
What I need is to make a single app that is, basically, a simple multi-page website, but enhanced with api generation tools from Strapi.
I have my Strapi app up and running and I thought maybe I should find some place in the app folder structure to put my website (i.e. all the static stuff to the public folder), but where to put the server-side stuff?
And I'll need to use a templating engine, so the question of "where to put the client-side files" arises again. The more I dig into the code, the more I get confused.
PS: I'm fine using Koa which is used as a server for Strapi.
PPS: Further, I'm planning to deploy the app to Heroku on a single Dyno (if it is important).
Okay I just played with the routing prefix and that is the solution I suggest you.
So you will have to build you website app. And push the build in the ./public folder of the Strapi application.
Then in your api/.../config/routes.json files you will have to add an option prefix in the config key of each of your routes and for all your APIs
{
"routes": [
{
"method": "POST",
"path": "/restaurants",
"handler": "Restaurant.create",
"config": {
"policies": [],
"prefix": "/api"
}
}
]
}
So at the end you will have your admin on /admin
Your APIs endpoints prefixed by /api
And your website/assets on /
I found myself in a similar situation. In my situation I wanted to deploy Strapi along with a static site (in my case built with Gatsby) just in one server instance, at least to try if possible.
There are some open questions left from the original post, so let me answer them based on my context:
First of all, yes it's possible. But each app has to function on their own port. In that way your server knows which one to serve based on the request. Trying to mix them into a single port will mess a lot both apps.
In my situation, what I end up doing is to have Strapi running in Port 1337 and serve my static page in Port 80. In order to achieve that I used NGINX for serving content and work as a proxy. I first tried to have them on the same domain, but got a lot of conflicts. So I strongly suggest to use subdomains. So I ended up like this:
domain.com serving my static page
api.domain.com serving Strapi
You can achieve that having your configuration file in NGINX like:
server {
server_name domain.com www.domain.com;
root /var/www/domain.com/html;
}
server {
server_name api.domain.com
location / {
proxy_pass http://127.0.0.1:1337;
proxy_http_version 1.1;
}
}
As you can see there is a lot of configuration you will need to perform, so I won't recommend trying something like this in Heroku since you really don't have much control of the Dyno (ports, routing, etc). If you want to go with Heroku then, you should have a separate Dyno for each app. One for Strapi and a separate one for the other app.
I would strongly suggest something like DigitalOcean or any other where you have total control over your server in Digital Ocean called a Droplet. I was able to install on the same Droplet an instance of Strapi and serve my static site built in Gatsby.

Is there a proxy webserver that routes dynamically requests based on URLs?

I am looking for a way how to dynamically route requests through proxy webserver. I will explain what I need exactly and what I have found so far.
I would like to have some lightweight webserver (thinking about node.js or nginx) set up as proxy webserver with public IP. It would route requests to different local webservers based on URLs. But not only based on hostname but based on full URL.
My idea is, that this proxying webserver would use either local memory cache, memcached or redis to look up key-value based information of URL and local webserver.
I have found these projects:
https://github.com/nodejitsu/node-http-proxy
https://www.steve.org.uk/Software/node-reverse-proxy/
https://github.com/hipache/hipache
They all seem to do similar things, but not exactly what I am looking for, that is:
URL based proxying (absolute URLs routing to different local webservers)
use of memory based configuration storage / cache
dynamically change configuration using API without reloading proxy webserver
Is there any better-suited project or is there a way how to configure one of three projects above to fit my requirements ?
Thank you for your time and effort in advance.
I think this does exactly what you want: https://openresty.org/en/dynamic-routing-based-on-redis.html
It's basically nginx with precompiled modules. You can setup the same by yourself with nginx + lua module + redis ( + of course the necessary lua rocks). OpenResty just makes it easier.

Authentication across node.js and nginx

As most of my content is static i was planning to have nginx to handle the serving of static files. But the static content is also private. Different users have different content.
The application itself is written in node.js/express.js
And i was wondering how i should handle authentication/authorization. Is there anything, any nginx module for this.
Something like node.js put some token in memcached which nginx looks up upon request or something like that?
Yes, there is such a feature, checkout the following more detailed articles:
http://wiki.nginx.org/XSendfile
http://kovyrin.net/2006/11/01/nginx-x-accel-redirect-php-rails/
All you have to do is to make a Node.js send the path of the file to NGiNX by setting the header "X-Accel-Redirect" with the location of that file.

Resources