Serving static HTML from CouchDB - couchdb

I've got a CouchDB server up and running serving basic API requests. Overall it works well because users can GET/POST/PUT etc. to the host 'api.example.com'. The only issue is that if a user does a GET request for '/', they get the {"couchdb":"Welcome","version":"1.0.2"}.
Is there anyway to serve a single static HTML page or even an HTTP redirect for the root? That way I could redirect users to the API documentation.
I'm vaguely familiar with Couchapp but it seems like overkill for such a simple task.
Thanks!

You probably want a vhost/rewrite rule. They are pretty easy. Basically you tell CouchDB that queries to "www.example.com" should go directly to the rewriter. The rewriter will serve anything that you specify in the design document.
Jan Lehnardt wrote up good instructions in the Couchbase blog post, Nice URLs.

Related

Connect from WordPress webpage to a Node.js backend server

Being new to WordPress, I've been doing some research and yet I don't seem to be able to pinpoint a solution for my need.
In short, I would like to allow a WordPress page to access a Node.js backend, the goal is ultimatly to get access to MongoDB via Node.js, retrieve some data and return a dynamically generated webpage to the website.
I was checking WordPress Rest API but all it seems to do is frontend handling of a WordPress website, creating and editing post, etc.
Unless there's a better way of doing it, I was thinking I might just send a get/post request from the WP page (like, with a form's action) and use Express.js to listen to that request, do the whole workflow on Node.js, then maybe use some npm wordpress API (like this one) to create a wordpress client and add a page or post with the DB extracted content.
I would appreciate some guidance, if any, as to how could one connect from WordPress to a Node.js backend.
Thanks a bunch!
There are a lot of ways to do it.
If you only need Node for a particular page then you can use your web server (NGINX/Apache) to reverse proxy a particular path to the Node server.
If you had to you could always use an HTML iframe as well but for some reason I feel like that's bad advise.
The method you described would work too. I was considering using GET/POST requests with Express running on a different port for a project I'm working on that uses Wordpress. I decided to go with the solution linked below.
This is probably the method you're looking for based on your description. Skip to solution three if you have to use Wordpress.
Node JS Reverse Proxy (with Apache)
You can find how to do it with NGINX with a quick search.

Is it possible to use the same node.js server for two/three different domains (aliases)?

Is it possible to use the same nodeJS server for two/three different domains (aliases)? (I don't want to redirect my users. I want them to see the exact URL they typed in the address bar. However, all three domains are exactly the same!)
I want my users to be logged in on all three domains at the same time, in order to avoid any confusion.
What is the simplest way to do this and avoid cross-domain issues?
Thanks!
If you mean that all domains will serve the same nodejs app then Yes you can do that.
but if each domain should open a different application then you must have a reverse proxy running on the server to handle and manage the sites/vhosts.
You may install nginx and use it as reverse proxy server or look for http-proxy a library for nodejs.
If you would like to manage the vhosts in your app you can look for vhost middleware for nodejs and use it
Choose one of:
Use some other server (like nginx) as a reverse proxy.
Use node-http-proxy as a reverse proxy.
Use the vhost middleware if each domain can be served from the same Connect/Express codebase and node.js instance.
This is a very broad question. Moreover, it is generally a pretty bad idea, SEO-wise, to have multiple independent domains that each serve the same content.
Logging in is generally either done through Cookies, or through extra parameters in the URL. Cookies are always domain-specific, for obvious security reasons. If you want to ensure folks will be logged in to all the domains at once, you can create an internal purpose-driven domain to handle authentication (without such domain showing in URL bar, and only being used for HTTP redirects, effectively); such domain will store the login state for all the rest, and the rest would pick up the login state through such purpose-driven domain (through HTTP redirects).
In general, however, this sounds like too much trouble. Consider that, perhaps, some users specifically want to use different domains for different accounts, so, you'll effectively break their usage if you mandate that a single login be used for all of them. And, back to the original point, doing this is pretty bad for SEO, so, just don't do it.

Node.js forward proxy to serve a web page

I am trying to set up a forward proxy to serve web pages in nodejs (using the request package).
I can get the web page to be served up, however my problem is in the assets that the webpage tries to reference, they are (of course), all relative pathing (e.g. images/google.png).
my code is thus:
...
app.get('/subdomain/proxy/:webpage', function(req, resp) {
req.pipe(request('http://' + req.params.webpage)).pipe(resp);
}
...
and the response I get, given proxy.mywebsite.com/www.google.com looks like (google inline-styles its css):
So, the question is:
How do I load in resources that are relatively pathed? Is my approach here regarding a forward proxy even correct?
My only solution is to scrape all relative paths and rewrite the html to be absolute references instead which sounds horrific (and doesn't account for cases where the external .js scripts could also relatively reference stuff).
It must be possible as there are websites like 'hidemyass' which achieve the same thing.
This is all extremely new to me, but it seems like I'm asking for something quite simple and I'm quite surprised I've not been able to find a solution yet.

Host multiple site with node.js

I'm currently learning node.js and loving it. I noticing, however, that it seems that's it's really only fit for one site. So it's great for hosting mydomain.com, but what if I want to build an actual full web server with it. In other words, I would like to host mydomain.com, example.com, yourdomain.com and so on. What solutions (modules) are available for this? I was thinking of simply parsing the url from the request object and simply reading from the appropriate directory. For example if I get a request for example.com then read from the example_com directory or if I get a request from mydomain.com read from the mydomain_com directory. The issue here is I don't know how this will affect performance and scalability.
I've looked into Multi-node but I don't fully follow the idea of processes yet (I'm a node beginner).
Any suggestions are welcome.
You can do this a few different ways. One way is to write it directly into your web application by checking what domain the request was made to and then route within your application but unless your application is very basic this can make it fairly bloated and can get messy. A good time to do something like this might be if you're writing a blogging platform where everything is pretty much the same across all your domains. The key difference might be how you query your data to display the right data.
In this case you'd probably use the request to see which blog is being accessed.
If you want to just host a few different domains on the same server all using port 80 (like most websites do) you will want to proxy each request off to a different process. You can do this with nginx or even with node itself. It all comes down to what best fits your needs. bouncy is a quick way to get setup doing this as its a nodejs module and has some pretty impressive benchmarks. nginx (proxy with nginx) is probably the most wildly used method though, as a lot of nodejs servers use nginx to serve static content anyways.
http://blog.noort.be/2011/03/07/node-js-on-nginx.html
https://github.com/substack/bouncy/
You can use connect's vhost middleware (which is also available in express) to dispatch requests to separate request handlers based on the Host: header. This assumes that everything is being handled by the same node process on the same port; if you really need separate processes, then the suggestion about using nginx as a reverse proxy is probably the way to go.

Using IIS as secure reverse proxy in front of less secure HTTP server?

I have a CppCMS based application and I cant use IIS's FastCGI connector as
it is broken for my use thus I want to try to
use the internal HTTP server designed for debug purposes behind IIS.
I it is quite simple web server for an application that handles basic HTTP/1.0 requests
and does not care too much about security like DoS, file serving and more.
So I'd like to know if it is possible to use IIS in front of such application such that
it would:
Sanitize all requests - ensure that they are proper HTTP
Handle all DoS issues like timeouts
Serve the static files.
Is this something that can be configured and done at all?
I would suggest this is the wrong way of doing this. I would use a web server like Nginx to proxy the requests through to backend server. It is very configurable and you will find a lot of articles with doing it to Apache.
We just did something like this. You want the URL Rewriter module. You can use it to sanitize the URLs, however, it isn't going to sanitize the payload. Which is to say, you can make sure that the URLs that hit your box are very specific ones, e.g. not attempts to hits CGI, but you can't use it to make sure that the contents of an upload are safe.
ModSecurity is out for IIS now, it can handle lots of the security related issues.

Resources