does pouchdb-authentication uses https for exchange and replication? - security

according to documentation of pouchdb-authentification, all the operations are done over http.
var db = new PouchDB('http://mysite:5984/mydb', {skipSetup: true});
db.login('batman', 'brucewayne').then(function (batman) {
console.log("I'm Batman.");
return db.logout();
});
does it use https under the hood, or is the username and password really going clair and readable over the wire mode over the wire ?

You can certainly setup your server with an SSL certificate and use PouchDB with https.
On their site when it says that PouchDB uses HTTP, it is referring to the protocol being used. This doesn't preclude the use of HTTPS.

Related

how can i use wss in node.js, if the https server is a separate node.js application?

I've created a multiplayer game with websockets in nodejs (using the ws lib), which works just fine. For debugging, I connected to the websocket server with my client webpage by just opening the html file via file:// protocol.
I wanted to have the page hosted on my web-server which uses https. This web-server also uses nodejs, but because the webpage is served via https, it cannot create a connection via ws and needs wss. Security downgrading and so on.
My problem is that I've got two separate programs: the https webserver and the websocket "game" server.
When i try connecting to the ws server i get:
Uncaught DOMException: The operation is insecure.
I only found instructions on how to set up wss by creating a https server, but i already have one.
Do i need to combine the two programs?
Could i maybe just serve the single page for the game with http?
Is there some other technology, which doesn't have these security restrictions? (i don't care about encryption for the websockets)
I was able to make it work:
Instead of wanting to use the https server from the web-server, i "upgraded" my http-server for the "gameserver" to https. I didn't want to "create" another https server, because i thought it would cause errors, but i already made a http server indirectly anyways, with new WebSocketServer.Server({ port: PORT });.
To create the https server and use it for the WebSocketServer i used this code:
let cert = fs.readFileSync(pathtocertkey, "utf8");
let key = fs.readFileSync(pathtopublickey, "utf8");
let options = {key: key, cert: cert};
let server = require("https").createServer(options);
const wss = new WebSocketServer.Server({ server: server});
After that i could listen to any port with server.listen(PORT,callback).
I also wasn't sure how or if i could get the cert i got with greenlock. But I found it undergreenlock.d/live/[yourdomain]/
With greenlock.d being the configDir specified in greenlock.init(options).
For the client i need to connect like this :
ws = new WebSocket("wss://mydomain:"+PORT);'

Connecting to Foxx app as TCP bridge?

I'm currently setting up my Foxx app as a GraphQL API endpoint and I would need to connect to it from the browser and the Node backend. There is an arango.client npm package that I'm thinking to install for my backend but it seems that it only supports HTTP. Wouldn't it be better to create a TCP connection bridge once (in Node) and communicate with it with lower latency and less overhead? I know that ArangoDB supports for TCP but why it is not implemented in arango.client?
Currently ArangoDB only implements HTTP as transportation endpoint.
Source for irretation may be that specifying the arangosh commandline parameter for the server connection looks like this:
--server.endpoint tcp://127.0.0.1:8529
But the protocol spoken there is HTTP.
One step to bypass the TCP stack could be to use unix domain sockets for the HTTP communication. You can use Raw routes to communicate with your Foxx service:
var db = require('arangojs')();
var myFoxxService = db.route('my-foxx-service');
myFoxxService.post('users', {
username: 'admin',
password: 'hunter2'
})
.then(response => {
// response.body is the result of
// POST /_db/_system/my-foxx-service/users
// with JSON request body '{"username": "admin", "password": "hunter2"}'
});
ArangoDB 3.0 will bring velocypack and later on raw tcp protocoll to sideline HTTP. Foxx support is also planned for this.
While arangojs is currently a pure js implementation, we plan to offer a native backend under a similar API in the future. So if you want to benefit of that, you should go with ArangoJS now.

How can I know that a HTTPS endpoint receiving a TLS request from my node.js is using a specified SSL certificate?

I have an endpoint (in any language, let's say Python) that exposes some service as HTTPS using a certificate issued by any widely known and trusted CA, that is
probably included in virtually any browser in the world.
The easiest part is that I can issue TLS requests against this endpoint using Node.js with no further problems.
For security reasons, I would like to check that every time my Node.js issues a TLS request against this HTTPS endpoint, I want to make sure that the certificate being used, is the certificate that I trust, and the one that was requested by my company.
What is the best way to accomplish that?
It sounds like the answer at How to get SSL certificate information using node.js? would be suitable for your needs.
You can use the following code to get your endpoint's certificate then check its fingerprint or hash against what you expect.
var https = require('https');
var options = {
host: 'google.com',
port: 443,
method: 'GET'
};
var req = https.request(options, function(res) {
console.log(res.connection.getPeerCertificate());
});
req.end();

OpenShift Invalid certificate | SSL on NodeJS app

Here's what I'm working with:
NodeJS/Express app
OpenShift environment (Enterprise account)
Works over HTTP
Certificate trust error over HTTPS
Using default wildcard certificate provided by OpenShift
Begins working if I go manually accept the exception the browsers are raising
Latest Express
Server.js looks something like:
var express = require("express"),
app = express(),
IP = process.env.OPENSHIFT_NODEJS_IP || "127.0.0.1",
PORT = process.env.OPENSHIFT_NODEJS_PORT || 8888; // its 8080 on openshift. i use 8888 only on my local environment for irrelevant reasons
// we sometimes need special endpoints that arent files
app.get("/something-special", function(req, res) {
res.send("something special");
});
// but typically it's static files
app.use(express.static(__dirname + "/public"));
// go!
app.listen(PORT, IP);
When I go to https://myserver/file.js (which lives in /public/file.js), I get an error saying the certificate is not trusted.
I dont much understand certificates, and I barely know Node. This is a learning project so I'm trying to work through all of the issues I come across without changing course.
I've tried everything I can think of, including:
app.enable('trust store') recommended on a different SO
simplifying my Node app and using req.secure to force HTTPS
You might try visiting your app using the https://appname-yourdomainname.rhcloud.com/ version of the URL. The underlying digital certificate is *.rhcloud.com and was issued by "Geotrust SSL CA" for what it's worth. If you do it this way you don't get certificate-related errors because they applied a wildcard-based cert to the servers.
I'm not sure that the free version of the hosting allows for private SSLs to be provided/bound... Yeah, you need Bronze or better to allow a private SSL for your application. Bummer
More than likely what is happening is that you are trying to use the *.rhcloud.com wildcard ssl certificate with your custom domain, and that won't work. OpenShift supplies you with an ssl certificate that matches your app-domain.rhcloud.com address. if you want to use SSL correctly with your custom domain, then you need to acquire (or purchase) a custom ssl certificate for your domain name. You can get one at lots of companies online, or you can get a free on here: https://www.startssl.com
Also, the SSL is terminated on the proxy, before it gets to your gear. Check out this developer center article for more information about how it all works: https://developers.openshift.com/en/managing-port-binding-routing.html

SNI proxy TLS session reuse over NodeJS

I'm setting up a SNI Proxy to be able to dynamically handle SSL connections on different domains. I'm using an HTTPS server with node-http-proxy (last version) over node.js v0.12 which supports SNI callback.
var sniOptions = {
SNIcallback : function(){
//dynamically fetch SSL certificate here
}
}
https.createServer(sniOptions, function (req, res) {
// node-http-proxy logic routes request here
});
All good so far but I haven't find yet a way to reuse the same TLS session. Everytime a request is received a brand new TLS session is created and I want to avoid this.
The only to resources strictly related to the topic are this issue on Github and this Paypal blog post.
Any suggestion about it?

Resources