When I tried to use adapter: 'redis' it told me to install socket.io-redis version 0.14. I did that and have entered in all the info into the session.js file:
module.exports.session = {
adapter: 'socket.io-redis'
host: '10...',
port: 6379,
db: 'sails',
pass: '...',
};
And now when I try run the application I get the following error:
Error: Ready check failed: NOAUTH Authentication required.
I'm not sure why pass: .. is not working? Anything else I should do?
Note: I am using a Google compute instance for redis hosting, I have a firewall rule for allowing access.
I did find a solution for my problem. I am not sure how useful it will be for you, since I believe my situation was a little different. I am using sails.js on a bitnami Google cloud compute instance and I am also hosting redis on a seperate bitnami instance, which we have in common. However, I am trying to connect to Redis for the use of Kue. So I did not make any use of my config/session file. We still got the same error though, so the solution was to remove the requiredpass in the redis instance and then with the firewall rules, only allow my server to access the Redis instance.
I believe the root issue is that redis has a second password prompt for any attempt to read/write to data store. So passing the password from the server only logs you in but does not give you access to the data, hense the NOAUTH error. So I believe the requiredpass for the instance is mainly for the client side and server side instances don't need it. This might be me being naive on how to use Redis, but I do not know how else to enter the password to the prompt from a different server. I feel the firewall rules are fine for me for now to keep unwanted traffic out.
If this is what you want to do/try, then the way I did this for Google cloud was ssh into the Redis instance (through your own command line or through the browser one that Google provides. Then edit the /opt/bitnami/redis/etc/redis.conf file with sudo privileges. Find the line that says requiredpass and comment that out by placing a # in front of it. Now to get this to take affect, then you have to restart the server.
Bitnami says you can just do this with sudo /opt/bitnami/ctlscript.sh restart redis. However, I was getting an AUTH error. So in order to get around that, I had to force kill the proccess with sudo pkill -9 -f redis-server, then restart it with sudo /opt/bitnami/ctlscript.sh restart redis. This should refresh the config file, update the instance and allow your server to connect without requiring a second prompt password to be entered.
If you have any questions, please let me know and I will try to help as much as possible.
You have to specify auth_pass:
module.exports.session = {
adapter: 'socket.io-redis'
host: '10...',
port: 6379,
db: 'sails',
pass: '...',
auth_pass: redis_url.auth.split(":")[1]
};
UPDATE
From documentation:
password: null; If set, client will run redis auth command on connect.
Alias auth_pass (node_redis < 2.5 have to use auth_pass)
Related
I'm trying to figure out how to set up my backend api (next.js/api) to the database (postgresql) that both are hosted by heroku.
Mediated by pg.pool, i set up with the following code.
const pool = new Pool(
{
connectionString: process.env.DATABASE_URL,
// ssl: {
// rejectUnauthorized: false,
// }
})
but returned by heroku with the following error:
sql_error_code = 28000 FATAL: no pg_hba.conf entry for host "122.180.247.11", user "u3idolso5k2v83", database "dc85788d13v9ej", SSL off
The error description is from:
https://help.heroku.com/DR0TTWWD/seeing-fatal-no-pg_hba-conf-entry-errors-in-postgres
EDIT: meant to post this link, Is it ok to be setting rejectUnauthorized to false in production PostgreSQL connections?
The authentication failed because the connection didn't use SSL encryption: (SSL off). All Heroku Postgres production databases require using SSL connections to ensure that communications between applications and the database remain secure. If your client is not using SSL to connect to your database, you would see these errors even if you're using the right credentials to connect to it.
I find this strange, since heroku do provide ssl already to my server hoested by them by default, so its unexpected for such an error to occur at all?
The side step solution I've come across online is uncomment the ssl property in the connection...which works, but i feel uneasy with this one.
const pool = new Pool(
{
connectionString: process.env.DATABASE_URL,
ssl: {
rejectUnauthorized: false,
}
})
As mentioned briefly it is not safe from here: https://security.stackexchange.com/questions/229282/is-it-safe-to-set-rejectunauthorized-to-false-when-using-herokus-postgres-datab
I don't understand why this error occur at all, and how can it be fixed with proper security.
It's pretty standard for SSL certificates for Postgres servers to not be valid. Even official postgres clients don't validate the certificates. The library you are using defaults to validating certificates, but is very much in the minority.
When setting this up for https://www.atdatabases.org/docs/pg-options I made it not validate certificates by default to match the standard behaviour for Postgres.
This lets you create a connection pool for heroku using simply:
import createConnectionPool from '#databases/pg';
createConnectionPool(process.env.DATABASE_URL);
As described in your linked-to answer, you can upgrade to one of Heroku's paid products which does support this. Or you can stop using Heroku. Or you can put up with the incredibly low risk that someone will MITM you.
I don't understand why this error occur at all,
What about it do you not understand? The explanation you linked to seems pretty clear. If you cannot formulate your uncertainty any more clearly than you have so far, how can anyone help you understand?
Edit: After thinking about the issue, the real question is what is an example of connecting to digitalocean's managed redis with node-redis using tls?
I'm able to connect just fine with redisinsight GUI client using username / password, but cannot connect with nodejs. It's on the same computer so no firewall issues.
var redis = require('redis');
var client = redis.createClient(process.env.REDIS_PORT, process.env.REDIS_URL, {no_ready_check: true});
client.auth('password', function (err) {
if (err) {
console.log(err);
return
}
console.log('auth')
});
One thing I'm confused about is where to enter the username? It's just 'default' but the documentation for node_redis doesn't provide a way to give a username during auth.
Error is: AbortError: Redis connection lost and command aborted. It might have been processed.
Here's my working lightly anonymized redisinsight connection screen.
How do I do the same in node-redis?
The AUTH command, as stated in the docs:
When ACLs are used, the single argument form of the command,
where only the password is specified, assumes that the implicit username is "default".
So even if you are using Redis 6, where additional users are supported, your authentication for default should work.
The error you're seeing is the result of a broken connection, e.g. you somehow lost connection with the Redis server. node-redis is dealing with one of two scenarios (or both) - the connection has timed out or the the reconnect attempts have exceeded the maximum number specified in a config. I would double check your connection information and how your redis server is configured.
I see you are using TLS, you may find this useful: Securing Node Redis
If you want to authenticate node-redis client with a different user, when using Redis 6, you will have to use send_command, but before you need to remove the current AUTH command, as currently node-redis doesn't support the new command AUTH <username> <password>.
client['auth'] = null;
client.send_command('AUTH', ['<username>', '<password>'], redis.print);
I have been researching this for day and I haven't been able to find the way to do this.
I am building a react app, running express at the backend, that needs to access some data in a remote database that lives inside a VPN. At the moment the app lives on my localhost so its enough for me to connect my machine using openvpn client and everything works a beauty. The problem will rise when the app will be live and I will need it to have access to the vpn by (I'm guessing) having a vpn client running on the site/domain.
Has anyone done this before?
I have tried to install the node-openvpn package that seems could do the job but unfortunately I can't manage to make it work as the connection doesn't seem to be configured properly.
This is the function I call to connect to the vpn that systematically fails at the line
--> openvpnmanager.authorize(auth);
const openvpnmanager = require('node-openvpn');
...
const connectToVpn = () => {
var opts = {
host: 'wopr.remotedbserver.com',
port: 1337, //port openvpn management console
timeout: 1500, //timeout for connection - optional,
logpath: '/log.txt'
};
var auth = {
user: 'userName',
pass: 'passWord',
};
var openvpn = openvpnmanager.connect(opts);
openvpn.on('connected', function() {
console.log('connecting..');
openvpnmanager.authorize(auth); <-- Error: Unhandled "error" event. (Cannot connect)
});
openvpn.on('console-output', function(output) {
console.log(output)
});
openvpn.on('state-change', function(state) { //emits console output of openvpn state as a array
console.log(output)
});
};
Am I misusing this function? Is there a better way?
Any help will be extremely appreciated.
Thank You!
The problem will rise when the app will be live and I will need it to
have access to the vpn by (I'm guessing) having a OpenVPN client running
on the site/domain.
Thats correct, you will need an openvpn client instance on the server where you will run the backend.
The above library (node-openvpn) is simply a library to interact with the local OpenVPN client instance. It cannot create a connection on its own. It depends on the OpenVPN binary (which should be running).
The solution you need is simply run the OpenVPN client on your server (apt-get openvpn). And let the daemon run. Check out the references below.
node-openvpn issues that points out that a running instance of the client is needed
OpenVPN CLI tutorial
So it took me a bit to set up a repl set with SSL and authorization. However, I have it set up and working finally, and can connect via command line providing the appropriate parameters. I'm trying to do the same thing with mongoose, however I keep getting an error in the mongodb logs, as follows: AssertionException handling request, closing client connection: 17189 The server is configured to only allow SSL connections Even though I specified all the ssl options.
My code is as follows:
var m = require('mongoose');
var key = fs.readFileSync('/home/node/mongodb/mongodb.pem');
var ca = [fs.readFileSync('/home/node/mongodb/ca.pem')];
var o = {
server: {
sslValidate:true,
sslCA: ca,
sslKey: key,
sslCert:key
},
user: '****',
pass: '****'
};
m.connect('mongodb://dbAddr/dbName', o)
I've tried setting sslValidate to false, same issue. I've tried without CA, cert, and/or key in multiple combinations. When I connect via command line it requires me to provide CA, and key+cert PEM file. So I figured the mongoose client would require these as well. I've tried both server and replset keys with the same exact outcome. I've even specified authSource(authDB), even though it appears this is not part of the problem, this still yields the same results.
I'm really confused especially since I have no problem doing this exact same thing via the mongo command.
My mongo shell command is as follows:
mongo --host db1 --ssl --sslPEMKeyFile /etc/mongodb/mongodb.pem --sslCAFile /etc/mongodb/ca.pem -u *** -p *** --authenticationDatabase dbName
Not depicted in the mongoDB node driver documentation, you must also provide the option {server: {ssl: true} in order to connect via SSL. If you do not, the other options are simple ignored.
However, if you dig into the mongoose issue tracker on github you'll find this, which recommends this exactly.
I am running NginX, Node and Mongodb. And it seems that I can't acces the same database from a second app I am running. For example, I don't get anything back when I do:
collection.findOne({
name: someName
}, function(err, results){
// Returns no errors or results. Just stops working.
});
I can access the database perfectly fine from my first app, but not the second one.
This is the code I use to connect to the database in both apps.
Server = require('mongodb').Server,
Db = require('mongodb').Db,
db = new Db('database', new Server('localhost', 27017, { auto_reconnect: true }), { w: true });
Anyone know what the problem might be?
Edit: Does it have something to do with the subdomain or ports? Too many connections?
Edit 2 (more info):
I run mongodb with service mongodb start.
In my /etc/mongodb.conf I have bind_ip = 127.0.0.1 and dbpath=/var/lib/mongodb (rest is default)
In both my apps I run the same code to establish a connection to the database, but only the first one works (I know that because I am able to retrieve information from the database in my first app).
The apps are running on different ports. The first one is running on port 1337 and the second one runs on 3000.
You are using 'localhost' as the host name to connect to this server.
This means you will only be able to connect from the same machine that mongod is running on with that hostname.
Unless all your apps run on the same server as mongod you will need to change your connect code to use the actual hostname of the mongod server.