Memcached Could not connect to Remote Server | memcached.js - node.js

My local environment for API
node -v: v8.9.4
npm -v: 5.6.0
Package
memcached.js: "memcached": "^2.2.2"
We have a Node API in which we are using package memcached.js to connect to Memcache server with below configurations.
MEMCACHED_CONFIG:
{
MAX_VALUE: 1024,
SERVER: "X.X.X.X",
PORT: 11211,
COMPLETE_PATH: "X.X.X.X:11211",
CACHE_TIMEOUT: 3600,
POOL_SIZE: 50,
maxKeySize: 1024,
timeout: 5000
}
So X.X.X.X is remote server IP where our Memcache server is running.
and I am able to connect this X.X.X.X server from my system by using telnet command like c:/> telnet X.X.X.X 11211 and it works.
cacheUtility.js
var MEMCACHED_CONFIG= require('./MEMCACHED_CONFIG');
var Memcached = require('memcached');
Memcached.config.maxValue = MEMCACHED_CONFIG.MAX_VALUE;
Memcached.config.poolSize = MEMCACHED_CONFIG.POOL_SIZE;
Memcached.config.maxKeySize= MEMCACHED_CONFIG.maxKeySize;
Memcached.config.timeout= MEMCACHED_CONFIG.timeout;
var memcached = new Memcached();
memcached.connect(MEMCACHED_CONFIG.COMPLETE_PATH, function( err,
conn ){
if( err ) {
CONFIG.CONSOLE_MESSAGE("Cache Connect Error "+conn.server);
}
});
We are using above code to connect to Memcached Server and as you can see remote server IP is coming from MEMCACHED_CONFIG.
My issue is that it is always trying to connect to 127.0.0.1 server instead of passing Remote Memcached Server. So in order to run it, I have to make changes in the memcached.js file of the core package.
C:\BitBucketProjects\Licensor Server\node_modules\memcached\lib\memcached.js
function Client (args, options) {
var servers = []
, weights = {}
, regular = 'localhost:11211'
//, regular = 'X.X.X.X:11211'
, key;
I don't want to make any change in core package.
Why is it not connecting to the given server?

When you have memcached server setup on a different machine than the server using it then always mention the server IP and options otherwise it defaults to localhost. You can see that if you view the "server" property of the client (using NodeJs memcached client version 2.2.2):
var Memcached = require('memcached');
var memcached = new Memcached();
console.log(memcached.server);
Seems to be some issue with the "memcache.connect" method as it does not override the localhost server. To make it work, you have to mention the IP of memcached server in the constructor as mentioned in the documentation:
var Memcached = require('memcached');
var memcached = new Memcached('192.168.10.10:11211');
Now you should be able to connect to the server without an issue if you have the 11211 port opened on the host. If not allowed, you can execute the following command on Memcached host to open the port:
$ sudo ufw allow 11211
To ensure you are able to connect to memcached server use following command:
telnet 192.168.10.10:11211
If even that does not work, your server might have stopped working so you need to start it either as a service or as a process:
Start as a process:
$ memcached -u memcached -d -m 30 -l 192.168.10.10 -p 11211
Start as a service:
$ sudo systemctl start memcached
OR
$ sudo service memcached start
Just for reference for those who might not know, to expose memcached server on the network, you can either specify the IP and port like in the command above or in the memcached configuration file. To provide default configuration, look for "-l 127.0.0.1" in the following file and replace the loopback address with your host server's network IP:
$ sudo nano /etc/default/ufw
Of course, above commands will work only if you have memcached installed on the server, if not installed then run the following to install it first:
$ sudo apt-get install memcached
I hope it helps.

Related

psql: error: could not connect to server: No such file or directory

i have installed Postgresql and struggling to configure it, tried reinstalling but still facing issue, i removed all the files and then installed postgre 9.6 version, but getting below issues.
9.6 main 5432 down postgres /var/lib/postgresql/9.6/main /var/log/postgresql/postgresql-9.6-main.log
When i try to run postgre by using sudo -u postgres psql it gives below output
psql: could not connect to server: No such file or directory
Is the server running locally and accepting
connections on Unix domain socket "/var/run/postgresql/.s.PGSQL.5432"?
i have tried from other questions as well but can't fix the problem.
Very likely your PostgreSQL client is configured with a different socket directory (/var/run/postgresql) than your server. Check the unix_socket_directories configuration parameter in postgresql.conf.
Chances are that the server is listening on the default dirextory /tmp. Try
psql -h /tmp ...
Of course it could also be that the server is listening on a different port, e.g. 5555 (configuration parameter port). Then run
psql -p 5555 ...

connection refused connecting to remote mongodb server

So we've accumulated enough applications in our network that use MongoDB to justify building a dedicated server specifically for MongoDB. Unfortunately, I'm pretty new to mongodb (coming from SQL/MySQL derivatives). I have followed several guides on installing and configuring mongodb for my environment. None are perfect, but I think I'm close... I've have managed to get to a point that I can connect to the db server from the local server using the following command:
mongo -u user 127.0.0.1/admin
However, I'm NOT able to connect to the server using this from either the local OR a remote computer using it's network address, IE:
mongo -u user 192.168.24.102/admin
I've tried both with authentication enabled and disabled, and I've tried setting the bindIP to 192.168.24.102 and 0.0.0.0 with no love. Thinking it was a Firewall issue, I disabled the firewall entirely... same. no love...
so what's the secret sauce? how do I connect to a MongoDB server remotely?
Some notes to know: This server is on a local network only. There will be some NAT shenanigans at some point directing public traffic to it from remote application servers, but only specific ports (we will NOT be using 27017 when that happens) and it will sit behind a pretty robust firewall appliance, so I'm not worried about securing the server as I about securing MongoDB itself.
This answer assume a setup where a Linux server is completely remote and has MongoDB already installed.
Steps:
1. Connect to your remote server over SSH.
ssh <userName>#<server-IP-address>
2. Start Mongo shell and add users to MongoDB.
Add the admin;
use admin
db.createUser(
{
user: "AdminSammy",
pwd: "AdminSammy'sSecurePassword",
roles: [
{"userAdminAnyDatabase",
"dbAdminAnyDatabase",
"readWriteAnyDatabase"}
]
}
)
Then add general user/users. Users are added to specific databases.
use some_db
db.createUser({
user: 'userName',
pwd: 'secretPassword',
roles: [{ role: 'readWrite', db:'some_db'}]
})
3. Edit your MongoDB config file, mongod.conf, that is found in etc directory.
sudo vim /etc/mongod.conf
Scroll down to the #security: section and add the following line. Make sure to un-comment the security: line.
security:
authorization: 'enabled'
After authorization has been enabled only those authenticated with password will access the database. In this case these are the ones added in step 2 above.
Note: Visual Studio code can also be used over SSH to edit the mongo.conf file.
4. Add remote server's IP address to mongod.conf file.
Look for the net line and add the IP address of the server that is hosting this MongoDB installation, example 178.45.55.88
# network interfaces
net:
port: 27017
bindIp: 127.0.0.1, 178.45.55.88
5. Open port 27017 on your server instance.
This allows access to your MongoDB server from anywhere in the world to anyone who knows your remote server IP address. This is one reason to have authenticated users. More robust ways of handling security are really important! Consult MongoDB manual for that.
Check firewall status using ufw.
sudo ufw status
If its not active, activate it.
sudo ufw enable
Then,
sudo ufw allow 27017
Important: You also need to allow port 22 for your SSH communication with your remote server. Otherwise you will be locked out from your remote server. Assumption here is that SSH uses port 22 for communication, the default.
sudo ufw allow 22
6. Restart Mongo daemon (mongod)
sudo systemctl restart mongod
7. Connect to remote Mongo server using Mongo shell
You can now connect to the remote MongoDB server using the following command.
mongo -u <user-name> -p <user-password> <remote-server-IP-address>:<mongo-server-port>
You can also connect to the remote MongoDB server with authentication:
mongo -u <user-name> -p <user-password> <remote-server-IP-address>:<mongo-server-port> --authenticationDatabase <auth-db-name>
You can also connect to a specific remote MongoDB database with authentication:
mongo -u <user-name> -p <user-password> <remote-server-IP-address>:<mongo-server-port>/<db-name> --authenticationDatabase <auth-db-name>
At this moment you can read and write within the some_db database from your local computer without ssh.
Important: Put into consideration the standard security measures for any database. Local security practices should guide what to do at any of the above steps.

How to connect via mongoose and a ssh tunnel

I have setup my mongod.conf as follows so it only allows localhost connection.
storage:
dbPath: /var/lib/mongodb
journal:
enabled: true
systemLog:
destination: file
logAppend: true
path: /var/log/mongodb/mongod.log
net:
port: 27017
bindIp: 127.0.0.1
I then want my site to ssh into the mongodb so the port has to be converted to localhost.
However how can I integrate this with mongoose's connect function?
mongoose.connect(configDB.url, function(err){
if (err){
console.log('Error connecting to mongodb: ' + err)
}
});
I have found the following command but I am not sure if this is what I need:
ssh -L 4321:localhost:27017 -i ~/.ssh/ssh_key user#ip-adress
This should ssh me via port 4321 to the localhost right? So I think I need something like this in the nodejs mongoose's connect function. I've tried to read up on this on the mongodb security tutorials but I cannot link their instructions to nodejs at all. Anyone who has experience with this?
You're nearly there. Set up the tunnel independent of node:
ssh -Nf -p [db_server_ssh_port] [mongo_user]#[mongo_domain] -L \
[local_db_port]:localhost:[remote_db_port]
And then within node, connect to mongo using [local_db_port]:
mongoose.connect(
"mongodb://localhost:[local_db_port]/[db_name]",
{"pass":"[db_pwd]"}
)
All the traffic sent to [local_db_port] on the web server will be sent through the tunnel to port [remote_db_port] on [mongo_domain]. The following post gives more info. It's connecting to a MySQL database, but the principle is the same.
Connect to MySQL using SSH Tunneling in node-mysql
Set up the tunnel independent of node:
ssh -L [your given port]:localhost:27017 [username of ssh]#[ip address of ssh matchine] -f -N
after that you have include your given port for mongo database.
In the nodejs you have to setup for mongoose connection like this
'mongodb://localhost:[your given port number]/[database name]'
enjoy it

connect to mongodb on separate ec2 instance

I am running two different instances on AWS, one for node application and other for mongoDB. I am trying to connect to mongoDB on other instance but not able to and failing with "504 Gateway timed out".
My db_conf.js to connect to node application is like below:
var express = require('express');
var mongodb = require('mongodb');
var url = "mongodb://<PUBLIC IP of mongoDB instance>:27017/local";
module.exports = url;
I have commented the "bind_ip" in mongodb.conf and restarted mongoDB.
Also, I have opened the port 27017 for the node application server's public IP from the security groups of mongoDB instance for both inbound and outbound but of no use.
Please suggest a way to achieve this(if there is any). Thanks in advance :)
All looks correct here, you do correctly open port 27017 on the mongoDB instance. Also, you may want to try and kill the original mongod server process, and restart it explicitly pointing to your config file so that it knows to use those settings. Something like the following should accomplish this (assuming your on a linux machine and your config file is in the default folder)-
sudo kill mongod PID
and then
sudo mongod --fork --config /etc/mongod.conf

Sequelize connection over ssh

I am not able to connect to a remote mysql server using sequelize but I am able to ssh into the machine and connect to mysql.
How do I make Sequelize connect to mysql server over ssh rather than directly in node?
You set up an SSH tunnel to your server, which listens on (for example) port 33060:
ssh -NL 33060:localhost:3306 yourserver
In another window/terminal, you can use the MySQL client (assuming that it's installed locally) to connect to the remote MySQL server over the SSH tunnel:
mysql --port 33060 --host 127.0.0.1
If that works, it's just a matter of changing the port number in Sequelize:
var sequelize = new Sequelize('database', 'username', 'password', {
host: "127.0.0.1",
port: 33060
});
If it doesn't work, it might be that the remote MySQL server isn't configured to accept connections over TCP. If that's the case, you should configure your MySQL server to get it to accept TCP connections.

Resources