Kuzzle: How does the autoReconnect argument works in WebSocket? - node.js

i want kuzzle to reconnect on a disconnected event, so i set autoReconnect argument to true as written in my code:
const connectionOptions = {
autoReconnect: true,
// reconnectionDelay: 500,
ssl: true,
port: PORT
}
kuzzle = new Kuzzle(new WebSocket(ADDRESS, connectionOptions), {
autoQueue: true,
autoReplay: true,
offlineMode: 'auto',
queueTTL: 0,
queueMaxSize: 500
})
i check the event of kuzzle with this:
kuzzle.on('reconnected', () => {
console.log('kuzzle reconnected')
})
kuzzle.on('disconnected', () => {
console.log('kuzzle disconnected')
})
But the only thing i see is the disconnected event but never the reconnect. Why ????
PS: i disable and enable my network interfaces to test my code

Related

connection error while connecting to AWS DocumentDB through lambda

getting the following error while connecting to AWS DocumentDB from node.js through lambda
{"errorMessage":"ENOENT: no such file or directory, open
'rds-combined-ca-bundle.pem'","errorType":"Error","stackTrace":["Object.fs.openSync (fs.js:646:18)","Object.fs.readFileSync
(fs.js:551:33)","Object.
(/var/task/base/mongoose.base.js:8:13)","Module._compile
(module.js:652:30)","Object.Module._extensions..js
(module.js:663:10)","Module.load (module.js:565:32)","tryModuleLoad
(module.js:505:12)","Function.Module._load
(module.js:497:3)","Module.require (module.js:596:17)","require
(internal/module.js:11:18)","Object.
(/var/task/library/mongoLib/room.lib.js:1:84)","Module._compile
(module.js:652:30)","Object.Module._extensions..js
(module.js:663:10)","Module.load (module.js:565:32)","tryModuleLoad
(module.js:505:12)","Function.Module._load (module.js:497:3)"]}
here is my node js file in lambda
var ca = fs.readFileSync(path.join('./','rds-combined-ca-bundle.pem'));
var options = {
keepAlive: true,
poolSize: 30,
socketTimeoutMS: 30000,
autoReconnect: true,
reconnectTries: Number.MAX_VALUE,
reconnectInterval: 500,
useCreateIndex: true,
auth: {authdb: 'admin'},
useFindAndModify: false,
sslValidate: true,
sslCA:ca,
useNewUrlParser: true
}
var uri = 'mongodb://'+globalData.getConfigurationSettings("documentdb_username")+':'+globalData.getConfigurationSettings("documentdb_password")+'#'+globalData.getConfigurationSettings("documentdb_server")+':'+globalData.getConfigurationSettings("documentdb_port")+'/'+globalData.getConfigurationSettings("documentdb_db_name")+'?ssl=true&replicaSet=rs0&readPreference=secondaryPreferred';
mongoose.connect(uri, options)
.then(() => console.log('Connection to DB successful'))
.catch((err) => console.error(err,'Error'));
It should be:
var ca = fs.readFileSync(path.join(__dirname + '/rds-combined-ca-bundle.pem'));
Or you can define :
import caBundle from "./rds-combined-ca-bundle.pem";
var options = {
............
sslCA:caBundle,
The error comes from ENOENT: no such file or directory, open 'rds-combined-ca-bundle.pem
it seems that file doesn't exist there. can you check the path? did you embed the cert with lambda?

prevent ffmpeg from opening console window

I have a node/express server which is used to give streams from IP camera to a website. Everything is working well. I run that webserver with PM2 on a windows server.
The problem : for each stream I have a windows console opening with just nothing logged in. The console reopen when I try to close it.
Is there a way to prevent those console to open ?
Here is the related node.js code :
const { NodeMediaServer } = require('node-media-server');
private _initiate_streams(): void{
DatabaseProvider.instance.camerasDao.getCamerasList().pipe(
take(1)
).subscribe(
(databaseReadOperationResult: DatabaseReadOperationResult<ICamera[]>) => {
if (databaseReadOperationResult.successful === true){
const cameras = databaseReadOperationResult.result;
const tasks = [];
cameras.forEach( camera => {
tasks.push(
{
app : config.get('media_server.app_name'),
mode: 'static',
edge: camera.rtsp_url,
name: camera.stream_name,
rtsp_transport: 'tcp'
}
)
});
const configMediaServer = {
logType: 3, // 3 - Log everything (debug)
rtmp: {
port: 1935,
chunk_size: 60000,
gop_cache: true,
ping: 60,
ping_timeout: 30
},
http: {
port: config.get('media_server.port'),
allow_origin: '*'
},
auth: {
play: true,
api: true,
publish: true,
secret: config.get('salt'),
api_user: 'user',
api_pass: 'password',
},
relay: {
ffmpeg: 'C:\\FFmpeg\\bin\\ffmpeg.exe',
tasks: tasks
}
};
var nms = new NodeMediaServer(configMediaServer)
nms.run();
} else {
// catch exception
}
}
);
}

How to auto reconnect if mongo connection fails in node.js using mongoose?

I have mongodb up and running in a docker container. I stop the container and node returns a MongoError. I restart the container and node continues to throw the same MongoError.
I would like for it to reconnect when there was an issue.
const uri: string = this.config.db.uri;
const options = {
useNewUrlParser: true,
useCreateIndex: true,
autoIndex: true,
autoReconnect: true,
},
mongoose.connect(uri, options).then(
() => {
this.log.info("MongoDB Successfully Connected On: " + this.config.db.uri);
},
(err: any) => {
this.log.error("MongoDB Error:", err);
this.log.info("%s MongoDB connection error. Please make sure MongoDB is running.");
throw err;
},
);
How do i setup mongoose to try and auto connect when there is a connection failure to mongodb.
I found my answer, instead of checking error events and reconnecting like others have suggested. There are some options you can set that will handle auto-reconnect.
Here are the set of mongoose options i am now using.
const options = {
useNewUrlParser: true,
useCreateIndex: true,
autoIndex: true,
reconnectTries: Number.MAX_VALUE, // Never stop trying to reconnect
reconnectInterval: 500, // Reconnect every 500ms
bufferMaxEntries: 0,
connectTimeoutMS: 10000, // Give up initial connection after 10 seconds
socketTimeoutMS: 45000, // Close sockets after 45 seconds of inactivity
}
You can test it works by starting and stoping mongodb in a container and checking your node application.
For furuther information refer to this part of the documentation. https://mongoosejs.com/docs/connections.html#options

Socket.get callback not triggered in socket.on function

I've been stuck on this issue for a while the answer might be really basic but I fail to understand what the problem is. AFAIU It execute the function but doesnt trigger the callback and I dont know why.
My script aim to have both a tcp server to have a device (raspberry pi) that connect a tcp socket and a client to connect to a websocket on a sailsjs app.
I manage to have both this thing running on the following code, the problem is they only work separatly, simultanuously but separatly, when I try a get outside the socket everything works fine but when I do inside, the io.socket object is just piling up the get request in a requestQueue.
{ useCORSRouteToGetCookie: true,
url: 'http://localhost:1337',
multiplex: undefined,
transports: [ 'polling', 'websocket' ],
eventQueue: { 'sails:parseError': [ [Function] ] },
query:'__sails_io_sdk_version=0.11.0&__sails_io_sdk_platform=node&__sails_io_sdk_language=javascript',
_raw:
{ socket:
{ options: [Object],
connected: true,
open: true,
connecting: false,
reconnecting: false,
namespaces: [Object],
buffer: [],
doBuffer: false,
sessionid: '0xAlU_CarIOPQAGUGKQW',
closeTimeout: 60000,
heartbeatTimeout: 60000,
origTransports: [Object],
transports: [Object],
heartbeatTimeoutTimer: [Object],
transport: [Object],
connectTimeoutTimer: [Object],
'$events': {} },
name: '',
flags: {},
json: { namespace: [Circular], name: 'json' },
ackPackets: 0,
acks: {},
'$events':
{ 'sails:parseError': [Function],
connect: [Object],
disconnect: [Function],
reconnecting: [Function],
reconnect: [Function],
error: [Function: failedToConnect],
undefined: undefined } },
requestQueue:
[ { method: 'get', headers: {}, data: {}, url: '/', cb: [Function] },
{ method: 'get', headers: {}, data: {}, url: '/', cb: [Function] } ] }
The code is the following :
//library to connect to sailsjs websockets
var socketIOClient = require('socket.io-client');
var sailsIOClient = require('sails.io.js');
//library to do the tcp server
var net = require('net');
// Instantiate the socket client (`io`)
// (for now, you must explicitly pass in the socket.io client when using this library from Node.js)
var io = sailsIOClient(socketIOClient);
// Set some options:
// (you have to specify the host and port of the Sails backend when using this library from Node.js)
io.sails.url = 'http://localhost:1337';
var server = net.createServer(function(tcpSocket) { //'connection' listener
//socket was sucessfully connected
console.log('client connected');
//notify on deconnection
tcpSocket.on('end', function() {
console.log('client disconnected');
});
// Handle incoming messages from clients.
tcpSocket.on('data', function (data) {
console.log(data.toString('utf8', 0, data.length));
//if data is PING respond PONG
if(data.toString('utf8', 0, 4)=='PING'){
console.log('I was pinged');
tcpSocket.write('PONG\r\n');
}
console.log(io.socket);//debugging purpose
//trigger a socket call on the sails app
io.socket.get('/', function (body, JWR) {
//display the result
console.log('Sails responded with: ', body);
console.log('with headers: ', JWR.headers);
console.log('and with status code: ', JWR.statusCode);
});
});
});
server.listen(8124, function() { //'listening' listener
console.log('server bound');
});
It looks like your socket isn't autoconnecting. Try connecting manually:
// Instantiate the socket client (`io`)
// (for now, you must explicitly pass in the socket.io client when using this library from Node.js)
var io = sailsIOClient(socketIOClient);
// Set some options:
// (you have to specify the host and port of the Sails backend when using this library from Node.js)
io.sails.url = 'http://localhost:1337';
var socket = io.sails.connect();
socket.on('connect', function() {
... connect TCP server and continue ...
});
I found a solution, I just got rid of sails.io.js and used plain socket.io it now works as intended feel free to explain though why it didnt in sails.io.js
//library to connect to sailsjs websockets
var socketIOClient = require('socket.io-client');
//var sailsIOClient = require('sails.io.js');
//library to do the tcp server
var net = require('net');
var socket=socketIOClient.connect('http://localhost:1337', {
'force new connection': true
});
var server = net.createServer(function(tcpSocket) { //'connection' listener
//socket was sucessfully connected
console.log('client connected');
//notify on deconnection
tcpSocket.on('end', function() {
console.log('client disconnected');
});
// Handle incoming messages from clients.
tcpSocket.on('data', function (data) {
console.log(data.toString('utf8', 0, data.length));
console.log(data.toString('utf8', 0, data.length));
//if data is PING respond PONG
if(data.toString('utf8', 0, 4)=='PING'){
console.log('I was pinged');
tcpSocket.write('PONG\r\n');
}
if(data.toString('utf8', 0, 4)=='test'){
socket.emit('test',{message : 'test'});
//io.socket.disconnect();
}
});
});

socket.io client persistent retries to unreachable host

I'm trying to get a persistent connection from my socket.io-client (running on Node.js) to a remote websocket. I do not have control over the remote socket, and sometimes it can go down entirely. I would like to attempt to reconnect() whenever an error or disconnect occurs. In the following example, I'm trying to test the case where the remote host is refusing a connection. In this case, I would like to attempt to reconnect after 1 second. It calls a second time, and exits.
Here's the code:
var events = require('events'),
util = require('util'),
io = require('socket.io-client'),
url = "ws://localhost:12345", // intentionally an unreachable URL
socketOptions = {
"transports" : [ "websocket" ],
"try multiple transports" : false,
"reconnect" : false,
"connect timeout" : 5000
};
// The goal is to have this socket attempt to connect forever
// I would like to do it without the built in reconnects, as these
// are somewhat unreliable (reconnect* events not always firing)
function Test(){
var self = this;
events.EventEmitter.call(self);
var socket;
function reconnect(){
setTimeout(go, 1000);
}
function go(){
console.log("connecting to", url, socketOptions);
socket = io.connect(url, socketOptions);
socket.on('connect', function(){
console.log("connected! wat.");
});
socket.on('error', function(err){
console.log("socket.io-client 'error'", err);
reconnect();
});
socket.on('connect_failed', function(){
console.log("socket.io-client 'connect_failed'");
reconnect();
});
socket.on('disconnect', function(){
console.log("socket.io-client 'disconnect'");
reconnect();
});
}
go();
}
util.inherits(Test, events.EventEmitter);
var test = new Test();
process.on('exit', function(){
console.log("this should never end");
});
When running it under node 0.11.0 I get the following:
$ node socketio_websocket.js
connecting to ws://localhost:12345 { transports: [ 'websocket' ],
'try multiple transports': false,
reconnect: false,
'connect timeout': 5000 }
socket.io-client 'error' Error: connect ECONNREFUSED
at errnoException (net.js:878:11)
at Object.afterConnect [as oncomplete] (net.js:869:19)
connecting to ws://localhost:12345 { transports: [ 'websocket' ],
'try multiple transports': false,
reconnect: false,
'connect timeout': 5000 }
this should never end
The ECONNREFUSED is an exception you don't manage.
Try with this:
process.on('uncaughtException', function(err) {
if(err.code == 'ECONNREFUSED'){
reconnect();
}
}
Edit
Modify the options like this:
socketOptions = {
"transports" : [ "websocket" ],
"try multiple transports" : false,
"reconnect" : false,
'force new connection': true, // <-- Add this!
"connect timeout" : 5000
};
and the reconnect function (look in the comments for the explanation)
function reconnect(){
socket.removeAllListeners();
setTimeout(go, 1000);
}
Probably socket.io reuse the same connection without creating a new one, forcing it the app works

Resources