I'm having a weird issue, I believe I am just overlooking something or do not understand the Node module.exports mechanism correctly.
I am trying to create a database service that holds my PouchDB connections and other service can simply require the database service. I am also replicating the remote database to a local database for performance reasons.
The database service looks something like this.
database.service.js
var __remote = new PouchDB(config.connectionstring.datastore);
var __local = new PouchDB("pouch-local-database");
var __replicationHandle = __startReplication();
function __startReplication() {
return __local.sync(__remote, {
live: true,
retry: true
}).on('change', function (change) {
logger.info('Database Sync New Data');
}).on('paused', function (info) {
logger.info('Database Sync Paused');
}).on('active', function (info) {
logger.info('Database Sync Active');
}).on('error', function (err) {
logger.error('Database Sync Error', err);
});
}
function __resync(){
__replicationHandle.cancel();
__local.destroy()
.then(function(data){
__local = new PouchDB("pouch-local-database");
__replicationHandle = __startReplication();
})
.catch(function(err){
logger.error('Database Destroy Error', err);
});
}
module.exports = {
data : __local,
resync: __resync
};
The two issues I am getting is:
The replication works like a charm when I host it locally and the app responsiveness is very good, but when I upload it to my Azure Web App (free tier), the replication does not work. And it seems that I cannot use the local database. I don't think is a permission problem since my logger is writing to a file in the wwwroot directory.
When I execute the resync method, all other services depending on the database stop working giving a "database destroyed" error. This does not make sense since I am assigning a new Pouch after destroying it.
I have considered replacing the module.exports with something like:
module.exports = {
data : new PouchDB("pouch-local-database"),
resync: __resync
};
Thanks guys! I am overlooking something obvious!
Related
I'm developing a NodeJS web app to receive Real Time updates from Firestore DB through Admin SDK.
This is the init code for the Firestore object. It's executed just once, when the app is deployed (on AWS Elastic Beanstalk):
const admin = require('firebase-admin');
var serviceAccount = require('./../key.json');
admin.initializeApp({
credential: admin.credential.cert(serviceAccount)
});
var db = admin.firestore();
FUNC.firestore = db;
Then I use this firestore object in a websocket comunication to send realtime updates to browser. The idea is to use the server as a proxy between browser and Firestore.
socket.on('open', function (client) {
var query = FUNC.firestore.collection("notifications").doc(client.user.id.toString()).collection("global");
query.onSnapshot(querySnapshot => {
querySnapshot.docChanges().forEach(change => {
client.send({ id: change.doc.id, body: change.doc.data(), type: change.type });
});
}, err => {
console.log(`Encountered error: ${err}`);
});
});
socket.on('close', function (client) {
var unsub = FUNC.firestore.collection("notifications").doc(client.user.id.toString()).collection("global").onSnapshot(() => {
});
unsub();
});
It works well for a while, but after few hours the client stop receiving onSnapshot() updates, and after a while the server log the error: Encountered error: Error: 10 ABORTED: The operation was aborted.
What's wrong? Should I initialized firestore on each connection? Is there some lifecycle mistake?
Thank you
EDIT (A very bad solution)
I've tried to create a single firebase-admin app instance for each logged user and changed my code in this way
const admin = require('firebase-admin');
var serviceAccount = require('./../key.json');
admin.initializeApp({
credential: admin.credential.cert(serviceAccount)
});
FUNC.getFirestore = function (user) {
try {
user.firebase = admin.app(user.id.toString());
return user.firebase.firestore();
} catch(e) {
//ignore
}
var app = admin.initializeApp({
credential: admin.credential.cert(serviceAccount)
}, user.id.toString());
user.firebase = app;
return user.firebase.firestore();
}
FUNC.removeFirebase = function (user) {
if (user.firebase) {
user.firebase.delete();
}
}
And then socket listeners:
self.on('open', function (client) {
var query = FUNC.getFirestore(client.user).collection("notifications").doc(client.user.id.toString()).collection("global");
query.onSnapshot(querySnapshot => {
querySnapshot.docChanges().reverse();
querySnapshot.docChanges().forEach(change => {
client.send({ id: change.doc.id, body: change.doc.data(), type: change.type });
});
}, err => {
console.log(`Encountered error: ${err}`);
});
});
self.on('close', function (client) {
var unsub = FUNC.getFirestore(client.user).collection("notifications").doc(client.user.id.toString()).collection("global").onSnapshot(() => {
});
unsub();
FUNC.removeFirebase(client.user);
});
So when a client disconnect for a reason the server removes its firebase app, it works, but I've noticed a huge memory leak on server, I need some help
UPDATED ANSWER
After many reaserch I've understand that this kind of approach is wrong. Of course, the old answer could be a workaround but is not the real solution of the problem, because Firestore was not designed to do something like: Firestore <--(Admin SDK)--> Server <--(WebSocket)--> Client.
In order to create the best comunication I have understand and applied Firestore Security Rules (https://firebase.google.com/docs/firestore/security/get-started) together with Custom token generation (https://firebase.google.com/docs/auth/admin/create-custom-tokens). So the correct flow is:
Client login request --> Server + Admin SDK generate custom auth token and return to client
Then, the real time comunication will be only between Client and Firestore itself, so: Client + Custom Auth Token <--(Firebase JS SDK)--> Firestore DB
As you can see, the server is not involved anymore in real-time comunication, but client receive updates directly from Firestore.
OLD ANSWER
Finally I can answer from myself. First of all the second solution I've tried is a very bad one, because each new app created through Admin SDK is stored in RAM, with 20/30 users the app reaches more then 1GB of RAM, absolutely unacceptable.
So the first implementation was the better solution, anyway I've wrong the register/unregister onSnapshot listener lifecycle. Each onSnapshot() call returns a different function, even if called on the same reference. So, instead of close the listener when socket close, I opened another one. This is how should be:
socket.on('open', function (client) {
var query = FUNC.firestore.collection("notifications").doc(client.user.id.toString()).collection("global");
client.user.firestoreUnsub = query.onSnapshot(querySnapshot => {
querySnapshot.docChanges().forEach(change => {
client.send({ id: change.doc.id, body: change.doc.data(), type: change.type });
});
}, err => {
console.log(`Encountered error: ${err}`);
});
});
socket.on('close', function (client) {
client.user.firestoreUnsub();
});
After almost 48h, listeners still works without problems and no memory leaks occurs.
The project in question is using 11 PouchDBs. To ensure syncing, all 11 DBs were instantiated (with sync) when the Angular 5 application was loaded/bootstrapped. Since sync did not function (due to the limitations set by browsers) we moved towards socket-pouch as a solution. We disabled sync on all DBs and incorporated socket-pouch sync to only one DB. The socketPouchServer runs on localhost:5000 & the CouchDB is hosted on DigitalOcean.
On running the system,
the following logs are observed in the browser. As you can see, an "aborting" error is being logged.
https://user-images.githubusercontent.com/26055473/38247973-8eef489a-3764-11e8-8411-6b5b20436d19.png
the code for the same is
import PouchDB from 'pouchdb';
import PouchDBFind from 'pouchdb-find';
import SocketPouchClient from 'socket-pouch/client';
PouchDB.plugin(PouchDBFind);
PouchDB.adapter('socket', SocketPouchClient);
PouchDB.debug.enable('pouchdb:socket:*');
this.dailyMovementDB = new
PouchDB(`${username}_${environment.REQUIRED_DB_VERSION_NUMBER}_daily-movement`, { auto_compaction: true });
this.dailyMovementDBRemote = new PouchDB(
{
adapter: 'socket',
name: `${username}_${environment.REQUIRED_DB_VERSION_NUMBER}_daily-movement`,
url: `${environment.REMOTE_COUCH_DB_BASE_URL}`
});
var syncHandler = this.dailyMovementDB.replicate.to(this.dailyMovementDBRemote
).on('change', function (change) {
// yo, something changed!
console.log('yo, something changed', change);
instance.autosaveMessageService.syncingProcessEndedSubject.next(false);
}).on('paused', function (info) {
console.log('replication was paused, usually because of a lost connection', info);
// replication was paused, usually because of a lost connection
instance.autosaveMessageService.syncingProcessEndedSubject.next(true);
}).on('active', function (info) {
// replication was resumed
console.log('replication was resumed', info);
instance.autosaveMessageService.syncingProcessStartedSubject.next();
}).on('denied', function (info) {
// handle complete
console.log('denied', info);
instance.autosaveMessageService.syncingProcessEndedSubject.next(false);
}).on('complete', function (info) {
// handle complete
console.log('handle complete', info);
instance.autosaveMessageService.syncingProcessEndedSubject.next(false);
}).on('error', function (err) {
console.log(err);
// instance.createDailyMovementPouchDBs(username);
// totally unhandled error (shouldn't happen)
});
And the following logs appear on localhost:5000 (socketPouchServer)
https://user-images.githubusercontent.com/26055473/38248011-ac49b9ca-3764-11e8-8501-74576c8c1e1f.png
the following is the code for the socketPouchServer
var socketPouchServer = require('socket-pouch/server');
const PouchDB = require('pouchdb');
socketPouchServer.listen(5000, {
remoteUrl: 'http://remoteurl:5984',
}, () => {
console.log('Hi');
});
Please guide how to resolve this issue.
First of all, this is one of my first projects in Node.js so I'm very new to it.
I have a project I want to make that is a SOAP (I know, SOAP... backwards compatibility, huh?) interface that connects to an Oracle database.
So I have a WSDL describing what these functions look like (validation for addresses and stuff) and I have a connection to the database.
Now when using the SOAP npm module, you need to create a server and listen using a service that allows you to respond to requests. I have a separate file that contains my SOAP service but this service should do queries on the database to get its results.
How would I go about sort of 'injecting' my database service into my SOAP service so that whenever a SOAP call is done, it orchestrates this to the correct method in my database service?
This is what my code looks like:
databaseconnection.js
var oracledb = require('oracledb');
var dbConfig = require('../../config/development');
var setup = exports.setup = (callback) => {
oracledb.createPool (
{
user : dbConfig.user,
password : dbConfig.password,
connectString : dbConfig.connectString
},
function(err, pool)
{
if (err) { console.error(err.message); return; }
pool.getConnection (
function(err, connection)
{
if (err) {
console.error(err.message);
return callback(null);
}
return callback(connection);
}
);
}
);
};
databaseservice.js
var DatabaseService = function (connection) {
this.database = connection;
};
function doSomething(callback) {
if (!this.database) { console.log('Database not available.'); return; }
this.database.execute('SELECT * FROM HELP', function(err, result) {
callback(result);
});
};
module.exports = {
DatabaseService: DatabaseService,
doSomething: doSomething
};
soapservice.js
var myService = {
CVService: {
CVServicePort: {
countryvalidation: function (args, cb, soapHeader) {
console.log('Validating Country');
cb({
name: args
});
}
}
}
};
server.js
app.use(bodyParser.raw({type: function(){return true;}, limit: '5mb'}));
app.listen(8001, function(){
databaseconnection.setup((callback) => {
var temp = databaseservice.DatabaseService(callback);
soapservice.Init(temp);
var server = soap.listen(app, '/soapapi/*', soapservice.myService, xml);
databaseservice.doSomething((result) => {
console.log(result.rows.length, ' results.');
});
});
console.log('Server started');
});
How would I go about adding the databaseservice.doSomething() to the countryvalidation soap method instead of 'name: args'?
Also: I feel like the structure of my code is very, very messy. I tried finding some good examples on how to structure the code online but as for services and database connections + combining them, I didn't find much. Any comments on this structure are very welcome. I'm here to learn, after all.
Thank you
Dieter
The first thing I see that looks a little off is the databaseconnection.js. It should be creating the pool, but that's it. Generally speaking, a connection should be obtained from the pool when a request comes in and release when you're done using it to service that request.
Have a look at this post: https://jsao.io/2015/02/real-time-data-with-node-js-socket-io-and-oracle-database/ There are some sample apps you could have a look at that might help. Between the two demos, the "employees-cqn-demo" app is better organized.
Keep in mind that the post is a little dated now, we've made enhancements to the driver that make it easier to use now. It's on my list to do a post on how to build a RESTful API with Node.js and Oracle Database but I haven't had a chance to do it yet.
I am using Promised-Mongo to connect MongoDB with Promises from NodeJS backend code. It worked fine, until I enabled MongoDB's client access control. When I run this code, I get "could not authenticate" message":
var pmongo = require('promised-mongo').compatible();
var db = pmongo('myusername:mypassword#localhost/mydb', ['candidates']);
db.candidates.save(req.body)
.then(function () {
// never reached here
})
.catch(function (e) {
// it reached here, where e.message says "could not authenticate"
});
Pure MongoDB code (i.e. no Promises...) works fine:
var mongodb = require('mongodb');
var uri = 'mongodb://myusername:mypassword#localhost/mydb';
mongodb.MongoClient.connect(uri, function (err, db) {
if (err) {
// never reached here
}
var candidates = db.collection('candidates');
candidates.insert(req.body, function(err, result) {
if (err) {
// never reached here
}
res.send('{result: success}');
});
});
Any idea?
Per several issues in the github repository (see here and here) it looks like using this library with authentication is totally broken. Per the second link, most people seem to be wrapping the official library with a promise via something like promisify, bluebird, or a thin custom wrapper.
I am trying to run the simplest hello world with Node.js and the mssql package.
https://www.npmjs.org/package/mssql
I create a new folder, with an empty JS file (app.js)
I copy and paste the sample from the mssql package page to the js file.
I only change the config object with my DB connection settings.
I run npm install mssql which is successful.
I run node app.js
What happens is that the code doesn't get into the callback after creating a connection. So in the code below:
var connection = new sql.Connection(config, function(err) {
alert(1);
...
//more code...
});
I never get to the alert. No exceptions or errors either
I am probably missing something... Can you please help me spot it?
Update: I should mention that the DB is on Azure...
Try this on your server side it works fine on my end:
var sql = require("mssql");
var dbConfig = {
user:'sa',
password:'password1',
server:'serverName',
database:'DBName'
};
var connection = new sql.Connection(dbConfig, function (err) {
console.log(err);
var request = new sql.Request(connection);
request.query("Select 'Hello World' as var1", function (err, recordset, returnValue) {
if (!err ){
console.log(recordset) ;
}else{
console.log(err)
}
});
});
OK, after digging a bit in the docs for Tedious, I found out that if the DB is on Azure you must include options: {encrypt: true} in your configuration object.
Now everything is working as expected.