How to sync GUN db when new peer connects - node.js

I have set up a simple node.js app as a proof of concept, where I want peers on a local network sync a database using gun.
I am new to gun so I am not sure if I am doing this correctly but here is my code:
var Gun = require('gun')
const address = require('network-address')
const hashToPort = require('hash-to-port')
// get username from arg eg. node index myname
const username = process.argv[2]
// create GUN server on it's own port
var server = require('http').createServer().listen(hashToPort(username))
var gun = Gun({web: server})
// listen for input from the console
process.stdin.on('data', (data) => {
gun.get('hello').put({ word: data.toString(), user: username })
});
// Output input update
gun.get('hello').on(function(data, key) {
console.log(data.user + ' said: ' + data.word.toString())
})
The idea is that peers can drop out and reconnect and sync to the latest version of the database.
I run the app on 2 different local network machines and it works well. The database is syncing.
If I close one app, then update the database on the open app, and then restart the 2nd app, the 2nd app does not sync with the already open app.
Is there a way to sync with the updated db when a new peer connects?
I hope that all makes sense. Please suggest if this is the wrong way to go about it.

#CUGreen I'm glad that the local area multicast sync is working!
If I understand your question right, it is that you want OLD data to synced?
gun.get('hello').put(data) and .on(cb) update the same object. So technically, you are syncing the whole database, you're just always getting the latest state. (Unless there is some other bug? Please let me know).
What you probably want to do is .set(data) instead of .put(, this will add a NEW record to a table on hello, which you can then query all of it (old and live inserts in future) with gun.get('hello').map().on(cb)
I do not know if this is relevant, but you may find https://gun.eco/docs/Graph-Guide a nice introduction to put/set, etc.
And of course, if you need any assistance, there is a super friendly and active community at the http://chat.gun.eco chat room!
If there is a bug, please report it at https://github.com/amark/gun/issues

Related

Loopback event stream track changes in a model on server side

I have stuck in a situation that I want to sync two mongoDB one is at local machine and second is at the remote (on a mlab sandbox). for sync I am doing as:
first I dump the collection which have changes using mongodump
next by using mongorestore I am restoring that collection on a remote MongoDB.
But there are two problems which I am facing.
First is how can I update the collection on a remote mongoDB.
for example : on change collection should I replace the whole collection on remote side or use other way ? and what is best way for doing this.
And the second one problem is how can I detect changes in a collection. or in a whole database. I am using loopback framework and event-stream npm module for sending changes to the client side. but I unable to read change stream on server side.
my server\boot\realtime.js is :
var es = require('event-stream');
var sync = require('../sync');
module.exports = function(app) {
var completeOrder = app.models.completeOrder;
completeOrder.createChangeStream(function(err, changes) {
sync('completeOrder',function(data){
console.log(data);
},function(err){
console.log(err);
});
changes.pipe(es.stringify()).pipe(process.stdout);
});
}
On second problem LoopBack doesn’t send events when there is change in data in underlying DB but only when there is change in model. Mongo 3.6 provides change stream functionality and you can use it to trigger model change/perform your functionality.

Waiting for PouchDB get to get a synced version from the server

I am using PouchDB to sync a database between a device and a server.
When installing my App on a new device I need to pull down the user's settings document from the server. If I do the following on a new device when the App has previously been run on another device and created the user settings:
var _DB = new PouchDB(localDBName);
_DB.sync(realRemoteDB, options);
_DB.get(userSettingsDocumentName);
The _DB.get says the document doesn't exist. If I wait long enough the sync works and the server docs are loaded locally and the .get works. How can I handle this other than putting in a long timeout?
PouchDB functions are mostly asynchronous. This means that when you fetch the document, the sync might not be complete yet.
Here's how you should write it with promises:
var _DB = new PouchDB(localDBName);
_DB.sync(realRemoteDB, options).on('complete',function(info){
//Sync is complete
return _DB.get(userSettingsDocumentName);
}).then(function(doc){
//Here you will have the document
}).catch(function(err){
//An error occured
})

Best NodeJS Workflow for team development

I'm trying to implement NodeJS and Socket.io for real time communication between two devices (PC & Smartphones) in my company product.
Basically what I want to achieve is sending a notification to all online users when somebody change something on a file.
All the basic functionality for saving the updates are already there and so, when everything is stored and calculated, I send a POST request to my Node server saying that something changed and he need to notify the users.
The problem now is that when I want to change some code in the NodeJS scripts, as long as I work alone, I can just upload the new files via FTP and just restart the pm2 service, but when my colleagues will start working with me on this story we will have problems merging our changes without overlapping each other.
Launching a local server is also not possible because we need the connection between our current server and the node machine and since our server is online it cannot access our localhosts.
It's there a way for a team to work together in the same Node server but without overlapping each other ?
Implement changes using some other option rather than FTP. For example:
You can use webdav-fs in authenticated or non-authenticated mode:
// Using authentication:
var wfs = require("webdav-fs")(
"http://example.com/webdav/",
"username",
"password"
);
wfs.readdir("/Work", function(err, contents) {
if (!err) {
console.log("Dir contents:", contents);
} else {
console.log("Error:", err.message);
}
});
putFileContents(remotePath, format, data [, options])
Put some data in a remote file at remotePath from a Buffer or String. data is a Buffer or a String. options has a property called format which can be "binary" (default) or "text".
var fs = require("fs");
var imageData = fs.readFileSync("someImage.jpg");
client
.putFileContents("/folder/myImage.jpg", imageData, { format: "binary" })
.catch(function(err) {
console.error(err);
});
And use callbacks to notify your team, or lock the files via the callback.
References
webdav-fs
webdav
lockfile
Choosing Secure Passwords

Firebase / Node.js, correct usage of on.Disconnect()

I have a node.js, the monitors a queue on Firebase, to send GCM Push notifications. Works fine.
It also updates an "online" status on firebase via .onDisconnect(), so one can easily see if the Node.js server is online and running.
Problem: after some time it will show "disconnected" even when the listener is still connected and running fine.
const NODESERVERONLINE="NodeSeverStatus";
var ref = new Firebase(FBURL+FBKEY_GCM_QUEUE);
ref.child("NODESERVERONLINE").set("Online");
ref.child("NODESERVERONLINE").onDisconnect().set("Offline!");
ref.on("child_added",function(snapshot, prevChild){
If (snapshot.key()!=NODESERVERONLINE) DO_GCM_PUSH(snapshot.val());
}, function(errorObject){
console.log("Error reading Firebase: " + errorObject.code);
});
Initially, the listener is running and -NodeSeverStatus shows Online.
However "after some time" (several hours), the listener is still running fine, and the queue is being processed, but NodeServerStatus now shows Offline.
I could move the online/offline code inside the listener itself, but that would appear to just be an ugly hack, and would presumably still have the same issue if there were no new queue posts within the timeout period.
What is best practice here? Thankyou.
A quick guess is that your network connection gets interrupted briefly.
If you network connection flaps, the server will detect the disconnect and set Offline!.
The client will automatically reconnect, but you never set Online again.
So you'll want to listen for .info/connected and set Online there.
var ref = new Firebase("https://yours.firebaseio.com");
ref.child("NODESERVERONLINE").onDisconnect().set("Offline!");
ref.child(".info/connected").on("value", function(snapshot) {
if (snapshot.val() === true) {
ref.child("NODESERVERONLINE").set("Online");
}
});
See https://www.firebase.com/docs/web/guide/offline-capabilities.html

What's the proper way of using Postgres connections in Node?

I was wondering if anyone can help me understand what the proper way of maintaining multiple connections to multiple postgres servers via https://github.com/brianc/node-postgres is.
Obviously when running a node server for long duration we want to make sure we keep everything clean with no leaks and so I am wondering what the proper pattern is.
Please remember that my Node server will need to connect to 7-8 Postgres servers.
https://github.com/brianc/node-postgres supports the idea of pools. I am wondering: do I just connect to all servers on initial Node server set up and maintain open connections and each function can ask for a pool when it needs to talk to a server?
In other words, am I supposed to call pg.connect every time I make a server query? (minus the var pg and var connectionString which could be global)
Can't I just have a single connection be on and ready?
var pg = require('pg');
var connectionString = "pg://brian:1234#localhost/postgres"
pg.connect(connectionString, function(err, client, done) {
client.query('SELECT name FROM users WHERE email = $1', ['brian#example.com'], function(err, result) {
assert.equal('brianc', result.rows[0].name);
done();
});
});
Code snippets are greatly appreciated.

Resources