I have stuck in a situation that I want to sync two mongoDB one is at local machine and second is at the remote (on a mlab sandbox). for sync I am doing as:
first I dump the collection which have changes using mongodump
next by using mongorestore I am restoring that collection on a remote MongoDB.
But there are two problems which I am facing.
First is how can I update the collection on a remote mongoDB.
for example : on change collection should I replace the whole collection on remote side or use other way ? and what is best way for doing this.
And the second one problem is how can I detect changes in a collection. or in a whole database. I am using loopback framework and event-stream npm module for sending changes to the client side. but I unable to read change stream on server side.
my server\boot\realtime.js is :
var es = require('event-stream');
var sync = require('../sync');
module.exports = function(app) {
var completeOrder = app.models.completeOrder;
completeOrder.createChangeStream(function(err, changes) {
sync('completeOrder',function(data){
console.log(data);
},function(err){
console.log(err);
});
changes.pipe(es.stringify()).pipe(process.stdout);
});
}
On second problem LoopBack doesn’t send events when there is change in data in underlying DB but only when there is change in model. Mongo 3.6 provides change stream functionality and you can use it to trigger model change/perform your functionality.
Related
I have set up a simple node.js app as a proof of concept, where I want peers on a local network sync a database using gun.
I am new to gun so I am not sure if I am doing this correctly but here is my code:
var Gun = require('gun')
const address = require('network-address')
const hashToPort = require('hash-to-port')
// get username from arg eg. node index myname
const username = process.argv[2]
// create GUN server on it's own port
var server = require('http').createServer().listen(hashToPort(username))
var gun = Gun({web: server})
// listen for input from the console
process.stdin.on('data', (data) => {
gun.get('hello').put({ word: data.toString(), user: username })
});
// Output input update
gun.get('hello').on(function(data, key) {
console.log(data.user + ' said: ' + data.word.toString())
})
The idea is that peers can drop out and reconnect and sync to the latest version of the database.
I run the app on 2 different local network machines and it works well. The database is syncing.
If I close one app, then update the database on the open app, and then restart the 2nd app, the 2nd app does not sync with the already open app.
Is there a way to sync with the updated db when a new peer connects?
I hope that all makes sense. Please suggest if this is the wrong way to go about it.
#CUGreen I'm glad that the local area multicast sync is working!
If I understand your question right, it is that you want OLD data to synced?
gun.get('hello').put(data) and .on(cb) update the same object. So technically, you are syncing the whole database, you're just always getting the latest state. (Unless there is some other bug? Please let me know).
What you probably want to do is .set(data) instead of .put(, this will add a NEW record to a table on hello, which you can then query all of it (old and live inserts in future) with gun.get('hello').map().on(cb)
I do not know if this is relevant, but you may find https://gun.eco/docs/Graph-Guide a nice introduction to put/set, etc.
And of course, if you need any assistance, there is a super friendly and active community at the http://chat.gun.eco chat room!
If there is a bug, please report it at https://github.com/amark/gun/issues
I am using PouchDB to sync a database between a device and a server.
When installing my App on a new device I need to pull down the user's settings document from the server. If I do the following on a new device when the App has previously been run on another device and created the user settings:
var _DB = new PouchDB(localDBName);
_DB.sync(realRemoteDB, options);
_DB.get(userSettingsDocumentName);
The _DB.get says the document doesn't exist. If I wait long enough the sync works and the server docs are loaded locally and the .get works. How can I handle this other than putting in a long timeout?
PouchDB functions are mostly asynchronous. This means that when you fetch the document, the sync might not be complete yet.
Here's how you should write it with promises:
var _DB = new PouchDB(localDBName);
_DB.sync(realRemoteDB, options).on('complete',function(info){
//Sync is complete
return _DB.get(userSettingsDocumentName);
}).then(function(doc){
//Here you will have the document
}).catch(function(err){
//An error occured
})
I'm trying to set up a notification system for my web app. Currently, I want to add an extra field to my mongo db, and have this update my front end in real time using socket.io
I know previously mongo had no functionality for listening to updates in its database - but has anything been recently released? Or is there another option to database triggers?
You can use Change Streams which is supported in many languages.
Since you are using socket.io you can see the node.js implementation
const collection = db.collection('inventory');
const changeStream = collection.watch();
changeStream.on('change', next => {
// process next document
});
Here is the link: https://docs.mongodb.com/manual/changeStreams/
I read :
How do I manage MongoDB connections in a Node.js web application?
http://mongodb.github.io/node-mongodb-native/driver-articles/mongoclient.html
How can I set up MongoDB on a Node.js server using node-mongodb-native in an EC2 environment?
And I am really confused. How I should work with mongoDB from node.js? I’m a rookie, and my question may look stupid.
var db = new db.MongoClient(new db.Server('localhost', 27017));
db.open(function(err, dataBase) {
//all code here?
dataBase.close();
});
Or every time when I needing something from db I need call:
MongoClient.connect("mongodb://localhost:27017/myDB", function(err, dataBase) {
//all code here
dataBase.close();
});
What is the difference betwen open and connect? I read in the manual that open: Initialize and second connect. But what exactly does that mean? I assume that both do the same, but in the other way, so when should I use one instead the other?
I also wanna ask it's normal that mongoClient needing 4 socket? I running two myWEbServer at the same time, here’s picture:
http://i43.tinypic.com/29mlr14.png
EDIT:
I wanna mention that this isn't a problem ( rather doubt :D), my server works perfect. I ask because I wanna know if I am using mongoDB driver correctly.
Now/Actually I use first option,init mongo dirver at the beginning and inside load put all code.
I'd recommend trying the MongoDB tutorial they offer. I was in the same boat, but this breaks it down nicely. In addition, there's this article on github that explains the basics of DB connection.
In short, it does look like you're doing it right.
MongoClient.connect("mongodb://localhost:27017/myDB", function(err, dataBase) {
//all code here
var collection = dataBase.collection('users');
var document1 = {'name':'John Doe'};
collection.insert(document1, {w:1}, function(err,result){
console.log(err);
});
dataBase.close();
});
You still can sign up for a free course M101JS: MongoDB for Node.js Developers, provided by MongoDB guys
Here is short description:
This course will go over basic installation, JSON, schema design,
querying, insertion of data, indexing and working with language
drivers. In the course, you will build a blogging platform, backed by
MongoDB. Our code examples will be in Node.js.
I had same question. I couldn't find any proper answer from mongo documentation.
All document say is to prefer new db connection and then use open (rather than using connect() )
http://docs.mongodb.org/manual/reference/method/connect/
I asked a question a few months ago, to which Meteor seems to have the answer.
Which, if any, of the NoSQL databases can provide stream of *changes* to a query result set?
How does Meteor receive updates to the results of a MongoDB query?
Thanks,
Chris.
You want query.observe() for this. Say you have a Posts collection with a tags field, and you want to get notified when a post with the important tag is added.
http://docs.meteor.com/#observe
// collection of posts that includes array of tags
var Posts = new Meteor.Collection('posts');
// DB cursor to find all posts with 'important' in the tags array.
var cursor = Posts.find({tags: 'important'});
// watch the cursor for changes
var handle = cursor.observe({
added: function (post) { ... }, // run when post is added
changed: function (post) { ... } // run when post is changed
removed: function (post) { ... } // run when post is removed
});
You can run this code on the client, if you want to do something in each browser when a post changes. Or you can run this on the server, if you want to say send an email to the team when an important post is added.
Note that added and removed refer to the query, not the document. If you have an existing post document and run
Posts.update(my_post_id, {$addToSet: {tags: 'important'}});
this will trigger the 'added' callback, since the post is getting added to the query result.
Currently, Meteor really works well with one instance/process. In such case all queries are going through this instance and it can broadcast it back to other clients. Additional, it polls MongoDB every 10s for changes to the database which were done by outside queries. They are plans for 1.0 to improve the scalability and hopefully allow multiple instances to inform each one about changes.
DerbyJS on the other hand is using Redis PubSub.
From the docs:
On the server, a collection with that name is created on a backend Mongo server. When you call methods on that collection on the server,
they translate directly into normal Mongo operations.
On the client, a Minimongo instance is created. Minimongo is essentially an in-memory, non-persistent implementation of Mongo in
pure JavaScript. It serves as a local cache that stores just the
subset of the database that this client is working with. Queries on
the client (find) are served directly out of this cache, without
talking to the server.
When you write to the database on the client (insert, update, remove),
the command is executed immediately on the client, and,
simultaneously, it's shipped up to the server and executed there too.
The livedata package is responsible for this.
That explains client to server
Server to client from what I can gather is the livedata and mongo-livedata packages.
https://github.com/meteor/meteor/tree/master/packages/mongo-livedata
https://github.com/meteor/meteor/tree/master/packages/livedata
Hope that helps.