Is there an easy way to manage offline data with a web app, and synchronize with a server when there is a connection? I have been looking at Meteor, CouchDB and the likes, but still not sure what would be the least painfull way.
I could of course implement it myself with sockets or something similar, but if something is already made for the purpose, I don't see a reason to do it again.
I'm planning to work with Node as the server.
Thanks
You're talking about two things; 1) How to store/persist data if/when offline (storage mechanism), and 2) How to synchronize with a server when online (communication mechanism). The answer to 1 is some kind of local storage, and there any several ways of doing that (localstorage, websql, filesystem APIs etc) depending on your platform. The answer to 2 really depend on how urgent your synchronization needs are, but in general you can use HTTP itself with periodic (long-) polling, websockets and similar.
On top of both storage and communication mechanisms there are numerous libraries that make the job simpler, like Meteor (communication) and CouchDB (storage), but also many many more. There are even libraries that take care of the actual synchronization mechanism (with possible conflict resolution as well), but this very much depends on your actual application.
Updated: This framework looks promising, but I haven't tested it myself:
http://blog.nateps.com/announcing-racer-experimental-realtime-model
You might want to look at cloud services as well. These are best if you are developing a new application as they push you more to a serverless model, and of course you have to be happy using a service.
Simperium (simperium) is an interesting cloud service - the only one I can find today that does syncing (unlike Firebase and Spire.io who are similar in other respects), and for iOS it includes offline storage, while for JavaScript clients you'd need to cover the local storage yourself using HTML5 features. Backbone.js seems to have some support for this, and Simperium can integrate with Backbone, using a similar API style.
For non-cloud services, Derbyjs (derbyjs) is an open source project that includes Racer, a data synchronization library (mentioned by the earlier answer) - both are under rapid development and not yet complete, but look interesting if your timescales allow, and don't require a cloud service. There is a comparison of Derbyjs to Meteor that is useful - although it's written by the Derbyjs developers it's not too biased.
I also looked at CouchDB, which has some interesting built-in replication features, but I didn't like its use of indexes that are updated lazily when a query needs them (or by a batch process), and I wasn't happy with exposing the server DB directly to clients to enable replication/sync. Generally I think it's best to decouple the client side local storage from the server side DB, and of course for a web app it would be hard to use CouchDB on the client.
Related
I have a local VPS that hosting and providing my Node.js REST API in my country.
However soon I will need to open it for different countries.
That means that clients from remote will ask for my services.
Since they are far it will be probably slow connection.
How can I avoid this? Maybe I need more servers located in their countries too, but still, how the data could be shared over one DB?
I do not looking for a full tutorial for how to do that (could be nice to have) but I am looking for get info about the methodology of this.
What do you recommend to do, keep buying servers in remote countries, sharing their data between them someway, or maybe choose to use some cloud service like Firebase? How cloud services work in first place?
Without going into too much detail for each item, here are some keypoints in which I think you should focus your on learning to solve your problem.
For data storage - look into firestore (not the json database) as firestore is globally scaleable.
For your REST endpoints I would use google cloud functions, but without knowing the nature of your application its hard to say if its suitable. The key to being able to reach global scale is having cacheable endpoints. Then you are leveraging google's global CDN which is much faster than hitting the origin server. Note: The firebase cloud functions infrastructure WILL face cold start issues which may/may not be a problem for you.
Cache invalidation is a little lacking so you can leverage longer max-age cache settings but use either cache busing and/or the header stale-while-revalidate to help with this.
There is some great info here https://www.youtube.com/watch?v=dbV-293m1dQ that covers some of what I have mentioned in more detail.
At this point, I'm convinced that declarative bindings backed by a robust data query service is the secret sauce for writing scalable rich client applications for the web.
Obviously there are many options for declarative data binding (Knockout JS and Rivets for Backbone to name just a few). However, when it comes to querying the server, caching data and tracking changes on the client, the only modular solution that looks half way mature seems to be Breeze JS. And yet, while it claims not to dictate server technology, all documentation examples show Breeze running with .NET.
What requirements, API-related or otherwise, must a server fulfill in order to serve as an endpoint for a Breeze application? Is implementing the OData protocol enough? Are there any examples out there to light the way? Or other libraries solving this problem that I've missed?
you can use nodejs as an oData server with JayData
http://jaydata.org/blog/install-your-own-odata-server-with-nodejs-and-mongodb
it's free and open source
Yes, OData is sufficient. However, we are still working on OData save support (querying is fine, of course).
Sorry for the delay in getting out non-.NET samples. We are definitely committed to an open, pluggable back-end and will be releasing more samples in the next few weeks.
Also, please vote for these features (or submit your own) on our UserVoice feedback page. This helps us prioritize what to work on next. Thanks!
I have an app that I would like to create. But I am not sure how to go about it. I am using node.js and would like to use couchdb, but if something like mongodb or riak would be a better choice them im willing to hear ideas. But, i have a site, say
cool.com
and on there is a couchdb instance, as well as a site to manage a store. say a shopping cart. the db houses all the store's items and data. The app itself has an admin backend to manage that data and can change items. What i would like to be able to do, is have the ability to have the user be disconnected from the internet, and still have the admin backend work. I realize for this to work I need to use a client side framework with my models/routes/controllers/whatever. But what I am not sure of, is how to let the site function while offline. couchdb if installed locally can sync the data from local to remote when back online, and if the admin user is on the computer, i could have them install couch. but that could be messy.
Also, what if the admin user is on a tablet or a phone? Would I need to have an actual mobile app and a desktop app to do this? is there some way I can set this up so it is seamless the the end user. I would also like this to be offline for end users too, but the bigger audience is the admin.
Another use case, instore POS system. and the power goes out. But the POS system can be loaded from the web onto a tablet and they can still make card based sales if the wifi is out, because the app is available offline.
Im just not sure how to do this. lets assume i need a client framrwork that can handle the data as well as the backend. something like ember, or angular. theres also all in one stacks like meteor and derby js, but those arent fully offline,but are for the appearance of real time. though meteor does have mini mongo so it might be worth looking into.
I was hoping someone could help me figure out how I would get this setup to work, preferrably with couch, but other nosql's would work too if I can have a way to sync the data.
I'm not sure if it would work for you, but I have been thinking of such an application for quite a long time now and been doing some research on what's possible. The best solution I could come up with is using a server with a couchdb and writing the application clientside based. Then for the data storage use pouchdb and synchronize the pouchdb regularly with your serverside couchdb if the app is online. I know pouch is in an early stage and not production ready but if you are willing to put some work into it I'd say it's doable.
If you want clients that work seemless as they go offline and come online (like a POS with the power out) then I would recommend making the app primarily work off local storage with a background publishing or synchronization to the cloud.
Local storage options could be everything from something light like sqlite, sqlexpress, firebird to no sql options like mongo, couchdb etc...
But for the client or device, consider the ease of configuration and weight of the option. You also need to consider the type of clients - do you have many platforms varying from devices to PCs? You don't want something that has a heavy config and runtime footprint. That's fine on the service side.
On the service side, consider the nature of your data and whether it's fitted better for transactional/relational systems (banking etc...) or eventually consistent/non transactional (no-sql) documents. Don't forget hybrid as an option. Also consider the service platform - for example, node goes well with mongodb (json objects front to back) ...
The device and service storage options can be different (and likely should be) separate by service interfaces (soap, rest/http, sockets etc...).
It's hard to have a one size fits all solution but often something light weight like sqlite on the device or client makes for ease of installation/config while scalability on the service side with something like sqlserver/mysql or couchdb/mongodb makes sense.
Some links to read:
http://www.mongodb.org/display/DOCS/Comparing+Mongo+DB+and+Couch+DB
http://www.sqlite.org/
http://blogs.msdn.com/b/sqlexpress/archive/2011/07/12/introducing-localdb-a-better-sql-express.aspx
You're question is pretty wide open and there's no one size fits all solution. Hopefully I provided some options to think about.
There's an interesting project out there called AppJs (http://appjs.com/), which packages Node.JS and Chrominium as a desktop environment. It's currently very fresh (very little documentation), but it appears to be straight forward enough (you'll be using the same tools as you would for your online application).
As for synchronising the offline and online environments. I doubt you can rely on CouchDB in the way that you envisage. CouchDB mobile support is not as comprehensive as some of the documentation suggests. So in this sense, it would be no different to using SQL/Mongo/Punchcards.
You might have more luck with designing a suitable serialisation scheme based on XML or JSON (or just plain text), and passing files between the online and offline installations.
Edit - Since writing this, Node Webkit - http://nwjs.io/ - is clearly the most obvious replacement for App.js. It has a very simple API, and some great features.
For a notification project, would like to push event notifications out. These are things like login, change in profile, etc., and to be displayed to the appropriate client. I would like to discuss some ideas on putting it together, and get some advice on the best approach.
I noticed here that changes made to a CouchDB can be detected with a _changes stream, picked up by Node, and a process kicks off. I would like to implement something like this (I'm using SQL Server, but an entry point at this level may not be the best solution).
Instead of following the CouchDB example (detecting database-based events, I think this just complicates things, since we're interested in client events), I was thinking that when an event occurs, such as a user login, then a message is sent to the Node server with some event details (RESTful request?). This message is then processed and broadcast to all connected clients; the appropriate client displays notification.
Proposed ecosystem:
.Net 4.0
IIS
IISNode
Socket.IO
Node.JS
SQL Server 2008
This will be built on top of an existing project using the .Net framework (IIS, etc.). Many of the clients' browsers do not support web sockets, so using Socket.IO is a good option (fallback support). However, from what I can see, Socket.IO only still only supports long polling through IISNode (which isn't really a problem).
An option would be to expose the Socket.IO/Node endpoint to all clients, so that client-based notifications can be sent through JS to the Node server, which broadcasts the message. (follows the basic chat-server /client/server examples).
Alternately, an IIS endpoint could be used, but could only support long polling (through Socket.IO). This would offer some additional .Net back-end processing, but may be over-complicating the architecture.
Is there SQL Server-based event notification available for Node?
What would be the best approach?
If I didn't get the terminology ecosystem configuration right, please clarify.
Thanks.
I would recommend you check out SignalR first before considering adding iisnode/node.js to the mix of technologies of your pre-existing ASP.NET application.
Regarding websockets, regardless if you use ASP.NET or node.js (socket.io), you can only use HTTP long polling for low latency notifications, as websockets are not supported by HTTP.SYS/IIS until Windows 8. iisnode does not currently support websockets (even on Windows 8), but such support could be added later.
I did some research lately regarding MSSQL access from node.js. There are a few OSS projects out there, some of them use native, platform-specific extensions, some attempt implementing TDS protocol purely in JavaScript. I am not aware of any that would enable you to access the SQL Notifications functionality. However, the MSSQL team itself is investing in a first class MSSQL driver for node.js, so this is something to keep an eye on going forward (https://github.com/tjanczuk/iisnode/issues/139).
If you plan to use SQL Notifications to support low latency notifications, I would strongly recommend starting with performance benchmarks that simulate the desired level of traffic at the SQL server level. SQL Notifications were meant primarily as a mechanism to help maintain in memory cache consistent with the content of the database, so it may or may not meet the requirements of your notification scenario. At the very minimum these measurements would help you start with a better design.
I would highly recommend using Pusher. That is what we use and it makes it easy to implement as it is a hosted solution. So plugging it and making it work is really easy. It doesn't cost much unless you are going to push a crazy amount of messages through it on a massive scale.
I'm hoping someone can validate or correct my conclusions here.
I'm looking into writing a small side project. I want to create a desktop application for taking notes that will synchronise to a web-server so that multiple installations can be kept in step and data shared and also so that it can be accessed via a browser if necessary.
I've kind of been half-listening to the noises about CouchDB and I've heard mention of "offline functionality", of desktop-couchdb and of moves to utilise its ability to handle intermittent communications to enable distributed applications in the mobile market. This all led me to believe that it might be an interesting option to look at for providing my data storage and also handling my synchronisation needs, but after spending some time looking around for info on how to get started my conclusion is that I've got completely the wrong end of the stick and the reality is that:
There's no way of packaging up a CouchDB instance, distributing it as part of a desktop application and running it in the context of that application to provide local storage and synchronisation to a central database.
Am I correct here? If so is there any technology out there that does this sort of thing or am I left just rolling my own local storage and maybe still using CouchDB on the server?
Update (2012/05): check out the new TouchDB projects from Couchbase if you are targeting Mac OS X and/or iOS or Android. These actually use SQLite under the hood (at least for now) but can replicate to/from a "real" CouchDB server. Another clientside alternative that is finally starting to mature is PouchDB, which runs in IndexedDB capable browser engines. Using these or using them to inspire similar port to another desktop platform is now becoming a better-trod path.
Original answer:
There's no way of packaging up a
CouchDB instance, distributing it as
part of a desktop application and
running it in the context of that
application to provide local storage
and synchronisation to a central
database.
At this point in time, your statement is practically correct although it is possible to include CouchDB in an app — for an example see CouchDBX.app which is a thin wrapper around a prefixed bundle of CouchDB and all its dependencies.
The easiest way to build a CouchDB app is to assume that the user will already have a CouchDB server running. This is easier than it sounds, especially with Couchone's hosting or a prebuilt app like CouchDBX on OS X or DesktopCouch on Ubuntu. This latter is especially interesting, because if I understand correctly it is included by default with Ubuntu these days, and automatically spins up a CouchDB server per-user when you query its port via D-Bus. Something similar could (and should) be done on OS X using launchd and Bonjour.
So as you write, you either would design your app to store data in a local format and optionally sync with a CouchDB service you provide or you'd have to build and bundle all of Erlang, SpiderMonkey and CouchDB together with your app along with some scripts to make sure it was running when needed. This is possible but obviously neither of these are ideal, and believe me you're not the only one wanting a simpler solution for desktop-oriented apps!