meteor for very fast data - node.js

I just started with Meteor app development and have a use case which I am not sure is good for meteor.
We have a java application that pushes data to redis at a very fast rate (data updates in less than 50 milliseconds) and we are building a web application (on NodeJS) which connects to this redis instance and sends the data to the client. For now (with native NodeJS app), we are sending data only twice a second (as we do not require such fast updates).
My question is, how can I achieve the same with Meteor? As we know Meteor has live-query which will tend to send data as soon as it changes, but this is not optimum for us. Is there a way to tune live-query to send data say only after a certain time?
Thanks

I think you are looking for ways to throttle meteors calls. This could be done with this library.
This issue has been also discussed here. Reading up on it I think they still haven't implemented it in core. This would make sense since there are no out-of-the-box throttling mechanisms in node or iojs.
Hope this was helpful.

Related

Real Time Tracking System

My client has 1000 taxi, he want to track every taxi location and see his display. My question is how to track all taxi information using driver mobile device. I am using mongoDb for database.
I plan to solve this problem using develop a api and mobile device send their location after 10 seconds. but the problem is server is very busy that time and can not working properly using api.
I saw firebase store client information realtime . I need to know its possible for me to work as like as firebase database using mongodb.
I am using nodejs for backend development . If anyone know any way how to store real time data please help
You cannot (generally speaking) track taxi in real time, the problem is your Internet connection may be poor due to low GPS signal, have really low latency sometimes, or even be down. Instead design two independent applications:
One, which will store current GPS location inside a FIFO queue locally
Second, which will flush the queue to a remote sever
This approach will ensure you will, eventually, receive all the positions without having to worry about dropped packets and other issues that may and will occur.
Instead of the TCP connection you can consider using UDP (or better DTLS) instead which is faster, but less reliable. If reliability is a must (doubt it if it is just a taxi), then go for TCP (or better TLS). How will you send or receive the data is just a detail.
Also make sure you authenticate the device before you store any data, especially if the connection between devices is not secure.
You can use Firebase as a real time database. Have a look at this link: https://github.com/firebase/geofire
But in case you want to go with MongoDB you can use MongoDB Geospatial Queries and use socket.io to enable real time behavior.
For more details: https://docs.mongodb.com/manual/geospatial-queries/
You can use https://socket.io/ for the real-time tracking system.
It is a JavaScript library for real-time web applications.
You just need to configure MongoDB.
There are many blogs which can explain to you how to set up socket.io with MongoDB.
Some of them are...
http://blog.slatepeak.com/creating-a-real-time-chat-api-with-node-express-socket-io-and-mongodb/
https://blog.feathersjs.com/building-a-rest-and-real-time-api-with-express-feathers-and-mongodb-12071e5417e1
I think you are going to implement the tracking in frontend
But that is not good way and not secure, because drivers can send the fake request real time
You can use websocket method to implement the checking real time of taxies
Please check this link
I guess this way is similar with your idea
I hope this way is good for you
Thanks

Options for getting a CPU intensive job off my web server?

I have been working on a Web App for visualizing live data. It is crucial that this data is kept up to date on the client side without such updates being invoked directly by the client (e.g. no button presses or refreshing the page). Currently, on page load, I grab the current data set from a database (DynamoDB) via Ajax, and subsequent updates are pushed to any listening clients every 5 minutes via a Websockets connection (using Socket.io).
I have overlooked the computational load of this update job. It has to mine some data, process it, update the database, and send the update out to all clients. As a result, the web server is left unresponsive for about 30 seconds with each update. Furthermore, my current architecture limits me from putting my server behind a load balancer, which is something I anticipate coming up in the future. For both these reasons, I really need to get this update job off my web server.
I am relatively inexperienced in web development, and I don't feel I am knowledgeable enough about these technologies to know the drawbacks of the solutions I have come up with. Currently, I am considering:
Break the update off into a separate process so it does not block the Node event loop. This would solve my issue in the short term, but if I ever want to load balance my application, I can't have the update running on multiple machines.
Drop Websockets entirely and just have the client query the database every 5 minutes, while a separate process (or separate server if I want load balancing) keeps the database up to date without interacting directly with the client. Will this kind of access pattern put too much load on my db?
Have a separate server run the update, and send the result via Websockets (or maybe some other protocol) to my load balanced application servers, which then push that update to all listening clients as usual. Is this even possible?
Perhaps there are other solutions. It seems like this would be a relatively common problem, so I was hoping I could find some guidance here. What are the potential issues with the solutions I have proposed, and are there other possible solutions that my suit my use case better?
It sounds like you want one process sitting somewhere which crunches the data and publishes it to a stream. Clients can then subscribe to the stream as and when they like. Redis handles streams nicely, you could process your data and push it into a redis stream. You could then create a small node service which subscribes to the redis stream and pushes the formatted data out over a websocket or via polling.
In this scenario you can then scale up either the publishing process (the one crunching the numbers) if your data load goes up, or scale up your subscribed process (which serves the data over a websocket to browsers) if you get an influx of clients watching the data.
You can also easily distribute the hosting of these services across other machines, and even write them in different languages if you decide the number crunching needs something like threading.
You're then left with the issue of clients (web browsers) consuming this data with a load balance in-between. This can be a hard problem if you use websockets and is bundled with pros and cons. But importantly you'll have separated your data crunching from your result publishing and that'll isolate out your issue to only the load balancing.
I have done pretty much the same to check ressources on some of our servers.
I have a C# service getting the information on each server that we manage, sending them to a queue (Amq).
From there, I have a stomp client fetching data from amq and emiting them to a websocket.
My main micro service is fetching the data to save them into a db.
My visualisation webapp is connected to the same ws and is fetching the data as they are sent to display them.
The Amq step isn't mandatory at all, it's just something I had to work with (historical).
I don't know what type of data your are working with, so I don't know if my solution can apply to you.
Don't hesitate if I'm not clear or you have any question.
This is a big question and I'm not going to try and give you a definitive answer.
For option 2
It really depends on how expensive your queries are. You can make DynamoDB fast if you pay for enough throughput. That said, on the face it, re-loading your whole dataset, when that sounds like its probably large, probably isn't good engineering.
For option 3
This option seems best to me if its achievable, although admittedly its hard to say with such a complex system - obviously you can't share your whole project.
Given your are already using AWS you might want to look into AWS Lambda. If you can move the update process into a stand alone job, you can host it on lambda and move the load off the web server. Lambda is essentially infinitely scalable and you only pay for the compute you use.
This really depends on you being able to split the update task off into a separate service. Its likely you would need a fair bit of refactoring to isolate it as a service. If you can break little bits off at a time, and make the move gradually, even better.
If you consider trying this, and you've not used Lambda before, I would definitely start small with some hello world examples. Then try a very simple service in your application, and build up to taking on the update service.
You might also consider looking in AWS Simple Message Queue Service to handle the comms between clients and server.
Database tuning
If a lot of your update time is spent waiting for database actions to complete, rather than server processing, you can consider tuning that side of things up. Things to consider are:
Buying more throughput
Using batch operations (as these move load to DynamoDB from your server)
Tuning keys, indexes and database access

Realtime multiplayer game along side REST api

I'm a Rails developer who has just migrated to Node and I've decided to write an angular application backed by an postgres/express.js REST api. I use the api primarily for CRUD operations thus far, but I want to start a realtime game instance when two players visit a certain page(challenge each other). I'm thinking of using socket.io to accomplish the realtime functionality.
The game is similar to that of pokemon on gameboy, in which to players take turn performing certain actions until one of them wins.
I have the following questions:
Should I have a separate server to handle the game using socket.io, or can i use the same as the one my API operates on?
Should I use a service like Pusher or can I create the architecture myself?
How would I go about making sure no data is lost, if say, a player disconnects during a game?
At which point (number of concurrent connections/request per second) would I run into performance issues? 100, 1000, 10000?
Thanks
If the realtime logic is closely related to the CRUD stuff (i.e. realtime events are a direct result of writes to the API), and you expect somewhat equal usage of both aspects of the system, then I'd put both on the same server.
I highly recommend using a realtime push service if possible (disclaimer: I work for Fanout.io). It'll be simpler and probably less expensive too.
The key to making sure data is not lost is to persist it on the server before sending. Don't depend on the realtime layer for persistence (biggest mistake you can make). When the client reconnects, it can request data it may have missed via the normal API. So, just get your CRUD stuff correct and then layer realtime eventing on top. You can create a very network resilient service this way.
You should be able to get to a few hundred concurrent connections without much thought. Going beyond will take architecture planning. Of course, if you delegate to a push service then you don't have to worry about this, at least for the realtime part.

Getting notifications on database changes: is it possible to watch entries in riak?

I'm looking for an efficient way to subscribe to events in riak from node. I would like to be able to be notified of changes on an entry from riak.
For example when one node.js server updates an entry, another server using and watching that entry receives the updated entry or a notification about its update automatically.
If this is impossible is there an efficient messaging system that can be efficiently used across node.js servers?
Riak implements what are called Pre and Post commit hooks. Post-commits, which will be triggered when a write successfully occurs (and is presumably what you want) can only be written in Erlang code and Riak needs to be configured to trigger your custom Erlang function, as a property on the appropriate bucket.
Depending on your needs and the scale of your application, there can be several options for your Erlang setup to notify your Node.js server(s). It would be relatively easy to write an Erlang function that would send a HTTP request to your Node.js server, but that carries quite a lot of overhead, that may very well be inappropriate for your application. A lot Better, but slightly more complicated, would be to use a pub/sub system like those offered by Redis or ZeroMQ (just to name a couple), that are battle-tested and proven to perform very well under heavy load. If you want to go with ZeroMQ, see this guide on how to implement very reliable pub/sub.
Both of these messaging tools, as well as many others, can notify your Node.js instance of updates to watched entries from either Riak or the Node.js instance that's effectively modifying the data. The second option (Node.js to Node.js) might be simpler since it wouldn't require you to learn Erlang if you're not familiar with it. Both of these tools have node.js libraries that are very well-tested:
Zeromq.node
redis-node
And if you were to use them to send out notifications from within Riak as post-commit hooks, here are the corresponding erlang drivers:
Erlzmq2
Eredis

Real-time newsfeed for followers - which tools and languages?

I want to implement newsfeed for followers just like Twitter in realtime. But I'm stucked which tools will be the best for my purposes. The solution (complexed solution) should be production-ready. I've tried node.js + socket.io + rabbitmq (node-ampq, rabbitJS), but node frequently crashes... Another solution is Tornado + sockJS-tornadio, but I'm unsure (and know Python not good). Before diving into code, I just need to know which tools are ebst for my purposes and can be 'really' realtime. By 'really' I mean trully fast request-responses. I've tried RabbitMQ + PHP API + Ajax, but it's not 'really' realtime, it uses ajax instead of, for example, websockets.
The data for newsfeed will be like 'John updated its profile', 'Doe uploaded new file' and something similar.
Thanks!
What sort of crashes are they? Node has been fairly stable for some time now, at least for me.
Node is a fine solution, but you introduce some amount of complexity with communication between Node and PHP (which you look to want to solve with RabbitMQ). "Really" realtime can be easily accomplished with socket.io, and since it has multiple fallbacks is ideal for getting a wide audience. Long polling and friends do have their overhead but it isn't too much if the events aren't very frequent.
If it's easier for you to integrate the newsfeed into your PHP program and you are willing to accept the minor overhead I say go for it. Otherwise, I would invest in Node. The platform is still young, but it's matured well, IMO.
I just build a realtime web application.Users can drag items and chat in the same page.and I use tornado + jquery pending + redis as a MQ system.It works good, but I am considering to try socket.io to handle realtime requests.
tornado is kickass in terms of performance, and really easy to work with. the only problem is that you need to make sure you have adapters for databases and other blocking-by-nature sources you need to communicate with. It supports websocket of course so you can stream data to users.
If I had to implement this today, the stack I'd use is:
Redis for the data and pub/sub on channels.
Tornado as the API server.
WebSocket for the communication layer wherever possible.

Resources