I created a Realtime application. It will store lot's of data in locally once node will crash all data lost.
how to carry these data and rejoin all the clients in same session or same room. Please let me know.
libraries are,
Socket.io,(version 4.2);
Node version 16.14
well, your question seems simple but it is actually kind of complicated to answer with so little information at hand...
there are a few aspecs to consider such as project architecture, data update times, data availability, data reliability, etc....
but to keep it short, basing my answer on the sole premise that you need something to store your data outside of nodejs, yet to be fast...
I'd recommend using Redis or Kafka, both have their pro and cons, both are meant for different needs, but both of are an in-memory data storage solution.
Hope my answer helped you, and set you in the right direction.
Related
I have two separate cloud-based APIs that I am working on integrating together. Neither software directly talks to each other so I am creating something in the middle to get them to communicate. I have had trouble finding examples or documentation on how exactly to do this, does anyone know of any resources that could help me out?
My plan going in was to use a MERN Stack, running on a local server to do GET and POST requests to both APIs, use some mapping and logic to transpose the data into the correct format and send it to the other software. I do not have a client per se (other than myself) on my end, so I really will be skipping the React part of MERN, at least that is what I'm thinking. I'll be using Mongo to keep track of both sets of data for redundancy. I also considered using a LAMP Stack but felt that MERN would be faster in handling the data, and Mongo is more flexible in handling different data formats. If there is another process or technology that could help me that I'm not thinking of, I would be grateful to hear about it.
Has anyone encountered something like this before? Thank you.
As with most architecture questions, there's no completely right or wrong answer here. You could certainly design a well-built system to handle for this purpose with either stack; even more-so when you mention that your front-end framework is not an important consideration. Instead, ask yourself questions like this:
Which stack do you have more experience with, and is this an appropriate time to learn a new set of technologies, or is it important to do the best work you're capable of right now (how important is time, cost, or quality in this case)?
Another generalization I'll stick my neck out for is a data-first approach; what sort of data are you dealing with from each cloud integration, and what kind of data do you need to support and/or create in order to make your system work? Mongo, being a NoSQL persistence layer, will allow you to change your data model and handle more varied data in a quicker and easier manner than a SQL solution will. This is a double-edged sword, however, as lack of validation and a strongly-constrained (typed?) data model will make your application harder to work with and debug as it grows. In short - how big might this application grow?
If you have a handy and familiar way to manage the three different data models you're dealing with (cloud service 1, cloud service 2, and your app) via MySQL, then that's a compelling reason to use it. However, if your style is to start dumping data into your database and you're comfortable with a more iterative approach (which may require more, albeit shorter rounds of refactoring), then Mongo with MERN may be the preferable choice.
Finally, will others ever be working on this application? If so, which language would you prefer to be dealing with them upon - PHP or Javascript?
I have an idea of social network website and as I'm currently learning web development I thought that was a great idea to practice. I already worked out the business logic and front-end design with React. Now it's time for backend and I'm struggling.
I want to create a React+Nodejs event-driven app. It seems logical to use Kafka right away. I looked through various Kafka architecture examples and have several questions:
Is it possible to create an app that uses Kafka for data through API calls from Nodejs and to React and vice versa. And user relational database only for longterm storage?
Is it better to use Kafka to handle all events but communicate with some noSQL database like Cassandra or HBase. Then it seems that NodeJS will have to make API calls to them and send data to React.
Am I completely missing the point and talking nonsense?
Any answer to this question is going to be quite opinion based, so I'm just going to try to stick to the facts.
It is totally possible to create such an application. Moreover, Kafka is basically a distributed log, so you can use it as an event store and build your state from that.
That mainly depends on your architecture, and there are too many gaps here to answer this with any certainty - what kind of data are you saving? What kind of questions will you need answered? What does your domain model look like? You could use Kafka as a store, or as a persistent messaging service.
I don't think you're off the mark, but perhaps you're going for the big guns when in reality you don't really need them. Kafka is great for a very large volume of events going through. If you're building something new, you don't have that volume. Perhaps start with something simpler that doesn't require so much operational complexity.
I have a nodejs project that using couchbase as database.
Just wonder if I store the temporary data in
1.redis
or in
2.couchbase directly.
As I know there is socket delay for couchbase, I think store temporary data in redis while store the permanent data in couchbase is better.
Is there any person has the experience on this?
Your comment welcome
I'm a big Redis fan, but in this situation I would use Couchbase only.
Couchbase is rather efficient, and comparable to the performance of memcached when the working set of your data fits in memory. Most of the time, an extra caching layer on top of Couchbase is not useful.
That said, if you really need a caching layer, or simply some storage for temporary data, you can simply create a memcached bucket hosted in the Couchbase cluster. So you would have an "eventually persistent" bucket for your persistent data, and a memcached bucket for the temporary data.
The bucket types are described here:
http://docs.couchbase.com/couchbase-manual-2.5/cb-admin/#data-storage
In that context, adding Redis as a extra storage layer does not really make sense.
Couchbase has a managed cache built into it, even for Couchbase buckets. So it already has a caching layer and adding another one on top just sounds superfluous.
I am not sure what you mean by a socket delay in Couchbase. Can you perhaps explain more about that? That is not something I have ever seen before and sticks out as suspect to me. I would try and troubleshoot this and figure out what that is before looking to add redis to the mix and have yet another layer to manage and code against. Without know more about the socket delay, it is difficult to make more recommendations.
It's an old question, but I'll have my take at it as well, if nothing else then for the people coming across it via google, just as I did.
I agree with he accepted answer, in that CouchBase has the most recently used Documents in RAM. In that aspect, it does the same as Redis. The advantage of CouchBase is of course that the data can reliably spill over the RAM limit, and the server disk limit, automatically, by adding more nodes.
However, I have a project where I am considering using Redis along side CouchBase. It's basically thought as a caching server, but for the "calculated" items. Such as html-snippets or other things. CouchBase is a fantastic document store, but making lists and other structures, doesn't come that easy, especially not without a lot of views. So I'm thinking to use Redis as a temporary datastore for the ad-hoc data manipulation needed, and CouchBase as the main datastore.
So here's my deal.
I'm using node on the express framework. The website i'm working on grabs scraped data and stores it for each user on the website. That data can then be displayed on the users page whenever they want to access it, so the data will be scraped, put in a database or storage, whatever i decide the best way to do it is, and then pulled back out for the user.
I'm trying to figure out what the best database setup would be. There will potentially be large amounts of data per user, especially over long periods of time. I've read some stuff about using redis to cache some data like the user login info and that basic stuff, and then using mongodb for the big data. But I don't know, i'm new to database stuff so I am open to some new teachings and some ideas from the masters.
What would you guys suggest I do? I want it to be fast and be able to handle multiple queries at the same time, but really, I have no idea what i'm talking about, so please help me.
What would you guys suggest I do?
This really depends on the nature of your data, how you model your domain and how you want to persist it. I would first try to figure out the basic model and based on that choose the most suitable database system. Don't jump at quick conclusions around caching with redis when you don't even know if you will need it in the first place.
Suggestion might also depend on how much time you want to spend with database layer of your application. Some database systems provide more functionality than others depending on their concepts. If you are a beginner choose a single mainstream solution that is well documented with established community like MongoDB or MySQL that will cover all your needs from the beginning so that you won't end up managing multitude of systems.
We have an application for iOS which has a chat feature. Currently it works with long poll. And now we are trying to modify it to work with sockets. When it comes to socket, we have started for a research and it seems that one of the best option is using nodejs with socket.io. Then we have used redis pub/sub to manage the message delivery and storage.
After a few researching on redis, the recommended usage suggests the stored data should fit on memory. But, we have a little big database. We would like to store the whole chat history. So we have started to plan to use redis as a cache database, which will store the online user's chat history (may be not whole of them) and write the actual conversation after getting offline from redis to mongodb/simpledb (or instantly both of them).
So as a summary, we are about to decide to use nodejs and redis pub/sub to deliver messages, redis as a cache database, and mongodb to store the whole conversation.
What do you think about the design? Is this acceptable? Or, if there is a better way you can suggest, can you please explain a little more?
Thanks in advance.
For a chat system, you're thinking big. If you think you're going to reach a million users then go for it. Consider also availability - how will your system deal with failure of a machine?