Redis Node JS - storing multiple objects of the same class - node.js

I want to store all objects of a class in redis cache and be able to retrive them, as I understand hashmaps are used for storing objects, but they are require a different key to be saved. So I can't save them all under key e.g. "items" and retrieve them by that key. Only way I can do it is something like this:
items.forEach(item => {
redis.hmset(`item${item.id}`, item);
}
But this feels wrong and I have to have a for loop again when I want to get this data. Is there a better solution?
Also there is a problem of associated objects, I can't find anywhere how they are stored and used in redis.

As I understand, you want to save different keys with same prefix
You can use mset to store them
For retrieving the data you use the mget
with your keys as params
In case you still want to use the hmset
Use pipline in the loop
So the call to redis will be only one with the sync action

Related

How we can store JSON object in Redis

I want to store JSON DATA in Redis How I can do that what is the best option to store JSON in Redis. my object will look like
{"name":"b123.home.group.title", "value":"Hellow World", "locale":"en-us", "uid":"b456"}
I want to update the object based on value and locale. also want to get this object with any condition. also want TTL support here (that I can remove if not required)
So what is the best way to store this data in Redis without any Memory issue and support all operations with less time?

Redis Caching - Is it a bad practice to store duplicate data

Is it a bad practice to store duplicate data in Redis cache?
I am experimenting with a GraphQL Caching solutions, but I have a few tables which I query by a combination of keys and never their primary key, and appears to be a bit of an issue for me.
Lets consider these tables
Products - id, ...
Images - id, productId, size
I need to be able to get the images ( multiple ) by productId or a single row by a combination of productId and size.
What I currently store is something in the form of
{
images:productId:1:size:sm: {...},
images:productId:1:size:xs: {...},
images:productId:1: ['images:productId:1:size:sm', 'images:productId:1:size:xs']
}
The third object contains references to all of the available images in cache for the product, so I end up performing two queries to retrieve the data.
If I want one, I can directly go ahead and get it. If I want all of them, I first have to hit the third key, and then use the keys within it to get the actual objects.
Is this a bad idea? Should I bother with it, or just go with the simpler form
{
images:productId:1:size:sm: {...},
images:productId:1:size:xs: {...},
images:productId:1: [ {...}, {...} ] // Copies of the two objects from above
}
To provide some context, some of these objects might become a bit large over time, because they might contain long text / html from Rich text editors.
I read that hashes compress data better, so I organized them in a way that they are placed in a single hash, that way invalidation becomes easier too ( I don't care about invalidating some of them, they will always be invalidated at once ).
It is a multi-tenant system, where I would be using a tenant id to scope the data to specific users.

How to Store Nested Objects in redis [duplicate]

I keep running into cases where I have more complicated information to store than what can fit into any of Redis' simple data structures. I still want to use Redis, but I was wondering if there are any standard alternatives people use when ideally they'd like to use a nested structure?
You have basically two strategies:
you can serialize your complex objects and store them as strings. We suggest json or msgpack for the serialization format. This is easy enough to manipulate from most client-side languages. If server-side access is needed, then a server-side Lua script can easily encode/decode such objects since Redis is compiled with msgpack and json support for Lua.
you can split your objects in different keys. Instead of storing user:id and a complex data structure to this id, you can store several keys such as user:id, user:id:address_list, user:id:document_lists, etc ... If you need atomicity, pipelining MULTI/EXEC blocks can be used to guarantee the data consistency, and aggregate the roundtrips.
See a simple example in this answer:
Will the LPUSH command work on a record that was initialized from JSON?
Finally, Redis is not a document oriented database. If you really have a lot of complex documents, perhaps you could be better served by solutions such as MongoDB, ArangoDB, CouchDB, Couchbase, etc ...
When you need to modify the object, it's very inefficient to serialize the complex object to string, and save the string to Redis. Since you have to fetch the string back to the client side, deserialize it to an object, modify it, serialize it to string again, and save it back to Redis. Too much work...
Now, it's 2019, and there're some new Redis modules which can enable Redis to support nested data structures, e.g. RedisJSON, redis-protobuf.
Desclaimer: I'm the author of redis-protobuf, so I'll give some examples on this module. Also redis-protobuf is faster and more memory efficient than RedisJSON, since it's use a binary format other than text format to serialize/deserialize the data.
First of all, you need to define your nested data structure in Protobuf format and save it to a local file:
syntax = "proto3";
message SubMsg {
string s = 1;
int32 i = 2;
}
message Msg {
int32 i = 1;
SubMsg sub = 2;
repeated int32 arr = 3;
}
Then load the module with the following configure in redis.conf:
loadmodule /path/to/libredis-protobuf.so --dir proto-directory
After that, you can read and write the nested data structure:
PB.SET key Msg '{"i" : 1, "sub" : {"s" : "string", "i" : 2}, "arr" : [1, 2, 3]}'
PB.SET key Msg.i 10
PB.GET key Msg.i
PB.SET key Msg.sub.s redis-protobuf
PB.GET key Msg.sub.s
PB.SET key Msg.arr[0] 2
PB.GET key Msg.arr[0]
Please check the doc for detail. If you have any problem with redis-protobuf, feel free to let me know.

Get value of multiple keys for sorted set using stackexchange.redis

I'm using stackexchange.redis for my azure redis processes. Im storing my data as (sorted set) zset. Its key - (value - score).
What i need is to query multiple keys at the same time in a request. Im using "SortedSetRangeByRankWithScores" but it only allows one key for a request.
If its not possible, i need to change all my data structure to simple string as value(json). Then i may be able to query multiple keys.
Any suggested way of doing that?

Get key from Redis and expire it at the same time

I have a Queue which is backed-up by Redis, with multiple node connections to that Redis server, and I need to make sure that it won't get the same key twice, so it will never run the same task more than once.
i'm using node-redis for this task:
client.set("some_key", data);
client.get("some_key", function (err, data) {
//..
});
How can I make sure when getting that key that no other node process will be able to get it too? if I will set it as expired only after getting the value it won't be enough when 2 process will try to get the same value at the same time.
I don't think there's any other way than to wrap it in a MULTI
MULTI
GET some_key
DEL some_key
EXEC
So using node-redis, something like
client.multi()
.get("some_key", data).del("some_key").
.exec(function (err, replies) {});
You can ensure atomicity of operations in Redis with MULTI/EXEC blocks and/or Lua scripting. In your case, you can do the GET followed by a DEL immediately afterward using either of the above approaches to ensure a single read.
I'd use a list with pop operations instead of a key. In particular you add new items to, say, the right, (via rpush) and pop them off the left ( via LPOP).
If you are storing a bunch if data in the key currently, such as a hash, use a unique identifier as the key of the hash and add that ID to the list instead. That way you get the get-and-remove capability in a simple fashion without needing transactions and multiple commands with the ability to store job data as well.
When the job succeeds delete the data key, if it fails you can re-queue it.

Resources