store and retrieve dictionary data from hiredis (a Redis client in C programming) - hiredis

I try to use hiredis (a Redis client written in C), to send and retrieve the dictionary data, from the Redis server.
I can send/receive simple data by using hiredis APIs. I can also retrieve and send the dictionary data using redis-py (a Redis client written in Python). However, I can't find the APIs for dictionary data types in hiredis.
My data is like:
data = {2: {"load":2.5, "temp":30 } }
Anyone please give some helps? Thank you.

problem fixed. I wrapped up the dictionary into a string.

Related

Setting a limit for stored keys in the node.js in-memory store?

I'm currently writing a simple application that receives a key+value and stores it in the in-memory key-value store (I'm using memdown for node.js).
I would like to set a limit for the number of keys stored (only the last 5 are relevant) but cannot seem to find a code example for that. Would you be able to help?
Thank you!

How does Nodejs worker thread work under the hood?

I have a nodeJS application that operates on user data in the form of JSON. The size of the JSON data is variable and depends on the user. Usually the size is around 30KB.
Every time any value of the the JSON parameter changes, I recalculate the JSON object, stringify it and encrypt the string using RSA encryption.
Calculating the json involves doing the following steps:
From the database get the data required to form the json. This involves querying atleast 4 tables in a nested for loop.
Use object.assign() to combine the data to form larger json object in the same for loop of order 2.
Once the final object is formed stringify and encrypt it using crypto module of nodejs.
So all this is causing my CPU to crash when the data is huge which means large number of iterations and lot of data to encrypt.
We use postgres database, hence I have to recalculate the entire json object even if only a single parameter value changed.
I was wondering if nodejs worker pool could be a solution to this? But I would like to know how the worker threads handles tasks under the hood. Suggestions on an alternate solution to this problem are also welcomed.

How to Store Nested Objects in redis [duplicate]

I keep running into cases where I have more complicated information to store than what can fit into any of Redis' simple data structures. I still want to use Redis, but I was wondering if there are any standard alternatives people use when ideally they'd like to use a nested structure?
You have basically two strategies:
you can serialize your complex objects and store them as strings. We suggest json or msgpack for the serialization format. This is easy enough to manipulate from most client-side languages. If server-side access is needed, then a server-side Lua script can easily encode/decode such objects since Redis is compiled with msgpack and json support for Lua.
you can split your objects in different keys. Instead of storing user:id and a complex data structure to this id, you can store several keys such as user:id, user:id:address_list, user:id:document_lists, etc ... If you need atomicity, pipelining MULTI/EXEC blocks can be used to guarantee the data consistency, and aggregate the roundtrips.
See a simple example in this answer:
Will the LPUSH command work on a record that was initialized from JSON?
Finally, Redis is not a document oriented database. If you really have a lot of complex documents, perhaps you could be better served by solutions such as MongoDB, ArangoDB, CouchDB, Couchbase, etc ...
When you need to modify the object, it's very inefficient to serialize the complex object to string, and save the string to Redis. Since you have to fetch the string back to the client side, deserialize it to an object, modify it, serialize it to string again, and save it back to Redis. Too much work...
Now, it's 2019, and there're some new Redis modules which can enable Redis to support nested data structures, e.g. RedisJSON, redis-protobuf.
Desclaimer: I'm the author of redis-protobuf, so I'll give some examples on this module. Also redis-protobuf is faster and more memory efficient than RedisJSON, since it's use a binary format other than text format to serialize/deserialize the data.
First of all, you need to define your nested data structure in Protobuf format and save it to a local file:
syntax = "proto3";
message SubMsg {
string s = 1;
int32 i = 2;
}
message Msg {
int32 i = 1;
SubMsg sub = 2;
repeated int32 arr = 3;
}
Then load the module with the following configure in redis.conf:
loadmodule /path/to/libredis-protobuf.so --dir proto-directory
After that, you can read and write the nested data structure:
PB.SET key Msg '{"i" : 1, "sub" : {"s" : "string", "i" : 2}, "arr" : [1, 2, 3]}'
PB.SET key Msg.i 10
PB.GET key Msg.i
PB.SET key Msg.sub.s redis-protobuf
PB.GET key Msg.sub.s
PB.SET key Msg.arr[0] 2
PB.GET key Msg.arr[0]
Please check the doc for detail. If you have any problem with redis-protobuf, feel free to let me know.

How to automatically check for a new value in redis using nodejs?

so I have two scripts : One that will pass data to redis and another one that will get the data from redis.
The script that will get the data from redis is written with node.js and the script that will send the data to redis is written in ruby.
I am looking for the rnode.js script to get the data from redis as soon as the ruby script sends it.
I was thinking about a continuous monitoring from node.js to check if there is an update into redis and as soon as there is an update , node.js grab the data. If you have a better way to do this , I'll consider it as well
I would like help clue about how should I write the node.js script to continuously monitor redis and grab any new data.
Thanks
One way of doing it is to use Redis' lists as queues.
Assuming that the Ruby script stores the new "data" under a key called data:1, have it also RPUSH the key's name to a list called new_data for example. This list is essentially a queue of all new data.
Now, have your 'rnode.js' script do a blocking left pop (BLPOP) on the new_data list. Whenever new data arrives, the script will unblock and you'll be able to process the news. Once finished, return to blocking pop.

Updating and retrieving keys in Redis

I am using Redis key-value pair for storing the data. The data against a particular key can change at any point of time, so after every retrieval request I asynchronously update the data stored against the requested key so that the next request can be served with updated data.
I have done quite a bit of testing but still I am wondering if there could be any case where this approach might have some negative consequences?
PS: The data is consolidated from multiple servers.
Thanks in advance for any help/suggestions.
If you already know the value to be stored, you can use GETSET (or a transaction if it is not a simple string type).
If the new value is some manipulation on the value i.e. f(value), you should do it in a LUA script.
Otherwise some other client might read the old value before you update it.

Resources