I have the following code:
var db = require("redis");
var dbclient1 = db.createClient();
dbclient1.zadd("myprivateset", 3, {"guid":"abab-baba", "data-persistent":"xxxx", "size":"20"})
dbclient1.zadd("myprivateset", 2, {"guid":"abab-baba3", "data-persistent":"xxxx", "size":"20"})
dbclient1.zrangebyscore("myprivateset", 1, 4)
dbclient1.hgetall("myprivateset", function(err, rep){
console.log(rep);
});
I wish to store my objects (in JSON format) in a sorted set, which determine by the score (3 & 2 in our case).
For some reason, when I print this table (rep), I get undefined.
What I do wrong?
Issue 1 -- sorted set keys
Try stringifying the JSON you are using as the keys of your sorted set. For example,
dbclient1.zadd("myprivateset", 3, {"guid":"abab-baba", "data-persistent":"xxxx", "size":"20"})
needs to be:
dbclient1.zadd("myprivateset", 3, JSON.stringify({"guid":"abab-baba", "data-persistent":"xxxx", "size":"20"}))
Without stringifying the keys, every zadd will use the key [object Object] overwriting each time. That is, you'll only ever have one item in your sorted set that is unidentifiable (other than by [object Object]).
Issue 2 -- fetching data
Also, hgetall is not the redis command to use for retrieving data in a redis sorted set. You'll want to focus on sorted set specific commands. A list of redis commands are listed here: http://redis.io/commands
My two cents, building on comments by #leonid-beschastny and #cpentra1. I recommend using redis.multi(). It allows for several calls in a batch, and as you can see in the example, as soon as the three elements are added to the ordered set, we can perform a zrangebyscore in the same multi batch and get the expected results. Instructions can be created dynamically. The replies array when multi.exec() is invoked returns the results for each of the multi operations, in order.
var db = require("redis");
var dbclient1 = db.createClient();
var multi = dbclient1.multi();
// We use JSON.stringify() as suggested by #cpentra1
multi.zadd("myprivateset", 3, JSON.stringify({"guid":"abab-baba", "data-persistent":"xxxx", "size":"20"}));
multi.zadd("myprivateset", 2, JSON.stringify({"guid":"abab-baba3", "data-persistent":"xxxx", "size":"20"}));
multi.zadd("myprivateset", 2, JSON.stringify({"guid":"abab-dafa3", "data-persistent":"yyyy", "size":"21"}));
multi.zrangebyscore("myprivateset", 1, 4);
multi.zcard("myprivateset"); // The total number of elements in the set
multi.exec(function(err, replies) {
console.log(replies)
// Will output something like:
// [ 1,
// 1,
// 1,
// [ '{"guid":"abab-baba3","data-persistent":"xxxx","size":"20"}',
// '{"guid":"abab-dafa3","data-persistent":"yyyy","size":"21"}',
// '{"guid":"abab-baba","data-persistent":"xxxx","size":"20"}' ],
// 3 ]
});
Note: if you run the same example twice, instead of 1s in the first three elements of the replies array, you'll get 0s as the same member with the same score cannot be added twice.
Seems like with Redis 6.2 the format has changed to and object with score and value attributes or an array of those, like this:
async function sortedSet() {
let client;
try {
client = createClient();
client.on("error", (err) => console.log("Redis Client Error", err));
await client.connect();
console.log("connected");
await client.zAdd("user:0:followers", [{score: "1", value: "John"}, {score: "2", value: "Other John"}]);
console.log("sorted set added");
} finally {
await client.quit();
}
}
sortedSet("duto_guerra", "with hashes");
In case you are wondering, I figured this out by reading the source code for node-redis ZADD
Related
I'm working on a project in nodejs using mongodb as my database. I'm trying to get rid of elements within my array that have dates before today. The problem that I'm having is that at most 5 elements are being deleted. I want all elements that meet this criteria to be deleted. Also, when I don't have user.possible.pull(items._id) const result = await user.save() all elements that meet this criteria are shown in my deletePossible array. However, when I do have user.possible.pull(items._id) const result = await user.save() at most 5 are being shown as well.
In my database, my User document looks like:
_id: '',
name: '',
possible: Array
0 Object
date: "Tues Jan 10 2023",
_id: "63c0b169b6fa12ac49874a13"
1 Object
date: "Wed Jan 11 2023",
_id: "63c0b172b6fa12ac49874a32"
...
My code:
const user = await User.findById(args.userId)
const deletePossible = [];
for (var items of user.possible) {
if (+new Date(items.date) < +new Date().setHours) {
deletePossible.push(items._id)
user.possible.pull(items._id)
const result = await user.save()
}
}
`
console.log(deletePossible)
I've tried a number of things such as:
for (var item of deletePossible) {
user.possible.pull(item)
const result = await user.save()
}
following deletePossible.push(items._id), and
const userInfo = await User.updateOne( { _id: args.userId}, {possible:{$pull:[...deletePossible] }} )
which removes all of the arrays from possible regardless of if it's contained within deletePossible and then adds a random _id. Nothing I have tried seems to work. Does anyone have any idea why this is happening and how to get this to work properly? I would really appreciate any help or advice. Thank you!
You can simply filter user.possible and save the updated User:
const user = await User.findById(args.userId);
if (!user) return;
// Change the condition based on your needs
user.possible = user.possible.filter(p => new Date(p.date) >= new Date());
await user.save();
The core of the issue appears to not be related to Mongo or Mongoose really, but is rather just a standard algorithmic logic problem.
Consider the following code, which iterates over an array, logs each element, and removes the third element when it arrives at it:
const array = [0, 1, 2, 3, 4];
for (const element of array) {
console.log(element);
if (element === 2) {
array.splice(2, 1); // remove the element at index 2 from the array
}
}
This code outputs:
0
1
2
4
Notice anything interesting? 3 has been skipped.
This happens because deleting an element from an array causes everything in front of it to move up a position. So if you're looking at 2 and you delete it, then 3 moves into 2's place, and 4 moves into 3's place. So then if you look at the next position, you're now looking at 4, not 3. The code never sees the 3.
This is why you should never change an array while iterating over it. A lot of languages won't even allow you to (if you're using iterators), they'll throw some sort of "underlying collection was modified during iteration" error. You can make it work if you know what you're doing (often just by iterating over the array backwards), but there are usually better solutions anyway, like using Array.prototype.filter().
One easy solution is to iterate over a copy of the array, so that when you do the delete, the array you're iterating over (the copy) isn't changed. That would look like this:
for (const item of [...user.possible]) {
if (/* some condition */) {
user.possible.pull(item._id);
}
}
Another problem with your code: +new Date().setHours will always evaluate to NaN since setHours is a function and converting a function to a number always results in NaN. I suspect this is just a typo you introduced while struggling with the original issue.
The suggestion to use filter() is even better.
I am new to dynamodb.
I want to increment the Sort Key
If the id=0 the next id=1 and so on,
If the user(Partition key), id(Sort Key) add items the next add items the id increment 1.
The code use on PutItem with dynamodb.
Is possible to do that?
I did not want use the UUID( unique Key)
Most situations don't need an auto-incrementing attribute and DynamoDB doesn't provide this feature out of the box. This is considered to be an anti-pattern in distributed systems.
But, see How to autoincrement in DynamoDB if you really need to.
I understand that you may need this number because it is a legal obligation to have incremental invoice numbers for example.
One way would be to create a table to store your number sequences.
Add fields like:
{
name: "invoices",
prefix: "INV",
numberOfDigits: 5,
leasedValue: 1,
appliedValue: 1,
lastUpdatedTime: '2022-08-05'
},
{
name: "deliveryNotes",
prefix: "DN",
numberOfDigits: 5,
leasedValue: 1,
appliedValue: 1,
lastUpdatedTime: '2022-08-05'
}
You need 2 values (a lease and an applied value), to make sure you never skip a beat, even when things go wrong.
That check-lease-apply-release/rollback logic looks as follows:
async function useSequence(name: string, cb: async (uniqueNumber: string) => void) {
// 1. GET THE SEQUENCE FROM DATABASE
const sequence = await getSequence("invoices");
this.validateSequence(sequence);
// 2. INCREASE THE LEASED VALUE
const oldValue = sequence.appliedValue;
const leasedValue = oldValue + 1;
sequence.leasedValue = leasedValue;
await saveSequence(sequence);
try {
// 3. CREATE AND SAVE YOUR DOCUMENT
await cb(format(leasedValue));
// 4. INCREASE THE APPLIED VALUE
sequence.appliedValue++;
await saveSequence(sequence);
} catch(err) {
// 4B. ROLLBACK WHEN THINGS ARE BROKEN
console.err(err)
try {
const sequence = await getSequence(name);
sequence.leasedValue--;
this.validateSequence(sequence);
await saveSequence(sequence);
} catch (err2) {
console.error(err2);
}
throw err;
}
}
function validateSequence(sequence) {
// A CLEAN STATE, MEANS THAT THE NUMBERS ARE IN SYNC
if (sequence.leasedValue !== sequence.appliedValue) {
throw new Error("sequence is broken.");
}
}
Then, whenever you need a unique number you can use the above function to work in a protected scope, where the number will be rollbacked when something goes wrong.
const details = ...;
await useSequence("invoice", async (uniqueNumber) => {
const invoiceData = {...details, id: uniqueNumber};
const invoice = await this.createInvoice(invoiceData);
await this.saveInvoice(invoice);
})
Can it scale? Can it run on multiple instances? No, it can't. It never will be, because in most countries it's just not legal to do so. You're not allowed to send out invoice 6 before invoice 5 or to cancel invoice 5 after you've send invoice 6.
The only exception being, if you have multiple sequences. e.g. in some cases you're allowed to have a sequence per customer, or a sequence per payment system, ... Hence, you want them in your database.
I've a node.js api in which user sends the required fields as an array to be fetched from the mongodb database. I need to find the data of that fields using Find query. I've written forEach statement to loop through that array and got the array elements. But when I try to get the results by inserting the array elements in the query, it doesn't giving the required results. Could any one please help me in resolving the issue by seeing the code below?
templateLevelGraphData: async function(tid,payload){
let err, templateData, respData = [], test, currentValue;
[err,templateData] = await to(Template.findById(tid));
var templateId = templateData.templateId;
payload.variables.forEach(async data=>{
console.log(data); //data has the array elements like variables=["humidity"]
[err, currentValue] = await to(mongoose.connection.db.collection(templateId).find({},{data:1}).sort({"entryDayTime":-1}).limit(1).toArray());
console.log(currentValue);
});
return "success";
}
The expected output is,
[ { humidity: 36 } ]
But I'm getting only _id like,
[ { _id: 5dce3a2df89ab63ee4d95495 } ]
I think data is not applying in the query. But I'm printing the data in the console where it's giving the correct results by displaying the array elements like, humidity. What I need to do to make it work?
When you are passing {data: 1} you are passing an array where is expecting name of column.
You have to create an object where the keys are going to be the elements of the array and set them to 1.
const projection = data.reduce((a,b) => (a[b]=1, a), {});
[...] .find({}, projection) [...]
Actually I got the solution.
for(let i=0;i<payload.variables.length;i++){
var test = '{"'+ payload.variables[i] +'":1,"_id":0}';
var query = JSON.parse(test);
[err, currentValue] = await to(mongoose.connection.db.collection(templateId).find({"deviceId":deviceId},query).sort({"entryDayTime":-1}).limit(1).toArray());
console.log(currentValue); //It's giving the solution
}
I have a question, is it possible to create a LIKE operator search in Redis? Similar to relational (mysql/oracle) database.
I have complex json:
{"_id" : ObjectId("581c8b8854fdcd1ff8c944e0"),
"Objectcode" : "xxxxx",
"Objecttype" : "xxxx",
"docid" : "581c8b8554fdcd1ff8c93d10",
"description" : "Tags based search .... ",
"metaTags" : [
"tag1",
"tag2",
"tag3",
"tag5",
"tag6",
"tag7",
"tag8",
"tag9",
"tag10"
],
"__v" : 0
}
and i want to search on array of metaTags how can i do it?
Thanks
You can use MATCH commands to search data.
Eg: scan 0 MATCH *11*
Refer: http://redis.io/commands/scan
You can use Redis *SCAN commands http://redis.io/commands/scan, depending of your type of data to filter by a pattern:
SCAN iterates the set of keys in the currently selected Redis database.
SSCAN iterates elements of Sets types.
HSCAN iterates fields of Hash types and their associated values.
ZSCAN iterates elements of Sorted Set types and their associated scores.
Never use KEYS in app code, because it may ruin performance.
The two major nodejs redis client libraries node_redis and ioredis support it, with some syntax sugar:
const keys = [];
const redis = new Redis(); // ioredis
redis.mset('foo1', 1, 'foo2', 1, 'foo3', 1, 'foo4', 1, 'foo10', 1, () => {
const stream = redis.scanStream();
stream.on('data', data => {
keys = keys.concat(data);
});
stream.on('end', () => {
assert.equal(keys.sort(), ['foo1', 'foo10', 'foo2', 'foo3', 'foo4']);
});
});
I am working with a redis database in node.js using node_redis. Here is a quick example of a structure similar to what I am using.
hmset('user:1234',
'user_id', 1234,
'user_name', billy,
'user_age', 16);
//add user to group 1 store their id with their age as their score
zadd(['group:1:users_by_age', 16, user:1234]);
hmset('user:1235',
'user_id', 1235,
'user_name', jake,
'user_age', 21);
//add user to group 1 store their id with their age as their score
zadd(['group:1:users_by_age', 21, user:1235]);
Now lets say I wanted to get all user data for users over the age of 18 in group:1
I know I can get the user keys by calling
postClient.zrangebyscore(
[ 'group:1:users_by_age', '18', '+inf'],
function( err, results ){
console.log(results);
}
);
Where I get lost is how do I fetch all the user objects at once?
To take it one step further, is it possible to both zrangebyscore and get all the userobjects in one call?
To take it one step further, is it possible to both zrangebyscore and get all the userobjects in one call?
I don't believe you can. The SORT command has GET functionality built in which allows you to do such a thing in one call, but there's no way to pipe the results of a ZRANGEBYSCORE into SORT (barring storing it in a temporary key and then using SORT on that key).
There's also no natural way to retrieve multiple hashes in one call.
With your current implementation, considering these limitations, you might get all the users with a multi, like:
postClient.zrangebyscore([...], function (err, results) {
var multi = postClient.multi();
for (var i=0; i<results.length; i++){
multi.hgetall(results[i]);
}
multi.exec(function(err, users){
console.log(users);
});
});
You could do this with a luascript though, retrieving the list of keys and iterating over it, calling hmget or hgetall on each key.
I do something like this in the following example, using hmget and specific keys.
Obvious disclaimer: I am not a lua programmer. Brief explanation: the script takes a start and end range for user age, then any number of hash keys which it uses for hmget on each user key, and appending it all to an array which will be wrapped up as user objects back in the javascript.
var script = '\
local keys = redis.call("ZRANGEBYSCORE", KEYS[1], ARGV[1], ARGV[2]); \
if not keys then return {} end; \
local users, attrs = {}, {} \
for i=3,#ARGV do \
table.insert(attrs, ARGV[i]) \
end \
for i,key in pairs(keys) do \
local vals = redis.call("HMGET", key, unpack(attrs)); \
if vals then \
for j,val in pairs(vals) do \
table.insert(users, val) \
end \
end \
end \
return users \
';
// The rest of this you'd probably want to wrap up in an async function,
// e.g `getUsersByRange`
// specify the user attributes you want
var attrs = ["id", "name", "age"];
// specify the range
var range = [18, "+INF"];
// get the attributes length, used to build the user objects
var attrlen = attrs.length;
// wrap up the params
var params = [script, 1, "users_by_age"].concat(range).concat(attrs);
// retrieve the user attributes in the form of [a1, a2, a3, a1, a2, a3, ... ],
// then collate them into an array of user objects like hgetall would return.
postClient.eval(params, function (err, res) {
var users = [], i, j, k;
for (i=0, j=0; i<res.length; i+=attrlen, j++) {
users[j] = {};
for (k=0; k<attrlen; k++) {
users[j][attrs[k]] = res[i+k];
}
// this should be a list of users
console.log(users);
}
});
Note that it would be trivial to process the users back into objects inside the lua script, but when being converted back to redis data structures, lua tables become redis multi bulk replies (arrays), and the structure would be lost. Because of that it's necessary to convert the multi bulk reply into user objects back in javascript.