What I was hoping to do was store an array of objects using RedisJSON very simply and then query that array.
I have something similar to this:
const data = [
{
_id: '63e7d1d85ad7e2f69df8ed6e',
artist: {
genre: 'rock',
},
},
{
_id: '63e7d1d85ad7e2f69df8ed6f',
artist: {
genre: 'metal',
},
},
{
_id: '63e7d1d85ad7e2f69df8ed6g',
artist: {
genre: 'rock',
},
},
]
then I can easily store and retrieve this:
await redisClient.json.set(cacheKey, '$', data)
await redisClient.json.get(cacheKey)
works great. but now I want to also query this data, I've tried creating an index as below:
await redisClient.ft.create(
`idx:gigs`,
{
'$.[0].artist.genre': {
type: SchemaFieldTypes.TEXT,
AS: 'genre',
},
},
{
ON: 'JSON',
PREFIX: 'GIGS',
}
)
and when I try and search this index what I expect is it to return the 2 documents with the correct search filter, but instead it always returns the entire array:
const searchResult = await redisClient.ft.search(`idx:gigs`, '#genre:(rock)')
produces:
{
total: 1,
documents: [
{ id: 'cacheKey', value: [Array] }
]
}
I can't quite work out at which level I'm getting this wrong, but any help would be greatly appreciated.
Is it possible to store an array of objects and then search the nested objects for nested values with RedisJSON?
The Search capability in Redis stack treats each key containing a JSON document as a separate search index entry. I think what you are doing is perhaps storing your whole array of documents in a single Redis key, which means any matches will return the document at that key which contains all of your data.
I would suggest that you store each object in your data array as its own key in Redis. Make sure that these will be indexed by using the GIGS prefix in the key name, for example GIGS:63e7d1d85ad7e2f69df8ed6e and GIGS:63e7d1d85ad7e2f69df8ed6f.
You'd want to change your index definition to account for each document being an object too so it would look something like this:
await redisClient.ft.create(
`idx:gigs`,
{
'$.artist.genre': {
type: SchemaFieldTypes.TEXT,
AS: 'genre',
},
},
{
ON: 'JSON',
PREFIX: 'GIGS:',
}
)
Note I also updated your PREFIX to be GIGS: not GIGS - this isn't strictly necessary, but does stop your index from accidentally looking at other keys in Redis whose name begins GIGS<whatever other characters>.
Related
I have a collection in my MongoDB:
{ userId: 1234, name: 'Mike' }
{ userId: 1235, name: 'John' }
...
I want to get a result of the form
dict[userId] = document
in other words, I want a result that is a dictionary where the userId is the key and the rest of the document is the value.
How can I do that?
You can use $arrayToObject to do that, you just need to format it to array of k, v before.
It is not clear if you want one dictionary for all documents, or each document in a dictionary format. I guess you want the first option, but I'm showing both:
One dictionary with all data*, requires a $group (which also format the data):
db.collection.aggregate([
{
$group: {
_id: null,
data: {$push: {k: {$toString: "$userId"}, v: "$$ROOT"}}
}
},
{
$project: {data: {$arrayToObject: "$data"}}
},
{
$replaceRoot: {newRoot: "$data"}
}
])
See how it works on the playground example - one dict
*Notice that in this option, all the data is inserted to one document, and document as a limit size.
Dictionary format: If you want to get all documents as different results, but with a dictionary format, just replace the first step of the aggregation with this:
{
$project: {
data: [{k: {$toString: "$userId"}, v: "$$ROOT"}],
_id: 0
}
},
See how it works on the playground example - dict per document
My document schema is as follows:
const CollectionSchema = mongoose.Schema({
ImageCategory:{type:String,required:true},
imgDetails: [
{
_id: false,
imageUrl:{type:String},
imageName:{type:String},
imageMimeType:{type:String},
}
],
Date: {
type: String,
default: `${year}-${month}-${day}`,
},
},{timestamps: true,})
So in the database for example one document has multiple images with a single image category. What I am trying to do is I want to delete an object from imgDetails array.
Let me explain my question more precisely: imgDetails is an array
Explanation: I want to loop in imgDetails and then find (where imgageUrl === req.body.imageUrl) if its match delete that whole object which have that req.body.imageUrl and then update the document.
Please guide me on how to write such a query. Regards
Demo - https://mongoplayground.net/p/qpl7lXbKAZE
Use $pull
The $pull operator removes from an existing array all instances of a value or values that match a specified condition.
db.collection.update(
{},
{ $pull: { "imgDetails": { imageUrl: "xyz" } } }
)
i'm having trouble with node & knex.js
I'm trying to build a mini blog, with posts & adding functionality to add multiple tags to post
I have a POST model with following properties:
id SERIAL PRIMARY KEY NOT NULL,
name TEXT,
Second I have Tags model that is used for storing tags:
id SERIAL PRIMARY KEY NOT NULL,
name TEXT
And I have many to many table: Post Tags that references post & tags:
id SERIAL PRIMARY KEY NOT NULL,
post_id INTEGER NOT NULL REFERENCES posts ON DELETE CASCADE,
tag_id INTEGER NOT NULL REFERENCES tags ON DELETE CASCADE
I have managed to insert tags, and create post with tags,
But when I want to fetch Post data with Tags attached to that post I'm having a trouble
Here is a problem:
const data = await knex.select('posts.name as postName', 'tags.name as tagName'
.from('posts')
.leftJoin('post_tags', 'posts.id', 'post_tags.post_id')
.leftJoin('tags', 'tags.id', 'post_tags.tag_id')
.where('posts.id', id)
Following query returns this result:
[
{
postName: 'Post 1',
tagName: 'Youtube',
},
{
postName: 'Post 1',
tagName: 'Funny',
}
]
But I want the result to be formated & returned like this:
{
postName: 'Post 1',
tagName: ['Youtube', 'Funny'],
}
Is that even possible with query or do I have to manually format data ?
One way of doing this is to use some kind of aggregate function. If you're using PostgreSQL:
const data = await knex.select('posts.name as postName', knex.raw('ARRAY_AGG (tags.name) tags'))
.from('posts')
.innerJoin('post_tags', 'posts.id', 'post_tags.post_id')
.innerJoin('tags', 'tags.id', 'post_tags.tag_id')
.where('posts.id', id)
.groupBy("postName")
.orderBy("postName")
.first();
->
{ postName: 'post1', tags: [ 'tag1', 'tag2', 'tag3' ] }
For MySQL:
const data = await knex.select('posts.name as postName', knex.raw('GROUP_CONCAT (tags.name) as tags'))
.from('posts')
.innerJoin('post_tags', 'posts.id', 'post_tags.post_id')
.innerJoin('tags', 'tags.id', 'post_tags.tag_id')
.where('posts.id', id)
.groupBy("postName")
.orderBy("postName")
.first()
.then(res => Object.assign(res, { tags: res.tags.split(',')}))
There are no arrays in MySQL, and GROUP_CONCAT will just concat all tags into a string, so we need to split them manually.
->
RowDataPacket { postName: 'post1', tags: [ 'tag1', 'tag2', 'tag3' ] }
The result is correct as that is how SQL works - it returns rows of data. SQL has no concept of returning anything other than a table (think CSV data or Excel spreadsheet).
There are some interesting things you can do with SQL that can convert the tags to strings that you concatenate together but that is not really what you want. Either way you will need to add a post-processing step.
With your current query you can simply do something like this:
function formatter (result) {
let set = {};
result.forEach(row => {
if (set[row.postName] === undefined) {
set[row.postName] = row;
set[row.postName].tagName = [set[row.postName].tagName];
}
else {
set[row.postName].tagName.push(row.tagName);
}
});
return Object.values(set);
}
// ...
query.then(formatter);
This shouldn't be slow as you're only looping through the results once.
This is my data saved in elastic search:
{
index: productName,
type: 'users',
body: {
name:'xyz',
subject:{
12:{
id:12,
name:Maths
count:3
},
13:{
id:13,
name:Physics
count:7
}
}
}
}
Is There a way to somehow search and get total number of users whose count in maths is greater than 0. Where 12 will be in a variable say subject_id.
I tried searching in the docs but coudn't find any one example to use.
I am new to elastic search any help would be appreciated thanks.
Create an object first, like this:
var queryObj = {
"query":{
"range":{
}
}
};
queryObj.query.range['subjects.'+data.subject_id+'.opened']={
"gte":1
};
then pass this object in the body of elastic search like this
elasticClient.search({
index: indexName,
type: type,
body: {
queryObj
}
}).then(promiseFunc)
I have an object array in a reducer that looks like this:
[
{id:1, name:Mark, email:mark#email.com},
{id:2, name:Paul, email:paul#gmail.com},
{id:3,name:sally, email:sally#email.com}
]
Below is my reducer. So far, I can add a new object to the currentPeople reducer via the following:
const INITIAL_STATE = { currentPeople:[]};
export default function(state = INITIAL_STATE, action) {
switch (action.type) {
case ADD_PERSON:
return {...state, currentPeople: [ ...state.currentPeople, action.payload]};
}
return state;
}
But here is where I'm stuck. Can I UPDATE a person via the reducer using lodash?
If I sent an action payload that looked like this:
{id:1, name:Eric, email:Eric#email.com}
Would I be able to replace the object with the id of 1 with the new fields?
Yes you can absolutely update an object in an array like you want to. And you don't need to change your data structure if you don't want to. You could add a case like this to your reducer:
case UPDATE_PERSON:
return {
...state,
currentPeople: state.currentPeople.map(person => {
if (person.id === action.payload.id) {
return action.payload;
}
return person;
}),
};
This can be be shortened as well, using implicit returns and a ternary:
case UPDATE_PERSON:
return {
...state,
currentPeople: state.currentPeople.map(person => (person.id === action.payload.id) ? action.payload : person),
};
Mihir's idea about mapping your data to an object with normalizr is certainly a possibility and technically it'd be faster to update the user with the reference instead of doing the loop (after initial mapping was done). But if you want to keep your data structure, this approach will work.
Also, mapping like this is just one of many ways to update the object, and requires browser support for Array.prototype.map(). You could use lodash indexOf() to find the index of the user you want (this is nice because it breaks the loop when it succeeds instead of just continuing as the .map would do), once you have the index you could overwrite the object directly using it's index. Make sure you don't mutate the redux state though, you'll need to be working on a clone if you want to assign like this: clonedArray[foundIndex] = action.payload;.
This is a good candidate for data normalization. You can effectively replace your data with the new one, if you normalize the data before storing it in your state tree.
This example is straight from Normalizr.
[{
id: 1,
title: 'Some Article',
author: {
id: 1,
name: 'Dan'
}
}, {
id: 2,
title: 'Other Article',
author: {
id: 1,
name: 'Dan'
}
}]
Can be normalized this way-
{
result: [1, 2],
entities: {
articles: {
1: {
id: 1,
title: 'Some Article',
author: 1
},
2: {
id: 2,
title: 'Other Article',
author: 1
}
},
users: {
1: {
id: 1,
name: 'Dan'
}
}
}
}
What's the advantage of normalization?
You get to extract the exact part of your state tree that you want.
For instance- You have an array of objects containing information about the articles. If you want to select a particular object from that array, you'll have to iterate through entire array. Worst case is that the desired object is not present in the array. To overcome this, we normalize the data.
To normalize the data, store the unique identifiers of each object in a separate array. Let's call that array as results.
result: [1, 2, 3 ..]
And transform the array of objects into an object with keys as the id(See the second snippet). Call that object as entities.
Ultimately, to access the object with id 1, simply do this- entities.articles["1"].
If you want to replace the old data with new data, you can do this-
entities.articles["1"] = newObj;
Use native splice method of array:
/*Find item index using lodash*/
var index = _.indexOf(currentPeople, _.find(currentPeople, {id: 1}));
/*Replace item at index using splice*/
arr.splice(index, 1, {id:1, name:'Mark', email:'mark#email.com'});