Querying a Riak DB by a Subkey - node.js

I'm using riak-js and node.js
If I have a document as follows:
{ 'skey1': 'val1',
'skey2': 'val2',
......,
'skeyn': 'valn'
}
How can I return a document that has skey2 = 'val2'? How would this be done in node.js / riak-js?

If your storage backend is riak_kv_eleveldb_backend, you can use secondary indexing.
db.save('airlines', 'KLM', {country: 'NL', established: 1919}, {index: {country: 'NL', established: 1919}});
and query like
db.query('airlines', {country: 'NL'});
(the above snippets shamelessly stolen from riak-js.org

Related

Storing and querying JSON arrays in Redisjson with nodejs

What I was hoping to do was store an array of objects using RedisJSON very simply and then query that array.
I have something similar to this:
const data = [
{
_id: '63e7d1d85ad7e2f69df8ed6e',
artist: {
genre: 'rock',
},
},
{
_id: '63e7d1d85ad7e2f69df8ed6f',
artist: {
genre: 'metal',
},
},
{
_id: '63e7d1d85ad7e2f69df8ed6g',
artist: {
genre: 'rock',
},
},
]
then I can easily store and retrieve this:
await redisClient.json.set(cacheKey, '$', data)
await redisClient.json.get(cacheKey)
works great. but now I want to also query this data, I've tried creating an index as below:
await redisClient.ft.create(
`idx:gigs`,
{
'$.[0].artist.genre': {
type: SchemaFieldTypes.TEXT,
AS: 'genre',
},
},
{
ON: 'JSON',
PREFIX: 'GIGS',
}
)
and when I try and search this index what I expect is it to return the 2 documents with the correct search filter, but instead it always returns the entire array:
const searchResult = await redisClient.ft.search(`idx:gigs`, '#genre:(rock)')
produces:
{
total: 1,
documents: [
{ id: 'cacheKey', value: [Array] }
]
}
I can't quite work out at which level I'm getting this wrong, but any help would be greatly appreciated.
Is it possible to store an array of objects and then search the nested objects for nested values with RedisJSON?
The Search capability in Redis stack treats each key containing a JSON document as a separate search index entry. I think what you are doing is perhaps storing your whole array of documents in a single Redis key, which means any matches will return the document at that key which contains all of your data.
I would suggest that you store each object in your data array as its own key in Redis. Make sure that these will be indexed by using the GIGS prefix in the key name, for example GIGS:63e7d1d85ad7e2f69df8ed6e and GIGS:63e7d1d85ad7e2f69df8ed6f.
You'd want to change your index definition to account for each document being an object too so it would look something like this:
await redisClient.ft.create(
`idx:gigs`,
{
'$.artist.genre': {
type: SchemaFieldTypes.TEXT,
AS: 'genre',
},
},
{
ON: 'JSON',
PREFIX: 'GIGS:',
}
)
Note I also updated your PREFIX to be GIGS: not GIGS - this isn't strictly necessary, but does stop your index from accidentally looking at other keys in Redis whose name begins GIGS<whatever other characters>.

Insert or Update into Mongo using mongoose and modify inner elements while updating

Assume there's a collection with documents where field_1 is unique
[
{
field_1: 'abc',
field_2: 0,
field_3: []
}
]
I want to add another document, but then field_1 is the same 'abc'. In which case I want to increment field_2, and append element into field_3 while updating. And if field_1 is different, create another document.
What is the best way to approach such queries? My first thought was to search, and then insert if no documents are found, or is there a better way? The problem with this approach is, if I'm inserting multiple documents at once, I can't use 'search and, if no doc found, insert' approach effectively.
Mongoose now supports this natively with findOneAndUpdate (calls MongoDB findAndModify).
The upsert = true option creates the object if it doesn't exist. defaults to false.
MyModel.findOneAndUpdate(
{foo: 'bar'}, // find a document with that filter
modelDoc, // document to insert when nothing was found
{upsert: true, new: true, runValidators: true}, // options
function (err, doc) { // callback
if (err) {
// handle error
} else {
// handle document
}
}
);
If the uniqueness of field_1 is enforced by unique index, you can use a kind of optimistic locking.
First you try to update:
db.collection.update(
{
field_1: 'abc'
},
{
$inc: {field_2: 1},
$push: {field_3: 'abc'},
}
);
and check result of the operation - if 1 document updated, no more actions required. Otherwise, it's the first document with field_1 == 'abc', so you try to insert it:
db.collection.insert(
{
field_1: 'abc',
field_2: 0,
field_3: []
}
);
and catch the error. If there is no error, no more actions required. Otherwise there was a concurrent insert, so you need to repeat the update query once more.

How to create LIKE operator search in Redis cache using nodejs?

I have a question, is it possible to create a LIKE operator search in Redis? Similar to relational (mysql/oracle) database.
I have complex json:
{"_id" : ObjectId("581c8b8854fdcd1ff8c944e0"),
"Objectcode" : "xxxxx",
"Objecttype" : "xxxx",
"docid" : "581c8b8554fdcd1ff8c93d10",
"description" : "Tags based search .... ",
"metaTags" : [
"tag1",
"tag2",
"tag3",
"tag5",
"tag6",
"tag7",
"tag8",
"tag9",
"tag10"
],
"__v" : 0
}
and i want to search on array of metaTags how can i do it?
Thanks
You can use MATCH commands to search data.
Eg: scan 0 MATCH *11*
Refer: http://redis.io/commands/scan
You can use Redis *SCAN commands http://redis.io/commands/scan, depending of your type of data to filter by a pattern:
SCAN iterates the set of keys in the currently selected Redis database.
SSCAN iterates elements of Sets types.
HSCAN iterates fields of Hash types and their associated values.
ZSCAN iterates elements of Sorted Set types and their associated scores.
Never use KEYS in app code, because it may ruin performance.
The two major nodejs redis client libraries node_redis and ioredis support it, with some syntax sugar:
const keys = [];
const redis = new Redis(); // ioredis
redis.mset('foo1', 1, 'foo2', 1, 'foo3', 1, 'foo4', 1, 'foo10', 1, () => {
const stream = redis.scanStream();
stream.on('data', data => {
keys = keys.concat(data);
});
stream.on('end', () => {
assert.equal(keys.sort(), ['foo1', 'foo10', 'foo2', 'foo3', 'foo4']);
});
});

Very slow update performance

I am parsing a CSV file, for each row I want to check if corresponding entry exists in the database, and if it does I want to update it, if it doesn't I want to enter a new entry.
It is very slow - only around 30 entries per second.
Am I doing something incorrectly?
Using node, mongodb, monk
function loadShopsCSV(ShopsName) {
var filename = 'test.csv'
csv
.fromPath(filename)
.on("data", function(data) {
var entry = {
PeriodEST: Date.parse(data[0]),
TextDate: textDateM,
ShopId: parseInt(data[1]),
ShopName: data[2],
State: data[3],
AreaUS: parseInt(data[4]),
AreaUSX: AreaUSArray[stateArray.indexOf(data[3])],
ProductClass: data[5],
Type: data[6],
SumNetVolume: parseInt(data[7]),
Weekday: weekdayNum,
WeightedAvgPrice: parseFloat(data[8]),
}
db.get(ShopsDBname).update(
{"PeriodEST" : entry.PeriodEST,
"ShopName": entry.ShopName,
"State" : entry.State,
"AreaUS" : entry.AreaUS,
"ProductClass" : entry.ProductClass,
"Type" : entry.Type},
{$set : entry},
function(err, result) {
}
);
}
}
})
.on("end", function() {
console.log('finished loading: '+ShopsName)
});
}, function(err) {
console.error(err);
});
}
First I would suggest to localize problem:
replace .on("data", function(data) with dummy .on("data", function() {return;}) and confirm speed of csv parsing.
turn on mongo profiler db.setProfilingLevel(1) and check slow log if there is any query slower than 100 ms.
If there are no problems above - the bottleneck is in one of nodejs libraries you are using to prepare and send query.
Assuming the problem is with slow mongodb queries, you can use explain for the update query for details. It may be the case it does not use any indexes and run a table scan for every update.
Finally, it is recommended to use bulk operations, which was designed for exactly your usecase.
Have you tried updating with no write concern? as MongoDB blocks until whole update is successful and DB sends back that acknowledgement? Are you on cluster or something? (might want to write into primary node if so)
after your {$set : entry},
{writeConcern: {w: 0}}

Issue in inserting multiple documents (Bulk Insert) into a mongoDB collection using monk in Node.JS?

I am trying to insert multiple documents into a collection in mongoDB database, but am only getting only one object ID as response (multiple documents are getting created in the DB).
Code:
exports.createRelation = function(relationDoc, db, callback) {
var relations = db.get("relations");
relations
.insert( relationDoc )
.error( function ( err ) {
callback(err, null);
})
.success( function ( doc ){
callback(null, doc);
});
};
in this the relationDoc would be an array
Input:
newRelDocs_Array: [ {
path: [ 53d0b191c5ac61d403b0090d ]
tob: 1405343247 },
{
path: [ 53d0b191c5ac61d403b0090d ],
tob: 1405343247 } ]
Response :
createRelation(): Success
createRelation_db: {
_id: 546a1d6f65c05d1c37660c4c,
tob: 1405343247
}
FYI, I am using NodeJS, and connection to MongoDB with monk.
In your question you say realationDoc is an array which I presume looks something like this:
realationDoc = [{
id: 1,
relation: 2
},{
id: 2, relation: 1
}];
In monk, the insert function is not capable of the MonogoDB 2.4 built-in function Bulk Insert. See http://docs.mongodb.org/manual/reference/method/Bulk.insert/
With your code the way it is, you are creating a collection of a single document that has 2 fields, 1 the _id which is auto-generated and the second field that is your array.
To do what you want to do, you will need to modify your code like this:
for(var i = 0; i<relationDoc.length;i++){
relations.insert(relationDoc[i])
//..add promise code for success and error stuff
}
Probably not as efficient as the Bulk Insert provided in the native driver, but should get you to where you need to go with monk.
Monk collection wrapper provides access to the underlying collection object via .col field. You can use it to access the methods not implemented by Monk, such as bulk insert and aggregate:
var relations = db.get("relations");
var docs = [ {
path: [ "53d0b191c5ac61d403b0090d" ],
tob: 1405343247
}, {
path: [ "53d0b191c5ac61d403b0090d" ],
tob: 1405343247
} ];
relations.col.insert(docs, callback);
Note that with .col you will not be able to use the promises functionality provided by monk, however. See here for more.

Resources