I am storing the data in CouchBase from Node.js app using the following codes (couchnode client):
//Expiry information of couchbase doucumenets. fetched from config.json
//Expiry is less than 30*24*60*60 (30 days)
//The value is interpreted as the number of seconds from the point of storage or update.
//Expiry is greater than 30*24*60*60
//The value is interpreted as the number of seconds from the epoch (January 1st, 1970).
//Expiry is 0
//This disables expiry for the item.
CouchBaseDB.set(keytosave, 60, doctosave, function (err, meta) {
if (err) { console.log(err); } else {console.log('saved');}
});
Unfortunately, the above code is not working (its saving 60 itself instead of object doctosave) and no where its explained how to set expiry othere than Chapter 4. Java Method Summary - 4.4. Expiry Values.
Does any one came across the same and found any work-around/solution or any document support for the same. If explained it would be of great HELP.
Thanks in advance.
Set function looks like this:
function set(key, doc, meta, callback) { ... }
If you want to add expiry for stored key just create meta = {} object and add field expiry to it: meta.expiry = 1000.
Here is link to sources
So to store your doc you need:
var meta = {"expiry":60};
CouchBaseDB.set(keytosave, doctosave, meta, function (err, meta) {
if (err) { console.log(err); } else {console.log('saved');}
});
Note: Also if that key is retreived form couchbase via CouchBaseDB.get(), meta could be extracted from that get function.
Related
Can someone recommend the best solution for getting the latest record belonging to a partition key when using the azure-storage library in Node? Since there is no .orderBy() option... what is the best approach?
In C# I would probably do something like:
var latestRecord = results.OrderByDescending(r => r.Timestamp).FirstOrDefault();
What would the equivalent be when using this node library for Table Storage?
We need to implement custom compare function to sort the results.
tableService.queryEntities(tableName, query, null, function(error, result, response) {
if (!error) {
var latestRecord = result.entries.sort((a,b)=>{
return new Date(b.Timestamp._) - new Date(a.Timestamp._);
})[0])
}});
Results are decorated with Edm type, so we need b.Timestamp._.
Timestamp: { '$': 'Edm.DateTime', _: 2018-10-26T07:32:58.490Z }
If this format is somehow unpleasant, we can get entities without metadata from response. Need to set payloadFormat.
tableService.queryEntities(tableName, query, null, {payloadFormat:azure.TableUtilities.PayloadFormat.NO_METADATA},function(error, result, response) {
if (!error) {
var latestRecord = response.body.value.sort((a,b)=>{
return new Date(b.Timestamp) - new Date(a.Timestamp);
})[0]
}});
Been trying to find samples usage for some of the static methods for a persistedModel in Loopback.
https://apidocs.strongloop.com/loopback/#persistedmodel-prototype-updateattribute
it just says:
persistedModel.updateAttributes(data, callback)
But how you I choose the which record I want to update? this is not working for me.
var order = Order.setId('whateverrecordId');
order.updateAttributes({name:'new name'},callback)
Loving loopback.. but their doc, sucks.. :(
You can use those on event listener like AfterSave
example:
Model.observe('after save', function(ctx, next) {
ctx.instance.updateAttribute(fieldname:'new value');
next();
});
1- What you did was right but i do not advise this method it's used for instance methods and generally to update fields like date for all the collection that you have so you don't need an id for it.
But you can try to make an array containing data to update containing also the ids and then make a comparison to fill in data for the ids that you have. (in #dosomething)
order.find().then(function(orders) {
orders.forEach(function(element) {
order.setId(element.id);
#DoSomething
order.updateAttribute({new: data}, function(err, instance) {
console.log(instance);
})
});
})
2- You can use updateAll to update one or many attribute.
PersistedModel.updateAll([where], data, callback)
var Updates = [{id : 1, name: name1}, ...]
Updates.forEach(function(element) {
order.updateAll({id : element.id}, {name :element.name}, function(err, count) {
if (err) {
console.error(err);
}
console.log(count); // number of data updated
})
})
I was writing code for crypto currency rates display backend,
I have 2 main problems are
First:
last: {
rate: (idx, cb) => {
models.master.find({}).where('id',idx).limit(10).exec((err,data)=>{
if(data && data.length>0) cb(data);
else cb(err);
});
}
}
have a look at the code it always having only one data JSON object instead it should have multiple instead.
Second:
How can I sort by recently added data means I already added time key in order to do so but I am with the lack of knowledge.(both ascending and descending)
"time":"5-7-2017_12:05:43:PM"
Test MongoDB object:
[{"_id":"595c88bf4d451b206454434b","time":"5-7-2017_12:05:43:PM","id":"bitcoin","name":"Bitcoin","symbol":"BTC","rank":"1","price_usd":"2561.25","price_btc":"1.0","market_cap_usd":"42078648188.0","available_supply":"16428950.0","total_supply":"16428950.0","percent_change_1h":"0.2","percent_change_24h":"-2.42","percent_change_7d":"2.13","last_updated":"1499236459","price_inr":"165829.411875","market_cap_inr":"2724403116224","__v":0}]
I'm working on a node.js project. I'm trying to understand how MongoDB works. I'm obtaining data hourly via a cron file. I'd like for there to be unique data, so I'm using update instead of insert. That works fine. I'd like to add the option that the data expires after three days. Its not clear to me how to do that.
In pseudo code:
Setup Vars, URL's, a couple of global variables, lineNr=1, end_index=# including databaseUrl.
MongoClient.connect(databaseUrl, function(err, db) {
assert.equal(null, err, "Database Connection Troubles: " + err);
**** db.collection('XYZ_Collection').createIndex({"createdAt": 1},
{expireAfterSeconds: 120}, function() {}); **** (update)
s = fs.createReadStream(text_file_directory + 'master_index.txt')
.pipe(es.split())
.pipe(es.mapSync(function(line) {
s.pause(); // pause the readstream
lineNr += 1;
getContentFunction(line, s);
if (lineNr > end_index) {
s.end();
}
})
.on('error', function() {
console.log('Error while reading file.');
})
.on('end', function() {
console.log('All done!');
})
);
function getContentFunction(line, stream){
(get content, format it, store it as flat JSON CleanedUpContent)
var go = InsertContentToDB(db, CleanedUpContent, function() {
stream.resume();
});
}
function InsertContentToDB(db, data, callback)
(expiration TTL code if placed here generates errors too..)
db.collection('XYZ_collection').update({
'ABC': data.abc,
'DEF': data.def)
}, {
"createdAt": new Date(),
'ABC': data.abc,
'DEF': data.def,
'Content': data.blah_blah
}, {
upsert: true
},
function(err, results) {
assert.equal(null, err, "MongoDB Troubles: " + err);
callback();
});
}
So the db.collection('').update() with two fields forms a compound index to ensure the data is unique. upsert = true allows for insertion or updates as appropriate. My data varies greatly. Some content is unique, other content is an update of prior submission. I think I have this unique insert or update function working correctly. Info from... and here
What I'd really like to add is an automatic expiration to the documents within the collection. I see lots of content, but I'm at a loss as to how to implement it.
If I try
db.collection('XYZ_collection')
.ensureIndex( { "createdAt": 1 },
{ expireAfterSeconds: 259200 } ); // three days
Error
/opt/rh/nodejs010/root/usr/lib/node_modules/mongodb/lib/mongodb/mongo_client.js:390
throw err
^
Error: Cannot use a writeConcern without a provided callback
at Db.ensureIndex (/opt/rh/nodejs010/root/usr/lib/node_modules/mongodb/lib/mongodb/db.js:1237:11)
at Collection.ensureIndex (/opt/rh/nodejs010/root/usr/lib/node_modules/mongodb/lib/mongodb/collection.js:1037:11)
at tempPrice (/var/lib/openshift/56d567467628e1717b000023/app-root/runtime/repo/get_options_prices.js:57:37)
at /opt/rh/nodejs010/root/usr/lib/node_modules/mongodb/lib/mongodb/mongo_client.js:387:15
at process._tickCallback (node.js:442:13)
If I try to use createIndex I get this error...
`TypeError: Cannot call method 'createIndex' of undefined`
Note the database is totally empty, via db.XYZ_collection.drop() So yeah, I'm new to the Mongo stuff. Anybody understand what I need to do? One note, I'm very confused by something I read: in regards to you can't create TTL index if indexed field is already in use by another index. I think I'm okay, but its not clear to me.
There are some restrictions on choosing TTL Index: you can't create
TTL index if indexed field is already used in another index. index
can't have multiple fields. indexed field should be a Date bson type
As always, many thanks for your help.
Update: I've added the createIndex code above. With an empty callback, it runs without error, but the TTL system fails to remove entries at all, sigh.
I'm using nodejs with the module cradle to interact with the couchdb server, the question is to let me understanding the reduce process to improve the view query...
For example, I should get the user data from his ID with a view like this:
map: function (doc) { emit(null, doc); }
And in node.js (with cradle):
db.view('users/getUserByID', function (err, resp) {
var found = false;
resp.forEach(function (key, row, id) {
if (id == userID) {
found = true;
userData = row;
}
});
if (found) {
//good, works
}
});
As you can see, this is really bad for large amount of documents (users in the database), so I need to improve this view with a reduce but I don't know how because I don't understand of reduce works.. thank you
First of all, you're doing views wrong. View are indexes at first place and you shouldn't use them for full-scan operations - that's ineffective and wrong. Use power of Btree index with key, startkey and endkey query parameters and emit field you like to search for as key value.
In second, your example could be easily transformed to:
db.get(userID, function(err, body) {
if (!err) {
// found!
}
});
Since in your loop you're checking row's document id with your userID value. There is no need for that loop - you may request document by his ID directly.
In third, if your userID value isn't matches document's ID, your view should be:
function (doc) { emit(doc.userID, null); }
and your code will be looks like:
db.view('users/getUserByID', {key: userID}, function (err, resp) {
if (!err) {
// found!
}
});
Simple. Effective. Fast. If you need matched doc, use include_docs: true query parameter to fetch it.