I am looking to update each document inserted in a couchdb database with the current timestamp.I have referred the document_update_handler feature which can update individual documents.But what i see from the documentation is that an explicit POST/PUT request to the handler function needs to fired along with docid ,something like this:
http://127.0.0.1:5984/my_database/_design/my_designdoc/_update/in-place-query/mydocId.
What i am looking for is a solution that will automatically create new field with the current timestamp just before a new record is inserted.(I am using lightcouch API on the client side to persist a document).
Any help is appreciated.
Related
I add some data to firestore which contain #serverstamp, but after I add data , I need to get #serverstamp for other progress
If you're using server side generated timestamps you'll need to query the created document to read the actual timestamp value. Nonetheless, if you need the timestamp maybe it would be more practical to generate the timestamp client side, so the document read isn't needed.
I am working on a node.js app, and I've been searching for a way around using the Model.save() function because I will want to save many documents at the same time, so it would be a waste of network and processing doing it one by one.
I found a way to bulk insert. However, my model has two properties that makes them unique, an ID and a HASH (I am getting this info from an API, so I believe I need these two informations to make a document unique), so, I wanted that if I get an already existing object it would be updated instead of inserted into the schema.
Is there any way to do that? I was reading something about making concurrent calls to save the objects, using Q, however I still think this would generate an unwanted load on the Mongo server, wouldn't it? Does Mongo or Mongoose have a method to bulk insert or update like it does with insert?
Thanks in advance
I think you are looking for the Bulk.find(<query>).upsert().update(<update>) function.
You can use it this way:
bulk = db.yourCollection.initializeUnorderedBulkOp();
for (<your for statement>) {
bulk.find({ID: <your id>, HASH: <your hash>}).upsert().update({<your update fields>});
}
bulk.execute(<your callback>)
For each document, it will look for a document matching the {ID: <your id>, HASH: {your hash}} criteria. Then:
If it finds one, it will update that document using {<your update fields>}
Otherwise, it will create a new document
As you need, it will not make a connection to the mongo server on each iteration of the for loop. Instead a single call will be made on the bulk.execute() line.
In past with my PHP / Rails - MYSQL apps I've used the unique ID of a table record to keep track of a record in an html file.
So I'd keep track of how to delete a record shown like this (15 being the ID of the record):
Delete this record
So now I'm using MongoDB. I've tried the same method but the objectID ._id attribute seems to be a loooong byte string that I can't use conveniently.
What's the most sensible way of binding a link in the view to a record (for deletion, or other purposes or whatever)?
If the answer is to create a new id that's unique for each document in the collection, then what's the best way to generate those unique id's?
Thank you.
You could use a counter instead of the ObjectID
But this could create a problem when inserting a new document after you deleted a previous one.
See this blog post for more detail info on Sequential unique identifiers with Node.js and MongoDB.
Or you could use the timestamp part of the ObjectID:
objectId.getTimestamp().toString()
See the node objectid docs
How to get Post with Comments Count in single query with CouchDB?
I can use map-reduce to build standalone view [{key: post_id, value: comments_count}] but then I had to hit DB twice - one query to get the post, another to get comments_count.
There's also another way (Rails does this) - count comments manually, on the application server and save it in comment_count attribute of the post. But then we need to update the whole post document every time a new comment added or deleted.
It seems to me that CouchDB is not tuned for such a way, unlike RDBMS when we can update only the comment_count attribute in CouchDB we are forced to update the whole post document.
Maybe there's another way to do it?
Thanks.
The view's return json includes the document count as 'total_rows', so you don't need to compute anything yourself, just emit all the documents you want counted.
{"total_rows":3,"offset":0,"rows":[
{"id":...,"key":...,value:doc1},
{"id":...,"key":...,value:doc2},
{"id":...,"key":...,value:doc3}]
}
I need a guidance on how can I update a field in CouchDB. I tried curl via console it works fine but programatically. I don't understand how to update a particular field say 'name'. Here is the snippet of updating a document in CouchDB which works fine and returns me the updated revision id.
HttpPut httpPutRequest = new HttpPut(hostUrl +"/"+ docId);
StringEntity body = new StringEntity(jsonDoc.toString());
httpPutRequest.setEntity(body);
httpPutRequest.setHeader("Accept", "application/json");
httpPutRequest.setHeader("Content-type", "application/json");
Partial updates are not supported by CouchDB. In other words, to update a field in the document, you must update the field in your local JSON document and push that document to CouchDB as a whole.
You can accomplish this by still issuing an HTTP PUT, ensuring the appropriate _rev is included in your document.
More details are available in the wiki.
It is possible to support partial updates by writing your own update function.
To be clear, it not actually a truly partial update. It still update the whole target document into new revision. But the update was done directly on the database itself. And so you could specified update partially on client side instead of retrieving and sending the whole document