To begin with, I am new to couchdb and new to databases in general.I have a couchdb instance setup in a docker container. I have another docker container in the same box that has some nodeJS code that is talking to this couchdb instance.
Now, I am doing basic stuff like adding an entry and getting an entry from the db. To get an entry, this is what I do:
curl -X GET http://IP:5984/mydb/158
I get an output as follows:
{"_id":"156", "_rev":"1-47c19c00bee6417f36c7db2bc8607468", "name":{"given":["Itisha"], "family":["Iyengar"]}, "dob":"1981-06-01", "phone":{"value":"tel:312-116-1123"}, "address":{"line["147leverettmailcenter"], "city":"Naperville", "state":"IL", "postalCode":"02770"}, "SID":""}
I pass the data to another function that processes it further. However, I only want the actual data and don't want fields like _id and _rev.How do I do that? -
I read somewhere that I can log into the couchdb instance by doing http://localhost:5984/ from the machine where it is installed. Here I can edit the get script to make it return just the data and ignore the _id and _rev fields. However, I am running it from a docker container on Ubuntu. I do not have access to a UI through which I can make such changes. Is there an alternate way to do this?
If not, is there a way to parse the output and filter out the _id and _rev fields? As of now, I am doing this in a crude way by doing String.splice() and filtering out the data (after the _id and _rev fields) till the end. But I don't think this is a good way to do this and definitely not a good idea for actual production code.
Please suggest.
Thanks in advance.
There are multiple ways to achieve this, depending on your needs:
method 1
Use _find by making a POST request to /db/_find and select the fields you want
curl -X POST -d '{"fields": ["name", "family", "dob", "phone", "address", "SID"]}' http://IP:5984/mydb/_find
The parameter -d is used to send the data to the POST request.
You may need to escape the quotes if you're running Windows.
method 2
Use a view function
method 3
Process the results with a simple node program
const http = require("http");
http.get({
host: 'IP',
port: 5984
path: '/mydb/158'
}, function(response) {
var body = '';
response.on('data', function(d) {
body += d;
});
response.on('end', function() {
var parsed = JSON.parse(body);
var result = {};
for (var key in parsed) {
if (key != "_id" && key != "_rev") {
result[key] = parsed[key];
}
}
console.log(result);
});
}
);
The above code issues a GET request to your couchdb server, parses the JSON output and puts the results in a new object after ignoring the _id and _rev keys.
method 4
Process the output as a string. As you correctly pointed out, this is not a good solution. It's ugly, but it doesn't mean it can't be done. You could even pipe the output through sed/awk/perl and process the string there.
Related
I am trying to update multiple records through a single put request using Angular HTTP service, which in turn is consuming a Node JS Express API that handles a PUT request. But so far the examples I have seen on the internet a referring to update a single record through a put request. But instead I want to pass an array of objects into the Put request from Angular Http service and it should be able to read that collection in Node JS API. So far I have been passing one single object as a part of request and I could read it's properties via "req.body.propertyname". Can it read the whole array which i want to pass ?
Let's say this is my code on Angular side to update a single book through a put request as below :
updateBook(updatedBook: Book): Observable {
return this.http.put(/api/books/${updatedBook.bookID}, updatedBook, {
headers: new HttpHeaders({
'Content-Type': 'application/json'
})
});
}
On Node js front it is able to read the passed book object from client(Angular ) side like below:
.put(function(req, res) {
var data = getBookData();
var matchingBooks = data.filter(function(item) {
return item.bookID == req.params.id;
});
if(matchingBooks.length === 0) {
res.sendStatus(404);
} else {
var bookToUpdate = matchingBooks[0];
bookToUpdate.title = req.body.title;
bookToUpdate.author = req.body.author;
bookToUpdate.publicationYear = req.body.publicationYear;
saveBookData(data);
res.sendStatus(204);
}
});
My question is if I could pass the collection of books at once so that all of them gets updated with a single request ?
updateBook(updatedBooks: Book[]): Observable {
return this.http.put(/api/books, updatedBooks, {
headers: new HttpHeaders({
'Content-Type': 'application/json'
})
});
}
If yes then how Node JS could even read this Array passed from client. Will req.body contain this array passed ?
Yes, since we are passing data using the PUT method. In your case, you can send the data to be updated as an array of objects. If you are using MongoDB as your database, you can update data using Update Array Operator.
My suggestion:
The thing is, there is nothing much to be done in the front end. If you need to update an only single record, then pass the single data as an object inside the array. If you want to update multiple records, pass every record as an object inside the array to the backend.
In the backend side, we can receive the data using req.body.books (The name of the object you pass from the frontend). If you are using mongoDB, you can refer to the link on how to store data in an array.
Im also new to node js but i tried to update multiple values through postman and i hit the '/api/books' , the below code worked for me and updated all the values at once in my DataBase using PUT call . Used Mongo DB and Node JS
app.put('/api/books',(req,res)=>{
var updatedData="";
var headersAgain=false;
for(let i =0;i<req.body.length;i++){
DataBaseName.findByIdAndUpdate(req.body[i]._id)
.then(val=>{
val.url=req.body[i].title
val.position=req.body[i].author
val.auto_scroll=req.body[i].publicationYear
updatedData=val+updatedData
val.save((err,updatedObject)=>{
console.log('inside save.....',updatedData)
if(err){
return res.status(500).send(err)
}else{
if(!headersAgain==true){
headersAgain=true
return res.status(201).send(updatedData)
}
}
})
})
}
})
I'm trying to implement a Password reset. So I'm taking the phone number of the user, getting the document from the database using the phone number to find it, and I'm taking the new password and trying to update the corresponding document using a PUT request in my Cloudant database.
app.post('/pass_rst', function(req,response){
var log='';
//log is just for me to see what's happening
var phone= req.body.phone;
log+=phone+'\n';
db.find({selector:{'phone':phone}}, function(err, result){
if(err){
throw err;
}
if(result.docs.length==0){
response.send('User doesnt exist');
}else{
var existing_data=result.docs[0];
log+=JSON.stringify(existing_data)+'\n';
var upd_pswd= req.body.new_password;
log+=upd_pswd+'\n';
var new_data=existing_data;
new_data.password=upd_pswd;
log+=JSON.stringify(new_data)+'\n';
var id= result.docs[0]._id;
log+=id+'\n';
//make PUT request to db
var options={
host:dbCredentials.host,
port:dbCredentials.port,
path:'/'+dbCredentials.dbName+'/'+id,
//url: dbCredentials.url+'/'+dbCredentials.dbName+'/'+id,
method:'PUT',
json:new_data,
headers:{
'Content-Type':'application/json',
'accept':'*/*'
}
};
log+=JSON.stringify(options)+'\n';
var httpreq= http.request(options);
//log+=JSON.stringify(httpreq);
httpreq.on('error', function(e){
response.send('Error'+e.message);
});
response.send(log+'\n\n\nUpdated');
}
});
});
dbCredentials is defined above as follows:
dbCredentials.host = vcapServices.cloudantNoSQLDB[0].credentials.host;
dbCredentials.port = vcapServices.cloudantNoSQLDB[0].credentials.port;
dbCredentials.user = vcapServices.cloudantNoSQLDB[0].credentials.username;
dbCredentials.password = vcapServices.cloudantNoSQLDB[0].credentials.password;
dbCredentials.url = vcapServices.cloudantNoSQLDB[0].credentials.url;
I've tried tinkering around with it, but in the best case scenario, I don't get an error and I see "Updated" but nothing actually happens in the database. Sometimes I get an error saying : 502 Bad Gateway: Registered endpoint failed to handle the request.
If you see what's going wrong, please let me know. Thank you.
This is the documentation on how to update documents in cloudant
UPDATE
Updating a document
PUT /$DATABASE/$DOCUMENT_ID HTTP/1.1 { "_id": "apple", "_rev":
"1-2902191555", "item": "Malus domestica", "prices": {
"Fresh Mart": 1.59,
"Price Max": 5.99,
"Apples Express": 0.79,
"Gentlefop's Shackmart": 0.49 } }
To update (or create) a document, make a PUT request with the updated
JSON content and the latest _rev value (not needed for creating new
documents) to https://$USERNAME.cloudant.com/$DATABASE/$DOCUMENT_ID.
If you fail to provide the latest _rev, Cloudant responds with a 409
error. This error prevents you overwriting data changed by other
processes. If the write quorum cannot be met, a 202 response is
returned.
Example response: { "ok":true, "id":"apple",
"rev":"2-9176459034" }
The response contains the ID and the new revision of the document or
an error message in case the update failed.
I am using bluemix -nodejs and cloudant. The best way that worked for me to do the update is to use the nano package for db interaction from node.js.
You can refer to the post here:
The summary is - By making use of nano api, you can easily update the record. You need to make sure to use - the _id and the right _rev number, while you use nano. This inturn uses PUT Method underneath.
Updating and Deleting documents using NodeJS on Cloudant DB
When you are including nano, make sure to update the package.json to have the nano dependency added. Let me know if you have further questions on the update/delete
When using the cloudant node.js module there is no separate update function.
You need to use the db.insert function also for the update with the right doc revision, so you need to read the latest revision before the insert.
https://github.com/apache/couchdb-nano#document-functions
"and also used to update an existing document, by including the _rev token in the document being saved:"
// read existing document from db
db.get(key, function(error, existing) {
if (!error)
// use revision of existing doc for new doc to update
obj._rev = existing._rev;
// call db insert
db.insert(obj, key, cb);
});
Pretty straightforward and without any sort of configuration ->
In the project directory I entered the command:
$ meteor mongo to access the mongodb
From there (mongo shell), I switched to db meteor using the command use meteor then entered some basic data to test:
j = { name: "mongo" }
k = { x: 3 }
db.testData.insert(j)
db.testData.insert(k)
I checked and got results by entering: db.testData.find()
Here's my meteor code provided that mongodb access is only required on the client:
if (Meteor.isClient) {
Template.hello.greeting = function () {
return "Welcome to test.";
};
Template.hello.events({
'click input' : function () {
// template data, if any, is available in 'this'
if (typeof console !== 'undefined')
console.log("You pressed the button");
}
});
Documents = new Meteor.Collection('testData');
var document = Documents.find();
console.log(document);
var documentCbResults = Documents.find(function(err, items) {
console.log(err);
console.log(items);
});
}
Upon checking on the browser and based on the logs, it says undefined. I was unsuccessful from retrieving data from mongodb and showing to the client console.
What am I missing?
For this answer I'm going to assume this is a newly created project with autopublish still on.
As Christian pointed out, you need to define Documents on both the client and the server. You can easily accomplish this by just putting the collection definition at the top of the file or in another file which isn't in either of the server or client directories.
An example which prints the first two test documents could look like this:
Documents = new Meteor.Collection('testData');
if (Meteor.isClient) {
Template.hello.greeting = function () {
return "Welcome to apui.";
};
Template.hello.events({
'click input' : function () {
var documents = Documents.find().fetch();
console.log(documents[0]);
console.log(documents[1]);
}
});
}
Note the following:
The find function returns a cursor. This is often all you want when writing template code. However, in this case we need direct access to the documents to print them so I used fetch on the cursor. See the documentation for more details.
When you first start the client, the server will read the contents of the defined collections and sync all documents (if you have autopublish on) to the client's local minimongo database. I placed the find inside of the click event to hide that sync time. In your code, the find would have executed the instant the client started and the data probably would not have arrived in time.
Your method of inserting initial items into the database works (you don't need the use meteor by the way), however mongo will default to using an ObjectId instead of a string as the _id. There are subtle ways that this can be annoying in a meteor project, so my recommendation is to let meteor insert your data if at all possible. Here is some code that will ensure the testData collection has some documents:
if (Meteor.isServer) {
Meteor.startup(function() {
if (Documents.find().count() === 0) {
console.log('inserting test data');
Documents.insert({name: "mongo"});
Documents.insert({x: 3});
}
});
}
Note this will only execute if the collection has no documents in it. If you ever want to clear out the collection you can do so via the mongo console. Alternatively you can drop the whole database with:
$ meteor reset
It's not enough to only define collections on the client side. Your mongo db lives on the server and your client needs to get its data from somewhere. It doesn't get it directly from mongodb (I think), but gets it via syncing with the collections on the server.
Just define the Documents collection in the joint scope of client and server. You may also need to wait for the subscription to Documents to complete before you can expect content. So safer is:
Meteor.subscribe('testData', function() {
var document = Documents.find();
console.log(document);
});
I'm trying to serve out images that I have stored in a Mongo document. I'm using express, express-resource and mongoose.
The data, which is a JPG, is stored in a Buffer field in my schema. Seems like it's getting there correctly as I can read the data using the cli.
Then I run a find, grab the field and attempt sending it. See code:
res.contentType('jpg');
res.send(img);
I don't think it's a storage issue because I'm performing the same action here:
var img = fs.readFileSync(
__dirname + '/../../img/small.jpg'
);
res.contentType('jpg');
res.send(img);
In the browser the image appears (as a broken icon).
I'm wondering if it's an issue with express-resource because I have the format set to json, however I am indeed overriding the content type before sending the data.
scratches head
I managed to solve this myself. Seems like I was using the right method to send the data from express, but wasn't storing it properly (tricky!).
For future reference to anyone handling image downloads and managing them in Buffers, here is some sample code using the request package:
request(
{
uri: uri,
encoding: 'binary'
},
function (err, response, body)
{
if (! err && response.statusCode == 200)
{
var imgData = new Buffer(
body.toString(),
'binary'
).toString('base64');
callback(null, new Buffer(imgData, 'base64'));
}
}
);
Within Mongo you need to setup a document property with type Buffer to successfully store it. Seems like this issue was due to how I was saving it into Mongo.
Hopefully that saves someone time in the future. =)
I'm currently trying to make a register form using mongoDB and nodeJS - I've created new database and collection - I want to store: username, password, email and insert_time in my database.
I've added unique indexes to username/email and checked if it works - and I can not add a duplicated entry using mongo's console or rockmongo (php mongodb manager) - so it works fine.
However - when the piece of code that is supposed to register a new account is being executed and makes an insert with the data that is already in database - it returns an object that contains all the data that was supposed to be added with a new, unique id. The point is - it should return an error that would say that entries can not be duplicated and insert failed - instead it returns the data back and gives it a new id. Data that already resides in database remains untouched - even the ID stays the same - it's not rewritten with the new one returned by script's insert.
So, the question is... what am I doing wrong? Or maybe everything is fine and database's insert should return data even if it's failed?...
I even tried defining indexes before executing indexes.
I tried inserting the data using mongoDB's default functions and mongoJS functions - the result is the same in both cases.
The code I'm trying to execute (for mongoJS):
var dbconn = require("mongojs").connect('127.0.0.1:27017/db', ['users']);
var register = function(everyone, params, callback)
{
// TODO: validation
dbconn.users.ensureIndex({username:1},{unique:true});
dbconn.users.ensureIndex({email:1},{unique:true});
dbconn.users.save(
{
username: params.username,
password: params.password,
email: params.email,
insert_time: Date.now()
},
function(error, saved)
{
if(error || !saved)
{
callback(false);
}
else
{
console.log(error);
console.log(saved);
callback(true);
}
});
}
For both cases - inserting new data and inserting duplicated data that doesn't modify database in any way - ERROR is null and SAVED is just a copy of data that is supposed to be inserted. Is there any way to check if insert was made or not - or do I have to check whether the data already exists in database or not manually before/after making an insert?
Mongo works that way. You should tell you want to get errors back, using the safe option when you issue the save command (as per default it uses the "fire and forget" method).
https://docs.mongodb.com/manual/reference/command/getLastError/
This looks to be basically the same problem as MongoDB unique index does not work -- but with the JavaScript API rather than Java. That is, saving without either specifying the "safe" flag or explicitly checking the last error value is unsafe--- the ID is generated client side and the command dispatched, but it might still fail (e.g. due to a unique index violation). You need to explicitly get the last error status.
http://www.mongodb.org/display/DOCS/dbshell+Reference#dbshellReference-ErrorChecking suggests db.getLastError() using the command shell, and I assume the node API is identical where they can possibly make it so.
Same problem solved, just forgot that its async insert and any nodes process.exit stops adding data
var lineReader = require('readline').createInterface({
input: require('fs').createReadStream('parse.txt')
});
lineReader.on('line', function (line) {
console.log(line)
db.insert(line,{w:1},function(e,d){ if (e) throw e })
});
// f this s, data are still adding when do close KUR WA MAC 2 days of my f life wasted
lineReader.on('close', function() {
// ----->>>>>> process.exit(); <<<<<----------
});