Retrieving all Documents from couchdb using Node.js - node.js

I am writing a simple test app to experiment with the functionality of node.js and couchdb, so far i am loving it, but i ran in a snag. i have looked for and wide but can't seem to find an answer. My test server(a simple address book) does 2 things:
if the user goes to localhost:8000/{id} then my app returns the name and address of the user with that id.
if the user goes to localhost:8000/ then my app needs to return a list a names that are hyperlinks and takes them to the page localhost:8000/{id}.
I was able to get the first requirement working. i cant not seem to find how to retrieve a list of all names from my couchdb. that is what i need help with. here is my code:
var http = require('http');
var cradle = require('cradle');
var conn = new(cradle.Connection)();
var db = conn.database('users');
function getUserByID(id) {
var rv = "";
db.get(id, function(err,doc) {
rv = doc.name;
rv += " lives at " + doc.Address;
});
return rv;
}
function GetAllUsers() {
var rv = ""
return rv;
}
var server = http.createServer(function(req,res) {
res.writeHead(200, {'Content-Type':'text/plain'});
var rv = "" ;
var id = req.url.substr(1);
if (id != "")
rv = getUserByID(id);
else
rv = GetAllUsers();
res.end(rv);
});
server.listen(8000);
console.log("server is runnig");
As you can see, I need to fill in the GetAllUsers() function. Any help would be appreciated. Thanks in advance.

I would expect you to be doing something like (using nano, which is a library I authored):
var db = require('nano')('http://localhost:5984/my_db')
, per_page = 10
, params = {include_docs: true, limit: per_page, descending: true}
;
db.list(params, function(error,body,headers) {
console.log(body);
});
I'm not pretty sure what you are trying to accomplish with http over there but feel free to head to my blog if you are looking for some more examples. Just wrote a blog post for people getting started with node and couch
As said above it will come a time when you will need to create your own view. Check up the CouchDB API Wiki, then scan thru the book, check what are design documents, then if you like you can go and check the test code I have for view generation and querying.

You can create a CouchDB view which will list the users. Here are several resources on CouchDB views which you should read in order to get a bigger picture on this topic:
Introduction to CouchDB Views
Finding Your Data with Views
View Cookbook for SQL Jockeys
HTTP View API
So let's say you have documents structured like this:
{
"_id": generated by CouchDB,
"_rev": generated by CouchDB,
"type": "user",
"name": "Johny Bravo",
"isHyperlink": true
}
Then you can create a CouchDB view (the map part) which would look like this:
// view map function definition
function(doc) {
// first check if the doc has type and isHyperlink fields
if(doc.type && doc.isHyperlink) {
// now check if the type is user and isHyperlink is true (this can also inclided in the statement above)
if((doc.type === "user") && (doc.isHyperlink === true)) {
// if the above statements are correct then emit name as it's key and document as value (you can change what is emitted to whatever you want, this is just for example)
emit(doc.name, doc);
}
}
}
When a view is created you can query it from your node.js application:
// query a view
db.view('location of your view', function (err, res) {
// loop through each row returned by the view
res.forEach(function (row) {
// print out to console it's name and isHyperlink flag
console.log(row.name + " - " + row.isHyperlink);
});
});
This is just an example. First I would recommend to go through the resources above and learn the basics of CouchDB views and it's capabilities.

Related

Delete Documents from CosmosDB based on condition through Query Explorer

What's the query or some other quick way to delete all the documents matching the where condition in a collection?
I want something like DELETE * FROM c WHERE c.DocumentType = 'EULA' but, apparently, it doesn't work.
Note: I'm not looking for any C# implementation for this.
This is a bit old but just had the same requirement and found a concrete example of what #Gaurav Mantri wrote about.
The stored procedure script is here:
https://social.msdn.microsoft.com/Forums/azure/en-US/ec9aa862-0516-47af-badd-dad8a4789dd8/delete-multiple-docdb-documents-within-the-azure-portal?forum=AzureDocumentDB
Go to the Azure portal, grab the script from above and make a new stored procedure in the database->collection you need to delete from.
Then right at the bottom of the stored procedure pane, underneath the script textarea is a place to put in the parameter. In my case I just want to delete all so I used:
SELECT c._self FROM c
I guess yours would be:
SELECT c._self FROM c WHERE c.DocumentType = 'EULA'
Then hit 'Save and Execute'. Viola, some documents get deleted. After I got it working in the Azure Portal I switched over the Azure DocumentDB Studio and got a better view of what was happening. I.e. I could see I was throttled to deleting 18 a time (returned in the results). For some reason I couldn't see this in the Azure Portal.
Anyway, pretty handy even if limited to a certain amount of deletes per execution. Executing the sp is also throttled so you can't just mash the keyboard. I think I would just delete and recreate the Collection unless I had a manageable number of documents to delete (thinking <500).
Props to Mimi Gentz #Microsoft for sharing the script in the link above.
HTH
I want something like DELETE * FROM c WHERE c.DocumentType = 'EULA'
but, apparently, it doesn't work.
Deleting documents this way is not supported. You would need to first select the documents using a SELECT query and then delete them separately. If you want, you can write the code for fetching & deleting in a stored procedure and then execute that stored procedure.
I wrote a script to list all the documents and delete all the documents, it can be modified to delete the selected documents as well.
var docdb = require("documentdb");
var async = require("async");
var config = {
host: "https://xxxx.documents.azure.com:443/",
auth: {
masterKey: "xxxx"
}
};
var client = new docdb.DocumentClient(config.host, config.auth);
var messagesLink = docdb.UriFactory.createDocumentCollectionUri("xxxx", "xxxx");
var listAll = function(callback) {
var spec = {
query: "SELECT * FROM c",
parameters: []
};
client.queryDocuments(messagesLink, spec).toArray((err, results) => {
callback(err, results);
});
};
var deleteAll = function() {
listAll((err, results) => {
if (err) {
console.log(err);
} else {
async.forEach(results, (message, next) => {
client.deleteDocument(message._self, err => {
if (err) {
console.log(err);
next(err);
} else {
next();
}
});
});
}
});
};
var task = process.argv[2];
switch (task) {
case "listAll":
listAll((err, results) => {
if (err) {
console.error(err);
} else {
console.log(results);
}
});
break;
case "deleteAll":
deleteAll();
break;
default:
console.log("Commands:");
console.log("listAll deleteAll");
break;
}
And if you want to do it in C#/Dotnet Core, this project may help: https://github.com/lokijota/CosmosDbDeleteDocumentsByQuery. It's a simple Visual Studio project where you specify a SELECT query, and all the matches will be a) backed up to file; b) deleted, based on a set of flags.
create stored procedure in collection and execute it by passing select query with condition to delete. The major reason to use this stored proc is because of continuation token which will reduce RUs to huge extent and will cost less.
##### Here is the python script which can be used to delete data from Partitioned Cosmos Collection #### This will delete documents Id by Id based on the result set data.
Identify the data that needs to be deleted before below step
res_list = "select id from id_del"
res_id = [{id:x["id"]}
for x in sqlContext.sql(res_list).rdd.collect()]
config = {
"Endpoint" : "Use EndPoint"
"Masterkey" : "UseKey",
"WritingBatchSize" : "5000",
'DOCUMENTDB_DATABASE': 'Database',
'DOCUMENTDB_COLLECTION': 'collection-core'
};
for row in res_id:
# Initialize the Python DocumentDB client
client = document_client.DocumentClient(config['Endpoint'], {'masterKey': config['Masterkey']})
# use a SQL based query to get documents
## Looping thru partition to delete
query = { 'query': "SELECT c.id FROM c where c.id = "+ "'" +row[id]+"'" }
print(query)
options = {}
options['enableCrossPartitionQuery'] = True
options['maxItemCount'] = 1000
result_iterable = client.QueryDocuments('dbs/Database/colls/collection-core', query, options)
results = list(result_iterable)
print('DOCS TO BE DELETED : ' + str(len(results)))
if len(results) > 0 :
for i in range(0,len(results)):
# print(results[i]['id'])
docID = results[i]['id']
print("docID :" + docID)
options = {}
options['enableCrossPartitionQuery'] = True
options['maxItemCount'] = 1000
options['partitionKey'] = docID
client.DeleteDocument('dbs/Database/colls/collection-core/docs/'+docID,options=options)
print ('deleted Partition:' + docID)

UI Router query parameters and multiple states

I've been trying to get my head around query parameters with UI Router and think I have finally figured out why I'm having such a hard time with it. I'm building a small search app that has 2 states:
$stateProvider
.state('home', {
url: '/',
.state('search', {
url: '/search',
},
The home state has its own template (view) and so does the search state. They share the same controller. Basically all home state does is provide a search box to enter your search query or choose from autocomplete.
Once you have submitted your search query it goes to the search state via
$state.go('search');
I've tried many ways to get the user's search term(s) in the url on the search page using UI Router's query parameter syntax in the url: '/search?q' combined with $stateParams in the controller set as
vm.searchTerms = $stateParams.q || '';
I had switched it to $state.params.q but that didn't fix it.
I have successfully been able to get the query parameters in the url, however, when I do, it breaks search functionality. The autocomplete and query parameters work and display, but search function stops.
However, I think I finally understand why its not working the way I'd like it to. I believe it has to do with the fact that I'm using 2 states and not one parent state with a nested child state and that the templates are not nested - so $scope doesn't inherit. I'm getting close to this working... it transitions from the home state to the search state displaying query parameters in the search state's url... its simply that search breaks, but autocomplete and query parameters are working.
What I'm trying to achieve is to have the user enter search terms from the home state and then have results display in the search state along with query parameters in the url. Is there anything I need to do with home state or search state that I'm not doing?
OR
Is there anything in my search() in my controller that could be the problem
//search()
vm.search = function() {
//$state.go('search', {q: vm.searchTerms});
$state.go('search');
console.log(vm.searchTerms);
console.log('success - search');
vm.currentPage = 1;
vm.results.documents = [];
vm.isSearching = true;
return coreService.search(vm.searchTerms, vm.currentPage)
.then(function(es_return) {
console.log('success - return');
var totalItems = es_return.hits.total;
var totalTime = es_return.took;
var numPages = Math.ceil(es_return.hits.total / vm.itemsPerPage);
vm.results.pagination = [];
for (var i = 0; i <= 10; i++) {
console.log('success - for');
vm.results.totalItems = totalItems;
vm.results.queryTime = totalTime;
vm.results.pagination = coreService.formatResults(es_return.hits.hits);
vm.results.documents = vm.results.pagination.slice(vm.currentPage, vm.itemsPerPage);
console.log('success - documents');
}
vm.noResults = true;
}),
function(error){
console.log('ERROR: ', error.message);
vm.isSearching = false;
},
vm.captureQuery();
console.log('success - captureQuery');
};
You can add a parameter to your URL in two ways, use colon:
'url': '/search/:query'
or curly braces:
'url': '/search/{query}'
Then you can use the go method of $state with parameter to transition:
$state.go('search', {'query', 'foobar'});
You can access the parameter's value from your controller by using the params member of your $state object:
console.log($state.params.query);
or directly from the $stateParams object:
console.log($stateParams.query);
Reference: https://github.com/angular-ui/ui-router/wiki/URL-Routing#url-parameters

Netsuite Userevent Script

I have a userevent script to change the Field in Contract record from PO record. The Script is running fine. But whenever I edit a contract record and try to submit it : It throws the error "Another user has updated this record since you began editing it. Please close the record and open it again to make your changes".
May I know the reason behind this ?
/*---------------------------------------------------------------------------------------------------------------
Description : Whenever the PO vendor is changed(due to Split vendor) that should replace the same in Contract page record automatically.
Script type : User Event Script
Script id : customscript452
Version : 1.0
Applied to : Contract
----------------------------------------------------------------------------------------------------------------*/
function srchfield()
{
var stRecordid = nlapiGetRecordId(); //returns the contract id
if(stRecordid== undefined || stRecordid== null || stRecordid==' ')
{
}
else
{
var stRecordtype = nlapiGetRecordType(); //returns the contract record type = jobs
var stRecord = nlapiLoadRecord(nlapiGetRecordType(), stRecordid);
nlapiLogExecution('debug','Load Object',stRecord);
var stContractID = stRecord.getFieldValue('entityid'); //returns the value of the field contractid whose fieldid is = entityid
nlapiLogExecution('debug','stContractID',stContractID);
var stCompanyName = stRecord.getFieldValue('companyname'); //returns the value of the field company name whose fieldid is = companyname
nlapiLogExecution('debug','stCompanyName',stCompanyName);
var stConcatenate = stContractID+" : "+stCompanyName; //Concatenate the two Fields to get the result which needs to be found in PO
var arrFilters = new Array(); // This is Array Filters all the Purchase Order Record Search
arrFilters.push(new nlobjSearchFilter('type', null, 'anyof',
[
'PurchOrd'
]));
arrFilters.push(new nlobjSearchFilter('mainline', null, 'is', 'T')); //This is to exclude line level results
arrFilters.push(new nlobjSearchFilter('custbodycontract', null, 'is', stRecordid)); //This is Filters in Contracts Search
var arrColumns = new Array();
arrColumns.push(new nlobjSearchColumn('entity')); //This is Search Column Field in Records
var arrSearchresults = nlapiSearchRecord('purchaseorder', null, arrFilters, arrColumns); //This is Filters in Search Result Purchase Order
if(arrSearchresults== undefined || arrSearchresults== null || arrSearchresults==' ')
{
}
else
{
var length = arrSearchresults.length;
}
if(length== undefined || length== null || length==' ')
{
}
else
{
for (var i = 0; arrSearchresults != null && i < arrSearchresults.length; i++)
{
var objResult = arrSearchresults[i];
var stRecId = objResult.getId();
var stRecType = objResult.getRecordType();
var stCntrctName = objResult.getValue('entity'); //This is Value are Get Purchase Order Records and Field for Vendor = entity
}
}
//var record = nlapiLoadRecord(nlapiGetRecordType(), stRecordid, stCntrctName);
if (stCntrctName =='custentityranking_vendor_name')
{
}
else
{
var stChangeName = stRecord.setFieldValue('custentityranking_vendor_name', stCntrctName); //This is Value are the Set in Main Vendor Field = custentityranking_vendor_name
nlapiSubmitRecord(stRecord, null, null); // Submit the Field Value in Record Type
}
}
}
The User Event script executes as the Contract record is being saved to the database. At the same time, you are loading a second copy of the record from the database and trying to submit the copy as well. This is causing the error you're seeing.
You fix this by just using nlapiSetFieldValue to set the appropriate field on the Contract.
I might also recommend getting more familiar with JavaScript by going through the JavaScript Guide over at MDN. In particular, take a look at the Boolean description so that you know how JavaScript evaluates Boolean expressions. This will help you greatly reduce the amount of code you've written here, as many of your conditionals are unnecessary.
What userevent do you have? It is happening depending on what type of user event and API you are using. Looking at your code, you are trying to load contract record that is already updated at the database. So you might consider below to address your issue. Hope, it helps.
If it is a before submit, you don't need to load the record where the script is deployed.
Just use nlapiGet* and nlapiSet* to get and set values. You also don't need to use nlapiSubmitRecord to reflect the change. With before submit, it executes before the record is being saved to the database. So your changes will still be reflected.
Then if it is after submit, it will be executed after the record has been saved to the database, Thus you might use the following API depending on your needs. Actually, this is the best practice to make sure the solution .
nlapiGetNewRecord - only use this if the script only needs to retrieve info from header and sublists. And nothing to set.
nlapiLookupField - use this if the script only needs to get value/s at the header and nothing from the line.
nlapiSubmitField - the script don't need to load and submit record if the changes only on header. Just use this API.
nlapiLoadRecord and nlapiSubmitRecord- use the former if the script will have changes at the line and then use the latter api to commit it on the database.
Being a user event script code, The code you showed is very not good considering performance.
Here is the sample you can merge
var stRecordid = nlapiGetRecordId(); //returns the contract id
// Every record has an internal id associated with it. No need to add condition explicitly to check if its null
var stRecordtype = nlapiGetRecordType();
var fields = ['entityid','companyname'];
var columns = nlapiLookupField(stRecordtype, stRecordid, fields);
var stContractID = columns.entityid;
var stCompanyName = columns.companyname;
nlapiLogExecution('debug','stContractID/stCompanyName',stContractID+'/'+stCompanyName);
var stConcatenate = stContractID+" : "+stCompanyName; //Concatenate the two Fields to get the result which needs to be found in PO
//
//your code of search
//you can improve that code also by using nlapilook up
nlapiSubmitField(stRecordtype, stRecordid, 'custentityranking_vendor_name', 'name to be updated');

Azure Mobile server update script w/ complex field type

I've got a complex data type "AzureTemplate" containing a list of children "AzureField". I've implemented my read and insert on the server side according to this article. Works great.
Needing an update as well, I copy/pasted the insert into the update so it does the same thing, but using update instead. So my update looks like this:
function update(item, user, request) {
// remove complex child object, make copy first
var fields = item.fields;
if (fields) {
delete item.fields;
}
request.execute({
success: function () {
var templateId = item.id; // "foreign key"
var fieldsTable = tables.getTable('AzureFields');
if (fields) {
// update the child fields
var updateNextField = function (index) {
if (index >= fields.length) {
// done updating fields, respond to client
request.respond();
} else {
var field = fields[index];
field.templateId = templateId;
// *** THE ID LOGGED HERE LOOKS FINE ***
console.log("updating field w/ id ", field.id);
fieldsTable.update(field, {
success: function () {
updateNextField(index + 1);
}
});
}
};
// kick off the loop saving each field
updateNextField(0);
} else {
// no fields. no need to do anything else
request.respond();
}
}
});
}
The log that prints the ID of the child "field" shows a valid field id (I save them on the client side when reading them). But I get an error that says:
Error in script '/table/AzureTemplate.update.js'. Error: Invalid id value specified. AzureTemplate/update Tue Jan 27 2015, 10:11:31 AM
I put a console.log() at the top of the AzureField.update, but that never shows up, so it's not getting in there. Also, when I update a single child "Field" directly from the client it works fine. So the AzureField.update is working. Any ideas?
var fieldsTable = tables.getTable('AzureFields');
... my table name is AzureField, not AzureFields. The above code works, hopefully it helps someone.
I have misnamed a table before and got a meaningful error about "table not existing". Not sure why the error in this case is totally unrelated.

How to tell if a record exists in Mongo collection (C#)

Given a collection of items { url: 'http://blah' }. How can I tell if a record exists where the url is "http://stackoverflow.com"?
P.s. I am communicating with the c# driver
For any of the previous suggestions to be efficient you should be sure that there is an index on the url element. Otherwise a full collection scan will be required.
If you only expect the answer to be 0 or 1, count is probably the most efficient approach. If you think the count will be very large and all you really care about is whether there is one or more, FindOne is the most efficient approach.
It probably doesn't matter that FindOne returns the whole document unless the document is actually rather large. In that case you could tell the server to only return one field (the _id seems the most likely candidate):
var query = Query.EQ("url", "http://stackoverflow.com");
var fields = Fields.Include("_id");
var res = collection.Find(query).SetFields(fields).SetLimit(1).FirstOrDefault();
if (res == null) {
// no match found
}
you simply need check count of items returned by the query:
int count = collection.FindAs<Item>(Query.EQ("url", "http://stackoverflow.com")).Count();
if(count > 0)
{
//do some stuff
}
IMongoQuery query = Query.EQ("url", "http://stackoverflow.com");
var res = collection.FindOne(query);
if(res == null)//don't exist
{
}
Existence of Key in MongoDB can check by using Exists and second parameter as true or false
var filter = builder.Exists("style", false);
var RetrievedData = collection.Find(filter).ToList()
Refference Link

Resources