I have gone through all the blogs available for the ektrop library and also the source code of the library.I have found that get(class obj,String id) function is available but If I use this function then only one document will be returned according to the id given.So I want to read all the changed documents present in the bucket.How Can I achieve this.Thanks in advance any help would be appreciable.
You are probably looking for the _all_docs endpoint.
ViewQuery q = new ViewQuery().allDocs().includeDocs(true);
List<Sofa> bulkLoaded = db.queryView(q, Sofa.class);
You can find more detailed information in the api
Hello all I have got the answer the question asked by me above.Here is the steps by which you can get the all documents or simply the result of _changes API is used.
Integrate the Ektorp library into your project.
Use the follwing code to get the all document.
HttpClient httpClient = new StdHttpClient.Builder()
.url("http://localhost:5984/")
.build();
CouchDbInstance dbInstance = new StdCouchDbInstance(httpClient);
CouchDbConnector db = new StdCouchDbConnector("my_database", dbInstance);
ChangesCommand.Builder builder = new ChangesCommand.Builder();
ChangesCommand changesCommand =builder.build() ;
List<DocumentChange> documentChangeList=db.changes(changesCommand);
for(int i=0;i<documentChangeList.size()-1;i++) {
System.out.println(documentChangeList.get(i).getId());
}
Related
I'm trying to update a custom entity field during registration in an SCA app.
the published Netsuite docs indicate I should be able to call:
var webStore = session.getSiteSettings(['displayname', 'id']);
customer = session.getCustomer();
customer.updateProfile({
internalid: internalid,
customfields: {
custentity_registered_site: webstore.id
}
});
but this throws the ever helpful UNEXPECTED_ERROR
Has anyone had this working for custom fields? I am doing this just after registration so that may be the issue though I can get a valid customer internalid. Any luck with alternate syntax of some sort?
Eventually was able to get NS support to give me their internal stack trace.
The issue has to do with some internals from the Rhino Javascript engine that NS uses.
Basically the value returned from the NS API was returning a string like value and NS was not following the 2014 recommendation on its use and was failing in the underlying Java code. So the fix was simple but so frustrating:
var webStore = session.getSiteSettings(['displayname', 'id']);
customer = session.getCustomer();
customer.updateProfile({
internalid: internalid,
customfields: {
custentity_registered_site: webstore.id.toString()
}
});
Hope this helps someone.
I recently realized that DocumentDB supports stand alone update operations via ReplaceDocumentAsync.
I've replaced the Upsert operation below with the Replace operation.
var result = _client
.UpsertDocumentAsync(_collectionUri, docObject)
.Result;
So this is now:
var result = _client
.ReplaceDocumentAsnyc(_collectionUri, docObject)
.Result;
However, now I get the exception:
Microsoft.Azure.Documents.BadRequestException : ResourceType Document is unexpected.
ActivityId: b1b2fd71-3029-4d0d-bd5d-87d8d0a2fc95
No idea why, upsert and replace are of the same vein and the object is the same that worked for upsert, so I would expect it to work without problems.
All help appreciated.
Thanks
Update: Have tried to implement this using the SelfLink approach, and it works for Replace, but selflink does not work with Upsert. The behavior is quite confusing. I don't like that I have to build a self link in code using string concatenation.
I'm afraid that building the selflink with string concatenation is your only option here because ReplaceDocument(...) requires a link to the document. You show a link to the collection in your example. It won't suck the id out and find the document as you might wish.
The NPM module, documentdb-utils, has library functions for building these links but it's just using string concatenation. I have seen an equivalent library for .NET but I can't remember where. Maybe it was in an Azure example or even in the SDK now.
You can build a document link for a replace using the UriFactory helper class:
var result = _client
.ReplaceDocumentAsync(UriFactory.CreateDocumentUri(databaseId, collectionId, docObject.Id), docObject)
.Result;
Unfortunately it's not very intuitive, as Larry has already pointed out, but a replace expects a document to already be there, while an upsert is what it says on the tin. Two different use-cases, I would say.
In order to update a document, you need to provide the Collection Uri. If you provide the Document Uri it returns the following:
ResourceType Document is unexpected.
Maybe the _collectionUri is a Document Uri, the assignment should look like this:
_collectionUri = UriFactory.CreateDocumentCollectionUri(DatabaseName, CollectionName);
I am new to Cloudant but have found it useful for a first stage of IoT data. But I need to subscribe to changes based on an id field that is separate from the _id and is unique to the sensor that is sending the data. The examples that I’ve seen so far haven’t helped with this problem. What I’m doing now is sending a separate json doc for each post, so it should return new docs with this sensor id. The json docs sometimes come in by the second but it can be hours as well.
I’m using c# in a .Net web app. The code below creates a call to the Cloudant database and returns the data that I want based on an index that was created for the field SensorID,
json =
{{
"selector": {
"SensorID" : "h7365cf3-17bc-4422-b436-f7bcf12b2e2a"
},
"fields": [
"Data"
]
}}
url = My Cloudant url + ” /_find”.
This returns all docs with the sensorID field that corresponds to the SensorID value in the json query, but just the json object of each doc nested in the Data field.
using (WebClient client = new WebClient())
{
byte[] postBytes = System.Text.ASCIIEncoding.UTF8.GetBytes(json.ToString());
client.UseDefaultCredentials = true;
client.Credentials = new NetworkCredential(username, password);
client.Headers[HttpRequestHeader.ContentType] = "application/json";
var response = client.UploadData(url, "POST", postBytes);
JObject iJson = JObject.Parse(client.Encoding.GetString(response));
return parseIncoming(iJson);
}
When the call is to My Cloudant url + “GET /_DB_UPDATES”, it returns information regarding changes to the whole database. This can be set up as a continuous feed.
I was hoping that this meant that i could subscribe to changes in documents to get new data coming, like Redis Pub/Sub. I’m starting to think that this might not be the case, but if anybody can show me how to do it I would be grateful.
As #adasilva70 said, you need to use the _changes feed.
You can filter changes with an appropriate filter function (so that only changes regarding the documents you're interested in show up).
You can get all updates since a given sequence point (everything since the last data you got) and/or you can use long polling or continuous mode for instant notifications.
Is there any solution to create record in other DBs from the CRM 2011 records? When a record such as "cost" was created in CRM 2011, we want a record would be created in out Oracle DB. Could it be done through a plugin? Or a service should be created for this?
Could you please provide me references or solutions for this.
Any helps would be greatly appreciated.
We had a similar request from a customer a while ago. They claimed that CRM's database wasn't to be trusted and wanted to securely store a copy of the records created in - guess what - SQL Server too. (Yes, we do understand the irony. They didn't.)
The way we've resolved it was to create a plugin. However, bear in mind that simply reacting to the message of Create won't really do. You need to set up a listener for three of the CRUD operations (retrieval doesn't affect the external database so it's rather C_UD operations, then).
Here's the skeleton of the main Execute method.
public void Execute(IServiceProvider serviceProvider)
{
Context = GetContextFromProvider(serviceProvider);
Service = GetServiceFromProvider(serviceProvider);
switch (Context.MessageName)
{
case "Create": ExecuteCreate(); break;
case "Update": ExecuteUpdate(); break;
case "Delete": ExecuteDelete(); break;
}
}
After this dispatcher, you can implement the actual calls to the other database. There are three gotchas I'd like to give you head-up on.
Remember to provide a suitable value to the outer DB when CRM doesn't offer you one.
Register the plugin as asynchronous since you'll be talking to an external resource.
Consider the problem with entity references, whether to store them recursively as well.
Walk-through for plugin construction
Link to CRM SDK if you haven't got that
Information on registering the plugin
And besides that, I've got a walk-through (including code and structure) on the subject in my blog. The URL to it, you'll have to figure out yourself - I'm not going to self-promote but it's got to do with my name and WP. Google is your friend. :)
You could use a plugin to create a record in another system, although you would need to think about syncing and ensure you don't get duplicates, but it certainly can be done.
Tutorial on plugins can be found here.
You need to write a plugin that runs on Create and uses the information on the created Cost entity to create a record in your Oracle DB.
As an example:
public void Execute(IServiceProvider serviceProvider)
{
var context = (IPluginExecutionContext)serviceProvider.GetService(typeof(IPluginExecutionContext));
//get the created entity from CRM
var theCreatedEntity = context.InputParameters["Target"] as Entity;
//build up a stored procedure call
using (OracleConnection objConn = new OracleConnection("connection string"))
{
var cmd = new OracleCommand();
cmd.Connection = objConn;
cmd.CommandText = "stored procedure name";
cmd.CommandType = CommandType.StoredProcedure;
cmd.Parameters.Add("param1", OracleType.Number).Value = theCreatedEntity.GetAttributeValue<int>("Attribute1");
cmd.Parameters.Add("param2", OracleType.Number).Value = theCreatedEntity.GetAttributeValue<int>("Attribute2");
//etc
cmd.ExecuteNonQuery();
}
}
That should give you enough to get going
I would like to be able to create project issue automatically. The aim is to create new issue based on received email.
I looked at ProjectWSSInfoDataSet, which is supposed to have reference to issue list (according to "PSI Methods and DataSets for Project Workspaces" at http://msdn.microsoft.com/en-us/library/aa495198(office.12).aspx). Indeed, ProjectWSSInfoDataSet XML schema contains PROJECT_ISSUES_URL field, but if it is just the url then it is not much usefull for me.
Has anyone did something similar? (Or possibly with project risks or deliverables.)
I think have to do it with the SharePoint Webservices. Find the list in the specified web and update it.
I recommend the SharePoint 2010 Client Object Model to do this:
//Use SP2010 Client Object Model to update the list
ClientContext SPContext = new ClientContext(wssUrl);
//Get list by name
string listname = "issues";
var query = SPContext.LoadQuery(SPContext.Web.Lists.Where(l => l.Title == listname));
SPContext.ExecuteQuery();
List myIssueList = query.FirstOrDefault();
//Add an item
ListItemCreationInformation nItem = new ListItemCreationInformation();
nItem.LeafName = "Blubb..";
myIssueList.AddItem(nItem);
SPContext.ExecuteQuery();
If you wan't to get the Workspace Url via the Project Id, you can do this by WSSInterop Webservice of the Project Server:
//Use WssInterop Webservice to get the Workspace URL
WssInteropSoapClient wssinteropSvc = new WssInteropSoapClient();
Guid prjGuid = new Guid("30937680-39FA-4685-A087-90C73376B2BE");
ProjectWSSInfoDataSet wssData = wssinteropSvc.ReadWssData(prjGuid);
string wssUrl = wssData.ProjWssInfo[0].PROJECT_WORKSPACE_URL;
I don't know if the code will compile, but it should work like this.
Regards Florian