I was wondering if there was a way to add an "Order By" clause when retrieving data from Acumatica through the Web Service API?
IN202500Content IN202500 = oScreen.IN202500GetSchema();
oScreen.IN202500Clear();
Command[] oCmd = new Command[] {IN202500.StockItemSummary.ServiceCommands.EveryInventoryID,
IN202500.StockItemSummary.InventoryID,
IN202500.StockItemSummary.Description,
IN202500.StockItemSummary.ItemStatus,
IN202500.GeneralSettingsItemDefaults.ItemClass,
IN202500.GeneralSettingsItemDefaults.LotSerialClass,
IN202500.PriceCostInfoPriceManagement.DefaultPrice,
};
Filter[] oFilter = new Filter[] {new Filter
{
Field = new Field {ObjectName = IN202500.StockItemSummary.InventoryID.ObjectName,
FieldName = "LastModifiedDateTime"},
Condition = FilterCondition.GreaterOrEqual,
Value = SyncDate
}
};
String[][] sReturn = oScreen.IN202500Export(oCmd, oFilter, iMaxRecords, true, false);
I would like to sort the results for example by DefaultPrice, so that I can retrieve the Top 200 most expensive items in my list (using iMaxRecords = 200 in this case)
I haven't seen any parameters that allows me to do the sorting yet.
I ran into this when I developed a round robin assignment system and the short answer is using the Acumatica API you cant do a sort on the results you have to do it outside of the API (This info came from a friend closely tied to the Acumatica product).
I came up with two options:
Query your DB directly... There are always reasons not to do this but it is much faster than pulling the result from the API and will allow you to bypass the BQL Acumatica uses and write an SQL statement that does EXACTLY what you want providing a result that is easier to work with than the jagged array Acumatica sends.
You can use some Linq and build a second array[][] that is sorted by price and then trim it to the top 200 (You would need all results from Acumatica first).
// This is rough but should get you there.
string[][] MaxPriceList = sReturn.OrderBy(innerArray =>
{
if () // This is a test to make sure the element is not null
{
decimal price;
if (//test decimal is not null))
return price;
}
return price.MaxValue;
}).Take(200).ToArray(); //Take 200 is a shot but might work
Related
I am implementing azure search in my application to provide autosuggestion feature like google, big, and amazon. I have implemented the same available github code using below URL.All is working fine but getting result in more than 1.5 second for each term of sentence.
https://github.com/Azure-Samples/search-dotnet-getting-started/tree/master/DotNetHowToAutocomplete
Currently I am using two indexes for searching and created in basic tier. Below is code
public ActionResult Suggest(bool highlights, bool fuzzy, string term)
{
InitSearch();
// Call suggest API and return results
SuggestParameters sp = new SuggestParameters()
{
UseFuzzyMatching = fuzzy,
Top = 5,
Filter="name eq 'testid'",
OrderBy=new List<string>() { "Date desc"}
};
if (highlights)
{
sp.HighlightPreTag = "<b>";
sp.HighlightPostTag = "</b>";
}
DocumentSuggestResult suggestResult = _indexClient1.Documents.Suggest(term, "index1",sp);
if (suggestResult.Results.Count<5)
{
SuggestParameters sp2 = new SuggestParameters()
{
UseFuzzyMatching = fuzzy,
Top = 5- suggestResult.Results.Count,
Filter = "Product eq 'PAAS'",
OrderBy = new List<string>() { "Count desc" }
};
if (highlights)
{
sp2.HighlightPreTag = "<b>";
sp2.HighlightPostTag = "</b>";
}
DocumentSuggestResult suggestResult2= _indexClient2.Documents.Suggest(term, "index2", sp2);
suggestResult.Results = suggestResult.Results.Union(suggestResult2.Results).Distinct().ToList();
// final = suggestResult.Results.GroupBy(s => s.Text, StringComparer.CurrentCultureIgnoreCase).ToList();
}
// Convert the suggest query results to a list that can be displayed in the client.
List<string> suggestions = suggestResult.Results.Select(x => x.Text).Distinct().ToList();
return new JsonResult
{
JsonRequestBehavior = JsonRequestBehavior.AllowGet,
Data = suggestions
};
}
To test it- when I am typing any word it's taking too much time in populating results around 1.5 to 1.8 seconds, it's working like other web app searchbox.
Timing I am checking using inspect element of chrome browser. Attaching sreenshot.see screenshot
Please suggest.
I answered a similar question on another post: Why is Azure Search taking 1400 miliiseconds to return query results for simple query
The main thing is, you shouldn't be using the Chrome timer to measure the performance of azure search. Use the "elapsed-time" field of the HTTP response you receive (take an average over multiple call), since it accurately tells you how much time was spent getting your results from azure search. The chrome timer can be affected by your network/machine configuration. If that doesn't help, you can follow the other tips I suggested in the post I linked above.
I'm trying to search the existing Customers and return the CustomerID if it exists. This is the code I'm using which works:
var CustomerToFind = new Customer
{
MainContact = new Contact
{
Email = new StringSearch { Value = emailIn }
}
};
var sw = new Stopwatch();
sw.Start();
//see if any results
var result = (Customer)soapClient.Get(CustomerToFind);
sw.Stop();
Debug.WriteLine(sw.ElapsedMilliseconds);
However, I've finding it appears extremely slow to the point of being unusable. For example on the DEMO dataset, on my i7-6700k # 4GHz with 24gb ram and SSD running SQL Server 2016 Developer Edition locally a simple email search takes between 3-4seconds. However on my production dataset with 10k Customer records, it takes over 60 seconds and times out.
Is this typical using Contract based soap? Screen based soap seems much faster and almost instant. If I perform a SQL select on the database tables in Microsoft Management Studio I can also return the result instantly.
Is there a better quick way to query if a Customer with email address = "test#test.com" exists and return the Customer ID?
Try using GetList instead of Get. It's better suited for "search for smth" scenarios.
When using GetList, depending on which endpoint you're using, there are two more optimizations. In Default/5.30.001 endpoint there's a second parameter to GetList which you should set to false. In Default/6.00.001 endpoint there's no second parameter but there is additional property in the entity itself, called ReturnBehavior. Either set it to OnlySpecified and then add *Return to required fields, like this:
var CustomerToFind = new Customer
{
ReturnBehavior = ReturnBehavior.OnlySpecified,
CustomerID = new StringReturn(),
MainContact = new Contact
{
Email = new StringSearch { Value = emailIn }
}
};
or set it to OnlySystem and then use ID on returned entity to request the full entity.
I found some situation on production when
CloudContext.TableData.Where( A => A.PartitionKey == "MYKEY").ToList();
where TableData is
public DataServiceQuery<T> TableData { get { return CreateQuery<T>( _TableName ); } }
does not return the whole partition (I have less than 1000 records there).
In my case it returns 367 records while in VS2010 Server Explorer or in Azure Storage Explorer I get 414 records (condition is the same).
Did anyone experience the same problem?
Also If I change the query and add RowKey into the condition - I get required record with no problem.
You have to better understand the Table Service. In the official documentation here there are listed other conditions which affect number of records returned. If you want to retrieve the whole partition you have to inspect the TableResult for Continuation Token and use provided continuation token to execute the same query over and over again, until all the results come.
You can use an approach similar to the following:
private IEnumerable<MyEntityType> GetAllEntities()
{
var result = this._tables.GetSegmentedEntities(100, null); // null is for continuation token
while (result.Results.Count > 0)
{
foreach (var ufs in result.Results)
{
yield return new MyEntityType(ufs.RowKey, ufs.WhateverOtherPropertyINeed);
}
if (result.ContinuationToken != null)
{
result = this._tables.GetSegmentedEntities(100, result.ContinuationToken);
}
else
{
break;
}
}
}
Where GetSegmentedEntities(100, result.ContinuationToken) is defined as:
public TableQuerySegment<MyEntityType> GetSegmentedEntities(int pageSize, TableContinuationToken token)
{
var partKey = "My_Desired_Partition_key_passed_via_Const_or_method_Param";
TableQuery<MyEntityType> query = new TableQuery<MyEntityType>()
.Where(TableQuery.GenerateFilterCondition("PartitionKey", QueryComparisons.Equal, partKey));
query.TakeCount = pageSize;
return this.azureTableReference.ExecuteQuerySegmented<MyEntityType>(query, token);
}
You can use and modify this code for your case.
This is a known and documented behavior. The Table service API will either return 1000 entities or as much entities as possible within 5 seconds. If the query takes longer than 5 seconds to execute, it'll return a continuation token.
With the addition of rowkey you are making the query more specific and hence faster and as a result yo are getting all the entities.
See TimeOuts and Pagination on MSDN for details
If you are getting partial result sets then there will be two factors.
i) You are having more than 1000 records matching the filter
ii) Querying took more than 5 seconds.
iii) Query crosses partition boundary.
As you are having less than 1000 records the first factor wont be a issue.And as you are retrieving based on PartitionKey equality third one also wont cause any problem. You are facing this problem because of second factor.
Two handle this you need to work on continuation token. You can refer this link for more info.
I'm trying to retrieve a list of entities from CRM, but I'd like to get each one with the related entities. So far, I've the following code:
FilterExpression filterExpression = new FilterExpression();
ConditionExpression condition = new ConditionExpression(Constants.ModifiedOnAttribute, ConditionOperator.GreaterEqual, lastSync);
filterExpression.AddCondition(condition);
QueryExpression query = new QueryExpression()
{
EntityName = entityName,
ColumnSet = new ColumnSet(attributesMetadata.Select(att => att.Name).ToArray<string>()),
Criteria = filterExpression,
Distinct = false,
NoLock = true
};
RetrieveMultipleRequest multipleRequest = new RetrieveMultipleRequest();
multipleRequest.Query = queryExpression;
RetrieveMultipleResponse response = (RetrieveMultipleResponse)proxy.Execute(multipleRequest);
In the variable response, I can see the EntityCollection attribute, but inside, Related entities always come empty.
I'd like to know if it is possible to retrieve the set of a given entities, with the related entities, using RetrieveMultipleRequest, rather than go one by one using RetrieveRequest.
One approach to retreive related entities data - adding LinkEntities to your query. Example below will make you an idea how to make this:
LinkEntity linkEntity = new LinkEntity("email", "new_emails", "activityid", "new_relatedemail", JoinOperator.Inner);
linkEntity.Columns.AddColumn("versionnumber");
linkEntity.Columns.AddColumn("new_emailsid");
linkEntity.EntityAlias = "related";
query = new QueryExpression("email");
query.ColumnSet.AddColumn("activityid");
query.ColumnSet.AddColumn("versionnumber");
query.Criteria.AddCondition("modifiedon", ConditionOperator.NotNull);
query.LinkEntities.Add(linkEntity);
And then you can access attributes from related entities using EntityAlias you specified above:
foreach (Entity entity in entities.Entities)
{
if ((long)(entity["related.versionnumber"] as AliasedValue).Value > 0)
{
stop = false;
}
}
The RetrieveMultipleRequest is for returning multiple instances of a particular type of entity. I have spent a year using the CRM SDK from C# and I have found no way of populating those related entity collections in a single query. This basically leaves you with two options:
Use the AliasedValue as SergeyS recommends. Remember when querying 1:Many relationships, be aware that you could be returning multiple results for the same parent entity. This is what I use most of the time.
Perform a second query for each relationship you want access to. You'll probably get better performance if you can use an IN statement in your second query, based on the results of the first, rather than performing a separate query for each result of the first.
Below is some pseudo code to show the difference.
var contacts = GetContacts();
// One Request to get the cars for the contacts
var cars = GetCarsWhereContactIdIn(contacts.Select( c => c.new_ContactId));
foreach(var c in contacts){
c.new_Cars.AddRange(cars.where(car => car.new_contactId = c.ContactId));
}
// Verses
var contacts = GetContacts();
foreach(var c in contacts){
// One Request for each contact
c.new_Cars.AddRange(GetCarsForContact(c.ContactId));
}
I am having trouble with a MOSS FulltextSqlQuery when attempting to filter People results on the Skills Managed Property using the CONTAINS predicate. Let me demonstrate:
A query with no filters returns the expected result:
SELECT AccountName, Skills
from scope()
where freetext(defaultproperties,'+Bob')
And ("scope" = 'People')
Result
Total Rows: 1
ACCOUNTNAME: MYDOMAIN\Bob
SKILLS: Numchucks | ASP.Net | Application Architecture
But when I append a CONTAINS predicate, I no longer get the expected result:
SELECT AccountName, Skills
from scope()
where freetext(defaultproperties,'+Bob')
And ("scope" = 'People')
And (CONTAINS(Skills, 'Numchucks'))
Result
Total Rows: 0
I do realize I can accomplish this using the SOME ARRAY predicate, but I would like to know why this is not working with the CONTAINS predicate for the Skills property. I have been successful using the CONTAINS predicate with a custom crawled property that is indicated as 'Multi-valued'. The Skills property (though it seems to be multi-valued) is not indicated as such on the Crawled Properties page in the SSP admin site:
http:///ssp/admin/_layouts/schema.aspx?ConsoleView=crawledPropertiesView&category=People
Anyone have any ideas?
So with the help of Mark Cameron (Microsoft SharePoint Developer Support), I figured out that certain managed properties have to be enabled for full text search using the ManagedProperty object model API by setting the FullTextQueriable property to true. Below is the method that solved this issue for me. It could be included in a Console app or as a Farm or Web Application scoped Feature Receiver.
using Microsoft.Office.Server;
using Microsoft.Office.Server.Search.Administration;
private void EnsureFullTextQueriableManagedProperties(ServerContext serverContext)
{
var schema = new Schema(SearchContext.GetContext(serverContext));
var managedProperties = new[] { "SKILLS", "INTERESTS" };
foreach (ManagedProperty managedProperty in schema.AllManagedProperties)
{
if (!managedProperties.Contains(managedProperty.Name.ToUpper()))
continue;
if (managedProperty.FullTextQueriable)
continue;
try
{
managedProperty.FullTextQueriable = true;
managedProperty.Update();
Log.Info(m => m("Successfully set managed property {0} to be FullTextQueriable", managedProperty.Name));
}
catch (Exception e)
{
Log.Error(m => m("Error updating managed property {0}", managedProperty.Name), e);
}
}
}
SELECT AccountName, Skills
from scope()
where freetext(defaultproperties,'+Bob')
And ("scope" = 'People')
And (CONTAINS(Skills, 'Numchucks*'))
use the * in the end.
You also have a few more options to try:
The following list identifies
additional query elements that are
supported only with SQL search syntax
using the FullTextSqlQuery class:
FREETEXT()
CONTAINS()
LIKE
Source