CRM 2011 Performing FetchXML group by on custom option set value - dynamics-crm-2011

I want to be able to perform a FetchXML request that sums a value while grouping on a field that is a custom option set but I don't get the expected results.
All that is returned is the summed up values, not the related custom option set value that it relates to so I have no idea what the returned values relate to.
This is the fetchXML request which appears to be correct:
<fetch distinct='false' mapping='logical' aggregate='true'>
<entity name='opportunity'>
<attribute name='estimatedvalue' alias='opportunity_sum' aggregate='sum' />
<attribute name='koo_opportunitytype' alias='koo_opportunitytype' groupby='true' />
</entity>
</fetch>
Each value that is returned only has 1 attribute...the opportunity_sum value.
If I group by say, customer id, then the returned values are summed up correctly and a reference is returned to the related customer for each summed value which is what I would expect.
Is it not possible to group by a custom option set value? This seems to work fine with standard system option set values such as status code.

I have verified that your fetch xml works just fine as long as the data is clean. If all of your koo_opportunitytype values for opportunities are null, it will not return an attribute for them. I'm assuming that you're only getting one entity returned? Also if any estimated value is null for a group, the sum will not be returned either. This means you'll probably want to add a filter to exclude null values from your sum.

Related

Solr count mismatch between facet and filter query

I am running solr 7.7.2 and I am trying to apply facet on a particular field
"display-classification_en_string_mv" (type="string" indexed="true" stored="false" multiValued="true")
The problem is when I try to apply facets on this field, with
acet=true&facet.field={!ex%3Dfkdisplay-classification}display-classification_en_string_mv&facet.mincount=1&facet.limit=10&facet.sort=count,
The actual facet count I get for a variant of this field "maxi dress" is 100 as shown below.
Now when I try to add a filterquery (fq) like this
fq={!tag%3Dfkdisplay-classification}+display-classification_en_string_mv:"Maxi+Dress"
the actual count increases to 101.
One thing to note is I am using a collapse query to group documents having same value in a field of type="string" indexed="true" stored="true".
This count mismatch happens only when the collapse query is applied, and without the collapsing in place, the count remains same in both cases.
Please let me know if I am missing something or any error in implementation which might lead to this.
Apparently, the collapse query selects one of the documents in the group as a leader and selects it for counting facets, and in one of the group, the leader which was getting selected didn't have the field considered for facets.

Microsoft Graph API $filter=name eq 'foo' query not working on GET workbook/tables/{id}/columns. No error and no filtering

I'm looking at a table (Table1) inside an Excel book saved on my OneDrive for Business account. I then want to get the maximum value in the CREATEDDATE column from this table.
I want to avoid pulling down the whole table with the API, so I'm trying to filter the results of my query to only the CREATEDDATE column. However, the column results from the table are not being filtered to the one column and I'm not getting an error to help troubleshoot why. All I get is an HTTP 200 response and the full unfiltered table results.
Is it possible to filter the columns retrieved from the API by the column name? The documentation made me think so.
I've confirmed that /columns?$select=name works correctly and returns just the name field, so I know that it recognizes this as an entity. $filter and $orderby do nothing when referencing any of the entities from the response (name, id, index, values). I know that I can limit columns by position, but I'd rather explicitly reference the column by name in case the order changes.
I'm using this query:
/v1.0/me/drive/items/{ID}/workbook/tables/Table1/columns?$filter=name eq 'CREATEDDATE'`
You don't need to $filter here, just pull it by the name directly. The prototypes from the Get TableColumn documentation are:
GET /workbook/tables/{id|name}/columns/{id|name}
GET /workbook/worksheets/{id|name}/tables/{id|name}/columns/{id|name}
So in your case, you should be able to simply call call:
/v1.0/me/drive/items/{ID}//workbook/tables/Table1/columns/CREATEDDATE

SSAS, dimension numeric value filtering

I am using the multiple dimensional model in SSAS with a seemingly simple requirement.
I have a Product dimension table with a Price attribute. Using Excel pivot-table, I want to filter this Price attribute, for example "greater than $1000". However the filter in the pivot table is a string only, hence I can not do perform any numerical comparison operations, but rather for equivalent strings, e.g. "$1,000.00".
My problem is similar to this thread, and I wonder if there is a solution/work around that I missed?
Best regards,
CT
As suggested in the thread that you link, you could create a measure for the price, and then filter that. The definition of this calculated measure would be something like
[Product].[Product].Properties("Price", TYPED)
assuming the dimension as well as the attribute are named "Product", and the attribute has the price defined as a property named "Price".
(You define a property in BIDS as a relationship from the Product attribute to the Priice attribute.)

OpenJPA not taking full column name in query

I am using OpenJPA and doing the mapping using orm.xml. For certain cases where my column name length is more than some number of characters then jpql while generating sql query generates the column name as only upto some number of characters.
If i have a mapping as below
<entity name="OneOffSystemTemplateVariableValue" class="financing.tools.docgen.models.OneOffSystemTemplateVariableValue">
<table name="modification_variable_values"/>
<attribute-override name="id">
<column name="modification_variable_values_id"/>
</attribute-override>
.....
</attributes>
</entity>
When i run a simple query on OneOffSystemTemplateVariableValue, it generated my query as below
SELECT t0.**MODIFICATION_VARIABLE_VALUES_I**, t0.create_timestamp, t0.create_id, t0.last_update_timestamp, t0.last_update_id, t0.modification_object_id, t0.variable_name, t0.variable_value FROM Administrator.modification_variable_values t0 WHERE t0.one_off_template_id = ?
Here for MODIFICATION_VARIABLE_VALUES_ID column if i change column name to MOD_VARIABLE_VALUES_ID then my query works good.
So sql generated is not taking full column name length.
I have not set the maxColumnNameLength anywhere in my application.
Can you tell me how can i set the value of column name length so that i may not face this problem.
If you are using MySQL, you would set the following property in your persistence.xml file.
openjpa.jdbc.DBDictionary=mysql(maxColumnNameLength=256)
For other databases refer to the OpenJPA user manual. I'd also be interested in knowing which database you're using so we might be able to fix this max column issue.

SharePoint - Auto-increment dates in new records?

I have a list that's going to be updated with relatively static data weekly, and I wanted to create a workflow to do this automatically. The only field I'm having trouble with is Start Date.
I want the new Start Date to be exactly one week after the previous week's (row's) Start Date, but I can't figure out how to capture this. I can't seem to find an easy way to get the value of the previous row.
Now, theoretically, I could just have the workflow run once a week on a given day and use [Today] as the value for the field; however, a requirement is that the list can be populated a few weeks in advance if needed.
Thanks in advance for any help you can provide!
You could solve this problem by creating a custom Field type that queries its parent list for the most recent item and set itself to be the desired date. MSDN has a number of decent references for how to create custom field types.
I recently did something akin to this: I created a "Unique Number" field type that will ensure that no two rows contain the same numerical value in the same column.
Why not just query the list and order by date descending. The first row returned is the previous week's date?
The CAML Query would look something like this:
<Query>
<OrderBy>
<FieldRef Name='Modified' Ascending='False' />
</OrderBy>
</Query>
I use U2U CAML Query Builder for syntax help...

Resources