Order Groups by number of items - tabulator

I use the tabulator JS library and group my table. How can I order the groups by their number of rows? Eg. the group containing the most items should order at the top.

I do this for sorting dates as the user adds new date-based events. I sort the array after a new event has been added and then use the replaceData function to update the table. Seems to work pretty well performance-wise.
You would have to work out your own routine logic for sorting based on number of group members.
The replaceData function lets you silently replace all data in the
table without updating scroll position, sort or filtering, and without
triggering the ajax loading popup. This is great if you have a table
you want to periodically update with new/updated information without
alerting the user to a change.
mayData.events = sortEventsByDate(mayData.events);
eventTable.replaceData(mayData.events);
function sortEventsByDate(events) {
var sorted = events.sort(function (a, b) {
return new Date(a.startDate).getTime() - new Date(b.startDate).getTime();
});
return sorted;
}

Related

Using Power Apps to add a new value to a multiple-value choice column in SharePoint without overwriting existing values

I have a SharePoint list that contains a choice column with the 'multiple selection' option turned on. Each item in this list contains data related to preferences for a given user, and the choice column will store the IDs for each of the user's 'favorited' reports.
I would like to write a Patch formula in Power Apps that writes a new value to this column, but retains the existing values. Here is an extract from my current formula, triggered when a user selects the 'Add To Favorites' button, where 'Favorites' is the choice column that already contains values:
Patch(
'Platform User Preferences',
LookUp(
'Platform User Preferences',
UserEmail = User().Email
),
{Favorites: [ThisItem.ID]}
)
Current state, this formula overwrites the existing values in the choice column with the new single value, instead of adding it alongside the existing values.
One approach I have attempted (based on reading similar use cases online) is to create a collection from the Favorites column, add the new value to that collection, then patch the entire collection back to SP. However, I have had similar problems doing this as I do not fully understand the model of a collection that is based on a multi-value choice column. For example, the following also appears to completely wipe the data in the collection, rather than add to it:
ClearCollect(favslist,Filter('Platform User Preferences',UserEmail = User().Email).Favorites);
Collect(favslist, {Value: ThisItem.ID});
Any help with solving this problem would be most appreciated!
You'll need to create another collection that contains each selection of the existing favorites. Right now your 'favlist' collection contains one item that contains all the existing favorite selections, then youre adding you new item. This isn't formatted correctly this way.
Try updating your existing code before you patch, by using a ForAll and collect the existing items:
ClearCollect(existingfavslist,Filter('Platform User Preferences',UserEmail = User().Email).Favorites);
ForAll(existingfavlist, Collect(favslist, ThisRecord.Value));
Collect(favslist, {Value: ThisItem.ID});
Then just patch your collection 'favslist' to the list

What is best way to delete all rows within an arraykey?

I have an entity Contact which has an array entity Specialties. Just a standard one to many relationship. One contact many specialties. (Specialties entity has few columns, might not be relevant). I have a screen to add list of Specialties to a contact on the PCF. I want to add a custom Remove All button on the Contact screen which will delete all values on the array against the specific contact. A contact can have large number of specialties (~10000)
What is best way to delete all the elements in the array?
Currently, I have the below function on the action property of the button and it is clocking and timing out.
for(spec in contact.Specialties)
{contact.removeFromSpecialties(spec) //OOTB remove from array method}
Any other better way to remove ~10000 records from the array entity?
From your question above, I assume that you will have a "Remove All" button in PCF screen with an action that deletes all the speciality records associated with that particular contact, but you will not delete partial records (like one or few records).
Also I assume the entity type of "Speciality" entity is "Editable" or "Retireable".
Keeping the above assumption in mind, you can create a new function and call that function from "Remove All" button action property.
Given below the code that will hard delete the entire records from database table just like you execute a delete query against a DB table,
function removeAllSpecialitiesForContact(contactFromPCF : Contact){
var specialityQuery = Query.make(entity.Speciality).compare(Speciality#Contact, Equals, contactFromPCF.ID)
var deleteBuilder = new com.guidewire.pl.system.database.query.DeleteBuilder.DeleteBuilder(specialityQuery)
deleteBuilder.execute()
}
This new approach might not get timeout after PCF action as you faced already.

SharePoint view limit

I have created an application and display data with JSOM. My problem is the limit of the view. I have set the row limit and is working fine if I get all items where ID is greater than zero, which is bringing all the items from the list. I also added a new column (indexed) and the query is failing when the second filter is giving me more than 5000 items. I really have no idea what is the difference. ID is automatically indexed, the second one was created by me. I can only guess index was not created because I have exceeded 5000 items, but as I heard, in SharePoint Online this limit is higher.
The list item retrieval limit on SharePoint Online is 5,000 and cannot be changed. Technically, it can be changed in the same way you would increase this limit for a SharePoint on prem instance, but Microsoft is the only ones with access to those settings in SharePoint Online and they're not going to change it.
This user voice request has been pending for 6 years now though it does have a "Working on it" status so maybe in the next couple of years.
In the mean time, you will need to implement pagination like this:
ClientContext clientContext = new ClientContext(<YourSiteURL>);
List list = clientContext.Web.Lists.GetByTitle(<YourListTitle>);
ListItemCollectionPosition itemPosition = null;
do
{
CamlQuery camlQuery = new CamlQuery();
camlQuery.ListItemCollectionPosition = itemPosition;
camlQuery.ViewXml = #"<View><ViewFields><FieldRef Name=’Title’/></ViewFields><RowLimit>5000</RowLimit></View>";
ListItemCollection listItems = list.GetItems(camlQuery);
clientContext.Load(listItems);
clientContext.ExecuteQuery();
itemPosition = listItems.ListItemCollectionPosition;
//Now process the items
foreach (ListItem listItem in listItems)
{
//Do something
}
} while (itemPosition != null)
If your app is unable to retrieve more than 5,000 items, you need to use pagination to get all the items in a large list in chunks of 5,000 until you have all of them. The above code snippet does just that.
If you're trying to modify a view directly on the SharePoint list, those won't allow more than 5,000 items to be retrieved. It becomes doubly messy when you consider that it isn't the final result set that counts, but the total POSSIBLE number of rows since the calculation is done on the SQL Server side. In your example you're probably trying to filter by the second column you created. Since the list has more than 5,000 items, the view fails because the column you created is not an indexed auto number field like the ID field. As a result, in order to return your view SQL Server has to select ALL the rows in the list in order to sort and filter by your column. Consider this.
The internal database structure for SharePoint lists stores all rows from all lists in ONE table named AllRows.
By default, SQL Server automatically upgrades row locks to table locks when the number of row locks on a given table exceeds 5,000. This is a performance consideration from the dinosaur age when RAM was at a premium in servers.
In any large list scenario, if SQL Server was to upgrade the row lock to a table lock, it would essentially block any and all other lists in SharePoint from being updateable. For this reason, it will block any return of more than 5,000 items to the list.
You could try to add filter to your view based on the ID field e.g. ID <= 5000 for a view called "0-5000" and then ID > 5000 && ID <= 10000 for another view called "5001-10000".
Its not the greatest solution, but it's a workable one. ;-)

Calculated SharePoint Column

I have a simple calculated column in my SharePoint 2010 list. It takes the list item ID, adds 100 to it.
When my users are creating items in the list, the calculated column does not get updated unless I go in, edit the column (do nothing) and save it. It, in fact, gives all items a value of 101 unless I manually edit the column.
Is this typical or is there a work around for this issue?
Thank you!
It is not possible to create calculated column based on ID value. The Id of the item is created after the item is added to the list.
You should use workflow instead.
The problem with using a workflow to do this (as per the accepted answer) is that workflow can take an appreciable time to execute. So you cannot create the ID until AFTER the new item is saved and there is always the danger that simultaneous users can create ID clashes that you also have to handle. If the workflow (as on a busy system) takes several minutes to work, you can also get the problem of someone else editing the item before the workflow has finished which may cause the workflow to fail leaving the item without any ID.
As an alternative, you might consider using JavaScript in the NewItem.aspx page to lookup and increment a counter from a separate list. Note that you have to update the counter as you read it if you are doing this so as to ensure that other users don't accidentally get the same ID if creating entries at the same time. This means that you must not mind the counter incrementing even if a user subsequently cancels the new item without saving.
As you noticed in opening/saving an item, The Calculated Column is updated on every item change.
Does it work to have a Workflow read the Title and write (the same) Title?
The [ID] reference in the Calculated Column should be set.
No need for an extra LookupID column then.

Filtering a repeating table linked to a secondary datasource

I have an infopath form based on a sharepoint list (worktracker).
In that form there is a repeating section which holds data from a secondary source which is the worktracker list again.
I would like to filter that table using the value in a field on the form - this enables me to run a check on duplicate items on the list by using a calculated checking reference.
I have done this by using conditional formatting to hide the non-matching items but that this killing my form as IE throws tantrum as it takes too long.
Does anyone know another way to do this? I am stuck using IE8 - not my choice!
UPDATE:
So since posting the above, I had since tried using a REST connection which doesn't work as my list is too big. I have also tried using an XML connection to a filtered view and that didn't work either!
Cheers,
In the form, select the value field. Create a rule that sets the secondary data source's query field of the same name to that value. Then query the secondary data source. It will only return the items where the value matches.

Resources