SharePoint CSOM - Load HasUniqueRoleAssignment fails all time - sharepoint

I am getting much confused as where I am doing wrong. I have done it many times before but not sure why its NOT working this time. Here is a code;
dynamic fileOrFolder;
if (model.IsFolder)
fileOrFolder = _clientContext.Web.GetFolderByServerRelativeUrl(serverRelativeUrl);
else
fileOrFolder = _clientContext.Web.GetFileByServerRelativeUrl(serverRelativeUrl);
I have tried ALL below but nothing worked;
_clientContext.Load(fileOrFolder, item => item.Include(file => file.ListItemAllFields));
dynamic blhasUniquePermission = fileOrFolder.ListItemAllFields.HasUniqueRoleAssignments;
OR
_clientContext.Load(fileOrFolder.ListItemAllFields.HasUniqueRoleAssignments);
OR
_clientContext.Load(fileOrFolder.ListItemAllFields,
items => items.Include(
item => item.Id,
item => item.DisplayName,
item => item.HasUniqueRoleAssignments));
OR
_clientContext.Load(fileOrFolder.ListItemAllFields, "Include(HasUniqueRoleAssignments)");
_clientContext.ExecuteQuery();
Everytime it is immediatly throwing error on either the Load line itself or on ExecuteQuery. We deffo know the property is there in ListItemAllFields collection then why its doing it?

It looks like you used dynamic keyword in order to hack C# into letting you create fileOrFolder variable and store instance of one of two unrelated types inside.
Not only it's weird, but by doing so you've also crippled IntelliSense and compiler.
Take this line:
_clientContext.Load(fileOrFolder, item => item.Include(file => file.ListItemAllFields));
It probably throws in runtime, because there is no Include method on neither Microsoft.SharePoint.Client.File or Microsoft.SharePoint.Client.Folder type. If you didn't use dynamic, you'd get clear compiler error instead.
This one:
_clientContext.Load(fileOrFolder.ListItemAllFields.HasUniqueRoleAssignments);
doesn't work, because _clientContext.Load takes instance of Microsoft.SharePoint.Client.ClientObject. But again, compiler doesn't know what the argument is, because it comes from dynamic object. Instead of red squiggly in editor, you get runtime error.
That should work:
ListItem itemAndOnlyItem = null;
if (model.IsFolder)
{
var folder = _clientContext.Web.GetFolderByServerRelativeUrl(serverRelativeUrl);
itemAndOnlyItem = folder.ListItemAllFields;
}
else
{
var file = _clientContext.Web.GetFileByServerRelativeUrl(serverRelativeUrl);
itemAndOnlyItem = file.ListItemAllFields;
}
_clientContext.Load(itemAndOnlyItem,
item => item.HasUniqueRoleAssignments);
_clientContext.ExecuteQuery();
var result = itemAndOnlyItem.HasUniqueRoleAssignments;

Related

Accessing an element of an array in TSX

I have a TSX file, with a state including:
tickets: Ticket[],
I now want to change one specific element inside the array, and reset the state, my idea:
onClick = (ticket: Ticket, i: number) => {
var newTitle = window.prompt('hello')
ticket.title = newTitle ? newTitle : ticket.title
var tickets = [this.state.tickets]
tickets[i] = ticket
// set state
}
Besides the usual "OBject could be undefined" errors, I'm mainly getting stuck at:
Type 'Ticket' is missing the following properties from type 'Ticket[]': length, pop, push, concat, and 28 more. TS2740
It's as if they still consider tickets[i] to be of type Tickets[]. (I've done other checks and that seems to be the problem).
Do you know why this is the case? And how can still achieve my goal?
Thank you
There's a lot that's wrong here including multiple mutations of state.
Array of Arrays
The particular error that you've posted:
Type 'Ticket' is missing the following properties from type 'Ticket[]': length, pop, push, concat, and 25 more.
Is caused by this line:
var tickets = [this.state.tickets]
You are taking the array of tickets from state and putting it into an array. This variable tickets is an array with one element where that element is the array from your state. In typescript terms, it is [Ticket[]] or Ticket[][]. So each element of that array should be Ticket[] instead of Ticket. When you try to set an element with a Ticket then you get an error that it should be Ticket[].
State Mutations
As a rule of thumb, don't mutate anything in React if you aren't certain that it's safe. Just setting ticket.title is an illegal mutation of state which will prevent your app from re-rendering properly. The Ticket object that is passed to onClick is (presumably) the same object as the one in your state so you cannot mutate it.
Instead, we use array.map (which creates a copy of the array) to either return the same Ticket object or a copied one if we are changing it. We don't actually need the ticket as an argument. If the tickets have some unique property like an id then you could also pass just the ticket and not i.
onClick = (i: number) => {
const newTitle = window.prompt("hello");
if (newTitle) {
this.setState((prevState) => ({
tickets: prevState.tickets.map((ticket, index) =>
index === i ? { ...ticket, title: newTitle } : ticket
)
}));
}
};

RazorEngine throw a NotSupportedException when compiling template in Azure Function

I must compile a razor view in an azure function to send an email, but something goes wrong. I obtain the a NotSupportedException error: The given path's format is not supported.
Here my code:
private IRazorEngineService _engine;
public MyCtor(bool isTestEnvironment)
{
TemplateServiceConfiguration configuration = new TemplateServiceConfiguration();
configuration.Debug = isTestEnvironment;
this._engine = RazorEngineService.Create(configuration);
}
public string GetHtmlEmailBody(string templateFileName, object emailData, string layoutFileName)
{
//Get data type of email data
Type emailDataType = emailData.GetType();
string layoutFullFileName = Path.Combine(this._layoutPath, layoutFileName);
string layoutContentString = File.ReadAllText(layoutFullFileName);
var layout = new LoadedTemplateSource(layoutContentString, layoutFullFileName);
this._engine.AddTemplate("layoutName", layout);
string templateFullFileName = Path.Combine(this._templatePath, templateFileName);
string templateContentString = File.ReadAllText(templateFullFileName);
var template = new LoadedTemplateSource(templateContentString, templateFullFileName);
this._engine.AddTemplate("templateName", template);
this._engine.Compile("templateName"); //<-- Here I get the exception
string htmlEmailBody = this._engine.Run("templateName", emailDataType, emailData);
return htmlEmailBody;
}
Paths are similar to D:\\...\\Emails\\Templates.. I am testing locally and it does not work... I have googled and it seems that Azure Functions have some limitations in caching and in file system management, but it is not clear how can I solve the problem.
I think I have same problem this person has written here
Any idea how can I solve it? There is something wrong in what I am doing?
I am using RazorEngine 3.10.0
Thank you
I have found the problem, downloading the code and doing reverse engineering.
Problem was inside the UseCurrentAssembliesReferenceResolver class, in the GetReferences method... here the code that throws the exception:
return CompilerServicesUtility
.GetLoadedAssemblies()
.Where(a => !a.IsDynamic && File.Exists(a.Location) && !a.Location.Contains(CompilerServiceBase.DynamicTemplateNamespace))
.GroupBy(a => a.GetName().Name).Select(grp => grp.First(y => y.GetName().Version == grp.Max(x => x.GetName().Version))) // only select distinct assemblies based on FullName to avoid loading duplicate assemblies
.Select(a => CompilerReference.From(a))
.Concat(includeAssemblies ?? Enumerable.Empty<CompilerReference>());
Exactly the statements that throw the exception are File.Exists(a.Location) && !a.Location.Contains(CompilerServiceBase.DynamicTemplateNamespace)). The problem is that in Azure function some assemblies are protected, so no information can be retrieved about them... (surely I must study about azure functions)...
At the moment I solved writing a custom ReferenceResolver. I copied exactly the same code from the UseCurrentAssembliesReferenceResolver and I changed just the Where conditions..
So
.Where(a => !a.IsDynamic && File.Exists(a.Location) && !a.Location.Contains(CompilerServiceBase.DynamicTemplateNamespace))
became
.Where(a => !a.IsDynamic && !a.FullName.Contains("Version=0.0.0.0") && File.Exists(a.Location) && !a.Location.Contains("CompiledRazorTemplates.Dynamic"))
I am almost sure that it is not the best way to solve the problem... but now I solved it, and after two days my work is blocked I need to go on... I hope this can help someone...

Different variable name case convention in one application

This is a really trivial problem. I am just curious on how to deal with this in a "professional" manner.
I am trying to stick to variable naming convention. For NodeJs I am doing camelCasing. For database, I am using PostgreSQL and using underscore_casing.
Now the problem arises when I query data from PostgreSQL. I'll get a user object with following format,
{user_id: 1, account_type : "Admin"}
I can pass this object directly to server side-render and will have to use underscore casing to access account_type. Of course, I can manually create a new user JSON object with property userId and accountType but that is unnecessary work.
Is it possible to follow variable naming convention for both language and avoid having mixed variable names casing in some files? What is a good way to stay organized?
The are two good ways to approach this issue. The simplest one - do no conversion, use the exact database names. And the second one is to camel-case columns automatically.
Either way, you should always follow the underscore notation for all PostgreSQL declarations, as it will give you the option to activate camel-casing in your app at a later time, if it becomes necessary. Never use camel-case inside the database, or you will end up in a lot of pain later.
If you want the best of both worlds, follow the underscore notation for all PostgreSQL declarations, and convert to camel-case as you read data.
Below is an example of how to do it properly with pg-promise, copied from event receive example:
// Example below shows the fastest way to camelize column names:
const options = {
receive(e) {
camelizeColumns(e.data);
}
};
function camelizeColumns(data) {
const template = data[0];
for (var prop in template) {
const camel = pgp.utils.camelize(prop);
if (!(camel in template)) {
for (var i = 0; i < data.length; i++) {
const d = data[i];
d[camel] = d[prop];
delete d[prop];
}
}
}
}
Also see the following article: Pg-promise and case sensitivity in column names.
UPDATE
The code above has been updated for use of pg-promise v11 or later.
I've struggled with this too, and I've concluded that there's really no way to avoid this kind of ugliness unless you rewrite the objects that come from the database. Fortunately, that's not too difficult in Javascript:
const fromDBtoJS = (obj) => {
// declare a variable to hold the result
const result = {};
// iterate over the keys on the object
Object.keys(obj).forEach((key) => {
// adjust the key
const newKey = key.replace(/_[a-z]/g, (x) => x[1].toUpperCase());
// add the value from the old object with the new key
result[newKey] = obj[key];
});
// return the result
return result;
};
Here's a JSFiddle. The "replace" code above was found here
If you wanted to use classes for models in your application, you could incorporate this code into the constructor or database load method so it's all handled more-or-less automatically.

How can I eager fetch content of custom types in a ContentManager query?

I'm running into some n+1 performance issues when iterating over a collection of ContentItems of a custom Type that I created solely through migrations.
ContentDefinitionManager.AlterPartDefinition("MyType", part => part
.WithField("MyField", field => field
...
)
);
ContentDefinitionManager.AlterTypeDefinition("MyType", type => type
.WithPart("MyType")
);
Every time I access a field of this part a new query is performed. I can use QueryHints to avoid this for the predefined parts
var myItems = _orchardServices.ContentManager.Query().ForType("MyType")
.WithQueryHints(new QueryHints().ExpandParts<LocalizationPart()
...
);
but can I do this for the ContentPart of my custom type too? This does not seem to work:
var myItems = _orchardServices.ContentManager.Query().ForType("MyType")
.WithQueryHints(new QueryHints().ExpandParts<ContentPart>()
...
);
How can I tell Orchard to just get everything in one go? I'd prefer to be able to do this without writing my own HQL or directly addressing the repositories.
Example:
var myItems = _orchardServices.ContentManager.Query().ForType("MyType");
#foreach(var item in myItems.Take(100)) {
foreach(var term in item.Content.MyItem.MyTaxonomyField.Terms) {
// Executes 100 queries
<div>#term.Name</div>
}
}
TaxonomyField doesn't store ids and using the TaxonomyService inside of the loop wouldn't improve performance. Right now, to work around this, I fetch all TermContentItems.Where(x => myItems.Select(i => i.Id).Contains(TermPartRecord.Id)) from the repository outside of the loop as well as a list of all the terms of the Taxonomy that the field is using. Then inside the loop:
var allTermsInThisField = termContentItems.Where(tci => tci.TermsPartRecord.Id == c.Id)
.Select(tci => terms.Where(t => t.Id == tci.TermRecord.Id).Single()).ToList()
I'm not a very experienced programmer but this was the only way I could see how to do this without digging into HQL and it seems overly complicated for my purposes. Can Orchard do this in less steps?

Field index for queries not updating when value set programmatically

My module creates a custom content item through the controller:
private ContentItem createContentItem()
{
// Add the field
_contentDefinitionManager.AlterPartDefinition(
"TestType",
cfg => cfg
.WithField(
"NewField",
f => f
.OfType(typeof(BooleanField).Name)
.WithDisplayName("New Field"))
);
// Not sure if this is needed
_contentDefinitionManager.AlterTypeDefinition(
"TestType",
cfg => cfg
.WithPart("TestType")
);
// Create new TestType item
var newItem = _contentManager.New("TestType");
_contentManager.Create(TestItem, VersionOptions.Published);
// Set the added boolean field to true
BooleanField newField = ((dynamic)newItem).TestType.NewField as BooleanField;
newField.Value = true;
// Set title (as date created, for convenience)
var time = DateTime.Now.ToString("MM-dd-yyyy h:mm:ss tt", CultureInfo.InvariantCulture).Replace(':', '.');
newItem.As<TitlePart>().Title = time;
return newItem;
}
The end result of this is a new TestType item with a field that's set to true. Viewing the content item in the dashboard as well as examining ContentItemVersionRecord in the database confirms that the value was set correctly.
However, queries don't seem to work properly on fields that are set in this manner. I found the record IntegerFieldIndexRecord, which is what I assume projections use to fill query result pages. On this, the value of TestField remains at 0 (false), instead of 1 (true).
Going to the content item edit page and simply clicking 'save' updates IntegerFieldIndexRecord correctly, meaning that the value is now picked up by the query. How can the record be updated for field values set programmatically?
Relevant section of migration:
SchemaBuilder.CreateTable(typeof(TestTypePartRecord).Name, table => table
.ContentPartRecord()
);
ContentDefinitionManager.AlterTypeDefinition(
"TestType",
cfg => cfg
.DisplayedAs("Test Type")
.WithPart(typeof(TitlePart).Name)
.WithPart(typeof(ContainablePart).Name)
.WithPart(typeof(CommonPart).Name)
.WithPart(typeof(IdentityPart).Name)
);
Edit: The fix for this is to manually change the projection index record whenever changing a field value, using this call:
_fieldIndexService.Set(testResultItem.As<FieldIndexPart>(),
"TestType", // Resolves as TestTypePart, which holds the field
"newField",
"", // Not sure why value name should be empty, but whatever
true, // The value to be set goes here
typeof(bool));
In some cases a simple contentManager.Publish() won't do.
I've had a similar problem some time ago and actually implemented a simple helper service to tackle this problem; here's an excerpt:
public T GetStringFieldValues<T>(ContentPart contentPart, string fieldName)
{
var fieldIndexPart = contentPart.ContentItem.As<FieldIndexPart>();
var partName = contentPart.PartDefinition.Name;
return this.fieldIndexService.Get<T>(fieldIndexPart, partName, fieldName, string.Empty);
}
private void SetStringFieldValue(ContentPart contentPart, string fieldName, IEnumerable<int> ids)
{
var fieldIndexPart = contentPart.ContentItem.As<FieldIndexPart>();
var partName = contentPart.PartDefinition.Name;
var encodedValues = "{" + string.Join("},{", ids) + "}";
this.fieldIndexService.Set(fieldIndexPart, partName, fieldName, string.Empty, encodedValues, typeof(string));
}
I've actually built this for use with MediaLibrary- and ContentPicker fields (they encode their value as string internally), so it might not be suitable for the boolean field in your example.
But it can't be that hard to implement, just look at the existing drivers and handlers for those fields.
There are 2 ways to fix this:
1) Ensure the newly created item is getting published by calling ContentManager.Publish() as Orchard.Projections.Handlers.FieldIndexPartHandler listens to the publish event to update the FieldIndexPartRecord
2) use IFieldIndexService to update FieldIndexPartRecord manually, see implementation of Orchard.Projections.Handlers.FieldIndexPartHandler to get in idea how to do this
Hope this helps.
:edit
Due to calling Create(...Published) the ContentManager.Published() won't do anything as the item is already considered published.
You can do the following to force the publish logic to run:
bool itemPublished = newItem.VersionRecord.Published;
// unpublish item first when it is already published as ContentManager.Publish() internally first checks for published flag and when set it aborts silently
// -> this behaviour prevents calling publish listeners
if (itemPublished)
_contentManager.Unpublish(newItem);
// the following call will result in calls to IContentHandler.Publishing() / IContentHandler.Published()
_contentManager.Publish(newItem);
or just create the item as a draft and publish it when everything is setup correctly.

Resources