Accessing an element of an array in TSX - node.js

I have a TSX file, with a state including:
tickets: Ticket[],
I now want to change one specific element inside the array, and reset the state, my idea:
onClick = (ticket: Ticket, i: number) => {
var newTitle = window.prompt('hello')
ticket.title = newTitle ? newTitle : ticket.title
var tickets = [this.state.tickets]
tickets[i] = ticket
// set state
}
Besides the usual "OBject could be undefined" errors, I'm mainly getting stuck at:
Type 'Ticket' is missing the following properties from type 'Ticket[]': length, pop, push, concat, and 28 more. TS2740
It's as if they still consider tickets[i] to be of type Tickets[]. (I've done other checks and that seems to be the problem).
Do you know why this is the case? And how can still achieve my goal?
Thank you

There's a lot that's wrong here including multiple mutations of state.
Array of Arrays
The particular error that you've posted:
Type 'Ticket' is missing the following properties from type 'Ticket[]': length, pop, push, concat, and 25 more.
Is caused by this line:
var tickets = [this.state.tickets]
You are taking the array of tickets from state and putting it into an array. This variable tickets is an array with one element where that element is the array from your state. In typescript terms, it is [Ticket[]] or Ticket[][]. So each element of that array should be Ticket[] instead of Ticket. When you try to set an element with a Ticket then you get an error that it should be Ticket[].
State Mutations
As a rule of thumb, don't mutate anything in React if you aren't certain that it's safe. Just setting ticket.title is an illegal mutation of state which will prevent your app from re-rendering properly. The Ticket object that is passed to onClick is (presumably) the same object as the one in your state so you cannot mutate it.
Instead, we use array.map (which creates a copy of the array) to either return the same Ticket object or a copied one if we are changing it. We don't actually need the ticket as an argument. If the tickets have some unique property like an id then you could also pass just the ticket and not i.
onClick = (i: number) => {
const newTitle = window.prompt("hello");
if (newTitle) {
this.setState((prevState) => ({
tickets: prevState.tickets.map((ticket, index) =>
index === i ? { ...ticket, title: newTitle } : ticket
)
}));
}
};

Related

How do I prevent a re-render of a large list?

I have a 60x30 grid for a game editor and as cells are updated, a new array is created to hold the state.
The problem is that when I update that grid array, this changes the property and it causes render() to recreate the grid. This seems almost obvious but then what do my options become?
If this is overly specific, imagine just a huge list of items and you have an immutable array in which one of the items properties must change.
render() {
return html`
${this.data?.cells.map((row) => {
return row.map((cell) => {
return html`<editor-cell .data="${cell}"></editor-cell>`;
});
})}
`;
}
Coincidentally, I had the same problem on Angular with a for loop only it had trackBy which used the index or item.id to prevent the recreation of a list of items. I just accepted the unicorns for that but here it is the same issue.
Question:
What am I missing about immutable states here? I totally understand why this is happening in that, its a new array and so lit element just renders what it deems a new array. I want that, but once the grid has been rendered, I don't understand the separation between rendering and data updates. I'm either missing a key lifecycle understanding, or my approach to state is just totally whack.
If you update the complete array this change the memory reference and then lit-html has to re-render the whole array because it doesn't know what item changes.
In the lit-html documentation you have a section about Repeating templates that explain that very well.
In your case you should use repeat directive, that performs efficient updates of lists based on user-supplied keys:
render() {
return html`
${repeat(this.data?.cells, row => row.id,
row => html`${repeat(row, cell => cell.id,
cell => html`<editor-cell .data="${cell}"></editor-cell>`
)}`
)}
`;
}
Notice the importance of the second argument, that it's the guaranteed unique key for each item.

Different variable name case convention in one application

This is a really trivial problem. I am just curious on how to deal with this in a "professional" manner.
I am trying to stick to variable naming convention. For NodeJs I am doing camelCasing. For database, I am using PostgreSQL and using underscore_casing.
Now the problem arises when I query data from PostgreSQL. I'll get a user object with following format,
{user_id: 1, account_type : "Admin"}
I can pass this object directly to server side-render and will have to use underscore casing to access account_type. Of course, I can manually create a new user JSON object with property userId and accountType but that is unnecessary work.
Is it possible to follow variable naming convention for both language and avoid having mixed variable names casing in some files? What is a good way to stay organized?
The are two good ways to approach this issue. The simplest one - do no conversion, use the exact database names. And the second one is to camel-case columns automatically.
Either way, you should always follow the underscore notation for all PostgreSQL declarations, as it will give you the option to activate camel-casing in your app at a later time, if it becomes necessary. Never use camel-case inside the database, or you will end up in a lot of pain later.
If you want the best of both worlds, follow the underscore notation for all PostgreSQL declarations, and convert to camel-case as you read data.
Below is an example of how to do it properly with pg-promise, copied from event receive example:
// Example below shows the fastest way to camelize column names:
const options = {
receive(e) {
camelizeColumns(e.data);
}
};
function camelizeColumns(data) {
const template = data[0];
for (var prop in template) {
const camel = pgp.utils.camelize(prop);
if (!(camel in template)) {
for (var i = 0; i < data.length; i++) {
const d = data[i];
d[camel] = d[prop];
delete d[prop];
}
}
}
}
Also see the following article: Pg-promise and case sensitivity in column names.
UPDATE
The code above has been updated for use of pg-promise v11 or later.
I've struggled with this too, and I've concluded that there's really no way to avoid this kind of ugliness unless you rewrite the objects that come from the database. Fortunately, that's not too difficult in Javascript:
const fromDBtoJS = (obj) => {
// declare a variable to hold the result
const result = {};
// iterate over the keys on the object
Object.keys(obj).forEach((key) => {
// adjust the key
const newKey = key.replace(/_[a-z]/g, (x) => x[1].toUpperCase());
// add the value from the old object with the new key
result[newKey] = obj[key];
});
// return the result
return result;
};
Here's a JSFiddle. The "replace" code above was found here
If you wanted to use classes for models in your application, you could incorporate this code into the constructor or database load method so it's all handled more-or-less automatically.

SharePoint CSOM - Load HasUniqueRoleAssignment fails all time

I am getting much confused as where I am doing wrong. I have done it many times before but not sure why its NOT working this time. Here is a code;
dynamic fileOrFolder;
if (model.IsFolder)
fileOrFolder = _clientContext.Web.GetFolderByServerRelativeUrl(serverRelativeUrl);
else
fileOrFolder = _clientContext.Web.GetFileByServerRelativeUrl(serverRelativeUrl);
I have tried ALL below but nothing worked;
_clientContext.Load(fileOrFolder, item => item.Include(file => file.ListItemAllFields));
dynamic blhasUniquePermission = fileOrFolder.ListItemAllFields.HasUniqueRoleAssignments;
OR
_clientContext.Load(fileOrFolder.ListItemAllFields.HasUniqueRoleAssignments);
OR
_clientContext.Load(fileOrFolder.ListItemAllFields,
items => items.Include(
item => item.Id,
item => item.DisplayName,
item => item.HasUniqueRoleAssignments));
OR
_clientContext.Load(fileOrFolder.ListItemAllFields, "Include(HasUniqueRoleAssignments)");
_clientContext.ExecuteQuery();
Everytime it is immediatly throwing error on either the Load line itself or on ExecuteQuery. We deffo know the property is there in ListItemAllFields collection then why its doing it?
It looks like you used dynamic keyword in order to hack C# into letting you create fileOrFolder variable and store instance of one of two unrelated types inside.
Not only it's weird, but by doing so you've also crippled IntelliSense and compiler.
Take this line:
_clientContext.Load(fileOrFolder, item => item.Include(file => file.ListItemAllFields));
It probably throws in runtime, because there is no Include method on neither Microsoft.SharePoint.Client.File or Microsoft.SharePoint.Client.Folder type. If you didn't use dynamic, you'd get clear compiler error instead.
This one:
_clientContext.Load(fileOrFolder.ListItemAllFields.HasUniqueRoleAssignments);
doesn't work, because _clientContext.Load takes instance of Microsoft.SharePoint.Client.ClientObject. But again, compiler doesn't know what the argument is, because it comes from dynamic object. Instead of red squiggly in editor, you get runtime error.
That should work:
ListItem itemAndOnlyItem = null;
if (model.IsFolder)
{
var folder = _clientContext.Web.GetFolderByServerRelativeUrl(serverRelativeUrl);
itemAndOnlyItem = folder.ListItemAllFields;
}
else
{
var file = _clientContext.Web.GetFileByServerRelativeUrl(serverRelativeUrl);
itemAndOnlyItem = file.ListItemAllFields;
}
_clientContext.Load(itemAndOnlyItem,
item => item.HasUniqueRoleAssignments);
_clientContext.ExecuteQuery();
var result = itemAndOnlyItem.HasUniqueRoleAssignments;

Field index for queries not updating when value set programmatically

My module creates a custom content item through the controller:
private ContentItem createContentItem()
{
// Add the field
_contentDefinitionManager.AlterPartDefinition(
"TestType",
cfg => cfg
.WithField(
"NewField",
f => f
.OfType(typeof(BooleanField).Name)
.WithDisplayName("New Field"))
);
// Not sure if this is needed
_contentDefinitionManager.AlterTypeDefinition(
"TestType",
cfg => cfg
.WithPart("TestType")
);
// Create new TestType item
var newItem = _contentManager.New("TestType");
_contentManager.Create(TestItem, VersionOptions.Published);
// Set the added boolean field to true
BooleanField newField = ((dynamic)newItem).TestType.NewField as BooleanField;
newField.Value = true;
// Set title (as date created, for convenience)
var time = DateTime.Now.ToString("MM-dd-yyyy h:mm:ss tt", CultureInfo.InvariantCulture).Replace(':', '.');
newItem.As<TitlePart>().Title = time;
return newItem;
}
The end result of this is a new TestType item with a field that's set to true. Viewing the content item in the dashboard as well as examining ContentItemVersionRecord in the database confirms that the value was set correctly.
However, queries don't seem to work properly on fields that are set in this manner. I found the record IntegerFieldIndexRecord, which is what I assume projections use to fill query result pages. On this, the value of TestField remains at 0 (false), instead of 1 (true).
Going to the content item edit page and simply clicking 'save' updates IntegerFieldIndexRecord correctly, meaning that the value is now picked up by the query. How can the record be updated for field values set programmatically?
Relevant section of migration:
SchemaBuilder.CreateTable(typeof(TestTypePartRecord).Name, table => table
.ContentPartRecord()
);
ContentDefinitionManager.AlterTypeDefinition(
"TestType",
cfg => cfg
.DisplayedAs("Test Type")
.WithPart(typeof(TitlePart).Name)
.WithPart(typeof(ContainablePart).Name)
.WithPart(typeof(CommonPart).Name)
.WithPart(typeof(IdentityPart).Name)
);
Edit: The fix for this is to manually change the projection index record whenever changing a field value, using this call:
_fieldIndexService.Set(testResultItem.As<FieldIndexPart>(),
"TestType", // Resolves as TestTypePart, which holds the field
"newField",
"", // Not sure why value name should be empty, but whatever
true, // The value to be set goes here
typeof(bool));
In some cases a simple contentManager.Publish() won't do.
I've had a similar problem some time ago and actually implemented a simple helper service to tackle this problem; here's an excerpt:
public T GetStringFieldValues<T>(ContentPart contentPart, string fieldName)
{
var fieldIndexPart = contentPart.ContentItem.As<FieldIndexPart>();
var partName = contentPart.PartDefinition.Name;
return this.fieldIndexService.Get<T>(fieldIndexPart, partName, fieldName, string.Empty);
}
private void SetStringFieldValue(ContentPart contentPart, string fieldName, IEnumerable<int> ids)
{
var fieldIndexPart = contentPart.ContentItem.As<FieldIndexPart>();
var partName = contentPart.PartDefinition.Name;
var encodedValues = "{" + string.Join("},{", ids) + "}";
this.fieldIndexService.Set(fieldIndexPart, partName, fieldName, string.Empty, encodedValues, typeof(string));
}
I've actually built this for use with MediaLibrary- and ContentPicker fields (they encode their value as string internally), so it might not be suitable for the boolean field in your example.
But it can't be that hard to implement, just look at the existing drivers and handlers for those fields.
There are 2 ways to fix this:
1) Ensure the newly created item is getting published by calling ContentManager.Publish() as Orchard.Projections.Handlers.FieldIndexPartHandler listens to the publish event to update the FieldIndexPartRecord
2) use IFieldIndexService to update FieldIndexPartRecord manually, see implementation of Orchard.Projections.Handlers.FieldIndexPartHandler to get in idea how to do this
Hope this helps.
:edit
Due to calling Create(...Published) the ContentManager.Published() won't do anything as the item is already considered published.
You can do the following to force the publish logic to run:
bool itemPublished = newItem.VersionRecord.Published;
// unpublish item first when it is already published as ContentManager.Publish() internally first checks for published flag and when set it aborts silently
// -> this behaviour prevents calling publish listeners
if (itemPublished)
_contentManager.Unpublish(newItem);
// the following call will result in calls to IContentHandler.Publishing() / IContentHandler.Published()
_contentManager.Publish(newItem);
or just create the item as a draft and publish it when everything is setup correctly.

MongoDB update object and remove properties?

I have been searching for hours, but I cannot find anything about this.
Situation:
Backend, existing of NodeJS + Express + Mongoose (+ MongoDB ofcourse).
Frontend retrieves object from the Backend.
Frontend makes some changes (adds/updates/removes some attributes).
Now I use mongoose: PersonModel.findByIdAndUpdate(id, updatedPersonObject);
Result: added properties are added. Updated properties are updated. Removed properties... are still there!
Now I've been searching for an elegant way to solve this, but the best I could come up with is something like:
var properties = Object.keys(PersonModel.schema.paths);
for (var i = 0, len = properties.length; i < len; i++) {
// explicitly remove values that are not in the update
var property = properties[i];
if (typeof(updatedPersonObject[property]) === 'undefined') {
// Mongoose does not like it if I remove the _id property
if (property !== '_id') {
oldPersonDocument[property] = undefined;
}
}
}
oldPersonDocument.save(function() {
PersonModel.findByIdAndUpdate(id, updatedPersonObject);
});
(I did not even include trivial code to fetch the old document).
I have to write this for every Object I want to update. I find it hard to believe that this is the best way to handle this. Any suggestions anyone?
Edit:
Another workaround I found: to unset a value in MongoDB you have to set it to undefined.
If I set this value in the frontend, it is lost in the REST-call. So I set it to null in the frontend, and then in the backend I convert all null-values to undefined.
Still ugly though. There must be a better way.
You could use replaceOne() if you want to know how many documents matched your filter condition and how many were changed (I believe it only changes one document, so this may not be useful to know). Docs: https://mongoosejs.com/docs/api/model.html#model_Model.replaceOne
Or you could use findOneAndReplace if you want to see the document. I don't know if it is the old doc or the new doc that is passed to the callback; the docs say Finds a matching document, replaces it with the provided doc, and passes the returned doc to the callback., but you could test that on your own. Docs: https://mongoosejs.com/docs/api.html#model_Model.findOneAndReplace
So, instead of:
PersonModel.findByIdAndUpdate(id, updatedPersonObject);, you could do:
PersonModel.replaceOne({ _id: id }, updatedPersonObject);
As long as you have all the properties you want on the object you will use to replace the old doc, you should be good to go.
Also really struggling with this but I don't think your solution is too bad. Our setup is frontend -> update function backend -> sanitize users input -> save in db. For the sanitization part, we use a helper function where we integrate your approach.
private static patchModel(dbDocToUpdate: IModel, dataFromUser: Record<string, any>): IModel {
const sanitized = {};
const properties = Object.keys(PersonModel.schema.paths);
for (const key of properties) {
if (key in dbDocToUpdate) {
sanitized[key] = data[key];
}
}
Object.assign(dbDocToUpdate, sanitized);
return dbDocToUpdate;
}
That works smoothly and sets the values to undefined. Hence, they get removed from the document in the db.
The only problem that remains for us is that we wanted to allow partial updates. With that solution that's not possible and you always have to send everything to the backend.
EDIT
Another workaround we found is setting the property to an empty string in the frontend. Mongo then also removes the property in the database

Resources