How ejs Extract values from object - node.js

I am using Sequelize where I fetch all the data from a table using findAll. Which is basically an Array of objects. What seems to be confusing is that the data I am showing as output nested under objects. (Sounds Confusing? Let me clarify)
So, Let's I have this Short Code
Here if I run this code, it will give me undefine because father lies in parent, for which I have to use user.parent.father, Right?
Okay, now on Fetching data from table in my code,
I console.log my first row, for which I get this.
Now here the values which I need lies in dataValues.
In my ejs file. I am using simple for-of loop
Now my Question is why am I not getting undefined for product.title , product.imageUrl and so on? It is supposed to get those data by product.dataValues.title. Because It lies in another object names dataValues.

Technically, when a value is initialized by Sequelize, your object's prototype is set to Model (the class is too long to copy-paste it here).
When you create your model, Sequelize calls init on it (line 424) and which in turn calls refreshAttributes.
This one calls Object.defineProperty to define both the getter and the setter for each property you have defined in the metadata (line 1238).
The getter and setter are set to get and set functions respectively (lines 1095 to 1103).
This actually means that
instance.field
is just a property wrapped over
instance.get('field')
This corresponds with their docs which says
Instance instances operate with the concept of a dataValues property, which stores the actual values represented by the instance. By default, the values from dataValues can also be accessed directly from the Instance, that is:
instance.field
is the same as
instance.get('field')
is the same as
instance.getDataValue('field')

Related

Why TypeOrm advises to create empty object via `new EntityClass()` THEN only pass fields value?

I'm using NestJS (not Next) NodeJS framework
When I'm creating new objects I used to use new OjbectClass({...fieldsValues});
It's great especially when you use transform pipes from class-transformer;
Besides this approach is used for entity creating:
https://docs.nestjs.com/techniques/database#separating-entity-definition
But as far I see in different guides of TypeOrm usage
here: https://typeorm.io/#/ ,
and here: https://orkhan.gitbook.io/typeorm/docs/entities .
They show first to create an empty object, then only set fields with values:
const object = new EntityObject();
object.field = 'value';
Why? Does it make sense?
Does NodeJS create a redundant hidden class of properties passed via object into Entity Class constructor? If yes - then we can pass coma-separated arguments
I believe it's just cause that's how the docs are. Looking at the code for BaseEntity it does not look like having a constructor to assign the fields would be a problem

How to define a function to get an field element of a Marshmallow Schema while still serialising through a nested Schema

So I have a Nested Many Schema (eg Users) inside another Schema (eg Computer). My input object to be deserialised by the Schema is complex and does not allow for assignment, and to modify it to allow for assignment is impractical.
The input object (eg ComputerObject) itself does not contain an a value called "Users", but nested in a few other objects is a function that can get the users (eg ComputerObject.OS.Accounts.getUsers()), and I want the output of that function to be used as the value for that field in the schema.
Two possible solutions exist that I know of, I could either define the field as field.Method(#call the function here) or I could do a #post_dump function to call the function and add it to the final output JSON as it can provide both the initial object and the output JSON.
The issue with both of these is that it then doesn't serialise it through the nested Schema Users, which contains more nested Schemas and so on, it would just set that field to be equal to the return value of getUsers, which I don't want.
I have tried to define it in a pre-dump so that it can then be serialised in the dump (note: this schema is used only for dumping and not for loading), but as that takes in the initial object I cannot assign to it.
Basically, I have a thing I am trying to do, and a bunch of hacky workarounds that could make it work but not without breaking other things or missing out on the validation altogether, but no actual solution it seems, anybody know how to do this properly?
For further info, the object that is being input is a complex Django Model, which might give me some avenues Im not aware of, my Django experience is somewhat lacking.
So figured this out myself eventually:
Instead of managing the data-getting in the main schema, you can define the method used in the sub-schema using post_dump with many=True, thus the following code would work correctly:
class User(Schema):
id = fields.UUID
#pre_dump(pass_many=True)
def get_data(self, data, **kwargs):
data = data.Accounts.getUsers()
return data
class Computer(Schema):
#The field will need to be called "OS" in order to correctly look in the "OS" attribute for further data
OS = fields.Nested(User, many=True, data_key="users")

NSCoder - Only Called On Object instantiation

I have set up a class that conforms to NSCoding, and works as expected when I first create the object. It is essentially the same set up as this
The problem I am having is that the properties of the object when changed are not kept.
For example,
Foo is created and has a property called name.
Foo.name = #"bar"
I can encode / decode the object and it retains the name bar.
If I try and change
Foo.name = #"newName"
The encode method is not called again, so foo.name remains as #"bar"
(I have a log state within the encode method)
Also,
I am using a core data object, that has a transformable property which points to the foo object.
Thanks
To "save" the object, you have to call the encode method, e.g. to write it to disk or send it to an output stream.
However, since you are using Core Data to persist the object, you have to call
[managedObjectContext save:&error];
to persist the object after changing it.
That being said, I do not think it makes a lot of sense to have a transformable property that points to a custom class that keeps a string property. Instead, you should think of a more appropriate data structure so you only need transformable properties for non standard data types that cannot be persisted by using the standard data types already built into Core Data.

Referencing an instance of a given class in sequence diagrams

I have to model a system where an object of the class Person will invoke the static method getBook(...) : Book on the class Book which will return an instance of a particular book.
How do you reference the book instance obtained by the operation?
As of now, I can think of two approaches, neither of which I have ever seen/used, which is why I am looking for the correct approach.
The first approach is to invoke methods directly on the book instance obtained, e.g. if the reference returned by getBook(...) : Book is named matchingBook, I would use matchingBook.doSomething(...), much like having a local variable.
The second way, which I find more in the line of sequence diagrams is to let the book instance returned by the operation appear with its own lifeline, e.g. next to the Book class, and referencing it with an arrow labeled doSomething(...).
However, with the second approach, it is not that obvious that this object is in fact the one returned by the operation.
The second approach is the correct. To show that you are pointing to the returned object (matchingBook), you can add the variable name to the title of the lifeline, like this:
The second approach is the correct one. Anytime you call operations on an object returned by a first operation, you can't do better than a name match between the result of the first call and the lifeline.
Anyway I don't really understand what you expect of the first way: where would you put matchingBook.doSomething(...)? on a arrow pointing which lifeline?

Storing object in Esent persistent dictionary gives: Not supported for SetColumn Parameter error

I am trying to save an Object which implements an Interface say IInterface.
private PersistentDictionary<string, IInterface> Object = new PersistentDictionary<string, IInterface>(Environment.CurrentDirectory + #"\Object");
Since many classes implement the same interface(all of which need to cached), for a generic approach I want to store an Object of type IInterface in the dictionary.
So that anywhere I can pull out that object type cast it as IInterface and use that object's internal implementation of methods etc..
But, as soon as the Esent cache is initialized it throws this error:
Not supported for SetColumn
Parameter name: TColumn
Actual value was IInterface.
I have tried to not use XmlSerializer to do the same but is unable to deserialize an Interface type.Also, [Serializable] attribute cannot be used on top of a Interface, so I am stuck.
I have also tried to make all the implementations(classes) of the Interface as [Serializable] as a dying attempt but to no use.
Does any one know a way out ? Thanks in advance !!!
The only reason that only structs are supported (as well as some basic immutable classes such as string) is that the PersistentDictionary is meant to be a drop-in replacement for Dictionary, SortedDictionary and other similar classes.
Suppose I have the following code:
class MyClass
{
int val;
}
.
.
.
var dict = new Dictionary<int,MyClass>();
var x = new MyClass();
x.val = 1;
dict.Add(0,x);
x.val = 2;
var y = dict[0];
Console.WriteLine(y.val);
The output in this case would be 2. But if I'd used the PersistentDictionary instead of the regular one, the output would be 1. The class was created with value 1, and then changed after it was added to the dictionary. Since a class is a reference type, when we retrieve the item from the dictionary, we will also have the changed data.
Since the PersistentDictionary writes the data to disk, it cannot really handle reference types this way. Serializing it, and writing it to disk is essentially the same as treating the object as a value type (an entire copy is made).
Because it's intended to be used instead of the standard dictionaries, and the fact that it cannot handle reference types with complete transparency, the developers instead opted to support only structs, because structs are value types already.
However, if you're aware of this limitation and promise to be careful not to fall into this trap, you can allow it to serialize classes quite easily. Just download the source code and compile your own version of the EsentCollections library. The only change you need to make to it is to change this line:
if (!(type.IsValueType && type.IsSerializable))
to this:
if (!type.IsSerializable)
This will allow classes to be written to the PersistentDictionary as well, provided that it's Serializable, and its members are Serializable as well. A huge benefit is that it will also allow you to store arrays in there this way. All you have to keep in mind is that it's not a real dictionary, therefore when you write an object to it, it will store a copy of the object. Therefore, updating any of your object's members after adding them to the PersistentDictionary will not update the copy in the dictionary automatically as well, you'd need to remember to update it manually.
PersistentDictionary can only store value-structs and a very limited subset of classes (string, Uri, IPAddress). Take a look at ColumnConverter.cs, at private static bool IsSerializable(Type type) for the full restrictions. You'd be hitting the typeinfo.IsValueType() restriction.
By the way, you can also try posting questions about PersistentDictionary at http://managedesent.codeplex.com/discussions .
-martin

Resources