Is there a way to tell MongoClient that there are some properties I don't want to store? For example, I have some properties that are cyclic dependencies which fail to serialize - this causes a few problems. I'd rather not have to set them to null before I save them every time, then re-instate those variables when the insert has finished.
One way to do this is with a little help from the omit method of the underscore (or lodash) library. That will cleanly create a copy of your object without the problematic properties.
var objectToInsert = _.omit(fullObject, 'badField1', 'badField2');
collection.insert(objectToInsert, callback);
Another way to go is to use Mongoose which lets you define schemas for your collections so that only those fields in the schema get included when saved.
Related
I'm using NestJS (not Next) NodeJS framework
When I'm creating new objects I used to use new OjbectClass({...fieldsValues});
It's great especially when you use transform pipes from class-transformer;
Besides this approach is used for entity creating:
https://docs.nestjs.com/techniques/database#separating-entity-definition
But as far I see in different guides of TypeOrm usage
here: https://typeorm.io/#/ ,
and here: https://orkhan.gitbook.io/typeorm/docs/entities .
They show first to create an empty object, then only set fields with values:
const object = new EntityObject();
object.field = 'value';
Why? Does it make sense?
Does NodeJS create a redundant hidden class of properties passed via object into Entity Class constructor? If yes - then we can pass coma-separated arguments
I believe it's just cause that's how the docs are. Looking at the code for BaseEntity it does not look like having a constructor to assign the fields would be a problem
So I have a Nested Many Schema (eg Users) inside another Schema (eg Computer). My input object to be deserialised by the Schema is complex and does not allow for assignment, and to modify it to allow for assignment is impractical.
The input object (eg ComputerObject) itself does not contain an a value called "Users", but nested in a few other objects is a function that can get the users (eg ComputerObject.OS.Accounts.getUsers()), and I want the output of that function to be used as the value for that field in the schema.
Two possible solutions exist that I know of, I could either define the field as field.Method(#call the function here) or I could do a #post_dump function to call the function and add it to the final output JSON as it can provide both the initial object and the output JSON.
The issue with both of these is that it then doesn't serialise it through the nested Schema Users, which contains more nested Schemas and so on, it would just set that field to be equal to the return value of getUsers, which I don't want.
I have tried to define it in a pre-dump so that it can then be serialised in the dump (note: this schema is used only for dumping and not for loading), but as that takes in the initial object I cannot assign to it.
Basically, I have a thing I am trying to do, and a bunch of hacky workarounds that could make it work but not without breaking other things or missing out on the validation altogether, but no actual solution it seems, anybody know how to do this properly?
For further info, the object that is being input is a complex Django Model, which might give me some avenues Im not aware of, my Django experience is somewhat lacking.
So figured this out myself eventually:
Instead of managing the data-getting in the main schema, you can define the method used in the sub-schema using post_dump with many=True, thus the following code would work correctly:
class User(Schema):
id = fields.UUID
#pre_dump(pass_many=True)
def get_data(self, data, **kwargs):
data = data.Accounts.getUsers()
return data
class Computer(Schema):
#The field will need to be called "OS" in order to correctly look in the "OS" attribute for further data
OS = fields.Nested(User, many=True, data_key="users")
I am using Sequelize where I fetch all the data from a table using findAll. Which is basically an Array of objects. What seems to be confusing is that the data I am showing as output nested under objects. (Sounds Confusing? Let me clarify)
So, Let's I have this Short Code
Here if I run this code, it will give me undefine because father lies in parent, for which I have to use user.parent.father, Right?
Okay, now on Fetching data from table in my code,
I console.log my first row, for which I get this.
Now here the values which I need lies in dataValues.
In my ejs file. I am using simple for-of loop
Now my Question is why am I not getting undefined for product.title , product.imageUrl and so on? It is supposed to get those data by product.dataValues.title. Because It lies in another object names dataValues.
Technically, when a value is initialized by Sequelize, your object's prototype is set to Model (the class is too long to copy-paste it here).
When you create your model, Sequelize calls init on it (line 424) and which in turn calls refreshAttributes.
This one calls Object.defineProperty to define both the getter and the setter for each property you have defined in the metadata (line 1238).
The getter and setter are set to get and set functions respectively (lines 1095 to 1103).
This actually means that
instance.field
is just a property wrapped over
instance.get('field')
This corresponds with their docs which says
Instance instances operate with the concept of a dataValues property, which stores the actual values represented by the instance. By default, the values from dataValues can also be accessed directly from the Instance, that is:
instance.field
is the same as
instance.get('field')
is the same as
instance.getDataValue('field')
I have a reporting-based website using ServiceStack and OrmLite, using a SQL Server back-end. Due to the duration of some of the reports, I'd like to either globally, or selectively (via Service-derived class or some attribute) make queries that run reports have a longer CommandTimeout. I'd like to keep using the existing code to query data for the reports (Db.Select, Db.SqlList calls), so I won't have to write boiler-plate code in each reporting class & method.
I've tried using the AlwaysUseCommand, but that requires declaring a single command that has an open connection. I don't see how I can do that without opening the connection in my AppHost.Configure method and never closing it. It'd be nicer if it were a function pointer, but alas.
I've also looked at overriding the Db property of Service, but that just returns an open IDbConnection without letting me override command created.
It seems like my last option is to create a custom DbConnection class, like the MiniProfiler's ProfiledDbConnection, but I'd have to implement a dozen abstract methods. It seems like overkill in order to set one property on a DbCommand.
So, in ServiceStack.OrmLite, how do I globally modify the CommandTimeout of DbCommands that are created?
You can change the CommandTimeout globally with:
OrmLiteConfig.CommandTimeout = NewTimeoutInSeconds;
Scoped Timeout
You can also specify a Timeout for a particular db connection with:
using (var db = dbFactory.OpenDbConnection())
{
db.SetCommandTimeout(NewTimeoutInSeconds);
}
I'm using db4o with groovy (actually griffon). I'm saving dozen of objects into db4o objectSet and see that .yarv file size is about 11Mb. I've checked its content and found that it stores metaClass with all nested fields into every object. It's a waste of space.
Looking for the way to avoid storing of metaClass and therefore reduce the size of result .yarv file, since I'm going to use db4o to store millions of entities.
Should I try callConstructors(true) db4o configuration? Think it would help?
Any help would be highly appreciated.
As an alternative you can just store 'Groovy'-beans instances. Those are compiled down to regular Java-ish classes with no special Groovy specific code attached to them.
Just like this:
class Customer {
// properties
Integer id
String name
Address address
}
class Address{
String street;
}
def customer = new Customer(id:1, name:"Gromit", address:new Address(street:"Fun"))
I don't know groovy but based on your description every groovy object carries metadata and you want to skip storing these objects.
If that is the case installing a "null translator" (TNull class) will cause the "translated" objects to not be stored.
PS: Call Constructor configuration has no effect on what gets stored in the db; it only affects how objects are instantiated when reading from db.
Hope this helps