Inheriting from both ABC and django.db.models.Model raises metaclass exception - python-3.x

I am trying to implement a Django data model class, which is also an interface class, using Python 3. My reason for doing so is, I'm writing a base class for my colleague, and need him to implement three methods in all of the classes he derives from mine. I am trying to give him a simplified way to use the functionality of a system I've designed. But, he must override a few methods to supply the system with enough information to execute the code in his inherited classes.
I know this is wrong, because it's throwing exceptions, but I'd like to have a class like the following example:
from django.db import models
from abc import ABC, abstractmethod
class AlgorithmTemplate(ABC, models.Model):
name = models.CharField(max_length=32)
#abstractmethod
def data_subscriptions(self):
"""
This method returns a list of topics this class will subscribe to using websockets
NOTE: This method MUST be overriden!
:rtype: list
"""
I understand I could avoid inheriting from the ABC class, but I'd like to use it for reasons I won't bore you with here.
The Problem
After including a class, like the one above, into my project and running python manage.py makemigrations I get the error: TypeError: metaclass conflict: the metaclass of a derived class must be a (non-strict) subclass of the metaclasses of all its bases. I have searched Stack Overflow, but have only find solutions like the following one:
class M_A(type): pass
class M_B(type): pass
class A(metaclass=M_A): pass
class B(metaclass=M_B): pass
class M_C(M_A, M_B): pass
class C:(A, B, metaclass=M_C): pass
I've read the following posts:
Using ABC, PolymorphicModel, django-models gives metaclass conflict
Resolving metaclass conflicts
And I've tried many variations of those solutions, but I still get the dreaded metaclass exception. Help me Obi-Wan Kenobi, you're my only hope. :-)

I had the same need and found this. I've altered the code for clarity and completeness. Basically you need an extra class which you can use for all your model interfaces.
import abc
from django.db import models
class AbstractModelMeta(abc.ABCMeta, type(models.Model)):
pass
class AbstractModel(models.Model, metaclass=AbstractModelMeta):
# You may have common fields here.
class Meta:
abstract = True
#abc.abstractmethod
def must_implement(self):
pass
class MyModel(AbstractModel):
code = models.CharField("code", max_length=10, unique=True)
class Meta:
app_label = 'my_app'
test = MyModel(code='test')
> TypeError: Can't instantiate abstract class MyModel with abstract methods must_implement
Now you have the best of both worlds.

I found a solution that worked for me, so thought I would post it here in case it helps someone else. I decided to not inherit from the ABC class, and instead just raise an exception in the "abstract" methods (the ones the derived class must implement). I did find helpful information in the Django docs, describing using Django data models as an Abstract base class and also Multi-table inheritance.
Django Data Model as an Abstract Base Class
Quoted from the docs:
Abstract base classes are useful when you want to put some common information into a number of other models. You write your base class and put abstract=True in the Meta class. This model will then not be used to create any database table. Instead, when it is used as a base class for other models, its fields will be added to those of the child class.
An example:
from django.db import models
class CommonInfo(models.Model):
name = models.CharField(max_length=100)
age = models.PositiveIntegerField()
class Meta:
abstract = True
class Student(CommonInfo):
home_group = models.CharField(max_length=5)
The Student model will have three fields: name, age and home_group.
The CommonInfo model cannot be used as a normal Django model, since it
is an abstract base class. It does not generate a database table or
have a manager, and cannot be instantiated or saved directly.
Fields inherited from abstract base classes can be overridden with
another field or value, or be removed with None.
Multi-table Inheritance with a Django Data Model
My understanding of "multi-table inheritance" is, you can define a data model and then also use it as a base class for a second data model. The second data model will inherit all the fields from the 1st model, plus its own fields.
Quoted from the docs:
The second type of model inheritance supported by Django is when each
model in the hierarchy is a model all by itself. Each model
corresponds to its own database table and can be queried and created
individually. The inheritance relationship introduces links between
the child model and each of its parents (via an automatically-created
OneToOneField). For example:
from django.db import models
class Place(models.Model):
name = models.CharField(max_length=50)
address = models.CharField(max_length=80)
class Restaurant(Place):
serves_hot_dogs = models.BooleanField(default=False)
serves_pizza = models.BooleanField(default=False)
All of the fields of Place will also be available in Restaurant,
although the data will reside in a different database table. So these
are both possible:
>>> Place.objects.filter(name="Bob's Cafe")
>>> Restaurant.objects.filter(name="Bob's Cafe")

Related

metaclass conflict when making an abstract base class with a metaclass

I'm writing a library which permits expansion by registering custom render elements.
I had the idea of using a metaclass on AbstractElement to automatically register all non-abstract classes subsequently derived from it. This is what I came up with:
import abc
import inspect
class AbstractElementMeta(type):
def __new__(mcs, name, basses, attrs):
cls = super(M, mcs).__new__(mcs, name, basses, attrs)
if not inspect.isabstract(cls):
register_element(cls) # Defined elsewhere in the module
class AbstractElement(abc.ABC, metaclass=AbstractElementMeta):
pass
But this results in the following error:
TypeError: metaclass conflict: the metaclass of a derived class must be a
(non-strict) subclass of the metaclasses of all its bases
Still sort of wrapping my head around the internal class structure and mechanisms of python. Is what I want to do possible, and if so, how can I go about it? I want to avoid making my users use the metaclass themselves (I'd just make a decorator if I had to).

Python Pydantic double base model

I'm using FastAPI with Pydantic and I'm trying to achieve that my API accepts cammel case parameters, for this, I'm using the following
from pydantic import BaseModel
from humps import camelize
class CamelModel(BaseModel):
class Config:
alias_generator = camelize
allow_population_by_field_name = True
class MyClass(CamelModel):
my_field1: int
my_field2: int
my_field3: int
So far it works great, but MyClass is a base class for others classes, for example as
class MyNewClass(MyClass):
my_field4: float
How can I get the MyNewClass to also use the camel case base class? I've tried something like
from typing import Union
class MyNewClass(Union[MyClass, CamelModel]):
my_field4: float
But I'm getting this error
TypeError: Cannot subclass <class 'typing._SpecialForm'>
Is there any way to accomplish this?
Thanks!
What you are trying to achieve is called multiple inheritance. Since you are inheriting from a class which inherited from the CamelModel, there's no need to inherit it again
The appropriate code should be
class MyNewClass(MyClass):
It's the python syntax for multiple inheritance. See an extensive example here https://www.python-course.eu/python3_multiple_inheritance_example.php#An-Example-of-Multiple-Inheritance
The code you are using (class MyNewClass(Union[MyClass, CamelModel]):) is used for declaring data types, which is kinda of correct, but not in the right place. Typing is almost only (as far as I've seen) used for parameters of functions.
NOTE
I did not test the piece of code above, but I'm pretty sure it works. Let me know if there are any problems

Best way to register all subclasses

I am currently developing a piece of software where the I have class instamces that are generated from dictionaries. The way these dictionariea file are structured is as follows:
layer_dict = {
"layer_type": "Conv2D",
"name": "conv1",
"kernel_size": 3,
...
}
Then, the following code is ran
def create_layer(layer_dict):
LayerType = getattr(layers, layer_dict['layer_type']
del layer_dict['layer_type']
return LayerType(**layer_dict)
Now, I want to support the creation of new layer types (by subclassing the BaseLayer class). I've thought of a few ways to do this and thought I'd ask which way is best and why as I don't have much experience developing software (finishing an MSc in comp bio).
Method 1: Metaclasses
The first method I thought of was to have a metaclass that registers every subclass of BaseLayer in a dict and do a simple lookup of this dict instead of using getattr.
class MetaLayer(type)
layers = {}
def __init__(cls, name, bases, dct):
if name in MetaLayer.layers:
raise ValueError('Cannot have more than one layer with the same name')
MetaLayer.layers[name] = cls
Benefit: The metaclass can make sure that no two classes have the same name. The user doesn't need to think about anything but subclassing when creating new layers.
Downside: Metaclasses are difficult to understand and often frowned upon
Method 2: Traversing the __subclasses__ tree
The second method I thought of was to use the __subclassess__ function of BaseLayer to get a list of all subclasses, then create a dict with Layer.__name__ as keys and Layer as values. See example code below:
def get_subclasses(cls):
"""Returns all classes that inherit from `cls`
"""
subclasses = {
sub.__name__: sub for sub in cls.__subclasses__()
}
subsubclasses = (
get_subclasses(sub) for sub in subclasses.values()
)
subsubclasses = {
name: sub for subs in subsubclasses for name, sub in subs.items()
}
return {**subclasses, ** subsubclasses}
Benefit: Easy to explain how this works.
Downside: We might end up with two layers having the same name.
Method 3: Using a class decorator
The final method is my favourite as it doesn't hide any implementation details in a metaclass, and still manages to prevent multiple classes with the same name.
Here the layers module has a global variable named layers and a decorator named register_layer, which simply adds the decorated classes to the layers dict. See code below.
layers = {}
def register_layer(cls):
if cls.__name__ in layers:
raise ValueError('Cannot have two layers with the same name')
layers[cls.__name__] = cls
return cls
Benefit: No metaclasses and no way of having two layers with the same name.
Downside: Requires a global variable, which is often frowned upon.
So, my question is, which method is preferable? And more importantly, why?
Actually - that is the kind of things metaclases are designed for. As you can see from the options you stated above, it is the simpler and more straightforward design.
They are sometimes "frowned upon" because of two things: (1) people don't understand then and don't care for understanding; (2) people misuse then when they are actually not needed; (3) they are hard to combine - so if any of your classes is to be used with a mixn that have a different metaclass (say abc.ABC), you have also to produce a combining metaclass.
Method 4: __init_subclass__
Now, that said, from Python 3.6, there is a new feature that can cover your usecase without the need for metaclasses: the class __init_subclass__ method:
it is called as a classmethod on the base class when subclasses of it are created.
All you need is to write a proper __init_subclass__ method on your BaseLayer class and have all the benefits you'd have from the implementation in the metaclasses and none of the downsides
Like you, I like the class decorator approach as it is more readable.
You can avoid using a global variable by making the class decorator itself a class, and making layers a class variable instead. You can also avoid possible name collision by joining the target class' name with its module name:
class register_layer:
layers = {}
def __new__(cls, target):
cls.layers['.'.join((target.__module__, target.__name__))] = target
return target

Getting all registered subclasses of an ABCMeta

I have a directory structure similar to the following:
.
├── main.py
├── model.py
└── models
├── __init__.py
├── model_a.py
└── model_b.py
model.py contains an Abstract Base Class:
from abc import ABCMeta, abstractmethod
class Base(metaclass=ABCMeta):
#abstractmethod
def run(self):
pass
in the modelsfolder are two implementations of this base class, model_a.py and model_b.py, who register themselves to the main Baseclass. model_a.py looks like this:
from model import Base
class ModelA(Base):
def run(self):
return "a"
ModelA.register(Base)
assert issubclass(ModelA, Base)
and model_b.pyis similar.
Now, what I am trying to do in main.py is to create a dictionary of all the subclasses of Base so that I can select one (via the GUI of my program) and run it:
from model import Base
subclasses = Base.__subclasses__()
dct = {cls.__name__: cls for cls in subclasses}
klass = dct['ModelA']
klass.run()
But I can't get it to work. I get RuntimeError: Refusing to create an inheritance cyclewhen I try to execute one of the derived classes and the dictionary in main.py is empty.
I realise this is rather late, but in case it's helpful to anyone else who stumbles upon this...
You've got a few problems here:
Your classes are the wrong way round in that register call; it would only make sense in this context to call Base.register(ModelA) (not the other way round) in order to register ModelA as a "virtual subclass" of Base.
Calling ModelA.register(Base) is trying to register Base as a virtual subclass of ModelA, but ModelA is already an actual subclass of Base, - which is why you're getting an inheritance cycle. You can't have classes X and Y inheriting from each other.
However, as ModelA is explicitly a subclass of Base, you don't need to call register at all. You want either:
class ModelA(Base):
...
with no register call (here ModelA is an actual subclass of Base), or:
class ModelA:
...
Base.register(ModelA)
(here ModelA is a standalone class, outside Base's inheritance hierarchy, but it is registered as a virtual subclass). Either/or - not both.
In either case, issubclass(ModelA, Base) would be True.
__subclasses__() doesn't pick up virtual subclasses, only actual ones - so if you want to use that, you should forget about register() and just make ModelA a real subclass of Base (the first option above).
(This is, to my mind, a wart with the whole ABC/register mechanism: issubclass() might be True but __subclasses__() doesn't pick it up - nasty.)
If you don't import the model containing ModelA at some point in your execution, it's never set up so ModelA won't show up in Base.__subclassess__() anyway. This is probably why the dictionary in main.py is empty.
A fix would be to add a line to main.py saying import models, and have models/__init__.py import model_a and model_b. Then when main runs, it imports models, which in turn imports model_a and model_a, executing the definitions of ModelA and ModelB and adding them to Base's class hierarchy.
On your final line, you don't instantiate an instance of whatever class klass is pointing at; the line should be:
klass().run()

Saving Object State with Pickle (objects containing objects)

I'm trying to figure out how to serialize an object with Pickle to a save file. My example is an object called World and this object has a list (named objects) of potentially hundreds of instantiated objects of different class types.
The problem is that Pickle won't let me serialize the items within the World.objects list because they aren't instantiated as attributes of World.
When I attempt to serialize with:
with open('gsave.pkl', 'wb') as output:
pickle.dump(world.objects, output, pickle.DEFAULT_PROTOCOL)
I get the following error:
_pickle.PicklingError: Can't pickle <class 'world.LargeHealthPotion'>:
attribute lookup LargeHealthPotion on world failed
So, my question is: what is an alternative way of storing the world.objects list items so that they are attributes of world rather than list items that don't get saved?
UPDATE
I think my issue isn't where the objects are stored; but rather that the class LargeHealthPotion (and many others) are dynamically created within the World class by operations such as this:
def __constructor__(self, n, cl, d, c, h, l):
# initialize super of class type
super(self.__class__, self).__init__(name=n, classtype=cl, description=d, cost=c,
hp=h, level=l)
# create the object class dynamically, utilizing __constructor__ for __init__ method
item = type(item_name,
(eval("{}.{}".format(name,row[1].value)),),
{'__init__':__constructor__})
# add new object to the global _objects object to be used throughout the world
self._objects[item_name] = item(obj_name, obj_classtype, obj_description, obj_cost,
obj_hp, obj_level)
When this finishes, I will have a new object like <world.LargeHealthPotion object at 0x103690ac8>. I do this dynamically because I don't want to explicitly have to create hundreds of different types of classes for each different type of object in my world. Instead, I create the class dynamically while iterating over the item name (with it's stats) that I want to create.
This introduces a problem though, because when pickling, it can't find the static reference to the class in order to deconstruct, or reconstruct the object...so it fails.
What else can I do? (Besides creating literal class references for each, and every, type of object I'm going to instantiate into my world.)
Pickle does not pickle classes, it instead relies on references to classes which doesn't work if the class was dynamically generated. (this answer has appropriate exert and bolding from documentation)
So pickle assumes that if your object is from the class called world.LargeHealthPotion then it check that that name actually resolves to the class that it will be able to use when unpickling, if it doesn't then you won't be able to reinitialize the object since it doesn't know how to reference the class. There are a few ways of getting around this:
Define __reduce__ to reconstruct object
I'm not sure how to demo this method to you, I'd need much more information about your setup to suggest how to implement this but I can describe it:
First you'd make a function or classmethod that could recreate one object based on the arguments (probably take class name, instance variables etc.) Then define __reduce__ on the object base class that would return that function along with the arguments needed to pass to it when unpickling.
Put the dynamic classes in the global scope
This is the quick and dirty solution. Assuming the class names do not conflict with other things defined in the world module you could theoretically insert the classes into the global scope by doing globals()[item_name] = item_type, but I do not recommend this as long term solution since it is very bad practice.
Don't use dynamic classes
This is definitely the way to go in my opinion, instead of using the type constructor, just define your own class named something like ObjectType that:
Is not a subclass of type so the instances would be pickle-able.
When an instance is it called constructs a new game-object that has a reference to the object type.
So assuming you have a class called GameObject that takes cls=<ObjectType object> you could setup the ObjectType class something like this:
class ObjectType:
def __init__(self, name, description):
self.item_name = name
self.base_item_description = description
#other qualities common to all objects of this type
def __call__(self, cost, level, hp):
#other qualities that are specific to each item
return GameObject(cls=self, cost=cost, level=level, hp=hp)
Here I am using the __call__ magic method so it uses the same notation as classes cls(params) to create instances, the cls=self would indicate to the (abstracted) GameObject constructor that the class (type) of GameObject is based on the ObjectType instance self. It doesn't have to be a keyword argument, but I'm not sure how else to make a coherent example code without knowing more about your program.

Resources