metaclass conflict when making an abstract base class with a metaclass - python-3.x

I'm writing a library which permits expansion by registering custom render elements.
I had the idea of using a metaclass on AbstractElement to automatically register all non-abstract classes subsequently derived from it. This is what I came up with:
import abc
import inspect
class AbstractElementMeta(type):
def __new__(mcs, name, basses, attrs):
cls = super(M, mcs).__new__(mcs, name, basses, attrs)
if not inspect.isabstract(cls):
register_element(cls) # Defined elsewhere in the module
class AbstractElement(abc.ABC, metaclass=AbstractElementMeta):
pass
But this results in the following error:
TypeError: metaclass conflict: the metaclass of a derived class must be a
(non-strict) subclass of the metaclasses of all its bases
Still sort of wrapping my head around the internal class structure and mechanisms of python. Is what I want to do possible, and if so, how can I go about it? I want to avoid making my users use the metaclass themselves (I'd just make a decorator if I had to).

Related

How to make a singleton that inherits a normal class, with predefined values, and comparable by `is` without the need of rounded brackets?

My attempt was to create the default instance from inside of a metaclass, but to no avail. At least the reported class is the singleton in the example bellow.
EDIT: Clarifying requirements here: a singleton comparable by using the is keyword, without having to instantiate/call it. Unfortunately, this well known question-answer here doesn't seem to address that.
class MyNormalClass:
def __init__(self, values):
self.values = values
class MySingleton(MyNormalClass, type):
def __new__(mcs, *args, **kwargs):
return MyNormalClass(["default"])
print(MySingleton)
# <class '__main__.MySingleton'>
print(MySingleton.values)
# AttributeError: type object 'MySingleton' has no attribute 'values'
Metaclasses for singletons are overkill. (Search my answers for that, and there should be about 10 occurrences of this phrase).
In your example code in the question, the code inside the class and the metaclass methods is not even being run, not once. There is no black-magic in Python - the program just runs, and there are a few special methods marked with __xx__ that will be called intrinsically by the language runtime. In this case, the metaclass __new__ will be called whenever you create a new class using it as the metaclass, which you code does not show. And also, it would create a new instance of your "singleton" class each time it were used.
In this case, if all you need is a single instance, just create that instance, and let that one be public, instead of its class. You can even us the instance itself to shadow the class from the module namespace, so no one can instantiate it again by accident. If you want values to be immutable, well, you can't
ensure that with pure Python code in any way, but you can make changing values do not work casually with = so that people will know they should not be changing it:
class MySingleton:
__slots__ = ("values",)
def __init__(self, values):
self.values = values
def lock(self, name, value):
raise TypeError("Singleton can't change value")
self.__class__.__setitem__ = lock
MySingleton = MySingleton(["values"])

Best way to implement abstract classes in Python

What is the best way to implement abstract classes in Python?
This is the main approach I have seen:
class A(ABC):
#abstractmethod
def foo(self):
pass
However, it does not prevent from calling the abstract method when you extend that class.
In Java you get an error if you try to do something similar, but not in Python:
class B(A):
def foo(self):
super().foo()
B().foo() # does not raise an error
In order to replicate the same Java's behaviour, you could adopt this approach:
class A(ABC):
#abstractmethod
def foo(self):
raise NotImplementedError
However, in practice I have rarely seen this latter solution, even if is apparently the most correct one. Is there a specific reason to prefer the first approach rather than the second one ?
If you really want the error to be raised if one of the subclasses try to call the superclass abstract method, then, yes, you should raise it manually. (and then, create an instance of the Exception class to the raise command raise NotImplementedError() even if it works with the class directly)
However, the existing behavior is actually convenient: if your abstractmethod contains just a pass, then you can have any number of sub-classes inheriting your base class, and as long as at least one implements the abstractmethod, it will work. Even if all of them call the super() equivalent method, without checking anything else.
If an error - NotImplementedError or any other, would be called, in a complex hierarchy, making use of mixins, and such, you'd need to check at each time you'd call super if the error was raised, just to skipt it. For the record, checking if super() would hit the class where method is abstract with a conditional is possible, this way:
if not getattr(super().foo, "__isabstractmethod__", False):
super().foo(...)
Since what do you want if you reach the base of the hierarchy for a method is for it to do nothing, it is far simples if just nothing happens!
I mean, check this:
class A(abc.ABC):
#abstractmethod
def validate(self, **kwargs):
pass
class B(A):
def validate(self, *, first_arg_for_B, second_arg_for_B=None, **kwargs):
super().validate(**kwargs)
# perform validation:
...
class C(A)
def validate(self, *, first_arg_for_C **kwargs):
super().validate(**kwargs)
# perform validation:
...
class Final(B, C):
...
Neither B.validate nor C.validate need to worry about any other class in the hierarchy, just do their thing and pass on.
If A.validate would raise, both methods would have to do super().validate(...) inside a try: ...;except ...:pass statement, or inside a weird if block, for the gain of...nothing.
update - I just found this note on the oficial documentation:
Note Unlike Java abstract methods, these abstract methods may have an
implementation. This implementation can be called via the super()
mechanism from the class that overrides it. This could be useful as an
end-point for a super-call in a framework that uses cooperative
multiple-inheritance.
https://docs.python.org/3/library/abc.html#abc.abstractmethod
I will even return you a personal question, if you can reply in the comments: I understand it is much less relevant in Java where one can't have multiple inheritance, so, even in a big hierarchy, the first subclass to implement the abstract method would usually be well known. But otherwise, in a Java project were one could pick one of various Base concrete classes, and proceed with others in an arbitrary order, since the abstractmethod raises, how is that resolved?

Inheriting from both ABC and django.db.models.Model raises metaclass exception

I am trying to implement a Django data model class, which is also an interface class, using Python 3. My reason for doing so is, I'm writing a base class for my colleague, and need him to implement three methods in all of the classes he derives from mine. I am trying to give him a simplified way to use the functionality of a system I've designed. But, he must override a few methods to supply the system with enough information to execute the code in his inherited classes.
I know this is wrong, because it's throwing exceptions, but I'd like to have a class like the following example:
from django.db import models
from abc import ABC, abstractmethod
class AlgorithmTemplate(ABC, models.Model):
name = models.CharField(max_length=32)
#abstractmethod
def data_subscriptions(self):
"""
This method returns a list of topics this class will subscribe to using websockets
NOTE: This method MUST be overriden!
:rtype: list
"""
I understand I could avoid inheriting from the ABC class, but I'd like to use it for reasons I won't bore you with here.
The Problem
After including a class, like the one above, into my project and running python manage.py makemigrations I get the error: TypeError: metaclass conflict: the metaclass of a derived class must be a (non-strict) subclass of the metaclasses of all its bases. I have searched Stack Overflow, but have only find solutions like the following one:
class M_A(type): pass
class M_B(type): pass
class A(metaclass=M_A): pass
class B(metaclass=M_B): pass
class M_C(M_A, M_B): pass
class C:(A, B, metaclass=M_C): pass
I've read the following posts:
Using ABC, PolymorphicModel, django-models gives metaclass conflict
Resolving metaclass conflicts
And I've tried many variations of those solutions, but I still get the dreaded metaclass exception. Help me Obi-Wan Kenobi, you're my only hope. :-)
I had the same need and found this. I've altered the code for clarity and completeness. Basically you need an extra class which you can use for all your model interfaces.
import abc
from django.db import models
class AbstractModelMeta(abc.ABCMeta, type(models.Model)):
pass
class AbstractModel(models.Model, metaclass=AbstractModelMeta):
# You may have common fields here.
class Meta:
abstract = True
#abc.abstractmethod
def must_implement(self):
pass
class MyModel(AbstractModel):
code = models.CharField("code", max_length=10, unique=True)
class Meta:
app_label = 'my_app'
test = MyModel(code='test')
> TypeError: Can't instantiate abstract class MyModel with abstract methods must_implement
Now you have the best of both worlds.
I found a solution that worked for me, so thought I would post it here in case it helps someone else. I decided to not inherit from the ABC class, and instead just raise an exception in the "abstract" methods (the ones the derived class must implement). I did find helpful information in the Django docs, describing using Django data models as an Abstract base class and also Multi-table inheritance.
Django Data Model as an Abstract Base Class
Quoted from the docs:
Abstract base classes are useful when you want to put some common information into a number of other models. You write your base class and put abstract=True in the Meta class. This model will then not be used to create any database table. Instead, when it is used as a base class for other models, its fields will be added to those of the child class.
An example:
from django.db import models
class CommonInfo(models.Model):
name = models.CharField(max_length=100)
age = models.PositiveIntegerField()
class Meta:
abstract = True
class Student(CommonInfo):
home_group = models.CharField(max_length=5)
The Student model will have three fields: name, age and home_group.
The CommonInfo model cannot be used as a normal Django model, since it
is an abstract base class. It does not generate a database table or
have a manager, and cannot be instantiated or saved directly.
Fields inherited from abstract base classes can be overridden with
another field or value, or be removed with None.
Multi-table Inheritance with a Django Data Model
My understanding of "multi-table inheritance" is, you can define a data model and then also use it as a base class for a second data model. The second data model will inherit all the fields from the 1st model, plus its own fields.
Quoted from the docs:
The second type of model inheritance supported by Django is when each
model in the hierarchy is a model all by itself. Each model
corresponds to its own database table and can be queried and created
individually. The inheritance relationship introduces links between
the child model and each of its parents (via an automatically-created
OneToOneField). For example:
from django.db import models
class Place(models.Model):
name = models.CharField(max_length=50)
address = models.CharField(max_length=80)
class Restaurant(Place):
serves_hot_dogs = models.BooleanField(default=False)
serves_pizza = models.BooleanField(default=False)
All of the fields of Place will also be available in Restaurant,
although the data will reside in a different database table. So these
are both possible:
>>> Place.objects.filter(name="Bob's Cafe")
>>> Restaurant.objects.filter(name="Bob's Cafe")

Saving Object State with Pickle (objects containing objects)

I'm trying to figure out how to serialize an object with Pickle to a save file. My example is an object called World and this object has a list (named objects) of potentially hundreds of instantiated objects of different class types.
The problem is that Pickle won't let me serialize the items within the World.objects list because they aren't instantiated as attributes of World.
When I attempt to serialize with:
with open('gsave.pkl', 'wb') as output:
pickle.dump(world.objects, output, pickle.DEFAULT_PROTOCOL)
I get the following error:
_pickle.PicklingError: Can't pickle <class 'world.LargeHealthPotion'>:
attribute lookup LargeHealthPotion on world failed
So, my question is: what is an alternative way of storing the world.objects list items so that they are attributes of world rather than list items that don't get saved?
UPDATE
I think my issue isn't where the objects are stored; but rather that the class LargeHealthPotion (and many others) are dynamically created within the World class by operations such as this:
def __constructor__(self, n, cl, d, c, h, l):
# initialize super of class type
super(self.__class__, self).__init__(name=n, classtype=cl, description=d, cost=c,
hp=h, level=l)
# create the object class dynamically, utilizing __constructor__ for __init__ method
item = type(item_name,
(eval("{}.{}".format(name,row[1].value)),),
{'__init__':__constructor__})
# add new object to the global _objects object to be used throughout the world
self._objects[item_name] = item(obj_name, obj_classtype, obj_description, obj_cost,
obj_hp, obj_level)
When this finishes, I will have a new object like <world.LargeHealthPotion object at 0x103690ac8>. I do this dynamically because I don't want to explicitly have to create hundreds of different types of classes for each different type of object in my world. Instead, I create the class dynamically while iterating over the item name (with it's stats) that I want to create.
This introduces a problem though, because when pickling, it can't find the static reference to the class in order to deconstruct, or reconstruct the object...so it fails.
What else can I do? (Besides creating literal class references for each, and every, type of object I'm going to instantiate into my world.)
Pickle does not pickle classes, it instead relies on references to classes which doesn't work if the class was dynamically generated. (this answer has appropriate exert and bolding from documentation)
So pickle assumes that if your object is from the class called world.LargeHealthPotion then it check that that name actually resolves to the class that it will be able to use when unpickling, if it doesn't then you won't be able to reinitialize the object since it doesn't know how to reference the class. There are a few ways of getting around this:
Define __reduce__ to reconstruct object
I'm not sure how to demo this method to you, I'd need much more information about your setup to suggest how to implement this but I can describe it:
First you'd make a function or classmethod that could recreate one object based on the arguments (probably take class name, instance variables etc.) Then define __reduce__ on the object base class that would return that function along with the arguments needed to pass to it when unpickling.
Put the dynamic classes in the global scope
This is the quick and dirty solution. Assuming the class names do not conflict with other things defined in the world module you could theoretically insert the classes into the global scope by doing globals()[item_name] = item_type, but I do not recommend this as long term solution since it is very bad practice.
Don't use dynamic classes
This is definitely the way to go in my opinion, instead of using the type constructor, just define your own class named something like ObjectType that:
Is not a subclass of type so the instances would be pickle-able.
When an instance is it called constructs a new game-object that has a reference to the object type.
So assuming you have a class called GameObject that takes cls=<ObjectType object> you could setup the ObjectType class something like this:
class ObjectType:
def __init__(self, name, description):
self.item_name = name
self.base_item_description = description
#other qualities common to all objects of this type
def __call__(self, cost, level, hp):
#other qualities that are specific to each item
return GameObject(cls=self, cost=cost, level=level, hp=hp)
Here I am using the __call__ magic method so it uses the same notation as classes cls(params) to create instances, the cls=self would indicate to the (abstracted) GameObject constructor that the class (type) of GameObject is based on the ObjectType instance self. It doesn't have to be a keyword argument, but I'm not sure how else to make a coherent example code without knowing more about your program.

How to define and access private and protected variables and functions in python3?

I am confused in accessing private variables in python3 because it makes new variable if access it from a wrong way. I made a private variable _data (Single underscore. Right?) in a class and I want to access it in child class.
class A_parent:
_myVar=0
class B_child:
_myVar=_myVar+1 #Right or wrong?
What is the convention of private, protected modifier in python3?
Simple underscore is just a convention to discourage callers/users of the variable: your example will work (except that you probably want to define an instance variable, not a class variable)
To make variable private you have to use double underscore BUT:
class A_parent:
__myVar=0
class B_child(A_parent):
__myVar=__myVar+1 # __myVar or A and B are different: error because right term is not defined yet.
you are talking of "kind of protected" variables. In that case, it is advised to use simple underscore like you did.
Example for an instance variable, much more useful:
class A_parent:
def __init__(self):
self._myVar=0
class B_child(A_parent):
def __init__(self):
# call parent constructor else _myVar isn't defined
A_parent.__init__(self)
# syntax using super
# super(B_child, self).__init__()
self._myVar+=1 # Okay
If you REALLY want to access the private parent attribute, you can do this:
class A_parent:
def __init__(self):
self.__myVar=12
class B_child(A_parent):
def __init__(self):
# call parent constructor
A_parent.__init__(self)
print(self._A_parent__myVar) # will print 12 when class is instantiated
but of course it's ill-advised. Use only when you can't refactor the mother class. I never had to do that.
Note that if you omit the call to the parent constructor, self._A_parent__myVar won't be defined
(Python is a dynamic language: you define members on the fly. In C++ or Java, not invoking the constructor would still define parent member variables, but not here)

Resources