Delegation pattern in Python using annotation - python-3.x

I have a class that should mimic behaviour of other class but with some flavour. For example Kotlin have delegation pattern (https://www.baeldung.com/kotlin/delegation-pattern) as a part of language to do such thing. But for Python when I try code below:
from dataclasses import dataclass
from typing import Generic, T
#dataclass
class Wrapper(T, Generic[T]):
__value__: T
def __getattr__(self, item):
return getattr(self.__value__, item)
# also need to delegate all magick methods
def __len__(self):
return self.__value__.__len__()
def try_some_funny_things(self):
setattr(__builtins__, "True", False)
funny_string = Wrapper[str]()
funny_string. # I want typehints as for str class here
I get the following error:
TypeError: metaclass conflict: the metaclass of a derived class must be a (non-strict) subclass of the metaclasses of all its bases
What are my final purposes:
Make PyCharm/Pylint (other typechecker) show fields/methods hints
Do not delegate all magic methods manually to value field
Any suggestion how can I do something like this in Python 3?

Related

Conflict between mix-ins for abstract dataclasses

1. A problem with dataclass mix-ins, solved
To make abstract dataclasses that type-check under mypy, I've been breaking them into two classes, one that contains the abstract methods and one that contains the data members, as explained in this answer. The abstract class inherits from the dataclass. This runs into a problem, though, when another abstract-class-and-dataclass pair inherits from the first one: the "ancestor" dataclass's fields get wiped out by the "descendant". For example:
from dataclasses import dataclass
from abc import ABC, abstractmethod
#dataclass
class ADataclassMixin:
a_field: int = 1
class A(ADataclassMixin, ABC):
#abstractmethod
def method(self):
pass
#dataclass
#class BDataclassMixin(A): # works but fails mypy 0.931 type-check
class BDataclassMixin: # fails
b_field: int = 2
pass
class B(BDataclassMixin, A):
def method(self):
return self
o = B(a_field=5)
The last line fails, yielding this error message:
TypeError: BDataclassMixin.__init__() got an unexpected keyword argument 'a_field'
B's method-resolution order (B.__mro__) is (B, BDataclassMixin, A, ADataclassMixin, ABC, object), as expected. But a_field is not found.
A solution, shown in the commented-out line above, is to put the ancestor class explicitly in the descendant dataclass's declaration: class BDataclassMixin(A) instead of class BDataclassMixin. This fails type-checking, though, because a dataclass can only be a concrete class.
2. A problem with that solution, unsolved
The above solution breaks down if we add a third class, inheriting from B:
#dataclass
#class CDataclassMixin: # fails
class CDataclassMixin(A): # fails
#class CDataclassMixin(B, A): # works but fails type-check
c_field: int = 3
pass
class C(CDataclassMixin, B):
def method(self):
return "C's result"
pass
o = C(b_field=5)
Now, C has a_field and c_field but has lost b_field.
I have found that if I declare CDataclassMixin explicitly to inherit from B and A (in that order), b_field will be in the resulting class along with a_field_ and c_field`. However, explicitly stating the inheritance hierarchy in every mix-in defeats the purpose of mix-ins, which is to be able to code them independently of all the other mix-ins and to mix them easily and any way you like.
What is the correct way to make abstract dataclass mix-ins, so that classes that inherit from them include all the dataclass fields?
The correct solution is to abandon the DataclassMixin classes and simply make the abstract classes into dataclasses, like this:
#dataclass # type: ignore[misc]
class A(ABC):
a_field: int = 1
#abstractmethod
def method(self):
pass
#dataclass # type: ignore[misc]
class B(A):
b_field: int = 2
#dataclass
class C(B):
c_field: int = 3
def method(self):
return self
The reason for the failures is that, as explained in the documentation on dataclasses, the complete set of fields in a dataclass is determined when the dataclass is compiled, not when it is inherited from. The internal code that generates the dataclass's __init__ function can only examine the MRO of the dataclass as it is declared on its own, not when mixed in to another class.
It's necessary to add # type: ignore[misc] to each abstract dataclass's #dataclass line, not because the solution is wrong but because mypy is wrong. It is mypy, not Python, that requires dataclasses to be concrete. As explained by ilevkivskyi in mypy issue 5374, the problem is that mypy wants a dataclass to be a Type object and for every Type object to be capable of being instantiated. This is a known problem and awaits a resolution.
The behavior in the question and in the solution is exactly how dataclasses should behave. And, happily, abstract dataclasses that inherit this way (the ordinary way) can be mixed into other classes willy-nilly no differently than other mix-ins.
Putting the mixin as the last base class works without error:
#dataclass
class ADataclassMixin:
a_field: int = 1
class A(ABC, ADataclassMixin):
#abstractmethod
def method(self):
pass
#dataclass
class BDataclassMixin:
b_field: int = 2
class B(A, BDataclassMixin):
def method(self):
return self
o = B(a_field=5)
print((o.a_field, o.b_field)) # (5,2)

Python pro way to make an abstract class allowing each child class to define its own attributes, Python3

I have to model several cases, each case is realised by a class. I want to make sure that each class must have 2 methods get_input() and run(). So in my opinion, I can write a CaseBase class where these 2 methods are decorated as #abstractmethod. Therefore, any child class has to implement these 2 methods. And this is exactly my goal.
However, due to the nature of my work, each case is for distinct subject, and it is not easy to define a fixed group of attributes. The attributes should be defined in the __init__ method of a class. That means I don't know what exactly attributes to write in the CaseBase class. All I know is that all children cases must have some common attributes, like self._common_1 and self._common_2.
Therefore, my idea is that I also decorate the __init__ method of CaseBase class by #abstractmethod. See my code below.
from abc import ABC, abstractmethod
from typing import Dict, List
class CaseBase(ABC):
#abstractmethod
def __init__(self):
self._common_1: Dict[str, float] = {}
self._common_2: List[float] = []
...
#abstractmethod
def get_input(self, input_data: dict):
...
#abstractmethod
def run(self):
...
class CaseA(CaseBase):
def __init__(self):
self._common_1: Dict[str, float] = {}
self._common_2: List[float] = []
self._a1: int = 0
self._a2: str = ''
def get_input(self, input_data: dict):
self._common_1 = input_data['common_1']
self._common_2 = input_data['common_2']
self._a1 = input_data['a1']
self._a2 = input_data['a2']
def run(self):
print(self._common_1)
print(self._common_2)
print(self._a1)
print(self._a2)
def main():
case_a = CaseA()
case_a.get_input(input_data={'common_1': {'c1': 1.1}, 'common_2': [1.1, 2.2], 'a1': 2, 'a2': 'good'})
case_a.run()
if __name__ == '__main__':
main()
My question: Is my way a good Python style?
I followed many Python tutorials about how to make Abstract class and child class. They all give examples where a fixed group of attributes are defined in the __init__ method of the base class. I also see some approach to use super().__init__ code in the child class to change the attributes defined in the base class or to add new attributes. But I am not sure if it is better (more pro) than my way.
Thanks.
You mostly used the abc module in python 3.10 correctly. but it doesn't make sense to decorate the constructor with #abstractmethod. It's unnecessary. Each class, derived or not, can and will have its own constructor. You can call super().__init__(args) within the child class to call the constructor of its immediate parent if you didn't want to duplicate its code but wanted to do further initialization in the child class constructor.

MyPy not considering dataclass attribute mechanics

I am developing a Python3.8 project with usage of typing and dataclasses, and automatic tests include mypy. This brings me in a strange behavior that I do not really understand...
In short: Mypy seems not to understand dataclass attributes mechanics that, to my understanding, make them instance attributes.
Here's a minimal example, with a package and two modules:
__init__.py: void
app_events.py:
class AppEvent:
pass
main.py:
import dataclasses
import typing
from . import app_events
class B:
"""Class with *app_events* as instance attribute."""
def __init__(self):
self.app_events: typing.List[app_events.AppEvent] = []
def bar(self) -> app_events.AppEvent:
# no mypy complaint here: the import is correctly distinguished
# from the attribute
...
class C:
"""Class with *app_events* as class attribute."""
app_events: List[app_events.AppEvent]
def chew(self) -> app_events.AppEvent:
# mypy considers app_events to be the class attribute
...
#dataclasses.dataclass
class D:
app_events: typing.List[app_events.AppEvent] = \
dataclasses.field(default_factory=list)
def doo(self) -> app_events.AppEvent:
# same here: mypy considers app_events to be the class attribute
...
And the typecheck result:
PyCharm complains, for methods C.chew and D.doo: Unresolved attribute reference 'AppEvent' for class 'list'
mypy complains, still for methods C.chew and D.doo, that error: Name 'app_events.AppEvent' is not defined.
No issue for B.bar as written, though if app_events attribute is declared as a class attribute (instead of being defined in self.__init__, then mypy raise the same complaint.)
-> any idea how to understand/solve/circumvent this elegantly?
I'd really like not to rename my module and attributes, but if you have nice names in mind, please do not hesitate to propose :-)

__post_init__() for multiple inherited dataclasses

Trying to evaluate if dataclasses are suitable for an upcoming project, but right now I'm stuck with this code:
from dataclasses import dataclass
#dataclass
class MixinA:
attrA: int
def __post_init__(self):
print('MixinA post_init')
self.attrA = [self.attrA]
#dataclass
class MixinB:
attrB: str
def __post_init__(self):
print('MixinB post_init')
self.attrB = [self.attrB]
#dataclass
class MixinC:
attrC: bool
def __post_init__(self):
print('MixinC post_init')
self.attrC = [self.attrC]
#dataclass
class Inherited(MixinC, MixinB, MixinA):
pass
obj = Inherited(4, 'Hello', False)
print(obj.attrA, obj.attrB, obj.attrC)
print(obj.__class__.mro())
It is a surprise to me that only __post_init__() in the first base class is called, when I expect all three are invoked:
MixinC post_init
4 Hello [False]
[<class '__main__.Inherited'>, <class '__main__.MixinC'>, <class '__main__.MixinB'>, <class '__main__.MixinA'>, <class 'object'>]
Besides, changing inheritance doesn't do me any good. Following inheritance generates the exact same output as above:
class MixinA:
class MixinB(MixinA):
class MixinC(MixinB):
class Inherited(MixinC):
Did I write the testing code in a wrong way, or is current behavior done by oversight or intention?
The core issue for me is, I want to transform each attribute before generating the final dataclass instances. The actual inheritance is of larger scale, and doing it within each and every class would be very redundant.
If __post_init__() is a no-go, is there any alternative approach (such as InitVar or custom __init__())?

How do you annotate the type of an abstract class with mypy?

I'm writing a library where I need a method that takes a (potentially) abstract type, and returns an instance of a concrete subtype of that type:
# script.py
from typing import Type
from abc import ABC, abstractmethod
class AbstractClass(ABC):
#abstractmethod
def abstract_method(self):
pass
T = TypeVar('T', bound=AbstractClass)
def f(c: Type[T]) -> T:
# find concrete implementation of c based on
# environment configuration
...
f(AbstractClass) # doesn't type check
Running mypy script.py yields:
error: Only concrete class can be given where "Type[AbstractClass]" is expected
I don't understand this error message and am having a hard time finding any documentation for it. Is there any way to annotate the function so that mypy will type check this?
As a side note, PyCharm's type checker, which is what I use the most, type checks f with no errors.
It does appear that mypy is a bit biased against using an abstract base class this way, though as you demonstrate there are valid use cases.
You can work around this by making your factory function a class method on your abstract class. If stylistically you'd like to have a top-level function as a factory, then you can create an alias to the class method.
from typing import TYPE_CHECKING
from abc import ABC, abstractmethod
class AbstractClass(ABC):
#abstractmethod
def abstract_method(self):
raise NotImplementedError
#classmethod
def make_concrete(cls) -> 'AbstractClass':
"""
find concrete implementation based on environment configuration
"""
return A()
class A(AbstractClass):
def abstract_method(self):
print("a")
# make alias
f = AbstractClass.make_concrete
x = f()
if TYPE_CHECKING:
reveal_type(x) # AbstractClass
Note that, without more work, mypy cannot know which concrete class is created by the factory function, it will only know that it is compatible with AbstractClass, as demonstrated by the output of reveal_type.
Alternately, if you're willing to give up the runtime checking provided by abc.ABC, you can get something even closer to your original design:
from typing import TYPE_CHECKING
from abc import abstractmethod
class AbstractClass: # do NOT inherit from abc.ABC
#abstractmethod
def abstract_method(self):
raise NotImplementedError
class A(AbstractClass):
def abstract_method(self):
print("a")
class Bad(AbstractClass):
pass
def f() -> AbstractClass:
"""
find concrete implementation based on environment configuration
"""
pass
b = Bad() # mypy displays an error here: Cannot instantiate abstract class 'Bad' with abstract attribute 'abstract_method'
x = f()
if TYPE_CHECKING:
reveal_type(x) # AbstractClass
This works because mypy checks methods marked with #abstractmethod even if the class does not inherit from abc.ABC. But be warned that if you execute the program using python, you will no longer get an error about instantiating the Bad class without implementing its abstract methods.

Resources