Below is the complete source code of the class _ProtocolMeta used as metaclass for typing.Protocol in Python 3.9:
class _ProtocolMeta(ABCMeta):
# This metaclass is really unfortunate and exists only because of
# the lack of __instancehook__.
def __instancecheck__(cls, instance):
# We need this method for situations where attributes are
# assigned in __init__.
if ((not getattr(cls, '_is_protocol', False) or
_is_callable_members_only(cls)) and
issubclass(instance.__class__, cls)):
return True
if cls._is_protocol:
if all(hasattr(instance, attr) and
# All *methods* can be blocked by setting them to None.
(not callable(getattr(cls, attr, None)) or
getattr(instance, attr) is not None)
for attr in _get_protocol_attrs(cls)):
return True
return super().__instancecheck__(instance)
As indicated by the first comment, the only purpose of the class is to provide an __instancecheck__ hook for isinstance(instance, cls) with cls being a subclass of typing.Protocol. Furthermore, in Protocol.__init_subclass__, a class subclassing from Protocol gets the attribute _is_protocol set to True if any of its bases is Protocol, in other words, if the class is a Protocol in sense of PEP 544.
I understand that checking an object to be an instance of a Protocol requires a special treatmeant, since PEP 544 states:
A concrete type X is a subtype of protocol P if and only if X
implements all protocol members of P with compatible types. In other
words, subtyping with respect to a protocol is always structural.
But I cannot recognize this special treatment in the implementation given by the source code above. Moreover, I cannot figure out any reasonable rationale for the implementation above. Can somebody explain it, please?
Especially, I don't understand the condition getattr(instance, attr) is not None in the line 1110:
Why it is needed to check if a protocol attribute (assured to exist on the instance: hasattr(instance, attr)) is not None on the instance if this attribute is callable in the protocol? If to check for a value, then I would think of checking for being callable, not just for just None.
Related
I have been after a way to provide none initialized instance variables to my class. I found that we can actually do that using type hinting without assigning anything to them. Which does not seem to create it in anyway. For example:
class T:
def __init__(self):
self.a: str
def just_print(self):
print(self.a)
def assign(self):
self.a = "test"
Now lets say I run this code:
t = T()
t.just_print()
It will raise an AttributeError saying 'T' object has not attribute 'a'. Obviously, when I run this code, it prints test.
t = T()
t.assign()
t.just_print()
My question is, what happens behind the scene when I just do a: str? It doesn't get added to the class's attributes. But it doesn't cause any problem either. So... is it just ignored? This is python 3.8 by the way.
You're referring to type annotations, as defined by PEP 526:
my_var: int
Please note that type annotations differ from type hints, as defined by PEP 428:
def my_func(foo: str):
...
Type annotations have actual runtime effects. For example, the documentation states:
In addition, at the module or class level, if the item being annotated is a simple name, then it and the annotation will be stored in the __annotations__ attribute of that module or class [...]
So, by slightly modifying your example, we get this:
>>> class T:
... a: str
...
>>> T.__annotations__
{'a': <class 'str'>}
I have got one question: why do I need to call super().--init--() in metaclasses? Because metaclass is factory of classes, I think we don`t need to call initialization for making objects of class Shop. Or with using super().--init-- we initializing the class? (Because my IDE says, that I should call it. But without super().--init-- nothing happens, my class working without mistakes).
Can you explane me, why?
Thanks in advance!
class Descriptor:
_counter = 0
def __init__(self):
self.attr_name = f'Descriptor attr#{Descriptor._counter}'
Descriptor._counter += 1
def __get__(self, instance, owner):
return self if instance is None else instance.__dict__[self.attr_name]
def __set__(self, instance, value):
if value > 0:
instance.__dict__[self.attr_name] = value
else:
msg = 'Value must be > 0!'
raise AttributeError(msg)
class Shop():
weight = Descriptor()
price = Descriptor()
def __init__(self, name, price, weight):
self.name = name
self.price = price
self.weight = weight
def __repr__(self):
return f'{self.name}: price - {self.price} weight - {self.weight}'
def buy(self):
return self.price * self.weight
class Meta(type):
def __init__(cls, name, bases, attr_dict):
super().__init__(name, bases, attr_dict) # <- this is that func. call
for key, value in attr_dict.items():
if isinstance(value, Descriptor): # Here I rename attributes name of descriptor`s object.
value.attr_name = key
#classmethod
def __prepare__(metacls, name, bases):
return OrderedDict()
You don't "need" to - and if your code use no other custom metaclasses, not calling the metaclass'__init__.super() will work just the same.
But if one needs to combine your metaclass with another, through inheritance, without the super() call, it won't work "out of the box": the super() call is the way to ensure all methods in the inheritance chain are called.
And if at first it looks like that a metaclass is extremely rare, and combining metaclasses would likely never take place: a few libraries or frameworks have their own metaclasses, including Python's "abc"s (abstract base classes), PyQT, ORM frameworks, and so on. If any metaclass under your control is well behaved with proper super() calls on the __new__, __init__ and __call__ methods, (if you override those), what you need to do to combine both superclasses and have a working metaclass can be done in a single line:
CompatibleMeta = type("CompatibleMeta", (meta, type(OtherClassBase)), {})
This way, for example, if you want to use the mechanisms in your metaclass in a class using the ABCMeta functionalities in Python, you just do it. The __init__ method in your Meta will call the other metaclass __init__. Otherwise it would not run, and some subtle unexpectd thing would not be initialized in your classes, and this could be a very hard to find bug.
On a side note: there is no need to declare __prepare__ in a metaclass if all it does is creating an OrderedDict on a Python newer than 3.6: Since that version, dicitionaries used as the "locals()" while executing class bodies are ordered by default. Also, if another metaclass you are combining with also have a __prepare__, there is no way to make that work automatically by using "super()" - you have to check the code and verify which of the two __prepare__s should be used, or create a new mapping type with features to attend both metaclasses.
Python's PEP 544 introduces typing.Protocol for structural subtyping, a.k.a. "static duck typing".
In this PEP's section on Merging and extending protocols, it is stated that
The general philosophy is that protocols are mostly like regular ABCs,
but a static type checker will handle them specially.
Thus, one would expect to inherit from a subclass of typing.Protocol in much the same way that one expects to inherit from a subclasses of abc.ABC:
from abc import ABC
from typing import Protocol
class AbstractBase(ABC):
def method(self):
print("AbstractBase.method called")
class Concrete1(AbstractBase):
...
c1 = Concrete1()
c1.method() # prints "AbstractBase.method called"
class ProtocolBase(Protocol):
def method(self):
print("ProtocolBase.method called")
class Concrete2(ProtocolBase):
...
c2 = Concrete2()
c2.method() # prints "ProtocolBase.method called"
As expected, the concrete subclasses Concrete1 and Concrete2 inherit method from their respective superclasses. This behavior is documented in the Explicitly declaring implementation section of the PEP:
To explicitly declare that a certain class implements a given
protocol, it can be used as a regular base class. In this case a class
could use default implementations of protocol members.
...
Note that there is little difference between explicit and implicit
subtypes, the main benefit of explicit subclassing is to get some
protocol methods "for free".
However, when the protocol class implements the __init__ method, __init__ is not inherited by explicit subclasses of the protocol class. This is in contrast to subclasses of an ABC class, which do inherit the __init__ method:
from abc import ABC
from typing import Protocol
class AbstractBase(ABC):
def __init__(self):
print("AbstractBase.__init__ called")
class Concrete1(AbstractBase):
...
c1 = Concrete1() # prints "AbstractBase.__init__ called"
class ProtocolBase(Protocol):
def __init__(self):
print("ProtocolBase.__init__ called")
class Concrete2(ProtocolBase):
...
c2 = Concrete2() # NOTHING GETS PRINTED
We see that, Concrete1 inherits __init__ from AbstractBase, but Concrete2 does not inherit __init__ from ProtocolBase. This is in contrast to the previous example, where Concrete1 and Concrete2 both inherit method from their respective superclasses.
My questions are:
What is the rationale behind not having __init__ inherited by explicit subtypes of a protocol class? Is there some type-theoretic reason for protocol classes not being able to supply an __init__ method "for free"?
Is there any documentation concerning this discrepancy? Or is it a bug?
You can't instantiate a protocol class directly. This is currently implemented by replacing a protocol's __init__ with a method whose sole function is to enforce this restriction:
def _no_init(self, *args, **kwargs):
if type(self)._is_protocol:
raise TypeError('Protocols cannot be instantiated')
...
class Protocol(Generic, metaclass=_ProtocolMeta):
...
def __init_subclass__(cls, *args, **kwargs):
...
cls.__init__ = _no_init
Your __init__ doesn't execute because it isn't there any more.
This is pretty weird and messes with even more stuff than it looks like at first glance - for example, it interacts poorly with multiple inheritance, interrupting super().__init__ chains.
I am using unittest.mock.sentinel to provide dumb values to my test functions and then assert calls.
I'd like to be able to specify the type of the sentinel so that it passes type checking in the methods.
MWE:
import collections
from unittest.mock import sentinel
def fun(x):
if not isinstance(x, collections.Iterable):
raise TypeError('x should be iterable')
pass
def test_fun_pass_if_x_is_instance_iterable():
# this does not work and raise because sentinel is not iterable
assert fun(sentinel.x) is None
EDIT
I have tried to do sentinel.x = collections.Iterable() but got the error:
TypeError: Can't instantiate abstract class Iterable with abstract methods __iter__
So far I can do sentinel.x = tuple() or sentinel.x = list() for instance, but these are special case of an iterable
I think the problem here is that collections.Iterable is an abstract base class (ABC) and cannot be instantiated directly. That's what the error message says, the method __iter__ is abstract, without body. You have to use a derived class or write one on your own.
In Java, we can prevent instantiation of a class by making it an abstract class. I thought python would behave the same way. But to my surprise, I found that I can create an object of an abstract class:
from abc import ABCMeta
class Foo(metaclass=ABCMeta):
pass
Foo()
Why does python allow this and how can I prevent this?
Python is for "consenting adults" - you could mark a class as abstract by naming convention within a project if you want (or module membership for that). However, it is feasible to do a hard "uninstantiable" abstract class - but that would not increase the security or good practices in a project in itself, as the commenters to the question propose.
So, to keep the remaining machinery for ABC's abstract classes, you can inherit the ABCMeta class, and use it to decorate the __new__ method so it won't instantiate - otherwise, just do the same, but inherit from type instead.
In other words, the code below wraps __new__ method on classes created with it as a metaclass. When that method is run, it checks if the class it is instantiating is the class marked with the ABC meta itself, if it is, it raises a typeerror instead.
class ReallyAbstract(ABCMeta):
def __new__(metacls, name, bases, namespace):
outter_cls = super().__new__(metacls, name, bases, namespace)
for bases in outter_cls.__mro__:
if getattr(getattr(bases, "__new__", None), "_abstract", False):
# Base class already marked as abstract. No need to do anything else
return outter_cls
original_new = getattr(outter_cls, "__new__")
if getattr(original_new, "_abstract", False):
# If we get a method that has already been wrapped
# just return it unchanged.
# TODO: if further classes on the hierarhy redfine __new__
return outter_cls
def __new__(cls, *args, **kw):
if cls is outter_cls:
raise TypeError
return original_new(cls, *args, **kw)
__new__._abstract = True
outter_cls.__new__ = __new__
return outter_cls
And on the console:
In [7]: class A(metaclass=ReallyAbstract):
...: pass
...:
In [7]: A()
TypeError Traceback (most recent call last)
<ipython-input-7-...> in <module>()
----> 1 A()
....
Just for sake of completeness - ABCMeta's in Python are not instantiable if they contain at least one "abstractmethod". Just like other O.O. features that are enforced in more static languages, the idea is to have this by convention. But yes, I agree that since they got to the work of creating an AbstractClass mechanism at all, it should probably behave with less surprises, and that would mean that the should not be instantiable by default.