PyCharm unable to identify inherited class' attributes - python-3.x

There's a base data class as follows:
class BaseClass:
def __init__(self, attribute_1: Any):
self.attribute_1 = attribute_1
There's an inherited data class using the above class as base class:
class DataClass(BaseClass):
def __init__(self, attribute_1: Any, attribute_2: Dict[str, str], attribute_3: List[str]):
super().__init__(attribute_1)
self.attribute_2 = attribute_2
self.attribute_3 = attribute_3
There's another BaseClass which expects an instance of BaseClass to work as follows:
class BaseActionClass:
def __init__(self, attribute_a1: BaseClass, attribute_a2: Dict[str, str])
self.attribute_a1 = attribute_a1
self.attribute_a2 = attribute_a2
def do_action_one(self):
pass
def do_action_two(self):
pass
There's an ActionClass which uses this BaseActionClass to perform some actions:
class ActionClass(BaseActionClass):
def __init__(self, attribute_a1: DataClass, attribute_a2: Dict[str, str]):
super().__init__(attribute_a1, attribute_a2)
def do_action_one(self):
do_statement_1
x = self.attribute_a1.attribute_1
y = self.attribute_a1.attribute_2
def do_action_two(self):
do_something
In ActionClass.do_action_one, when writing y = self.attribute_a1.attribute_2, PyCharm shows a typing error of Unresolved attribute reference 'attribute_2' for class 'BaseClass'. How to resolve this typing error which is shown by the IDE, and why would this happen since DataClass is already inheriting from BaseClass?

The problem now is that ActionClass.attribute_a1 is still BaseClass, because its base was declared so. It is absolutely fine, because in ActionClass you do not enforce attribute_a1 to be DataClass, but only limit __init__ method to it. Were it another method (not __init__, but, say, set_attribute_a1 - let's forget about properties for now), you would also violate LSP this way.
I can suggest two ways to go:
Generic
I'd assume BaseClass and DataClass definitions as yours. Then the following will work:
from typing import Generic, TypeVar
_T = TypeVar('_T', bound=BaseClass)
class BaseActionClass(Generic[_T]):
def __init__(self, attribute_a1: _T, attribute_a2: dict[str, str]) -> None:
self.attribute_a1: _T = attribute_a1
self.attribute_a2 = attribute_a2
def do_action_one(self) -> None:
pass
class ActionClass(BaseActionClass[DataClass]):
# Note you don't even need to override __init__ now, it follows from generic defn
def do_action_one(self) -> None:
self.attribute_a1.attribute_1
self.attribute_a1.attribute_2
Direct
class BaseActionClass:
def __init__(self, attribute_a1: BaseClass, attribute_a2: dict[str, str])
self.attribute_a1 = attribute_a1
self.attribute_a2 = attribute_a2
def do_action_one(self):
pass
def do_action_two(self):
pass
class ActionClass(BaseActionClass):
attribute_1: DataClass
def __init__(self, attribute_a1: DataClass, attribute_a2: dict[str, str]):
super().__init__(attribute_a1, attribute_a2)
def do_action_one(self):
self.attribute_a1.attribute_1
self.attribute_a1.attribute_2
The former solution is preferred, because it reveals your intention initially and is more semantically correct. It means roughly the following: class BaseActionClass has attribute_a1 parameter of type _T, which can be substituted by any BaseClass subclass (including BaseClass itself). When you subclass BaseActionClass[DataClass], you enforce _T substitution with DataClass. You can still do BaseActionClass(BaseClass(), {}) and _T will be BaseClass, but ActionClass(BaseClass(), {}) is rejected now.
The latter solution is much less elegant. I'd advice to use it only if you don't have access to modify BaseActionClass (for example, it is 3rd-party module and you don't want/can't create a PR to it).

Related

Python type hinting None | Object with decorator

Is it possible to add/overwrite a type hint in case of the following example?
The example is just to get an idea of what I mean, by no means is this something that I would use in this way.
from dataclasses import dataclass
def wrapper(f):
def deco(instance):
if not instance.user:
instance.user = data(name="test")
return f(instance)
return deco
#dataclass
class data:
name: str
class test_class:
def __init__(self):
self.user: None | data = None
#wrapper
def test(self):
print(self.user.name)
x = test_class()
x.test()
The issue is that type hinting does not understand that the decorated method's user attribute is not None, thus showing a linting error that name is not a known member of none.
Of course this code could be altered so that instead of using a decorator it would just do something like this:
def test(self):
if not self.user:
...
print(self.user.name)
But that is not the point. I just want to know if it is possible to let the type hinter know that the attribute is not None. I could also just suppress the warning but that is not what I am looking for.
I would use the good ol' assert and be done with it:
...
#wrapper
def test(self):
assert isinstance(self.user, data)
print(self.user.name)
I realize this is a crude way as opposed to some annotation magic you might have expected for the decorator, but in my opinion this is the most practical approach.
There are countless other situations that can be constructed, where the type of some instance attribute may be altered externally. In those cases the use of such a simple assertion is not only for the benefit of the static type checker, but can also save you from shooting yourself in the foot, if you decide to alter that external behavior.
Alternative - Getter
Another possibility is to make the user attribute private and add a function (or property) to get it, which ensures that it is not None. Here is a working example:
from __future__ import annotations
from collections.abc import Callable
from dataclasses import dataclass
from typing import TypeVar
T = TypeVar("T")
#dataclass
class Data:
name: str
def wrapper(f: Callable[[TestClass], T]) -> Callable[[TestClass], T]:
def deco(self: TestClass) -> T:
try:
_ = self.user
except RuntimeError:
self.user = Data(name="test")
return f(self)
return deco
class TestClass:
def __init__(self) -> None:
self._user: None | Data = None
#property
def user(self) -> Data:
if self._user is None:
raise RuntimeError
return self._user
#user.setter
def user(self, data: Data) -> None:
self._user = data
#wrapper
def test(self) -> None:
print(self.user.name)
if __name__ == '__main__':
x = TestClass()
x.test()
Depending on the use case, this might actually be preferred because otherwise, user being a public attribute, all outside code that wants to use TestClass will face the same problem of never being sure if user is None or not, thus being forced to do the same checks again and again.
Sadly there isn't really a satisfactory answer to your question. The problem is that no type-checkers execute any code - that means that any dynamic type generation doesn't work. For that reason, if you want to tell the type-checker that the self.user is not None you need to create a class where user is not Optional.
I don't think it's a good idea but here is how you could achieve what you want to achieve. Note though that that way you need to keep the two classes in sync and some type-checkers have trouble with decorators...
from typing import ParamSpec, TypeVar, Concatenate, Callable, cast
from dataclasses import dataclass
T = TypeVar("T") # generic return value
P = ParamSpec("P") # all other params after self
def wrapper( # this wrapper works on any functions in 'test_class'
f: Callable[Concatenate["test_class", P], T]
) -> Callable[Concatenate["__non_optional_user_test_class", P], T]:
def deco(instance: "test_class", *args: P.args, **kwargs: P.kwargs):
if not instance.user:
instance.user = data(name="test")
return f(cast("__non_optional_user_test_class", instance), *args, **kwargs)
return deco
#dataclass
class data:
name: str
class __non_optional_user_test_class:
user: data
class test_class:
def __init__(self):
self.user: None | data = None
#wrapper
def test(self):
print(self.user.name)
x = test_class()
x.test()
You sadly cannot generate the __non_optional_user_test_class dynamically in such a way that type-checkers understand them...
And you would need to write a new wrapper for all classes where you want to apply this #wrapper.

Python subclass that takes superclass as argument on instantiation?

I am trying to create a wrapper class in Python with the following behaviour:
It should take as an argument an existing class from which it should inherit all methods and attributes
The wrapper class methods should be able to use Python super() to access methods of the superclass (the one passed as an argument)
Because of my second requirement I think the solution here will not suffice (and in any case I am having separate issues deepcopying some of the methods of the superclass' I am trying to inherit from).
I tried this but it's not correct...
class A:
def shout(self):
print("I AM A!")
class B:
def shout(self):
print("My name is B!")
class wrapper:
def __init__(self, super_class):
## Some inheritance thing here ##
# I initially tried this but no success...
super(super_class).__init__() # or similar?
def shout(self):
print('This is a wrapper')
super().shout()
And this is the behaviour I require...
my_wrapper = wrapper(A)
my_wrapper.shout()
# Expected output:
# > This is a wrapper
# > I AM A
my_wrapper = wrapper(B)
my_wrapper.shout()
# Expected output:
# > This is a wrapper
# > My name is B!
Is inheritance the correct approach here, if so am I sniffing in the right direction? Any help is appreciated, thanks :)
Edit for context:
I intend to build multiple wrappers so that all of my ML models have the same API. Generally, models from the same package (sklearn for example) have the same API and should be able to be wrapped by the same wrapper. In doing this I wish to modify/add functionality to the existing methods in these models whilst keeping the same method name.
If wrapper has to be a class then a composition solution would fit much better here.
Keep in mind that I turned the shout methods to staticmethod because in your example you pass the class to wrapper.shout, not an instance.
class A:
#staticmethod
def shout():
print("I AM A!")
class B:
#staticmethod
def shout():
print("My name is B!")
class wrapper:
def __init__(self, super_class):
self._super_class = super_class
def __getattr__(self, item):
try:
return self.__dict__[item].__func__
except KeyError:
return self._super_class.__dict__[item].__func__
def a_wrapper_method(self):
print('a wrapper attribute can still be used')
my_wrapper = wrapper(A)
my_wrapper.shout()
my_wrapper = wrapper(B)
my_wrapper.shout()
my_wrapper.a_wrapper_method()
Outputs
This is a wrapper
I AM A!
This is a wrapper
My name is B!
a wrapper attribute can still be used
So I went for a function in the end. My final solution:
class A:
def shout(self):
print("I AM A!")
class B:
def shout(self):
print("My name is B!")
def wrap_letter_class(to_wrap):
global letterWrapper
class letterWrapper(to_wrap):
def __init__(self):
super().__init__()
def shout(self):
print('This is a wrapper')
super().shout()
def __getstate__(self):
# Add the wrapper to global scope before pickling
global letterWrapper
letterWrapper = self.__class__
return self.__dict__
return letterWrapper()
Which produces the desired behaviour...
In [2]: wrapped = wrap_letter_class(A)
In [3]: wrapped.shout()
This is a wrapper
I AM A!
In [4]: wrapped = wrap_letter_class(B)
In [5]: wrapped.shout()
This is a wrapper
My name is B!
Something not mentioned in my initial question was that I intended to pickle my custom class, this is not possible if the class is not defined in the global scope, hence the __getstate__ and global additions.
Thanks!

Python Is it ok that an attribute only exists in child/concrete classes [duplicate]

What's the best practice to define an abstract instance attribute, but not as a property?
I would like to write something like:
class AbstractFoo(metaclass=ABCMeta):
#property
#abstractmethod
def bar(self):
pass
class Foo(AbstractFoo):
def __init__(self):
self.bar = 3
Instead of:
class Foo(AbstractFoo):
def __init__(self):
self._bar = 3
#property
def bar(self):
return self._bar
#bar.setter
def setbar(self, bar):
self._bar = bar
#bar.deleter
def delbar(self):
del self._bar
Properties are handy, but for simple attribute requiring no computation they are an overkill. This is especially important for abstract classes which will be subclassed and implemented by the user (I don't want to force someone to use #property when he just could have written self.foo = foo in the __init__).
Abstract attributes in Python question proposes as only answer to use #property and #abstractmethod: it doesn't answer my question.
The ActiveState recipe for an abstract class attribute via AbstractAttribute may be the right way, but I am not sure. It also only works with class attributes and not instance attributes.
A possibly a bit better solution compared to the accepted answer:
from better_abc import ABCMeta, abstract_attribute # see below
class AbstractFoo(metaclass=ABCMeta):
#abstract_attribute
def bar(self):
pass
class Foo(AbstractFoo):
def __init__(self):
self.bar = 3
class BadFoo(AbstractFoo):
def __init__(self):
pass
It will behave like this:
Foo() # ok
BadFoo() # will raise: NotImplementedError: Can't instantiate abstract class BadFoo
# with abstract attributes: bar
This answer uses same approach as the accepted answer, but integrates well with built-in ABC and does not require boilerplate of check_bar() helpers.
Here is the better_abc.py content:
from abc import ABCMeta as NativeABCMeta
class DummyAttribute:
pass
def abstract_attribute(obj=None):
if obj is None:
obj = DummyAttribute()
obj.__is_abstract_attribute__ = True
return obj
class ABCMeta(NativeABCMeta):
def __call__(cls, *args, **kwargs):
instance = NativeABCMeta.__call__(cls, *args, **kwargs)
abstract_attributes = {
name
for name in dir(instance)
if getattr(getattr(instance, name), '__is_abstract_attribute__', False)
}
if abstract_attributes:
raise NotImplementedError(
"Can't instantiate abstract class {} with"
" abstract attributes: {}".format(
cls.__name__,
', '.join(abstract_attributes)
)
)
return instance
The nice thing is that you can do:
class AbstractFoo(metaclass=ABCMeta):
bar = abstract_attribute()
and it will work same as above.
Also one can use:
class ABC(ABCMeta):
pass
to define custom ABC helper. PS. I consider this code to be CC0.
This could be improved by using AST parser to raise earlier (on class declaration) by scanning the __init__ code, but it seems to be an overkill for now (unless someone is willing to implement).
2021: typing support
You can use:
from typing import cast, Any, Callable, TypeVar
R = TypeVar('R')
def abstract_attribute(obj: Callable[[Any], R] = None) -> R:
_obj = cast(Any, obj)
if obj is None:
_obj = DummyAttribute()
_obj.__is_abstract_attribute__ = True
return cast(R, _obj)
which will let mypy highlight some typing issues
class AbstractFooTyped(metaclass=ABCMeta):
#abstract_attribute
def bar(self) -> int:
pass
class FooTyped(AbstractFooTyped):
def __init__(self):
# skipping assignment (which is required!) to demonstrate
# that it works independent of when the assignment is made
pass
f_typed = FooTyped()
_ = f_typed.bar + 'test' # Mypy: Unsupported operand types for + ("int" and "str")
FooTyped.bar = 'test' # Mypy: Incompatible types in assignment (expression has type "str", variable has type "int")
FooTyped.bar + 'test' # Mypy: Unsupported operand types for + ("int" and "str")
and for the shorthand notation, as suggested by #SMiller in the comments:
class AbstractFooTypedShorthand(metaclass=ABCMeta):
bar: int = abstract_attribute()
AbstractFooTypedShorthand.bar += 'test' # Mypy: Unsupported operand types for + ("int" and "str")
Just because you define it as an abstractproperty on the abstract base class doesn't mean you have to make a property on the subclass.
e.g. you can:
In [1]: from abc import ABCMeta, abstractproperty
In [2]: class X(metaclass=ABCMeta):
...: #abstractproperty
...: def required(self):
...: raise NotImplementedError
...:
In [3]: class Y(X):
...: required = True
...:
In [4]: Y()
Out[4]: <__main__.Y at 0x10ae0d390>
If you want to initialise the value in __init__ you can do this:
In [5]: class Z(X):
...: required = None
...: def __init__(self, value):
...: self.required = value
...:
In [6]: Z(value=3)
Out[6]: <__main__.Z at 0x10ae15a20>
Since Python 3.3 abstractproperty is deprecated. So Python 3 users should use the following instead:
from abc import ABCMeta, abstractmethod
class X(metaclass=ABCMeta):
#property
#abstractmethod
def required(self):
raise NotImplementedError
If you really want to enforce that a subclass define a given attribute, you can use metaclasses:
class AbstractFooMeta(type):
def __call__(cls, *args, **kwargs):
"""Called when you call Foo(*args, **kwargs) """
obj = type.__call__(cls, *args, **kwargs)
obj.check_bar()
return obj
class AbstractFoo(object):
__metaclass__ = AbstractFooMeta
bar = None
def check_bar(self):
if self.bar is None:
raise NotImplementedError('Subclasses must define bar')
class GoodFoo(AbstractFoo):
def __init__(self):
self.bar = 3
class BadFoo(AbstractFoo):
def __init__(self):
pass
Basically the meta class redefine __call__ to make sure check_bar is called after the init on an instance.
GoodFoo()  # ok
BadFoo ()  # yield NotImplementedError
As Anentropic said, you don't have to implement an abstractproperty as another property.
However, one thing all answers seem to neglect is Python's member slots (the __slots__ class attribute). Users of your ABCs required to implement abstract properties could simply define them within __slots__ if all that's needed is a data attribute.
So with something like,
class AbstractFoo(abc.ABC):
__slots__ = ()
bar = abc.abstractproperty()
Users can define sub-classes simply like,
class Foo(AbstractFoo):
__slots__ = 'bar', # the only requirement
# define Foo as desired
def __init__(self):
self.bar = ...
Here, Foo.bar behaves like a regular instance attribute, which it is, just implemented differently. This is simple, efficient, and avoids the #property boilerplate that you described.
This works whether or not ABCs define __slots__ at their class' bodies. However, going with __slots__ all the way not only saves memory and provides faster attribute accesses but also gives a meaningful descriptor instead of having intermediates (e.g. bar = None or similar) in sub-classes.1
A few answers suggest doing the "abstract" attribute check after instantiation (i.e. at the meta-class __call__() method) but I find that not only wasteful but also potentially inefficient as the initialization step could be a time-consuming one.
In short, what's required for sub-classes of ABCs is to override the relevant descriptor (be it a property or a method), it doesn't matter how, and documenting to your users that it's possible to use __slots__ as implementation for abstract properties seems to me as the more adequate approach.
1 In any case, at the very least, ABCs should always define an empty __slots__ class attribute because otherwise sub-classes are forced to have __dict__ (dynamic attribute access) and __weakref__ (weak reference support) when instantiated. See the abc or collections.abc modules for examples of this being the case within the standard library.
The problem isn't what, but when:
from abc import ABCMeta, abstractmethod
class AbstractFoo(metaclass=ABCMeta):
#abstractmethod
def bar():
pass
class Foo(AbstractFoo):
bar = object()
isinstance(Foo(), AbstractFoo)
#>>> True
It doesn't matter that bar isn't a method! The problem is that __subclasshook__, the method of doing the check, is a classmethod, so only cares whether the class, not the instance, has the attribute.
I suggest you just don't force this, as it's a hard problem. The alternative is forcing them to predefine the attribute, but that just leaves around dummy attributes that just silence errors.
I've searched around for this for awhile but didn't see anything I like. As you probably know if you do:
class AbstractFoo(object):
#property
def bar(self):
raise NotImplementedError(
"Subclasses of AbstractFoo must set an instance attribute "
"self._bar in it's __init__ method")
class Foo(AbstractFoo):
def __init__(self):
self.bar = "bar"
f = Foo()
You get an AttributeError: can't set attribute which is annoying.
To get around this you can do:
class AbstractFoo(object):
#property
def bar(self):
try:
return self._bar
except AttributeError:
raise NotImplementedError(
"Subclasses of AbstractFoo must set an instance attribute "
"self._bar in it's __init__ method")
class OkFoo(AbstractFoo):
def __init__(self):
self._bar = 3
class BadFoo(AbstractFoo):
pass
a = OkFoo()
b = BadFoo()
print a.bar
print b.bar # raises a NotImplementedError
This avoids the AttributeError: can't set attribute but if you just leave off the abstract property all together:
class AbstractFoo(object):
pass
class Foo(AbstractFoo):
pass
f = Foo()
f.bar
You get an AttributeError: 'Foo' object has no attribute 'bar' which is arguably almost as good as the NotImplementedError. So really my solution is just trading one error message from another .. and you have to use self._bar rather than self.bar in the init.
Following https://docs.python.org/2/library/abc.html you could do something like this in Python 2.7:
from abc import ABCMeta, abstractproperty
class Test(object):
__metaclass__ = ABCMeta
#abstractproperty
def test(self): yield None
def get_test(self):
return self.test
class TestChild(Test):
test = None
def __init__(self, var):
self.test = var
a = TestChild('test')
print(a.get_test())

Python return typing dynamically based on parameter

I have a method that returns dynamic type based on the class I pass in:
def foo(cls):
return cls()
How can I setup typing for this function?
After reading this article https://blog.yuo.be/2016/05/08/python-3-5-getting-to-grips-with-type-hints/, I found solution myself:
from typing import TypeVar, Type
class A:
def a(self):
return 'a'
class B(A):
def b(self):
return 'b'
T = TypeVar('T')
def foo(a: T) -> T:
return a()
This template suites my question above, but actually, my need is a little bit different that I need to work more. Below I include my problem and solution:
Problem: I want to use the with keyword like this:
with open_page(PageX) as page:
page.method_x() # method x is from PageX
Solution
from typing import TypeVar, Type, Generic
T = TypeVar('T')
def open_page(cls: Type[T]):
class __F__(Generic[T]):
def __init__(self, cls: Type[T]):
self._cls = cls
def __enter__(self) -> T:
return self._cls()
def __exit__(self, exc_type, exc_val, exc_tb):
pass
return __F__(cls)
So, when I use with PyCharm, it's able to suggest method_x when I pass PageX into with open_page(PageX) as page:

dynamic class inheritance using super

I'm trying to dynamically create a class using type() and assign an __init__ constructor which calls super().__init__(...); however, when super() gets called I receive the following error:
TypeError: super(type, obj): obj must be an instance or subtype of type
Here is my code:
class Item():
def __init__(self, name, description, cost, **kwargs):
self.name = name
self.description = description
self.cost = cost
self.kwargs = kwargs
class ItemBase(Item):
def __init__(self, name, description, cost):
super().__init__(name, description, cost)
def __constructor__(self, n, d, c):
super().__init__(name=n, description=d, cost=c)
item = type('Item1', (ItemBase,), {'__init__':__constructor__})
item_instance = item('MyName', 'MyDescription', 'MyCost')
Why is super() inside the __constructor__ method not understanding the object parameter; and how do I fix it?
Solution 1: Using cls = type('ClassName', ...)
Note the solution of sadmicrowave creates an infinite loop if the dynamically-created class gets inherited as self.__class__ will correspond to the child class.
An alternative way which do not have this issue is to assigns __init__ after creating the class, such as the class can be linked explicitly through closure. Example:
# Base class
class A():
def __init__(self):
print('A')
# Dynamically created class
B = type('B', (A,), {})
def __init__(self):
print('B')
super(B, self).__init__()
B.__init__ = __init__
# Child class
class C(B):
def __init__(self):
print('C')
super().__init__()
C() # print C, B, A
Solution 2: Using MyClass.__name__ = 'ClassName'
An alternative way to dynamically create class is to define a class inside the function, then reassign the __name__ and __qualname__ attributes:
class A:
def __init__(self):
print(A.__name__)
def make_class(name, base):
class Child(base):
def __init__(self):
print(Child.__name__)
super().__init__()
Child.__name__ = name
Child.__qualname__ = name
return Child
B = make_class('B', A)
class C(B):
def __init__(self):
print(C.__name__)
super().__init__()
C() # Display C B A
Here is how I solved the issue. I reference the type() method to dynamically instantiate a class with variable references as such:
def __constructor__(self, n, d, c, h):
# initialize super of class type
super(self.__class__, self).__init__(name=n, description=d, cost=c, hp=h)
# create the object class dynamically, utilizing __constructor__ for __init__ method
item = type(item_name, (eval("{}.{}".format(name,row[1].value)),), {'__init__':__constructor__})
# add new object to the global _objects object to be used throughout the world
self._objects[ item_name ] = item(row[0].value, row[2].value, row[3].value, row[4].value)
There may be a better way to accomplish this, but I needed a fix and this is what I came up with... use it if you can.

Resources