class A(object):
__init__ = None
def A__init__(self, value):
self.value = value
A.__init__ = A__init__
I am new to python so i want to ask for the differences between the constructor above and the below one.
class A(object):
def __init__(self, value):
self.value = value
As mentioned in the comments, in terms of usage, there will be no clear differences. Notice, however, that the naming does change, as,
>>>print(A.__init__.__name__)
Outputs __init__ in the normal case, but A__init__ in the case where you bind a function as the __init__ method for A.
Beyond this, if A ever enters into a class hierarchy, you can also not rely on super() to figure out the MRO for you, and will have to use something along the lines of super(type(self), self) for this to work. Monkey-patching __init__ is just a little odd, eh.
Related
I have 2 classes, class A and Class B which is a child of class A. In the __init__ method of class A I used argparse to define Class A's methods. In class B __init__, I use the super().__init__() to call class A's attributes. The problem is that class B needs another attribute and I would like to parse it with argparse module
Any ideas on how to do that???
class A:
def __init__(self):
parser = argparse.ArgumentParser()
parser.add_argument('a')
parser.add_argument('a2')
args = parser.parse_args()
self.a1, self.a2 = args.a1, args.a2
class B(A):
def __init__(self):
self.a1,self.a2 = super().__init__()
# from here on the code does not work; I would like to know how to do that.
parser = argparse.ArgumentParser()
parser.add_argument('b')
args = parser.parse_args()
self.b = args.b
The only way I found is to give an optional argument to class A parser; the optional argument will be inserted only in case Class B is being executed. This solution seems to have no logical sense to me.
Because class B inherits from A it has properties a1 and a2 already. It's enough to call its constructor as follows:
super().__init__()
As for the argument 'b' I think the best is to instantiate parser object as a property of class A and then use it in class B if necessary:
class A:
def __init__(self):
self.parser = argparse.ArgumentParser()
self.parser.add_argument('a1')
self.parser.add_argument('a2')
self.parser.add_argument('b')
args = self.parser.parse_args()
self.a1, self.a2 = args.a1, args.a2
although as suggested in comments better not to manipulate the command line arguments parsing during class instantiation. It should be done somewhere in the upper level of your project (e.g. main()).
You could subclass ArgumentParser to achieve your goal. Then you just need to overwrite the parse_args method so that it assigns all of the attributes of the namespace back to the parser itself.
For example:
# main.py
from argparse import ArgumentParser
class A(ArgumentParser):
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self.add_argument('a')
self.add_argument('a2')
def parse_args(self, args=None, namespace=None):
nspace = super().parse_args(args=args, namespace=namespace)
for attr, val in nspace.__dict__.items():
self.__setattr__(attr, val)
class B(A):
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self.add_argument('b')
parser = B()
parser.parse_args()
print(parser.a, parser.a2, parser.b)
Then try:
python main.py HELLO WORLD TODAY
OUTPUT:
HELLO WORLD TODAY
So, first off, don't do this. Classes that don't exist solely for the purpose of parsing arguments (a job better done by a simple function, or just handled inline in the script entry point) should not be pulling information from sys.argv (global state that is "owned" by the script entry point, not random utility classes).
Using positional command-line arguments makes this nigh impossible to do in even a bad way (leaving argument parsing in __init__, and using parse_known_args to parse recognized args while ignoring the rest; broken because B needs its extra argument to be third positionally, and you'd have to link in extra knowledge of A to know that), leaving only truly horrific ways (involving allowing A's constructor to accept an ArgumentParser which it initializes and uses, then B adds more to it and reuses it just to get the extra value).
If the classes must be involved in making the parser, you should still separate the parser from the instance construction, e.g. putting non-instance utility methods in the classes to build a parser, using it outside the class to parse, then passing in the result to construct an instance. Something like the following is weird, but not necessarily awful:
class A:
def __init__(self, a1, a2):
self.a1 = a1
self.a2 = a2
#staticmethod # Don't need to know the type here, so just static
def make_parser():
parser = argparse.ArgumentParser()
parser.add_argument('a')
parser.add_argument('a2')
return parser
class B(A):
def __init__(self, b, **kwargs): # Accept and pass along arbitrary keyword args
super().__init__(**kwargs) # to remove need to reproduce A's parameters
self.b = b
#classmethod # Use classmethod to make no-arg super work
def make_parser(cls):
parser = super().make_parser()
parser.add_argument('b')
return parser
Then in the main script, you can do something like:
def main():
class_to_use = B if should_be_B else A
parser = class_to_use.make_parser()
args = parser.parse_args()
inst = class_to_use(**vars(args)) # Extract parsed arguments as dict and pass as keyword args
# Do stuff with something that's an A or B
if __name__ == '__main__':
main()
I don't really recommend this in general (argument parsing should be entirely separate from random classes in general), but in the rare cases it needs to be linked like this, you still shouldn't put it in __init__ (which makes it impossible to reuse said classes with manually provided arguments without modifying sys.argv, which is the ugliest hack imaginable).
Is it possible to add/overwrite a type hint in case of the following example?
The example is just to get an idea of what I mean, by no means is this something that I would use in this way.
from dataclasses import dataclass
def wrapper(f):
def deco(instance):
if not instance.user:
instance.user = data(name="test")
return f(instance)
return deco
#dataclass
class data:
name: str
class test_class:
def __init__(self):
self.user: None | data = None
#wrapper
def test(self):
print(self.user.name)
x = test_class()
x.test()
The issue is that type hinting does not understand that the decorated method's user attribute is not None, thus showing a linting error that name is not a known member of none.
Of course this code could be altered so that instead of using a decorator it would just do something like this:
def test(self):
if not self.user:
...
print(self.user.name)
But that is not the point. I just want to know if it is possible to let the type hinter know that the attribute is not None. I could also just suppress the warning but that is not what I am looking for.
I would use the good ol' assert and be done with it:
...
#wrapper
def test(self):
assert isinstance(self.user, data)
print(self.user.name)
I realize this is a crude way as opposed to some annotation magic you might have expected for the decorator, but in my opinion this is the most practical approach.
There are countless other situations that can be constructed, where the type of some instance attribute may be altered externally. In those cases the use of such a simple assertion is not only for the benefit of the static type checker, but can also save you from shooting yourself in the foot, if you decide to alter that external behavior.
Alternative - Getter
Another possibility is to make the user attribute private and add a function (or property) to get it, which ensures that it is not None. Here is a working example:
from __future__ import annotations
from collections.abc import Callable
from dataclasses import dataclass
from typing import TypeVar
T = TypeVar("T")
#dataclass
class Data:
name: str
def wrapper(f: Callable[[TestClass], T]) -> Callable[[TestClass], T]:
def deco(self: TestClass) -> T:
try:
_ = self.user
except RuntimeError:
self.user = Data(name="test")
return f(self)
return deco
class TestClass:
def __init__(self) -> None:
self._user: None | Data = None
#property
def user(self) -> Data:
if self._user is None:
raise RuntimeError
return self._user
#user.setter
def user(self, data: Data) -> None:
self._user = data
#wrapper
def test(self) -> None:
print(self.user.name)
if __name__ == '__main__':
x = TestClass()
x.test()
Depending on the use case, this might actually be preferred because otherwise, user being a public attribute, all outside code that wants to use TestClass will face the same problem of never being sure if user is None or not, thus being forced to do the same checks again and again.
Sadly there isn't really a satisfactory answer to your question. The problem is that no type-checkers execute any code - that means that any dynamic type generation doesn't work. For that reason, if you want to tell the type-checker that the self.user is not None you need to create a class where user is not Optional.
I don't think it's a good idea but here is how you could achieve what you want to achieve. Note though that that way you need to keep the two classes in sync and some type-checkers have trouble with decorators...
from typing import ParamSpec, TypeVar, Concatenate, Callable, cast
from dataclasses import dataclass
T = TypeVar("T") # generic return value
P = ParamSpec("P") # all other params after self
def wrapper( # this wrapper works on any functions in 'test_class'
f: Callable[Concatenate["test_class", P], T]
) -> Callable[Concatenate["__non_optional_user_test_class", P], T]:
def deco(instance: "test_class", *args: P.args, **kwargs: P.kwargs):
if not instance.user:
instance.user = data(name="test")
return f(cast("__non_optional_user_test_class", instance), *args, **kwargs)
return deco
#dataclass
class data:
name: str
class __non_optional_user_test_class:
user: data
class test_class:
def __init__(self):
self.user: None | data = None
#wrapper
def test(self):
print(self.user.name)
x = test_class()
x.test()
You sadly cannot generate the __non_optional_user_test_class dynamically in such a way that type-checkers understand them...
And you would need to write a new wrapper for all classes where you want to apply this #wrapper.
I'm looking for a shorthand to add common property decorators to classes.
class Animal:
def __init__(self):
self._attributes = {}
class Dog(Animal):
#property
def color(self):
return super()._attributes.get('color', None)
#color.setter
def color(self, value):
if value is not None:
super()._attributes['color'] = value
else:
super()._attributes.pop('color', None)
class Cat(Animal):
#property
def color(self):
return super()._attributes.get('color', None)
#color.setter
def color(self, value):
if value is not None:
super()._attributes['color'] = value
else:
super()._attributes.pop('color', None)
class InvisibleMan(Animal):
pass
I'm looking for the easiest way to "package" the color property so I can assign it to Dog and Cat, but not InvisibleMan. Something like this (although in actuality there will be ~8 such properties and ~15 such classes)
class Dog(Animal):
def __init__(self):
super().__init__()
includeColorProperty(self)
Have you considered descriptors, instead of a decorator?
In a nutshell, descriptors give you fine-grained control over attribute storage. (In fact, the property decorator builds a descriptor under the hood!) Here are some Python docs that may be helpful.
Anyway, sticking with your pattern, a descriptor that manipulates _attributes would look something like this:
class Color:
def __get__(self, obj, objtype=None):
return obj._attributes.get('color')
def __set__(self, obj, value):
if value is None:
obj._attributes.pop('color', None)
else:
obj._attributes['color'] = value
where obj is a reference to the Dog instance, et al.
(Note the __get__ and __set__ methods match your getter and setter, respectively.)
Then, plug the descriptor into your classes like this:
class Animal:
def __init__(self):
self._attributes = {}
class Dog(Animal):
color = Color()
class Cat(Animal):
color = Color()
class InvisibleMan(Animal):
pass
You can see in this example the behaviors you're looking for are preserved: instances maintain their own _attributes, and InvisibleMan has no color:
>>> d1, d2 = Dog(), Dog()
>>> d1.color = 'blue'
>>> d1.color, d2.color
('blue', None)
>>>
>>>
>>> x = InvisibleMan()
>>> x.color
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
AttributeError: 'InvisibleMan' object has no attribute 'color'
Personally, I also find this a bit easier to read when many properties are involved, as you mentioned is true in your case. Want to know what properties are available for a given type? They're listed out right at the top, no surprises.
You have about options.
Firstly, multiple inheritance:
# this is the best way to do things if lots of stuff is invisible
class HasColor:
# getter and setter go here
class Dog(Animal,HasColor):
...
OR
# This is probably the best one way to do things, if not many things are invisible
class Invisible:
#property
def color(self):
raise AttributeError("very meaningful message")
class InvisibleMan(Invisible,Animal): # THE ORDER HERE MATTERS!!
etc
Option 2 would be to override the getter and setter in invisible man:
class Dog(Animal):
...
class InvisibleMan(Animal):
#property
def color(self):
raise AttributeError("very meaningful message")
Bonus option:
If you want to turn invisibility on and off on an instance then you want to do something else. I'm not sure if you want this but:
class Animal:
cloaking_on = False
#property
def color(self):
if self.cloaking_on:
raise AttributeError(etc)
etc
Then you can have a way to set cloaking on and off and make all Cats invisible by default.
I'm having a problem with multiple inheritance that I can't seem to figure out. Here is a very abstracted minimal example that reproduces my error (my code is much more complex than this).
class Thing(object):
def __init__(self, x=None):
self.x = x
class Mixin(object):
def __init__(self):
self.numbers = [1,2,3]
def children(self):
return [super().__init__(x=num) for num in self.numbers]
class CompositeThing(Mixin, Thing):
def __init__(self):
super().__init__()
def test(self):
for child in self.children():
print(child.x)
obj = CompositeThing()
obj.test()
Per this, I expect the children() method to return a list of Things built up from self.numbers. Instead, I get TypeError: super(type, obj): obj must be an instance or subtype of type. Incidentally, the same thing happens if I don't call the constructor and allow children to return super() 3 times (i.e., the uninstantiated superclass). Any ideas why this might be happening?
Thanks in advance!
In line 9 of your code, it looks like you are trying to call __init__ of object. I am assuming you meant to have Mixin inherit from Thing.
class Thing(object):
def __init__(self, x=None):
self.x = x
class Mixin(Thing):
def __init__(self):
self.numbers = [1,2,3]
def children(self):
return [super().__init__(x=num) for num in self.numbers] # Now calls Thing.__init__ instead of object.__init__
class CompositeThing(Mixin, Thing):
def __init__(self):
super().__init__()
def test(self):
for child in self.children():
print(child.x)
obj = CompositeThing()
obj.test()
Actually, I figured it out. There were two problems: (1) super() doesn't work as expected inside comprehensions because comprehensions in Py3 have their own scope - this was causing the TypeError I was experiencing. (2) What I was really trying to do was create a new instance of the parent, rather than calling a method from the parent. I have posted a new question for just the latter problem for clarity.
I have a base class that has a lot of direct sub classes. There are multiple independent features that are shared by multiple of the sub classes. This is a good use case for Python's cooperative inheritance. However, the features should wrap behavior from the outside, so they need to be earlier in the method resolution order.
class WrappedSub(FeatureA, FeatureB, FeatureC, RealSub):
def __init__(self, *args, **kwargs):
FeatureA.__init__(foo=42)
FeatureB.__init__(bar=13)
FeatureC.__init__(foobar=546)
RealSub.__init__(*args, **kwargs)
class RealSub(Base):
# Lots of code ...
It would be nice to decorate the child classes instead.
#Mixin(FeatureA, 42)
#Mixin(FeatureB, 13)
#Mixin(FeatureC, 546)
class RealSub(Base):
# Lots of code ...
Precisely, I need a #Mixin decorator where the first block below is be equivalent to the second.
#Mixin(Sub, *feature_args, **feature_kwargs)
class RealSub:
# Lots of code ...
class RealSub:
# Lots of code ...
class WrappedSub(Feature, RealSub):
def __init__(self, *sub_args, **sub_kwargs):
Feature.__init__(self, *feature_args, **feature_kwargs)
RealSub.__init__(self, *sub_args, **sub_kwargs)
RealSub = WrappedSub
How is this possible in Python 3?
You can probably use Python's cooperative multiple-inheritance system to write your mixin classes, rather than trying to implement them as class decorators. This is how I've generally understood the term "mixin" to be used in Python OOP.
class Base:
def method(self, param):
value = param + 18
return value
class FeatureOne: # this could inherit from Base
def method(self, param):
if param == 42:
return 13
else:
return super().method(param) # call next class in inheritance chain
class Child(FeatureOne, Base):
def method(self, param):
value = super().method(param)
value *= 2
return value
This isn't quite the same as what you wanted, since it calls the FeatureOne class's method implementation between the Base and Child classes' versions, rather than before Child does its thing. You could instead add an new Grandchild class that inherits from the Features you care about first, and Child last, if you can't adjust the methods to work in this order (the Grandchild class's body could be empty).
If you really want to use decorators to flip the order around, I think you could probably make it work, with the decorator building a "grandchild" class for you (though it doesn't know anything about the normal inheritance hierarchy). Here's a rough attempt at a mixin decorator that works almost like you want:
def mixin(*mixin_classes, **mixin_kwargs): # decorator factory function
def decorator(cls): # decorator function
class wrapper(*mixin_classes, cls):
def __init__(self, *args, **kwargs):
wrapped_kwargs = mixin_kwargs.copy() # use the passed kwargs to update the
wrapped_kwargs.update(kwargs) # mixin args, so caller can override
super().__init__(*args, **wrapped_kwargs)
# maybe modify wrapper's __name__, __qualname__, __doc__, etc. to match cls here?
return wrapper
return decorator
The mixin classes should call super().__init__(*args, **kwargs) from their own __init__ method (if they have one), but they can accept (and not pass on) keyword-only arguments of their own that they want to be passed by the mixin decorator:
class FeatureOne:
def __init__(self, *args, foo, **kwargs): # note that foo is a keyword-only argument
self.foo = foo
super().__init__(*args, **kwargs)
def method(self, param):
if param == self.foo:
return 13
else:
return super().__method__(param)
#mixin(FeatureOne, foo=42)
class Child(Base):
def method(self, param):
return super().method(param) * 2
The decorator should work either with all the mixin classes passed to one decorator call (e.g. #mixin(FeatureA, FeatureB, FeatureC, foo=42, bar=13, foobar=546)), or with several nested decorator calls. The MRO of the final class will be the same either way.