How to observe Instance member variables using traitlets library? - python-3.x

I am trying to use the traitlets library provided by ipython in my code. Suppose if a trait is of instance of a particular class how do i observe change in value for a member of the class..
example:
class A:
def init(self,val):
self.value = val
class myApp(HasTraits):
myA = Instance(A,kw={'val':2})
I want to have an observe method to be called if 'value' member variable of object myA is changed.. SOmething like below:
#observe('myA.value')
def onValueChange(self,change):
return
is it possible with the trailets implementation?

In order to observe changes to the value of an instance trait, the class for the instance trait should subclass Hastraits.
traitlets.observe(*names, **kwargs)
A decorator which can be used to observe Traits on a class.
from traitlets import HasTraits, observe, Instance, Int
​
class A(HasTraits):
value = Int()
​
def __init__(self, val):
self.value = val
#observe('value')
def func(self, change):
print(change)
​
class App(HasTraits):
myA = Instance(klass=A, args=(2,))
​
app = App()
app.myA.value = 1
{'name': 'value', 'old': 0, 'new': 2, 'owner': <__main__.A object at 0x10b0698d0>, 'type': 'change'}
{'name': 'value', 'old': 2, 'new': 1, 'owner': <__main__.A object at 0x10b0698d0>, 'type': 'change'}
Edit
To keep the change handler in the composed class, you can dynamically set an observer.
Note that, the attribute observed must be property of a trait.
In case you don't have access to modify class A to subclass HasTraits, you may be able to compose a class that subclasses HasTraits on the fly using some type magic.
from traitlets import HasTraits, observe, Instance, Int
class A(HasTraits):
value = Int()
def __init__(self, val):
self.value = val
class App(HasTraits):
myA = Instance(klass=A, args=(2,))
def __init__(self):
super().__init__()
self.set_notifiers()
def set_notifiers(self):
HasTraits.observe(self.myA, func, 'value')
def func(self, change):
print(change)
app = App()
app.myA.value = 1

Related

inheritance extending a from_json function in super but it makes an instance of the parent class

I have 2 classes:
class A:
name = 'test'
def __init__(self):
pass
#staticmethod
def from_json(json: dict) -> object:
obj = A()
obj.name = json["name"]
return obj
class B(A):
description = "desc"
def __init__(self):
super().__init__(self) # I was originally doing: A.__init__(self) but online said to use super.
#staticnmethod
def from_json(json: dict) -> object:
obj = A.from_json(json) # As seen above, A.from_json, this returns an instance of A.
obj.description = json["description"]
return obj
I know there isnt really any casting, but I want the returned class to be of type B, so it gains all the other new properties / methods.
How to i have B::from_json return type B? I was thinking there was a way to create something like:
b = B()
and then through some python magic pass all properties from A into B and then return b, but i wasnt sure if that is the right solution.
Here is now a functional test of the flaw:
x = A.from_json({'name': 'foo'})
z = B.from_json({ 'name': 'thor', 'description': 'god of thunder'})
type(x) == A # <class '__main__.A'>
type(z) == B # <class '__main__.A'>
You should use classmethod here, not staticmethod. Then you can remove all the hardcoded classes references
class A:
name = 'test'
def __init__(self):
pass
#classmethod
def from_json(cls, json: dict) -> object:
obj = cls()
obj.name = json["name"]
return obj
class B(A):
description = "desc"
def __init__(self):
super().__init__()
#classmethod
def from_json(cls, json: dict) -> object:
obj = super().from_json(json)
obj.description = json["description"]
return obj
print(type(B.from_json({'name': 'name', 'description': 'description'})))
Outputs
<class '__main__.B'>
And your tests:
x = A.from_json({'name': 'foo'})
z = B.from_json({ 'name': 'thor', 'description': 'god of thunder'})
print(type(x) == A)
print(type(z) == B)
Outputs
True
True
Using classmethod is actually the recommended way in the official Python docs to create alternative "constructors" (which is what from_json essentially is). Otherwise, you don't have any access to the correct class (as you found out).
This works because (quoted from the docs):
If a class method is called for a derived class, the derived class
object is passed as the implied first argument.

How to extend base class to child class in python and printing child class value?

when i try to extend base class to child class it doesn't work properly
it shows error
b1 = B("adnan", 25, "male")
TypeError: object() takes no parameters
here is my code :
class A:
def letter(self,name,age):
self.name=name
self.age=age
class B(A):
def farhan(self,gender):
self.gender=gender
b1=B("adnan",25,"male")
print(b1.name,b1.age,b1.gender)
None of your classes have an __init__ method, which is used to initialise the class. When you do: B("adnan",25,"male"), it's translated to a call to B.__init__:
B.__init__(<object of type B>, "adnan", 25, "male")
The default implementation of __init__ supplied by the class object takes no parameters, which is exactly what the error is saying. A inherits from object (issubclass(B, object) == True), so its default __init__ method is the same as that of object.
You can try this:
class A:
def __init__(self, name, age):
self.name=name
self.age=age
class B(A):
def __init__(self, name, age, gender):
super().__init__(name, age) # initialise parent
self.gender = gender
When you write something like
b1 = B("adnan", 25, "male")
You are creating a new instance of the class 'B'. When you do that, you're calling the __init__ method of that class. A possible solution would be in the lines of:
class B(A):
def __init__(self, name, age, gender):
self.name = name
self.age = age
self.gender = gender
You need to brush up your Python OOP skills! A nice source is https://realpython.com/python3-object-oriented-programming/

Python Is it ok that an attribute only exists in child/concrete classes [duplicate]

What's the best practice to define an abstract instance attribute, but not as a property?
I would like to write something like:
class AbstractFoo(metaclass=ABCMeta):
#property
#abstractmethod
def bar(self):
pass
class Foo(AbstractFoo):
def __init__(self):
self.bar = 3
Instead of:
class Foo(AbstractFoo):
def __init__(self):
self._bar = 3
#property
def bar(self):
return self._bar
#bar.setter
def setbar(self, bar):
self._bar = bar
#bar.deleter
def delbar(self):
del self._bar
Properties are handy, but for simple attribute requiring no computation they are an overkill. This is especially important for abstract classes which will be subclassed and implemented by the user (I don't want to force someone to use #property when he just could have written self.foo = foo in the __init__).
Abstract attributes in Python question proposes as only answer to use #property and #abstractmethod: it doesn't answer my question.
The ActiveState recipe for an abstract class attribute via AbstractAttribute may be the right way, but I am not sure. It also only works with class attributes and not instance attributes.
A possibly a bit better solution compared to the accepted answer:
from better_abc import ABCMeta, abstract_attribute # see below
class AbstractFoo(metaclass=ABCMeta):
#abstract_attribute
def bar(self):
pass
class Foo(AbstractFoo):
def __init__(self):
self.bar = 3
class BadFoo(AbstractFoo):
def __init__(self):
pass
It will behave like this:
Foo() # ok
BadFoo() # will raise: NotImplementedError: Can't instantiate abstract class BadFoo
# with abstract attributes: bar
This answer uses same approach as the accepted answer, but integrates well with built-in ABC and does not require boilerplate of check_bar() helpers.
Here is the better_abc.py content:
from abc import ABCMeta as NativeABCMeta
class DummyAttribute:
pass
def abstract_attribute(obj=None):
if obj is None:
obj = DummyAttribute()
obj.__is_abstract_attribute__ = True
return obj
class ABCMeta(NativeABCMeta):
def __call__(cls, *args, **kwargs):
instance = NativeABCMeta.__call__(cls, *args, **kwargs)
abstract_attributes = {
name
for name in dir(instance)
if getattr(getattr(instance, name), '__is_abstract_attribute__', False)
}
if abstract_attributes:
raise NotImplementedError(
"Can't instantiate abstract class {} with"
" abstract attributes: {}".format(
cls.__name__,
', '.join(abstract_attributes)
)
)
return instance
The nice thing is that you can do:
class AbstractFoo(metaclass=ABCMeta):
bar = abstract_attribute()
and it will work same as above.
Also one can use:
class ABC(ABCMeta):
pass
to define custom ABC helper. PS. I consider this code to be CC0.
This could be improved by using AST parser to raise earlier (on class declaration) by scanning the __init__ code, but it seems to be an overkill for now (unless someone is willing to implement).
2021: typing support
You can use:
from typing import cast, Any, Callable, TypeVar
R = TypeVar('R')
def abstract_attribute(obj: Callable[[Any], R] = None) -> R:
_obj = cast(Any, obj)
if obj is None:
_obj = DummyAttribute()
_obj.__is_abstract_attribute__ = True
return cast(R, _obj)
which will let mypy highlight some typing issues
class AbstractFooTyped(metaclass=ABCMeta):
#abstract_attribute
def bar(self) -> int:
pass
class FooTyped(AbstractFooTyped):
def __init__(self):
# skipping assignment (which is required!) to demonstrate
# that it works independent of when the assignment is made
pass
f_typed = FooTyped()
_ = f_typed.bar + 'test' # Mypy: Unsupported operand types for + ("int" and "str")
FooTyped.bar = 'test' # Mypy: Incompatible types in assignment (expression has type "str", variable has type "int")
FooTyped.bar + 'test' # Mypy: Unsupported operand types for + ("int" and "str")
and for the shorthand notation, as suggested by #SMiller in the comments:
class AbstractFooTypedShorthand(metaclass=ABCMeta):
bar: int = abstract_attribute()
AbstractFooTypedShorthand.bar += 'test' # Mypy: Unsupported operand types for + ("int" and "str")
Just because you define it as an abstractproperty on the abstract base class doesn't mean you have to make a property on the subclass.
e.g. you can:
In [1]: from abc import ABCMeta, abstractproperty
In [2]: class X(metaclass=ABCMeta):
...: #abstractproperty
...: def required(self):
...: raise NotImplementedError
...:
In [3]: class Y(X):
...: required = True
...:
In [4]: Y()
Out[4]: <__main__.Y at 0x10ae0d390>
If you want to initialise the value in __init__ you can do this:
In [5]: class Z(X):
...: required = None
...: def __init__(self, value):
...: self.required = value
...:
In [6]: Z(value=3)
Out[6]: <__main__.Z at 0x10ae15a20>
Since Python 3.3 abstractproperty is deprecated. So Python 3 users should use the following instead:
from abc import ABCMeta, abstractmethod
class X(metaclass=ABCMeta):
#property
#abstractmethod
def required(self):
raise NotImplementedError
If you really want to enforce that a subclass define a given attribute, you can use metaclasses:
class AbstractFooMeta(type):
def __call__(cls, *args, **kwargs):
"""Called when you call Foo(*args, **kwargs) """
obj = type.__call__(cls, *args, **kwargs)
obj.check_bar()
return obj
class AbstractFoo(object):
__metaclass__ = AbstractFooMeta
bar = None
def check_bar(self):
if self.bar is None:
raise NotImplementedError('Subclasses must define bar')
class GoodFoo(AbstractFoo):
def __init__(self):
self.bar = 3
class BadFoo(AbstractFoo):
def __init__(self):
pass
Basically the meta class redefine __call__ to make sure check_bar is called after the init on an instance.
GoodFoo()  # ok
BadFoo ()  # yield NotImplementedError
As Anentropic said, you don't have to implement an abstractproperty as another property.
However, one thing all answers seem to neglect is Python's member slots (the __slots__ class attribute). Users of your ABCs required to implement abstract properties could simply define them within __slots__ if all that's needed is a data attribute.
So with something like,
class AbstractFoo(abc.ABC):
__slots__ = ()
bar = abc.abstractproperty()
Users can define sub-classes simply like,
class Foo(AbstractFoo):
__slots__ = 'bar', # the only requirement
# define Foo as desired
def __init__(self):
self.bar = ...
Here, Foo.bar behaves like a regular instance attribute, which it is, just implemented differently. This is simple, efficient, and avoids the #property boilerplate that you described.
This works whether or not ABCs define __slots__ at their class' bodies. However, going with __slots__ all the way not only saves memory and provides faster attribute accesses but also gives a meaningful descriptor instead of having intermediates (e.g. bar = None or similar) in sub-classes.1
A few answers suggest doing the "abstract" attribute check after instantiation (i.e. at the meta-class __call__() method) but I find that not only wasteful but also potentially inefficient as the initialization step could be a time-consuming one.
In short, what's required for sub-classes of ABCs is to override the relevant descriptor (be it a property or a method), it doesn't matter how, and documenting to your users that it's possible to use __slots__ as implementation for abstract properties seems to me as the more adequate approach.
1 In any case, at the very least, ABCs should always define an empty __slots__ class attribute because otherwise sub-classes are forced to have __dict__ (dynamic attribute access) and __weakref__ (weak reference support) when instantiated. See the abc or collections.abc modules for examples of this being the case within the standard library.
The problem isn't what, but when:
from abc import ABCMeta, abstractmethod
class AbstractFoo(metaclass=ABCMeta):
#abstractmethod
def bar():
pass
class Foo(AbstractFoo):
bar = object()
isinstance(Foo(), AbstractFoo)
#>>> True
It doesn't matter that bar isn't a method! The problem is that __subclasshook__, the method of doing the check, is a classmethod, so only cares whether the class, not the instance, has the attribute.
I suggest you just don't force this, as it's a hard problem. The alternative is forcing them to predefine the attribute, but that just leaves around dummy attributes that just silence errors.
I've searched around for this for awhile but didn't see anything I like. As you probably know if you do:
class AbstractFoo(object):
#property
def bar(self):
raise NotImplementedError(
"Subclasses of AbstractFoo must set an instance attribute "
"self._bar in it's __init__ method")
class Foo(AbstractFoo):
def __init__(self):
self.bar = "bar"
f = Foo()
You get an AttributeError: can't set attribute which is annoying.
To get around this you can do:
class AbstractFoo(object):
#property
def bar(self):
try:
return self._bar
except AttributeError:
raise NotImplementedError(
"Subclasses of AbstractFoo must set an instance attribute "
"self._bar in it's __init__ method")
class OkFoo(AbstractFoo):
def __init__(self):
self._bar = 3
class BadFoo(AbstractFoo):
pass
a = OkFoo()
b = BadFoo()
print a.bar
print b.bar # raises a NotImplementedError
This avoids the AttributeError: can't set attribute but if you just leave off the abstract property all together:
class AbstractFoo(object):
pass
class Foo(AbstractFoo):
pass
f = Foo()
f.bar
You get an AttributeError: 'Foo' object has no attribute 'bar' which is arguably almost as good as the NotImplementedError. So really my solution is just trading one error message from another .. and you have to use self._bar rather than self.bar in the init.
Following https://docs.python.org/2/library/abc.html you could do something like this in Python 2.7:
from abc import ABCMeta, abstractproperty
class Test(object):
__metaclass__ = ABCMeta
#abstractproperty
def test(self): yield None
def get_test(self):
return self.test
class TestChild(Test):
test = None
def __init__(self, var):
self.test = var
a = TestChild('test')
print(a.get_test())

Set variable as type of class

I am trying to figure out how I can pass a variable as the declaration type (object) for a class in Python 3.
Example:
#class defintion
class TestClass(Document):
test = IntField()
me = MongoEngine(app)
testInstance = TestClass(me.Document) # How do i pass the Document variable
I tried passing an instance of the MongoEngine variable as a variable to the TestClass but this isn't working properly?
I think you need to structure your class slightly different. Don't put Document in the class definition as if the TestClass is a subclass of Document. In stead, declare the class as standard (object), and define an __init__ where you can pass a variable which can be used by the instance of the class after initiation:
class TestClass(object):
def __init__(self, my_document):
self.document = my_document
# at this point the self.document variable
# is the same as the variable passed
# when initiating the instance of the class
def show_document(self):
# do something with your document
print(self.document)
me = MongoEngine(app)
# this will call __init__() passing the variable
test_instance = TestClass(me.Document)
# now do something with the class intance
test_instance.show_document()
[EDIT based on comment]
OP's comment:
Looking at the type(test_instance), Its not the same as a
MongoEngine.Document. I am hoping to create a class of type 'Document'
and pass in an instance of that type?
You can create classes which would take a parent class as object in the class definition. As I do not know MongoEngine I will make an example with list
A class defined as follows, will behave perfectly like a list, but if you do a type() it will come back as MyList:
class MyList(list):
def __init__(self, *args, **kwargs):
super(MyList, self).__init__(*args, **kwargs)
def my_extra_function(self):
print('hello world')
You can easily see this when using this class, first look at it as a list:
my_instance = MyList([1, 2, 3])
print(my_instance)
print(my_instance[::-1])
this will behave as if it was a list.
But when you do a type(), it will not return the same as list:
print(type(list))
print(type(list()))
print(type(MyList()))
print(type(my_instance))
output:
<class 'type'>
<class 'list'>
<class '__main__.MyList'>
<class '__main__.MyList'>
So even when you try to create a class with the MongoEngine.Document as parent object, the type() will still show you your own defined class.
class MyClass(MongoEngine.Document):
def __init__(self, *args, **kwargs):
super(MyClass, self).__init__(*args, **kwargs)
my_instance = MyClass('something')
If you do a type(my_instance) it will return your custom class, and not the parent object type.
Not sure how MongoEngine works, and if you can actually do something like this, so YMMV.
You can change the name type() is returning, by doing the following in my example class. Setting the self.__class__ in the __init__(). Like this:
class MyList(list):
def __init__(self, *args, **kwargs):
super(MyList, self).__init__(*args, **kwargs)
self.__class__ = type('list', (list,),{})
def my_extra_function(self):
print('hello world', self)
my_instance = MyList([1, 2, 3])
print(type(list))
print(type(list()))
print(type(MyList()))
print(type(my_instance))
output:
<class 'type'>
<class 'list'>
<class '__main__.list'>
<class '__main__.list'>
If this trick works for MongoEngine.Document I do not know.

dynamic class inheritance using super

I'm trying to dynamically create a class using type() and assign an __init__ constructor which calls super().__init__(...); however, when super() gets called I receive the following error:
TypeError: super(type, obj): obj must be an instance or subtype of type
Here is my code:
class Item():
def __init__(self, name, description, cost, **kwargs):
self.name = name
self.description = description
self.cost = cost
self.kwargs = kwargs
class ItemBase(Item):
def __init__(self, name, description, cost):
super().__init__(name, description, cost)
def __constructor__(self, n, d, c):
super().__init__(name=n, description=d, cost=c)
item = type('Item1', (ItemBase,), {'__init__':__constructor__})
item_instance = item('MyName', 'MyDescription', 'MyCost')
Why is super() inside the __constructor__ method not understanding the object parameter; and how do I fix it?
Solution 1: Using cls = type('ClassName', ...)
Note the solution of sadmicrowave creates an infinite loop if the dynamically-created class gets inherited as self.__class__ will correspond to the child class.
An alternative way which do not have this issue is to assigns __init__ after creating the class, such as the class can be linked explicitly through closure. Example:
# Base class
class A():
def __init__(self):
print('A')
# Dynamically created class
B = type('B', (A,), {})
def __init__(self):
print('B')
super(B, self).__init__()
B.__init__ = __init__
# Child class
class C(B):
def __init__(self):
print('C')
super().__init__()
C() # print C, B, A
Solution 2: Using MyClass.__name__ = 'ClassName'
An alternative way to dynamically create class is to define a class inside the function, then reassign the __name__ and __qualname__ attributes:
class A:
def __init__(self):
print(A.__name__)
def make_class(name, base):
class Child(base):
def __init__(self):
print(Child.__name__)
super().__init__()
Child.__name__ = name
Child.__qualname__ = name
return Child
B = make_class('B', A)
class C(B):
def __init__(self):
print(C.__name__)
super().__init__()
C() # Display C B A
Here is how I solved the issue. I reference the type() method to dynamically instantiate a class with variable references as such:
def __constructor__(self, n, d, c, h):
# initialize super of class type
super(self.__class__, self).__init__(name=n, description=d, cost=c, hp=h)
# create the object class dynamically, utilizing __constructor__ for __init__ method
item = type(item_name, (eval("{}.{}".format(name,row[1].value)),), {'__init__':__constructor__})
# add new object to the global _objects object to be used throughout the world
self._objects[ item_name ] = item(row[0].value, row[2].value, row[3].value, row[4].value)
There may be a better way to accomplish this, but I needed a fix and this is what I came up with... use it if you can.

Resources