How can i make class which takes func parameter on initialization, which is a callable that can be passed?
Then object of this class may be called - callable passed on init will be called with passed parameters.
def test_callable_obj(self):
callable_obj = homework_lecture1.CallableInstances(lambda x: x + 8)
self.assertEqual(callable_obj(10), 18)
class CallableInstances:
You need to create a class whose __init__ method accepts a function, and whose __call__ simply calls that function.
def func(*args, **kwargs):
print('called func with args', args, ' and kwargs', kwargs)
class Foo:
def __init__(self, f):
self._f = f
def __call__(self, *args, **kwargs):
return self._f(*args, **kwargs)
foo = Foo(func)
foo(1, 2, c=3)
Outputs
called func with args (1, 2) and kwargs {'c': 3}
Related
I'm a beginner. I've defined the following classes. They look the same. I'm a little dizzy. I don't know what the difference is?
My purpose is to define a base class. Why use object? I think B is what I want.
Do I have to use the super () function to return? What do CLS, * args, * kwargs stand for?
python code :
class A(object):
def __new__(cls, *args, **kwargs):
print("A.__new__called")
return super(A, cls).__new__(cls, *args, **kwargs)
class B:
def __new__(cls, *args, **kwargs):
print("B.__new__called")
return super(B, cls).__new__(cls, *args, **kwargs)
class C():
def __new__(cls, *args, **kwargs):
print("C.__new__called")
return super(C, cls).__new__(cls, *args, **kwargs)
class D():
def __new__(cls, *args, **kwargs):
print("D.__new__called")
return super(D, cls).__new__(cls)
class E():
def __new__(cls):
print("E.__new__called")
return super(E, cls).__new__(cls)
class F():
print("F.__new__called")
a = A()
b = B()
c = C()
d = D()
e = E()
f = F()
result :
F.__new__called
A.__new__called
B.__new__called
C.__new__called
D.__new__called
E.__new__called
Few comments:
object is inherited by default in python3.x onwards and thus no need for this explicit inheritance as required in python2.x
super() also doesn't need any arguments in py3.x.
class A:
def __init__(self):
print("Here in init")
def __new__(cls,*args,**kargs):
print("Here in new")
super().__new__(cls,**kargs)
new is always class method and called before an object is instantiated. For example, if you create the object above, following print statement will be in that order: first what is in new and then init. So new is used if you want to do something before an object is instantiated. super() calls the inherited base class (default object in this case) to create the object. *args and **kargs are just list of arguments and key arguments to pass when you create the object.
obj = A()
It will print following:
Here in new
Here in init
The below is an attempt to implement Singleton in python 3 but it doesn't appear to work. When I instantiate, the _instance is always None and both instances (a and b) have different addresses in memory - why?
class Singleton(object):
_instance = None
def __call__(self, *args, **kwargs):
if self._instance is None:
self._instance = super().__call__(*args, **kwargs)
return self._instance
def __init__(self, *args, **kwargs):
print(self._instance, self)
a = Singleton()
b = Singleton()
The output is:
(None, <__main__.Singleton object at 0x7f382956c190>)
(None, <__main__.Singleton object at 0x7f382956c410>)
The __call__ method is not what you think it is. It is meant to make instances of classes callable like functions:
class A:
def __call__(self):
print("called")
a = A() # prints nothing
a() # prints "called"
What you are looking for is the __new__ method:
Called to create a new instance of class cls.
You can write the singleton like this (very similar to what you wrote):
class Singleton(object):
_instance = None
def __new__(cls, *args, **kwargs):
if cls._instance is None:
cls._instance = super(Singleton, cls).__new__(cls)
return cls._instance
def __init__(self, *args, **kwargs):
print(self._instance, self)
a = Singleton()
b = Singleton()
The output is now:
<__main__.Singleton object at 0x7f149bf3cc88> <__main__.Singleton object at 0x7f149bf3cc88>
<__main__.Singleton object at 0x7f149bf3cc88> <__main__.Singleton object at 0x7f149bf3cc88>
I am making a simple decorator that outputs the returned value in uppercase. This is the code I tried:
class UpperDecorator:
def __init__(self, func, msg):
self.func = func
self.msg = msg
def __call__(self):
res = self.func(self.msg)
return res.upper()
#UpperDecorator
def message_app(msg):
return msg
res = message_app('Hi')
print(res)
upon running the code I get this error:
TypeError: __init__() missing 1 required positional argument: 'msg'
then I modified the constructor a bit (def __init__(self, func, msg=None):) and get this error:
TypeError: __call__() takes 1 positional argument but 2 were given
please help me solve it. Thank you
The parameters of the decorated function are passed to the __call__ method, not to the constructor __init__:
class UpperDecorator:
def __init__(self, func):
self.func = func
def __call__(self, *args, **kwargs):
res = self.func(*args, **kwargs)
return res.upper()
#UpperDecorator
def message_app(msg):
return msg
res = message_app('Hi')
print(res)
Prints:
HI
I have a superclass that has a retrieve() method, and its subclasses each implement their own retrieve() method. I'd like every retrieve() method to be decorated to cache the return value when it receive the same args, without having to decorate the method in every subclass.
Decorators don't seem to be inherited. I could probably call the superclass's method which would in turn set the cache, but currently my superclass raises a NotImplemented exception, which I like.
import json
import operator
from cachetools import cachedmethod, TTLCache
def simple_decorator(func):
def wrapper(*args, **kwargs):
#check cache
print("simple decorator")
func(*args, **kwargs)
#set cache
return wrapper
class AbstractInput(object):
def __init__(self, cacheparams = {'maxsize': 10, 'ttl': 300}):
self.cache = TTLCache(**cacheparams)
super().__init__()
#simple_decorator
def retrieve(self, params):
print("AbstractInput retrieve")
raise NotImplementedError("AbstractInput inheritors must implement retrieve() method")
class JsonInput(AbstractInput):
def retrieve(self, params):
print("JsonInput retrieve")
return json.dumps(params)
class SillyJsonInput(JsonInput):
def retrieve(self, params):
print("SillyJsonInput retrieve")
params["silly"] = True
return json.dumps(params)
Actual results:
>>> ai.retrieve(params)
ai.retrieve(params)
simple decorator
AbstractInput retrieve
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<string>", line 8, in wrapper
File "<string>", line 22, in retrieve
NotImplementedError: AbstractInput inheritors must implement retrieve() method
>>> ji.retrieve(params)
ji.retrieve(params)
JsonInput retrieve
'{"happy": "go lucky", "angry": "as a wasp"}'
Desired results:
>>> ai.retrieve(params)
ai.retrieve(params)
simple decorator
AbstractInput retrieve
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<string>", line 8, in wrapper
File "<string>", line 22, in retrieve
NotImplementedError: AbstractInput inheritors must implement retrieve() method
>>> ji.retrieve(params)
simple decorator
ji.retrieve(params)
JsonInput retrieve
'{"happy": "go lucky", "angry": "as a wasp"}'
Yes, the use of a metaclass to force a decorator on an specific method, as you put in your own answer is correct. With a few changes, it can be made so that the method to be decorated is not fixed - for example, an attribute set in the decorated function can be used as a "mark" that such a decorator should be forced upon overriding methods.
Besides that, since Python 3.6, there is a new class level mechanism - the special method __init_subclass__, which has the specific objective of diminishing the need for metaclasses. Metaclasses can be complicated, and if your class hierarchy needs to combine more than one metaclass, you may be in for some headache.
The __init_subclass__ method is placed on the base class, and it is called once each time a child class is created. The wrapping logic can be put there.
Basically, you can just modify your decorator to put the mark I mentioned above, and add this class in your inheritance hierarchy - it can be put as mixin class in multiple inheritance, so it can be reused for various class-trees, if needed:
def simple_decorator(func):
def wrapper(*args, **kwargs):
print("check cache")
rt = func(*args, **kwargs)
print("set cache")
return rt
wrapper.inherit_decorator = simple_decorator
return wrapper
class InheritDecoratorsMixin:
def __init_subclass__(cls, *args, **kwargs):
super().__init_subclass__(*args, **kwargs)
decorator_registry = getattr(cls, "_decorator_registry", {}).copy()
cls._decorator_registry = decorator_registry
# Check for decorated objects in the mixin itself- optional:
for name, obj in __class__.__dict__.items():
if getattr(obj, "inherit_decorator", False) and not name in decorator_registry:
decorator_registry[name] = obj.inherit_decorator
# annotate newly decorated methods in the current subclass:
for name, obj in cls.__dict__.items():
if getattr(obj, "inherit_decorator", False) and not name in decorator_registry:
decorator_registry[name] = obj.inherit_decorator
# finally, decorate all methods anottated in the registry:
for name, decorator in decorator_registry.items():
if name in cls.__dict__ and getattr(getattr(cls, name), "inherit_decorator", None) != decorator:
setattr(cls, name, decorator(cls.__dict__[name]))
So, that is it - each new subclass will have its own _decorator_registry attribute, where the name of the decorated methods in all ancestors, along with which decorator to apply is annotated.
If the decorator should be used one single time for the method, and not be repeated when the overridden method performs the super() call for its ancestors (not the case when you are decorating for cache, since the super-methods won't be called) that gets trickier - but can be done.
However, it is tricky to do - as the decorator instances in the superclasses would be other instances than the decorator on the subclass - one way to pass information to then that the "decorator code for this method is already run in this chain call" is to use an instance-level marker - which should be a thread-local variable if the code is to support parallelism.
All this checking will result in quite some complicated boilerplate to put into what could be a simple decorator - so we can create a "decorator" for the "decorators" that we want to run a single time. In other wors, decoratos decorated with childmost bellow will run only on the "childmost" class, but not on the corresponding methods in the superclasses when they call super()
import threading
def childmost(decorator_func):
def inheritable_decorator_that_runs_once(func):
decorated_func = decorator_func(func)
name = func.__name__
def wrapper(self, *args, **kw):
if not hasattr(self, f"_running_{name}"):
setattr(self, f"_running_{name}", threading.local())
running_registry = getattr(self, f"_running_{name}")
try:
if not getattr(running_registry, "running", False):
running_registry.running = True
rt = decorated_func(self, *args, **kw)
else:
rt = func(self, *args, **kw)
finally:
running_registry.running = False
return rt
wrapper.inherit_decorator = inheritable_decorator_that_runs_once
return wrapper
return inheritable_decorator_that_runs_once
Example using the first listing:
class A: pass
class B(A, InheritDecoratorsMixin):
#simple_decorator
def method(self):
print(__class__, "method called")
class C(B):
def method(self):
print(__class__, "method called")
super().method()
And after pasting the listing-1 and these A=B-C class in the
interpreter, the result is this:
In [9]: C().method()
check cache
<class '__main__.C'> method called
check cache
<class '__main__.B'> method called
set cache
set cache
(the "A" class here is entirely optional and can be left out)
Example using the second listing:
# Decorating the same decorator above:
#childmost
def simple_decorator2(func):
def wrapper(*args, **kwargs):
print("check cache")
rt = func(*args, **kwargs)
print("set cache")
return rt
return wrapper
class D: pass
class E(D, InheritDecoratorsMixin):
#simple_decorator2
def method(self):
print(__class__, "method called")
class F(E):
def method(self):
print(__class__, "method called")
super().method()
And the result:
In [19]: F().method()
check cache
<class '__main__.F'> method called
<class '__main__.E'> method called
set cache
OK, it seems that I can "decorate" a method in a superclass and have the subclasses also inherit that decoration, even if the method is overwritten in the subclass, using metaclasses. In this case, I'm decorating all "retrieve" methods in AbstractInput and its subclasses with simple_decorator using a metaclass named CacheRetrieval.
def simple_decorator(func):
def wrapper(*args, **kwargs):
print("check cache")
rt = func(*args, **kwargs)
print("set cache")
return rt
return wrapper
class CacheRetrieval(type):
def __new__(cls, name, bases, attr):
# Replace each function with
# a print statement of the function name
# followed by running the computation with the provided args and returning the computation result
attr["retrieve"] = simple_decorator(attr["retrieve"])
return super(CacheRetrieval, cls).__new__(cls, name, bases, attr)
class AbstractInput(object, metaclass= CacheRetrieval):
def __init__(self, cacheparams = {'maxsize': 10, 'ttl': 300}):
self.cache = TTLCache(**cacheparams)
super().__init__()
def retrieve(self, params):
print("AbstractInput retrieve")
raise NotImplementedError("DataInput must implement retrieve() method")
class JsonInput(AbstractInput):
def retrieve(self, params):
print("JsonInput retrieve")
return json.dumps(params)
class SillyJsonInput(JsonInput):
def retrieve(self, params):
print("SillyJsonInput retrieve")
params["silly"] = True
return json.dumps(params)
I was helped by this page:
https://stackabuse.com/python-metaclasses-and-metaprogramming/
I implemented a Delegate class in Python 3, which wraps a function object in a object instance. It's possible to register multiple function objects on one delegate (in .NET terminology it's a MultiCastDelegate). Assumed all registered functions accept the same parameters, it's possible to invoke the delegate and call all functions at once.
Delegate implementation:
class Delegate:
def __init__(self, *funcs):
self.__invocationList__ = []
for func in funcs:
self.__invocationList__.append(func)
def __iadd__(self, func):
self.__invocationList__.append(func)
return self
def __isub__(self, func):
self.__invocationList__.remove(func)
return self
def __call__(self, *args, **kwargs):
if (len(self.__invocationList__) == 1):
return self.__invocationList__[0](*args, **kwargs)
else:
res = {}
for func in self.__invocationList__:
res[func] = func(*args, **kwargs)
return res
#property
def isMulticast(self):
return (len(self.__invocationList__) > 1)
Usage examples:
def test1():
return 5
def test2(a, b):
return a + b
def test3(a, b):
return a * b + 15
delegate = Delegate(test1)
result = delegate()
print("test1: {0}".format(result))
delegate = Delegate(test2)
result = delegate(3, 8)
print("test2: {0}".format(result))
delegate += test3
results = delegate(2, 9)
print("test2: {0}".format(results[test2]))
print("test3: {0}".format(results[test3]))
I would like to implement an iterator or generator on this class, so it's possible to use the delegate in for loops.
How could it look like?
# loop over every result from delegate, call with parameters 4 and 18
for result in delegate(4, 18):
print("function={0} result={1}".format(*result))
The iterators __next__() method should return a tuple consisting of the function-object and return value.
What I tried so far:
class Delegate:
# ...
# see code from above
def __iter__(self):
print("Delegate.__iter__():")
class iter:
def __init__(self2, *args, **kwargs):
print(str(args))
self2.__args = args
self2.__kwargs = kwargs
self2.__index = 0
def __iter__(self2):
return self2
def __next__(self2):
if (self2.__index == len(self.__invocationList__)):
raise StopIteration
func = self.__invocationList__[self2.__index]
self2.__index += 1
return func(*self2.__args, **self2.__kwargs)
return iter()
Because the constructor method is already in use by the Delegate creation itself, I implemented the iterator as a nested class. But unfortunately, I can not pass the call parameters *args and **kwargs to the iterator.
So my questions:
Is it possible and wise the implement a iterator / generator pattern for delegates?
What should I change to get it working?
I just tried to implement the iterator pattern. If it works, I would like to upgrade it to a generator - if possible :)
I'm not familiar with this, but I gave it a shot. It is not well tested, but it will help you on the way to solve your task. Here is the code:
class Delegate:
class IterDelegate:
def __init__(this, invocationList, *args, **kwargs):
this.__args = args
this.__kwargs = kwargs
this._invocationList = invocationList
def __iter__(this):
this.__index = 0
return this
def __next__(this):
if this.__index < len(this._invocationList):
func = this._invocationList[this.__index]
this.__index += 1
return (func.__name__, func(*this.__args, **this.__kwargs))
raise StopIteration
def __init__(self, func):
if (type(func) == 'list'):
self._invocationList = func
else:
self._invocationList = [func]
def __call__(self, *args, **kwargs):
return self.IterDelegate(self._invocationList, *args, **kwargs)
def __iadd__(self, func):
self._invocationList.append(func)
return self
def __isub__(self, func):
self._invocationList.remove(func)
return self
def test2(a, b):
return a + b
def test1(*args):
return 6
delegate = Delegate(test2)
delegate += test1
results = delegate(2,3)
for r in results:
print("function={0} result={1}".format(*r))
This will give the results
function=test2 result=5
function=test1 result=6