I have two decorators, one involves assigning a staticmethod and the other one doesn't:
def mark_statistics(func):
new_func = staticmethod(func)
new_func.is_stats_mark = True
return new_func
def log_args(func):
def wrapper(*args, **kwargs):
print(f"""args are: {args}
kwargs are: {kwargs}""")
return func(*args, **kwargs)
wrapper.is_logging_mark = True
return wrapper
class Utility:
#mark_statistics
#log_args
def make_to_static(*args, **kwargs):
#... some processing ...
result = "result from <make_to_static>"
return result
#log_args
def normal_meth(self):
result = "result from <normal_meth>"
return result
u = Utility()
u.make_to_static.is_logging_mark
u.make_to_static.is_stats_mark # failed
Running this code produce the following error:
AttributeError: 'function' object has no attribute 'is_stats_mark'
As you can see, the first mark, is_logging_mark can run successfully, but not the case in the second mark, is_stats_mark. For some reason, that mark didn't get attached onto the decorated func.
I've been scratching my head trying to figure out why? Someone please give some hint? Thanks.
My python version:
python3 --version
Python 3.9.2
Issue 1:
What does isinstance function mean?
class Singleton1(object):
__instance = None
def __init__(self):
if not hasattr(Singleton1, '__instance'):
print("__init__ method called, but no instance created")
else:
print("instance already created:", self.__instance)
#classmethod
def get_instance(cls):
if not cls.__instance:
cls.__instance = Singleton1()
return cls.__instance
Initialize it :
x = Singleton1()
__init__ method called, but no instance created
Have a check with isinstance function:
isinstance(x,Singleton1)
True
If x is not an instance,why does isinstance(x,Singleton1) say it is an instance of Singleton1?
Issue2:
Why __init__ method can't be called anyway?
Now repalce all __instance (double underscores) with _instance(single underscore) in the class Singleton1 and replace all Singleton1 with Singleton2:
class Singleton2(object):
_instance = None
def __init__(self):
if not hasattr(Singleton2, '_instance'):
print("__init__ method called, but no instance created")
else:
print("instance already created:", self._instance)
#classmethod
def get_instance(cls):
if not cls._instance:
cls._instance = Singleton2()
return cls._instance
Initialize it:
y = Singleton2()
instance already created: None
Why __init__ method can't be called anyway in this status?
#snakecharmerb on issue1,Why someone say it is lazy instantiation ,if isinstance(x,Singleton1) is true,it is no need to call with Singleton1.get_instance() ,because the instance is already created during instantiation.
The hasattr check does not do what you think it does. Using Singleton2*, hasattr(Singleton2, '_instance') is always True, because the class has an attribute named _instance. You want to check the value of the instance, so use getattr instead; then the expected output will be printed.
The isinstance checks succeed because Singleton2() will return a new instance each time - there is nothing to prevent this. You can add a __new__ method to create _instance and return it every time Singleton2() is called. Note that this will mean that _instance will always exist by the time __init__ is called.
class Singleton2:
_instance = None
def __new__(cls):
if cls._instance is not None:
return cls._instance
instance = super().__new__(cls)
cls._instance = instance
return instance
* The hasattr check in Singleton1 is complicated by the name-mangling performed on __instance. In general, avoid using double-underscored variable names, except for avoiding name clashes in class hierarchies.
I'm trying to create a ContextDecorator. Here's my code:
class CmTag(ContextDecorator):
def __init__(self, cm_tag_func):
self.cm_tag_func = cm_tag_func
def __enter__(self):
return self
def __exit__(self, exc_type, exc_value, tb):
if exc_type is not None:
traceback.print_exception(exc_type, exc_value, tb)
else:
name = self.cm_tag_func.__name__
print(name)
def __call__(self, **kwargs):
name = self.cm_tag_func.__name__
print(name)
print(kwargs)
self.cm_tag_func(**kwargs)
#CmTag
def testing(**kwargs):
pass
with testing(foo='bar') as t:
print('a test')
I expect the output to be:
testing
{'foo':'bar'}
a test
testing
That is, it first prints the name of the function. Then it prints out kwargs as a dictionary.
Then it prints out whatever is there inside the context manager, which is 'a test', in this case. Finally upon exit, it prints out the name of the function again.
Instead, it says:
testing
{'foo': 'bar'}
Traceback (most recent call last):
File "/workspace/sierra/src/sierra/test.py", line 32, in <module>
with testing(foo='bar') as t:
AttributeError: __enter__
I saw other solutions saying __enter__ was not defined. But I've done it here.
How do I rectify this? Thanks.
In the line
with testing(foo='bar') as t:
print('a test')
since testing is an object of CmTag, you perform a call to __call__, however, you return None in that method instead of returning self.
Adding return self at the end of that method would fix it.
I have a superclass that has a retrieve() method, and its subclasses each implement their own retrieve() method. I'd like every retrieve() method to be decorated to cache the return value when it receive the same args, without having to decorate the method in every subclass.
Decorators don't seem to be inherited. I could probably call the superclass's method which would in turn set the cache, but currently my superclass raises a NotImplemented exception, which I like.
import json
import operator
from cachetools import cachedmethod, TTLCache
def simple_decorator(func):
def wrapper(*args, **kwargs):
#check cache
print("simple decorator")
func(*args, **kwargs)
#set cache
return wrapper
class AbstractInput(object):
def __init__(self, cacheparams = {'maxsize': 10, 'ttl': 300}):
self.cache = TTLCache(**cacheparams)
super().__init__()
#simple_decorator
def retrieve(self, params):
print("AbstractInput retrieve")
raise NotImplementedError("AbstractInput inheritors must implement retrieve() method")
class JsonInput(AbstractInput):
def retrieve(self, params):
print("JsonInput retrieve")
return json.dumps(params)
class SillyJsonInput(JsonInput):
def retrieve(self, params):
print("SillyJsonInput retrieve")
params["silly"] = True
return json.dumps(params)
Actual results:
>>> ai.retrieve(params)
ai.retrieve(params)
simple decorator
AbstractInput retrieve
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<string>", line 8, in wrapper
File "<string>", line 22, in retrieve
NotImplementedError: AbstractInput inheritors must implement retrieve() method
>>> ji.retrieve(params)
ji.retrieve(params)
JsonInput retrieve
'{"happy": "go lucky", "angry": "as a wasp"}'
Desired results:
>>> ai.retrieve(params)
ai.retrieve(params)
simple decorator
AbstractInput retrieve
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<string>", line 8, in wrapper
File "<string>", line 22, in retrieve
NotImplementedError: AbstractInput inheritors must implement retrieve() method
>>> ji.retrieve(params)
simple decorator
ji.retrieve(params)
JsonInput retrieve
'{"happy": "go lucky", "angry": "as a wasp"}'
Yes, the use of a metaclass to force a decorator on an specific method, as you put in your own answer is correct. With a few changes, it can be made so that the method to be decorated is not fixed - for example, an attribute set in the decorated function can be used as a "mark" that such a decorator should be forced upon overriding methods.
Besides that, since Python 3.6, there is a new class level mechanism - the special method __init_subclass__, which has the specific objective of diminishing the need for metaclasses. Metaclasses can be complicated, and if your class hierarchy needs to combine more than one metaclass, you may be in for some headache.
The __init_subclass__ method is placed on the base class, and it is called once each time a child class is created. The wrapping logic can be put there.
Basically, you can just modify your decorator to put the mark I mentioned above, and add this class in your inheritance hierarchy - it can be put as mixin class in multiple inheritance, so it can be reused for various class-trees, if needed:
def simple_decorator(func):
def wrapper(*args, **kwargs):
print("check cache")
rt = func(*args, **kwargs)
print("set cache")
return rt
wrapper.inherit_decorator = simple_decorator
return wrapper
class InheritDecoratorsMixin:
def __init_subclass__(cls, *args, **kwargs):
super().__init_subclass__(*args, **kwargs)
decorator_registry = getattr(cls, "_decorator_registry", {}).copy()
cls._decorator_registry = decorator_registry
# Check for decorated objects in the mixin itself- optional:
for name, obj in __class__.__dict__.items():
if getattr(obj, "inherit_decorator", False) and not name in decorator_registry:
decorator_registry[name] = obj.inherit_decorator
# annotate newly decorated methods in the current subclass:
for name, obj in cls.__dict__.items():
if getattr(obj, "inherit_decorator", False) and not name in decorator_registry:
decorator_registry[name] = obj.inherit_decorator
# finally, decorate all methods anottated in the registry:
for name, decorator in decorator_registry.items():
if name in cls.__dict__ and getattr(getattr(cls, name), "inherit_decorator", None) != decorator:
setattr(cls, name, decorator(cls.__dict__[name]))
So, that is it - each new subclass will have its own _decorator_registry attribute, where the name of the decorated methods in all ancestors, along with which decorator to apply is annotated.
If the decorator should be used one single time for the method, and not be repeated when the overridden method performs the super() call for its ancestors (not the case when you are decorating for cache, since the super-methods won't be called) that gets trickier - but can be done.
However, it is tricky to do - as the decorator instances in the superclasses would be other instances than the decorator on the subclass - one way to pass information to then that the "decorator code for this method is already run in this chain call" is to use an instance-level marker - which should be a thread-local variable if the code is to support parallelism.
All this checking will result in quite some complicated boilerplate to put into what could be a simple decorator - so we can create a "decorator" for the "decorators" that we want to run a single time. In other wors, decoratos decorated with childmost bellow will run only on the "childmost" class, but not on the corresponding methods in the superclasses when they call super()
import threading
def childmost(decorator_func):
def inheritable_decorator_that_runs_once(func):
decorated_func = decorator_func(func)
name = func.__name__
def wrapper(self, *args, **kw):
if not hasattr(self, f"_running_{name}"):
setattr(self, f"_running_{name}", threading.local())
running_registry = getattr(self, f"_running_{name}")
try:
if not getattr(running_registry, "running", False):
running_registry.running = True
rt = decorated_func(self, *args, **kw)
else:
rt = func(self, *args, **kw)
finally:
running_registry.running = False
return rt
wrapper.inherit_decorator = inheritable_decorator_that_runs_once
return wrapper
return inheritable_decorator_that_runs_once
Example using the first listing:
class A: pass
class B(A, InheritDecoratorsMixin):
#simple_decorator
def method(self):
print(__class__, "method called")
class C(B):
def method(self):
print(__class__, "method called")
super().method()
And after pasting the listing-1 and these A=B-C class in the
interpreter, the result is this:
In [9]: C().method()
check cache
<class '__main__.C'> method called
check cache
<class '__main__.B'> method called
set cache
set cache
(the "A" class here is entirely optional and can be left out)
Example using the second listing:
# Decorating the same decorator above:
#childmost
def simple_decorator2(func):
def wrapper(*args, **kwargs):
print("check cache")
rt = func(*args, **kwargs)
print("set cache")
return rt
return wrapper
class D: pass
class E(D, InheritDecoratorsMixin):
#simple_decorator2
def method(self):
print(__class__, "method called")
class F(E):
def method(self):
print(__class__, "method called")
super().method()
And the result:
In [19]: F().method()
check cache
<class '__main__.F'> method called
<class '__main__.E'> method called
set cache
OK, it seems that I can "decorate" a method in a superclass and have the subclasses also inherit that decoration, even if the method is overwritten in the subclass, using metaclasses. In this case, I'm decorating all "retrieve" methods in AbstractInput and its subclasses with simple_decorator using a metaclass named CacheRetrieval.
def simple_decorator(func):
def wrapper(*args, **kwargs):
print("check cache")
rt = func(*args, **kwargs)
print("set cache")
return rt
return wrapper
class CacheRetrieval(type):
def __new__(cls, name, bases, attr):
# Replace each function with
# a print statement of the function name
# followed by running the computation with the provided args and returning the computation result
attr["retrieve"] = simple_decorator(attr["retrieve"])
return super(CacheRetrieval, cls).__new__(cls, name, bases, attr)
class AbstractInput(object, metaclass= CacheRetrieval):
def __init__(self, cacheparams = {'maxsize': 10, 'ttl': 300}):
self.cache = TTLCache(**cacheparams)
super().__init__()
def retrieve(self, params):
print("AbstractInput retrieve")
raise NotImplementedError("DataInput must implement retrieve() method")
class JsonInput(AbstractInput):
def retrieve(self, params):
print("JsonInput retrieve")
return json.dumps(params)
class SillyJsonInput(JsonInput):
def retrieve(self, params):
print("SillyJsonInput retrieve")
params["silly"] = True
return json.dumps(params)
I was helped by this page:
https://stackabuse.com/python-metaclasses-and-metaprogramming/
I'm trying to redifine the __getattr__ method, in a function.
I've tried this code:
def foo():
print("foo")
def addMethod(obj, func):
setattr(obj, func.__name__, types.MethodType(func, obj))
def __getattr__(obj, name):
print(name)
addMethod(foo, __getattr__)
foo.bar
but I get this error:
Traceback (most recent call last):
File blah blah, line 14, in <module>
foo.bar
AttributeError: 'function' object has no attribute 'bar'
I've inspected the "foo" function and it really has the method bounded to it, but it seems that if you set it dynamically, __getattr__ won't get called.
If I do the same thing to a class, if I set the __getattr__ using my "addMethod" function, the instance won't call the __getattr__ too!, so the problem must be the dynamic call!
BUT if I put the __getattr__ in the definition of the class, it will work, obviously.
The question is, how can I put the __getattr__ to the function to make it work? I can't put it from the beginning because it's a function!, I don't know how to do this
Thanks!
Well, you don't. If you want attributes, make a class. If you want instances to be callable, define __call__ for it.
class foo:
def __call__(self):
print("foo")
def __getattr__(self, name):
print(name)
f = foo()
f() # foo
f.bar # bar
Your problem is that Python will only look up __xxx__ magic methods on the class of an object, not the object itself. So even though you are setting __getattr__ on the instance of foo, it will never be automatically called.
That is also the problem you are having with class foo:
class foo():
pass
def addMethod(obj, func):
setattr(obj, func.__name__, func)
def __getattr__(obj, name):
print(name)
addMethod(foo, __getattr__)
foo.bar
Here Python is looking up bar on the class object foo which means . . .
the __getattr__ method you added to foo is not getting called, instead Python is looking at foo's metaclass, which is type.
If you change that last line to
foo().bar
and get an instance of the class of foo, it will work.