My python version:
python3 --version
Python 3.9.2
Issue 1:
What does isinstance function mean?
class Singleton1(object):
__instance = None
def __init__(self):
if not hasattr(Singleton1, '__instance'):
print("__init__ method called, but no instance created")
else:
print("instance already created:", self.__instance)
#classmethod
def get_instance(cls):
if not cls.__instance:
cls.__instance = Singleton1()
return cls.__instance
Initialize it :
x = Singleton1()
__init__ method called, but no instance created
Have a check with isinstance function:
isinstance(x,Singleton1)
True
If x is not an instance,why does isinstance(x,Singleton1) say it is an instance of Singleton1?
Issue2:
Why __init__ method can't be called anyway?
Now repalce all __instance (double underscores) with _instance(single underscore) in the class Singleton1 and replace all Singleton1 with Singleton2:
class Singleton2(object):
_instance = None
def __init__(self):
if not hasattr(Singleton2, '_instance'):
print("__init__ method called, but no instance created")
else:
print("instance already created:", self._instance)
#classmethod
def get_instance(cls):
if not cls._instance:
cls._instance = Singleton2()
return cls._instance
Initialize it:
y = Singleton2()
instance already created: None
Why __init__ method can't be called anyway in this status?
#snakecharmerb on issue1,Why someone say it is lazy instantiation ,if isinstance(x,Singleton1) is true,it is no need to call with Singleton1.get_instance() ,because the instance is already created during instantiation.
The hasattr check does not do what you think it does. Using Singleton2*, hasattr(Singleton2, '_instance') is always True, because the class has an attribute named _instance. You want to check the value of the instance, so use getattr instead; then the expected output will be printed.
The isinstance checks succeed because Singleton2() will return a new instance each time - there is nothing to prevent this. You can add a __new__ method to create _instance and return it every time Singleton2() is called. Note that this will mean that _instance will always exist by the time __init__ is called.
class Singleton2:
_instance = None
def __new__(cls):
if cls._instance is not None:
return cls._instance
instance = super().__new__(cls)
cls._instance = instance
return instance
* The hasattr check in Singleton1 is complicated by the name-mangling performed on __instance. In general, avoid using double-underscored variable names, except for avoiding name clashes in class hierarchies.
Is there any way to refer to instance of a class from its metaclass every time an instance is created? I suppose I should use dunder _call_ method inside metaclass for that purpose.
I have the following code:
class meta(type):
def __call__(cls):
super().__call__()
#<--- want to get an object of A class here every time when instance of A class is created
class A(metaclass = meta):
def __init__(self, c):
self.c = 2
def test(self):
print('test called')
a1=A()
a2=A()
a3=A()
Also why when I implement __call__ method inside metaclass all created instances of my class became NoneType however when overring __call__ I used super().__call__()?
For example a4.test() returns AttributeError: 'NoneType' object has no attribute 'test'
The newly created instance is returned by super().__call__() - you hav to keep this value in a variable, use t for whatever you want and return it.
Otherwise, if the metaclass __call__ has no return statement, all instances are imediatelly de-referenced and destroyed, and the code trying to create instances just get None:
class meta(type):
def __call__(cls):
obj = super().__call__()
# use obj as you see fit
...
return obj
I have a superclass that has a retrieve() method, and its subclasses each implement their own retrieve() method. I'd like every retrieve() method to be decorated to cache the return value when it receive the same args, without having to decorate the method in every subclass.
Decorators don't seem to be inherited. I could probably call the superclass's method which would in turn set the cache, but currently my superclass raises a NotImplemented exception, which I like.
import json
import operator
from cachetools import cachedmethod, TTLCache
def simple_decorator(func):
def wrapper(*args, **kwargs):
#check cache
print("simple decorator")
func(*args, **kwargs)
#set cache
return wrapper
class AbstractInput(object):
def __init__(self, cacheparams = {'maxsize': 10, 'ttl': 300}):
self.cache = TTLCache(**cacheparams)
super().__init__()
#simple_decorator
def retrieve(self, params):
print("AbstractInput retrieve")
raise NotImplementedError("AbstractInput inheritors must implement retrieve() method")
class JsonInput(AbstractInput):
def retrieve(self, params):
print("JsonInput retrieve")
return json.dumps(params)
class SillyJsonInput(JsonInput):
def retrieve(self, params):
print("SillyJsonInput retrieve")
params["silly"] = True
return json.dumps(params)
Actual results:
>>> ai.retrieve(params)
ai.retrieve(params)
simple decorator
AbstractInput retrieve
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<string>", line 8, in wrapper
File "<string>", line 22, in retrieve
NotImplementedError: AbstractInput inheritors must implement retrieve() method
>>> ji.retrieve(params)
ji.retrieve(params)
JsonInput retrieve
'{"happy": "go lucky", "angry": "as a wasp"}'
Desired results:
>>> ai.retrieve(params)
ai.retrieve(params)
simple decorator
AbstractInput retrieve
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<string>", line 8, in wrapper
File "<string>", line 22, in retrieve
NotImplementedError: AbstractInput inheritors must implement retrieve() method
>>> ji.retrieve(params)
simple decorator
ji.retrieve(params)
JsonInput retrieve
'{"happy": "go lucky", "angry": "as a wasp"}'
Yes, the use of a metaclass to force a decorator on an specific method, as you put in your own answer is correct. With a few changes, it can be made so that the method to be decorated is not fixed - for example, an attribute set in the decorated function can be used as a "mark" that such a decorator should be forced upon overriding methods.
Besides that, since Python 3.6, there is a new class level mechanism - the special method __init_subclass__, which has the specific objective of diminishing the need for metaclasses. Metaclasses can be complicated, and if your class hierarchy needs to combine more than one metaclass, you may be in for some headache.
The __init_subclass__ method is placed on the base class, and it is called once each time a child class is created. The wrapping logic can be put there.
Basically, you can just modify your decorator to put the mark I mentioned above, and add this class in your inheritance hierarchy - it can be put as mixin class in multiple inheritance, so it can be reused for various class-trees, if needed:
def simple_decorator(func):
def wrapper(*args, **kwargs):
print("check cache")
rt = func(*args, **kwargs)
print("set cache")
return rt
wrapper.inherit_decorator = simple_decorator
return wrapper
class InheritDecoratorsMixin:
def __init_subclass__(cls, *args, **kwargs):
super().__init_subclass__(*args, **kwargs)
decorator_registry = getattr(cls, "_decorator_registry", {}).copy()
cls._decorator_registry = decorator_registry
# Check for decorated objects in the mixin itself- optional:
for name, obj in __class__.__dict__.items():
if getattr(obj, "inherit_decorator", False) and not name in decorator_registry:
decorator_registry[name] = obj.inherit_decorator
# annotate newly decorated methods in the current subclass:
for name, obj in cls.__dict__.items():
if getattr(obj, "inherit_decorator", False) and not name in decorator_registry:
decorator_registry[name] = obj.inherit_decorator
# finally, decorate all methods anottated in the registry:
for name, decorator in decorator_registry.items():
if name in cls.__dict__ and getattr(getattr(cls, name), "inherit_decorator", None) != decorator:
setattr(cls, name, decorator(cls.__dict__[name]))
So, that is it - each new subclass will have its own _decorator_registry attribute, where the name of the decorated methods in all ancestors, along with which decorator to apply is annotated.
If the decorator should be used one single time for the method, and not be repeated when the overridden method performs the super() call for its ancestors (not the case when you are decorating for cache, since the super-methods won't be called) that gets trickier - but can be done.
However, it is tricky to do - as the decorator instances in the superclasses would be other instances than the decorator on the subclass - one way to pass information to then that the "decorator code for this method is already run in this chain call" is to use an instance-level marker - which should be a thread-local variable if the code is to support parallelism.
All this checking will result in quite some complicated boilerplate to put into what could be a simple decorator - so we can create a "decorator" for the "decorators" that we want to run a single time. In other wors, decoratos decorated with childmost bellow will run only on the "childmost" class, but not on the corresponding methods in the superclasses when they call super()
import threading
def childmost(decorator_func):
def inheritable_decorator_that_runs_once(func):
decorated_func = decorator_func(func)
name = func.__name__
def wrapper(self, *args, **kw):
if not hasattr(self, f"_running_{name}"):
setattr(self, f"_running_{name}", threading.local())
running_registry = getattr(self, f"_running_{name}")
try:
if not getattr(running_registry, "running", False):
running_registry.running = True
rt = decorated_func(self, *args, **kw)
else:
rt = func(self, *args, **kw)
finally:
running_registry.running = False
return rt
wrapper.inherit_decorator = inheritable_decorator_that_runs_once
return wrapper
return inheritable_decorator_that_runs_once
Example using the first listing:
class A: pass
class B(A, InheritDecoratorsMixin):
#simple_decorator
def method(self):
print(__class__, "method called")
class C(B):
def method(self):
print(__class__, "method called")
super().method()
And after pasting the listing-1 and these A=B-C class in the
interpreter, the result is this:
In [9]: C().method()
check cache
<class '__main__.C'> method called
check cache
<class '__main__.B'> method called
set cache
set cache
(the "A" class here is entirely optional and can be left out)
Example using the second listing:
# Decorating the same decorator above:
#childmost
def simple_decorator2(func):
def wrapper(*args, **kwargs):
print("check cache")
rt = func(*args, **kwargs)
print("set cache")
return rt
return wrapper
class D: pass
class E(D, InheritDecoratorsMixin):
#simple_decorator2
def method(self):
print(__class__, "method called")
class F(E):
def method(self):
print(__class__, "method called")
super().method()
And the result:
In [19]: F().method()
check cache
<class '__main__.F'> method called
<class '__main__.E'> method called
set cache
OK, it seems that I can "decorate" a method in a superclass and have the subclasses also inherit that decoration, even if the method is overwritten in the subclass, using metaclasses. In this case, I'm decorating all "retrieve" methods in AbstractInput and its subclasses with simple_decorator using a metaclass named CacheRetrieval.
def simple_decorator(func):
def wrapper(*args, **kwargs):
print("check cache")
rt = func(*args, **kwargs)
print("set cache")
return rt
return wrapper
class CacheRetrieval(type):
def __new__(cls, name, bases, attr):
# Replace each function with
# a print statement of the function name
# followed by running the computation with the provided args and returning the computation result
attr["retrieve"] = simple_decorator(attr["retrieve"])
return super(CacheRetrieval, cls).__new__(cls, name, bases, attr)
class AbstractInput(object, metaclass= CacheRetrieval):
def __init__(self, cacheparams = {'maxsize': 10, 'ttl': 300}):
self.cache = TTLCache(**cacheparams)
super().__init__()
def retrieve(self, params):
print("AbstractInput retrieve")
raise NotImplementedError("DataInput must implement retrieve() method")
class JsonInput(AbstractInput):
def retrieve(self, params):
print("JsonInput retrieve")
return json.dumps(params)
class SillyJsonInput(JsonInput):
def retrieve(self, params):
print("SillyJsonInput retrieve")
params["silly"] = True
return json.dumps(params)
I was helped by this page:
https://stackabuse.com/python-metaclasses-and-metaprogramming/
I need for an abstract class to be able to handle missing class method. I find out how to do this for instance method with __getattr__ but it's not working with class method. Is it even possible ?
I've got something like this :
class Container:
_definitions = {'class_a': 'example.class_a.ClassA',
'class_b': 'example.class_b.ClassB'}
#classmethod
def get(cls, def_id):
# import dynamicaly
parts = cls._definitions[def_id].split(".")
module_name = ".".join(parts[:-1])
class_name = parts[-1]
__import__(module_name)
return class_name
class AbstractClass:
#classmethod
def __getattr__(cls, name, *args):
def missing_method():
result = re.search("^(?P<class_id>[a-z0-9._-]*)$", name)
if result:
return Container.get(result.group('class_id'))
raise RuntimeError("class method '{}' missing from class".format(name))
return missing_method
class ClassA(AbstractClass):
#classmethod
def method(cls):
classb = cls.class_a()
classb.method()
class ClassB(AbstractClass):
#classmethod
def method(cls):
print('Hello world')
Each class in my Container must extend AbstractClass to be able to magically call any class in it, like I do un classA.method().
It's working if I do not use class method, but my purpose is that every class into my container can not be instantiate cause it will be useless for my needs. It's a kind of Singleton pattern.
Is it more understandable ?
I'm trying to redifine the __getattr__ method, in a function.
I've tried this code:
def foo():
print("foo")
def addMethod(obj, func):
setattr(obj, func.__name__, types.MethodType(func, obj))
def __getattr__(obj, name):
print(name)
addMethod(foo, __getattr__)
foo.bar
but I get this error:
Traceback (most recent call last):
File blah blah, line 14, in <module>
foo.bar
AttributeError: 'function' object has no attribute 'bar'
I've inspected the "foo" function and it really has the method bounded to it, but it seems that if you set it dynamically, __getattr__ won't get called.
If I do the same thing to a class, if I set the __getattr__ using my "addMethod" function, the instance won't call the __getattr__ too!, so the problem must be the dynamic call!
BUT if I put the __getattr__ in the definition of the class, it will work, obviously.
The question is, how can I put the __getattr__ to the function to make it work? I can't put it from the beginning because it's a function!, I don't know how to do this
Thanks!
Well, you don't. If you want attributes, make a class. If you want instances to be callable, define __call__ for it.
class foo:
def __call__(self):
print("foo")
def __getattr__(self, name):
print(name)
f = foo()
f() # foo
f.bar # bar
Your problem is that Python will only look up __xxx__ magic methods on the class of an object, not the object itself. So even though you are setting __getattr__ on the instance of foo, it will never be automatically called.
That is also the problem you are having with class foo:
class foo():
pass
def addMethod(obj, func):
setattr(obj, func.__name__, func)
def __getattr__(obj, name):
print(name)
addMethod(foo, __getattr__)
foo.bar
Here Python is looking up bar on the class object foo which means . . .
the __getattr__ method you added to foo is not getting called, instead Python is looking at foo's metaclass, which is type.
If you change that last line to
foo().bar
and get an instance of the class of foo, it will work.