Why does my singleton implementation not work in Python? - python-3.x

The below is an attempt to implement Singleton in python 3 but it doesn't appear to work. When I instantiate, the _instance is always None and both instances (a and b) have different addresses in memory - why?
class Singleton(object):
_instance = None
def __call__(self, *args, **kwargs):
if self._instance is None:
self._instance = super().__call__(*args, **kwargs)
return self._instance
def __init__(self, *args, **kwargs):
print(self._instance, self)
a = Singleton()
b = Singleton()
The output is:
(None, <__main__.Singleton object at 0x7f382956c190>)
(None, <__main__.Singleton object at 0x7f382956c410>)

The __call__ method is not what you think it is. It is meant to make instances of classes callable like functions:
class A:
def __call__(self):
print("called")
a = A() # prints nothing
a() # prints "called"
What you are looking for is the __new__ method:
Called to create a new instance of class cls.
You can write the singleton like this (very similar to what you wrote):
class Singleton(object):
_instance = None
def __new__(cls, *args, **kwargs):
if cls._instance is None:
cls._instance = super(Singleton, cls).__new__(cls)
return cls._instance
def __init__(self, *args, **kwargs):
print(self._instance, self)
a = Singleton()
b = Singleton()
The output is now:
<__main__.Singleton object at 0x7f149bf3cc88> <__main__.Singleton object at 0x7f149bf3cc88>
<__main__.Singleton object at 0x7f149bf3cc88> <__main__.Singleton object at 0x7f149bf3cc88>

Related

Some basic problems on the construction of python3 base classes?

I'm a beginner. I've defined the following classes. They look the same. I'm a little dizzy. I don't know what the difference is?
My purpose is to define a base class. Why use object? I think B is what I want.
Do I have to use the super () function to return? What do CLS, * args, * kwargs stand for?
python code :
class A(object):
def __new__(cls, *args, **kwargs):
print("A.__new__called")
return super(A, cls).__new__(cls, *args, **kwargs)
class B:
def __new__(cls, *args, **kwargs):
print("B.__new__called")
return super(B, cls).__new__(cls, *args, **kwargs)
class C():
def __new__(cls, *args, **kwargs):
print("C.__new__called")
return super(C, cls).__new__(cls, *args, **kwargs)
class D():
def __new__(cls, *args, **kwargs):
print("D.__new__called")
return super(D, cls).__new__(cls)
class E():
def __new__(cls):
print("E.__new__called")
return super(E, cls).__new__(cls)
class F():
print("F.__new__called")
a = A()
b = B()
c = C()
d = D()
e = E()
f = F()
result :
F.__new__called
A.__new__called
B.__new__called
C.__new__called
D.__new__called
E.__new__called
Few comments:
object is inherited by default in python3.x onwards and thus no need for this explicit inheritance as required in python2.x
super() also doesn't need any arguments in py3.x.
class A:
def __init__(self):
print("Here in init")
def __new__(cls,*args,**kargs):
print("Here in new")
super().__new__(cls,**kargs)
new is always class method and called before an object is instantiated. For example, if you create the object above, following print statement will be in that order: first what is in new and then init. So new is used if you want to do something before an object is instantiated. super() calls the inherited base class (default object in this case) to create the object. *args and **kargs are just list of arguments and key arguments to pass when you create the object.
obj = A()
It will print following:
Here in new
Here in init

I need to make class with special parameters

How can i make class which takes func parameter on initialization, which is a callable that can be passed?
Then object of this class may be called - callable passed on init will be called with passed parameters.
def test_callable_obj(self):
callable_obj = homework_lecture1.CallableInstances(lambda x: x + 8)
self.assertEqual(callable_obj(10), 18)
class CallableInstances:
You need to create a class whose __init__ method accepts a function, and whose __call__ simply calls that function.
def func(*args, **kwargs):
print('called func with args', args, ' and kwargs', kwargs)
class Foo:
def __init__(self, f):
self._f = f
def __call__(self, *args, **kwargs):
return self._f(*args, **kwargs)
foo = Foo(func)
foo(1, 2, c=3)
Outputs
called func with args (1, 2) and kwargs {'c': 3}

Why doesn't __get__ method of metaclass get called?

I got class Op:
class Pipeable(type):
def __get__(self, instance, owner):
def pipe_within(*args, **kwargs):
return self(*args, op=instance, **kwargs)
print('piping...')
return pipe_within
class Op(metaclass=Pipeable):
def __init__(self, op=None):
if op is not None:
print('piped!')
self.op = op
self.__dict__[type(self).__name__] = type(self)
I expect Op class itself to work as descriptor, because its metaclass has __get__ method, but the code
op = Op().Op()
doesn't invoke Op.__get__. Why?
It is hard to tell what you really want there. But a metaclass that would add a property to itself at every new class maybe works better for whatever you want.
As far as I can understand your code, older classes won't be populated with references to the newer ones, as you create new instances (that in turn, get the reference for others).
On a second though, dinamically creating properties inisde __new__ seems hacky - but you can just implement the metaclass __getattr__ and __dir__ methods for much less convoluted code:
The simple version works for classes, but not for their instances - because instances do not trigger the __getattr__ on the metaclass:
class Pipeable(type):
_classes = {}
def __new__(metacls, name, bases, namespace, **kwds):
cls = type.__new__(metacls, name, bases, namespace)
metacls._classes[name] = cls
return cls
def __getattr__(cls, attr):
classes = cls.__class__._classes
if attr not in classes:
raise AttributeError
def pipe_within(*args, **kwargs):
return cls(*args, op=classes[attr], **kwargs)
print('piping...')
return pipe_within
def __dir__(cls):
regular = super().__dir__()
return sorted(regular + list(cls.__class__._classes.keys()))
class Op(metaclass=Pipeable):
def __init__(self, op=None):
if op is not None:
print('piped!')
self.op = op
Op.Op()
(Note as well, that over time I picked this parameter naming convention to use on metaclasses - as most their methods take the class created with them in place of what is the "self" in ordinary classes, I find this naming easier to follow. It is not mandatory, not necessarily "correct", though)
But then, we can make it work for instances by creating the __dir__ and __getattr__ directly on the created classes as well. The catch with that is that the class you are creating already have a __getattr__ or custom __dir__, even in their super-classes, those have to be wrapped. And then, we don't want to re-wrap our own __dir__ and __getattr__, so some extra-care:
class Pipeable(type):
_classes = {}
def __new__(metacls, name, bases, namespace, **kwds):
cls = type.__new__(metacls, name, bases, namespace)
metacls._classes[name] = cls
original__getattr__ = getattr(cls, "__getattr__", None)
if hasattr(original__getattr__, "_metapipping"):
# Do not wrap our own (metaclass) implementation of __getattr__
original__getattr__ = None
original__dir__ = getattr(cls, "__dir__") # Exists in "object", so it is always found.
# these two functions have to be nested so they can get the
# values for the originals "__getattr__" and "__dir__" from
# the closure. These values could be set on the class created, alternatively.
def __getattr__(self, attr):
if original__getattr__:
# If it is desired that normal attribute lookup have
# less precedence than these injected operators
# move this "if" block down.
try:
value = original__getattr__(self, attr)
except AttributeError:
pass
else:
return value
classes = self.__class__.__class__._classes
if attr not in classes:
raise AttributeError
def pipe_within(*args, **kwargs):
return cls(*args, op=classes[attr], **kwargs)
print('piping...')
return pipe_within
__getattr__._pipping = True
def __dir__(self):
regular = original__dir__(self)
return sorted(regular + list(self.__class__.__class__._classes.keys()))
__dir__.pipping = True
if not original__getattr__ or not hasattr(original__getattr__, "_pipping"):
cls.__getattr__ = __getattr__
if not hasattr(original__dir__, "_pipping"):
cls.__dir__ = __dir__
return cls
def __getattr__(cls, attr):
classes = cls.__class__._classes
if attr not in classes:
raise AttributeError
def pipe_within(*args, **kwargs):
return cls(*args, op=classes[attr], **kwargs)
print('piping...')
return pipe_within
__getattr__._metapipping = True
def __dir__(cls):
regular = super().__dir__()
return sorted(regular + list(cls.__class__._classes.keys()))
class Op(metaclass=Pipeable):
def __init__(self, op=None):
if op is not None:
print('piped!')
Op().Op()
So, this ended up being lengthy - but it "does the right thing", by ensuring all classes and instances in the hierarchy can see each other, regardless of creation order.
Also, what make up for the complexity is correctly wrapping other possible customizations of __getattr__ and __dir__ in the class hierarchy - if you don't get any customization of those, this can be an order of magnitude simpler:
class Pipeable(type):
_classes = {}
def __new__(metacls, name, bases, namespace, **kwds):
cls = type.__new__(metacls, name, bases, namespace)
metacls._classes[name] = cls
def __getattr__(self, attr):
classes = self.__class__.__class__._classes
if attr not in classes:
raise AttributeError
def pipe_within(*args, **kwargs):
return cls(*args, op=classes[attr], **kwargs)
print('piping...')
return pipe_within
def __dir__(self):
regular = original__dir__(self)
return sorted(regular + list(self.__class__.__class__._classes.keys()))
cls.__getattr__ = __getattr__
cls.__dir__ = __dir__
return cls
def __getattr__(cls, attr):
classes = cls.__class__._classes
if attr not in classes:
raise AttributeError
def pipe_within(*args, **kwargs):
return cls(*args, op=classes[attr], **kwargs)
print('piping...')
return pipe_within
def __dir__(cls):
regular = super().__dir__()
return sorted(regular + list(cls.__class__._classes.keys()))
To get into work, descriptor must be class attribute, not that of instance.
This code does what was desired.
class Pipeable(type):
_instances = {}
def __new__(cls, name, bases, namespace, **kwds):
namespace.update(cls._instances)
instance = type.__new__(cls, name, bases, namespace)
cls._instances[name] = instance
for inst in cls._instances:
setattr(inst, name, instance)
return instance
def __get__(self, instance, owner):
def pipe_within(*args, **kwargs):
return self(*args, op=instance, **kwargs)
print('piping...')
return pipe_within
class Op(metaclass=Pipeable):
def __init__(self, op=None):
if op is not None:
print('piped!')
self.op = op
Op().Op()

Apply decorator to all method of sub classes for timeit

I have a method decorator looking like
def debug_run(fn):
from functools import wraps
#wraps(fn)
def wrapper(self, *args, **kw):
# log some stuff
# timeit fn
res = fn(self, *args, **kw)
return wrapper
Right now I used to use it apply on each method that I want to debug. Now i'm trying to apply to all class method using a class decorator looking like.
Rather doing
class A():
#debug_run
def f(self):
pass
I do
#decallmethods(debug_run)
class A():
def f(self):
pass
def decallmethods(decorator):
def dectheclass(cls):
for name, m in inspect.getmembers(cls, inspect.ismethod):
if name in getattr(cls, 'METHODS_TO_INSPECT', []):
setattr(cls, name, decorator(m))
return cls
return dectheclass
Trying to apply to decorator to the base class, not working as expected. no log to the console. Now i wonder if this approach is the good or I should used something else (apply the debug decorator to selected method from base class to all sub classes).
[EDIT]
Finally found why no logs were printed
Why is there a difference between inspect.ismethod and inspect.isfunction from python 2 -> 3?
Here a complete example reflecting my code
import inspect
import time
import logging as logger
from functools import wraps
logger.basicConfig(format='LOGGER - %(asctime)s %(message)s', level=logger.DEBUG)
def debug_run(fn):
#wraps(fn)
def wrapper(self, *args, **kw):
logger.debug(
"call method %s of instance %s with %r and %s "
% (fn.__name__, self, args, kw))
time1 = time.time()
res = fn(self, *args, **kw)
time2 = time.time()
logger.debug(
"%s function %0.3f ms" % (fn, (time2-time1)*1000.0))
return res
return wrapper
def decallmethods(decorator):
def dectheclass(cls):
for name, m in inspect.getmembers(
cls, predicate=lambda x: inspect.isfunction(x) or inspect.ismethod(x)):
methods_to_inspect = getattr(cls, 'METHODS_TO_INSPECT', [])
if name in methods_to_inspect:
setattr(cls, name, decorator(m))
return cls
return dectheclass
class B(object):
METHODS_TO_INSPECT = ["bfoo1", "bfoo2", "foo"]
def __str__(self):
return "%s:%s" % (repr(self), id(self))
def bfoo1(self):
pass
def bfoo2(self):
pass
def foo(self):
pass
def run(self):
print("print - Base run doing nothing")
class C(object):
pass
#decallmethods(debug_run)
class A(B, C):
METHODS_TO_INSPECT = ["bfoo1", "bfoo2", "foo", "run"]
def foo(self):
print("print - A foo")
def run(self):
self.bfoo1()
self.bfoo2()
self.foo()
a = A()
b = B()
a.run()
b.run()
In this case applying decallmethods to B, will not affect the A so i must to apply to both A and B thus to all sub classes of B.
It is possible to have such mechanism that permit to apply decallmethods to all sub classes methods ?
look at this:
How can I decorate all functions of a class without typing it over and over for each method added? Python
delnan has a good answer,
only add this rule to his answer
if name in getattr(cls, 'METHODS_TO_INSPECT', []):

How to implement a iterator or generator for my Delegate class?

I implemented a Delegate class in Python 3, which wraps a function object in a object instance. It's possible to register multiple function objects on one delegate (in .NET terminology it's a MultiCastDelegate). Assumed all registered functions accept the same parameters, it's possible to invoke the delegate and call all functions at once.
Delegate implementation:
class Delegate:
def __init__(self, *funcs):
self.__invocationList__ = []
for func in funcs:
self.__invocationList__.append(func)
def __iadd__(self, func):
self.__invocationList__.append(func)
return self
def __isub__(self, func):
self.__invocationList__.remove(func)
return self
def __call__(self, *args, **kwargs):
if (len(self.__invocationList__) == 1):
return self.__invocationList__[0](*args, **kwargs)
else:
res = {}
for func in self.__invocationList__:
res[func] = func(*args, **kwargs)
return res
#property
def isMulticast(self):
return (len(self.__invocationList__) > 1)
Usage examples:
def test1():
return 5
def test2(a, b):
return a + b
def test3(a, b):
return a * b + 15
delegate = Delegate(test1)
result = delegate()
print("test1: {0}".format(result))
delegate = Delegate(test2)
result = delegate(3, 8)
print("test2: {0}".format(result))
delegate += test3
results = delegate(2, 9)
print("test2: {0}".format(results[test2]))
print("test3: {0}".format(results[test3]))
I would like to implement an iterator or generator on this class, so it's possible to use the delegate in for loops.
How could it look like?
# loop over every result from delegate, call with parameters 4 and 18
for result in delegate(4, 18):
print("function={0} result={1}".format(*result))
The iterators __next__() method should return a tuple consisting of the function-object and return value.
What I tried so far:
class Delegate:
# ...
# see code from above
def __iter__(self):
print("Delegate.__iter__():")
class iter:
def __init__(self2, *args, **kwargs):
print(str(args))
self2.__args = args
self2.__kwargs = kwargs
self2.__index = 0
def __iter__(self2):
return self2
def __next__(self2):
if (self2.__index == len(self.__invocationList__)):
raise StopIteration
func = self.__invocationList__[self2.__index]
self2.__index += 1
return func(*self2.__args, **self2.__kwargs)
return iter()
Because the constructor method is already in use by the Delegate creation itself, I implemented the iterator as a nested class. But unfortunately, I can not pass the call parameters *args and **kwargs to the iterator.
So my questions:
Is it possible and wise the implement a iterator / generator pattern for delegates?
What should I change to get it working?
I just tried to implement the iterator pattern. If it works, I would like to upgrade it to a generator - if possible :)
I'm not familiar with this, but I gave it a shot. It is not well tested, but it will help you on the way to solve your task. Here is the code:
class Delegate:
class IterDelegate:
def __init__(this, invocationList, *args, **kwargs):
this.__args = args
this.__kwargs = kwargs
this._invocationList = invocationList
def __iter__(this):
this.__index = 0
return this
def __next__(this):
if this.__index < len(this._invocationList):
func = this._invocationList[this.__index]
this.__index += 1
return (func.__name__, func(*this.__args, **this.__kwargs))
raise StopIteration
def __init__(self, func):
if (type(func) == 'list'):
self._invocationList = func
else:
self._invocationList = [func]
def __call__(self, *args, **kwargs):
return self.IterDelegate(self._invocationList, *args, **kwargs)
def __iadd__(self, func):
self._invocationList.append(func)
return self
def __isub__(self, func):
self._invocationList.remove(func)
return self
def test2(a, b):
return a + b
def test1(*args):
return 6
delegate = Delegate(test2)
delegate += test1
results = delegate(2,3)
for r in results:
print("function={0} result={1}".format(*r))
This will give the results
function=test2 result=5
function=test1 result=6

Resources