Python - Wrapping object with a class and preserving its methods - python-3.x

I have a scenario where I can accept different objects (classes or functions) and I wrap them with a class to enhance their capabilities and I want to still be able to access their native methods (that I didn't write).
With the __call__ function, I can easily pass the arguments to the native __call__ function, but how can I still route the functions I don't know beforehand to their native functions?
For example:
import modules.i.didnt.write as some_classes
import modules.i.didnt.write2 as some_functions
class Wrapper:
def __init__(self, module, attr_name):
self.obj = getattr(module, attr_name)
self.extra_args = ....
def __call__(self, *args, **kwargs):
return self.obj(*args, **kwargs)
def added_functionality(self, ...):
....
wrapped_class = Wrapper(some_classes, 'class_a')
wrapped_function = Wrapper(some_functions, 'func_a')
wrapped_class(a=1, b=2)
wrapped_function(a=10, b=20)
wrapped_class.native_method(c=10) # <--------------
In this example, the last one will fail, because native_method does not exist in the Wrapper class, but it exists in the class_a original structure.
How can I support the native functionality while adding my own?
Am I taking the wrong approach? Is there a better way to do it? Is it even possible?

Related

Using parent class method and child instance, call the most concrete method (child's method)

Consider this code:
class Interface():
def my_method(self, a):
print(f"Interface: {a=}")
class ChildA(Interface):
def my_method(self, a):
print(f"ChildA test: {a=}")
method = Interface.my_method
my_instance = ChildA()
method(my_instance, a=4)
This prints Interface: a=42. How can I make it print ChildA: a=42?
Obviously I do not want to use ChildA.my_method, I want a solution that works for any new child classes that are implemented in the future.
The best you can do is just write a helper function:
def methodcaller(name):
def inner(obj, *args, **kwargs):
return getattr(obj, name)(*args, **kwargs)
return inner
method = methodcaller('my_method')
Knowing the method name and the fact that the object will be an instance of Interface doesn't let you perform any part of the method resolution in advance, unfortunately. You would need to know the concrete class in advance. (Also, operator.methodcaller doesn't work, because it requires all method arguments to be provided at methodcaller creation time.)

MetaClass in Python

I am trying to create a Meta-Class for my Class.
I have tried to print information about my class in meta-class
Now I have created two objects of my class
But Second object gets created without referencing my Meta-Class
Does Meta Class gets called only once per Class??
Any help will be appreciated
Thanks
class Singleton(type):
def __new__(cls,name,bases,attr):
print (f"name {name}")
print (f"bases {bases}")
print (f"attr {attr}")
print ("Space Please")
return super(Singleton,cls).__new__(cls,name,bases,attr)
class Multiply(metaclass = Singleton):
pass
objA = Multiply()
objB = Multiply()
print (objA)
print (objB)
Yes - the metaclass's __new__ and __init__ methods are called only when the class is created. After that, in your example, the class will be bound to theMultiply name. In many aspects, it is just an object like any other in Python. When you do objA = Multiply() you are not creating a new instance of type(Multiply), which is the metaclass - you are creating a new instance of Multiply itself: Multiply.__new__ and Multiply.__init__ are called.
Now, there is this: the mechanism in Python which make __new__ and __init__ be called when creating an instance is the code in the metaclass __call__ method. That is, just as when you create any class with a __call__ method and use an instance of it with the calling syntax obj() will invoke type(obj).__call__(obj), when you do Multiply() what is called (in this case) is Singleton.__call__(Multiply).
Since it is not implemented, Singleton's superclass, which is type __call__ method is called instead - and it is in there that Multiply.__new__ and __init__ are called.
That said, there is nothing in the code above that would make your classes behave as "singletons". And more importantly you don't need a metaclass to have a singleton in Python. I don't know who invented this thing, but it keeps circulating around.
First, if you really need a singleton, all you need to do is to write a plain class, no special anything, create your single instance, and document that the instance should be used. Just as people use None - no one keeps getting a reference to Nonetype and keep calling it to get a None reference:
class _Multiply:
...
# document that the code should use this instance:
Multiply = _Multiply()
second Alternatively, if your code have a need, whatsoever, for instantiating the class that should be a singleton where it will be used, you can use the class' __new__ method itself to control instantiation, no need for a metaclass:
class Multiply:
_instance = None
def __new__(cls):
if not cls._instance:
cls._instance = super().__new__(cls)
# insert any code that would go in `__init__` here:
...
...
return cls._instance
Third just for demonstration purposes, please don't use this, the metaclass mechanism to have singletons can be built in the __call__ method:
class Singleton(type):
registry = {}
def __new__(mcls,name,bases,attr):
print(f"name {name}")
print(f"bases {bases}")
print(f"attr {attr}")
print("Class created")
print ("Space Please")
return super(Singleton,mcls).__new__(cls,name,bases,attr)
def __call__(cls, *args, **kw):
registry = type(cls).registry
if cls not in registry:
print(f"{cls.__name__} being instantiated for the first time")
registry[cls] = super().__call__(*args, **kw)
else:
print(f"Attempting to create a new instance of {cls.__name__}. Returning single instance instead")
return registry[cls]
class Multiply(metaclass = Singleton):
pass

Singleton only containing a dict, statically accesible

I want to provide a class, that contains a dictionary, that should be accessible all over my package. This class should be initialized by another class, which is a database connector.
From the database I retrieve the mapping but I want to do this only once on initialization of the database connector. Furthermore this mapping than should be availabe to all other modules in my package without getting the instance of the database connector passed through all function calls.
I thought about using a Singleton pattern and tried some stuff from this SO post. But I can't find a working solution.
I tried it this way with metaclass:
The mapping class:
class Singleton(type):
_instances = {}
def __call__(cls, *args, **kwargs):
if cls not in cls._instances:
cls._instances[cls] = super(Singleton, cls).__call__(*args, **kwargs)
return cls._instances[cls]
class CFMapping(metaclass=Singleton):
def __init__(self, cf_mapping: dict):
self._cf_mapping = cf_mapping
#classmethod
def get_cf_by_name(cls, name: str) -> str:
return cls._cf_mapping.get(name)
The database connector
class DBConnector:
def __init__(....):
# some init logic, connection to db etc...
self.cf_mapping = self.get_cf_mapping() # just returning a dict from a rest call
Now i expect the mapping to be accesible via the DBConnector instance.
But at other scripts where I don't have this instance I would like to access the mapping just over a static/class method like this:
CFMapping.get_cf_by_name("someName")
# leads to AttributeError as CFMapping has no attribute _cf_mapping
Is there a way to get this construct to work the way I want it to or is there some better approach for some problem like this?

Change class parent dynamically from attributes python 3

I want the new class to dynamically inherit from different parents depending on an attribute given when creating an instance. So far I've tried something like this:
class Meta(type):
chooser = None
def __call__(cls, *args, **kwars):
if kwargs['thingy'] == 'option':
Meta.choose = option
return super().__call__(*args, **kwargs)
def __new__(cls, name, parents, attrs):
if Meta.choose == option:
bases = (parent1)
return super().__new__(cls, name, parents, attrs)
It doesn't work, is there a way that, depending on one of the parameters of the instance, I can dynamically choose a parent for the class?
First, lets fix a trivial mistake in the code, and then dig into the "real problem": the bases parameter needs to be a tuple. when you do bases=(option) the right hand side is not a tuple - it is merely a parenthesized expression that will be resolved and passed on as the non-tuple option.
Change that to bases=(option,) whenever you need to create a tuple for the bases.
The second mistake is more conceptual, and is probably why you didn't get it to work across varios attempts: the __call__ method of the metaclass is not something one usually fiddles with. To keep a long history short, the __call__ method of a __class__ is what is called to coordinate the calling of the __new__ and __init__ methods of instances of that class - that is made by Python automatically, and it is the __call__ from type that has this mechanism. When you transpose that to your metaclass you might realise that the __call__ method of your metaclass is not used when the __new__ and __init__ methods of your metaclass itself are about to be called (when a class is defined). In other words - the __call__ that is used is on the "meta-meta" class (which is again, type).
The __call__ method you wrote will instead be used by the instances of your custom classes when they are created (and this is what you intended), and will have no effect on class creation as it won't invoke the metaclass' __new__ - just the class __new__ itself. (and this is not what you intended).
So, what you need is, from inside __call__ not to call super().__call__ with the same arguments you received: that will pass cls to type's call, and the bases for cls are baked in when the metaclass __new__ was run - it just happens when the class body itself is declared.
You must have, in this __call__ dynamically create a new class, or use one of a pre-filled in table, and them pass that dynamically created class to the type.__call__.
But - in the end of the day, one can perceive that all this can be done with a factory function, so there is no need to create this super-complicated meta-class mechanism for that - and other Python tools such as linters and static analysers (as embbeded in an IDE you or your colleagues may be using) might work better with that.
Solution using factory function:
def factory(cls, *args, options=None, **kwargs):
if options == 'thingy':
cls = type(cls.__name__, (option1, ), cls.__dict__)
elif options = 'other':
...
return cls(*args, **kwargs)
If you don't want to create a new class on every call, but want to share a couple of pre-existing classes with common bases, just create a cache-dictionary, and use the dict's setdefault method:
class_cache = {}
def factory(cls, *args, options=None, **kwargs):
if options == 'thingy':
cls = class_cache.setdefault((cls.__name__, options),
type(cls.__name__, (option1, ), cls.__dict__))
elif options = 'other':
...
return cls(*args, **kwargs)
(The setdefault method will store the second argument on the dict if the key (name, options) does not exist yet).
using a metaclass:
updated
After breakfast :-) I came up with this:
make your metaclass __new__ inject a __new__ function on the created class itself that will either create a new class, with the desired bases dynamically or used a cached one for the same options. But unlike the other example, use a metaclass to anotate the original parameters to the class creation to create the derived class:
parameter_buffer = {}
derived_classes = {}
class Meta:
def __new__(metacls, name, bases, namespace):
cls = super().__new__(metacls, name, bases, namespace)
parameter_buffer[cls] = (name, bases, namespace)
def __new__(cls, *args, option=None, **kwargs):
if option is None:
return original_new(cls, *args, **kwargs)
name, bases, namespace = parameter_buffer[cls]
if option == 'thingy':
bases = (option1,)
elif option== 'thingy2':
...
if not (cls, bases) in derived_classes:
derived_classes[cls, bases] = type(name, bases, namespace)
return derived_classes[cls, bases](*args, **kwargs)
cls.__new__ = __new__
return cls
To keep the example short, this simply overwrites any explict __new__ method on the class that uses this metaclass. Also, the derived classes created this way are not themselves bearer of the same capability since they are created calling type and the metaclass is discarded in the process. Both things could be taken care off by writing more careful code but it would become complicated as an example here.

Show docstrings on every function call

Let's say I have a code like this:
class NewTestCase(unittest.TestCase, CommonMethods):
def setUp(self):
self.shortDescription()
def test_01_sample test(self):
"""Testing something"""
self.create_account(self.arg['account'])
assert ...
...
class CommonMethods():
def create_account(self, account):
"""Creating account"""
...
if __name__ == '__main__':
unittest.main(verbosity=2, warnings='ignore')
I want to show the docstrings of all methods defined / created by me ('Testing something' and 'Creating account'), but the execution shows 'Testing something' only. Any tip?
Maybe there is an option for that in the unittest module, but I doubt it; otherwise, how would that module distinguish between your methods and functions and all sorts of library functions?
What you could do is to use another function to modify the existing functions to print their Docstring and/or other useful information whenever they are called. You could make this a decorator, or just call the function manually before running the tests.
This one should 'verbosify' all the methods of a given class (only slightly tested!), and you could make similar ones for individual functions or entire modules.
def verbosify(clazz):
for name in dir(clazz):
attr = getattr(clazz, name)
if not name.startswith("__") and callable(attr):
def attr_verbose(*args, **kwargs):
print("Calling", name, args, kwargs)
print(attr.__doc__)
return attr(*args, **kwargs)
setattr(clazz, name, attr_verbose)
Just call verbosify(CommonMethods) in your main block.

Resources