I want the new class to dynamically inherit from different parents depending on an attribute given when creating an instance. So far I've tried something like this:
class Meta(type):
chooser = None
def __call__(cls, *args, **kwars):
if kwargs['thingy'] == 'option':
Meta.choose = option
return super().__call__(*args, **kwargs)
def __new__(cls, name, parents, attrs):
if Meta.choose == option:
bases = (parent1)
return super().__new__(cls, name, parents, attrs)
It doesn't work, is there a way that, depending on one of the parameters of the instance, I can dynamically choose a parent for the class?
First, lets fix a trivial mistake in the code, and then dig into the "real problem": the bases parameter needs to be a tuple. when you do bases=(option) the right hand side is not a tuple - it is merely a parenthesized expression that will be resolved and passed on as the non-tuple option.
Change that to bases=(option,) whenever you need to create a tuple for the bases.
The second mistake is more conceptual, and is probably why you didn't get it to work across varios attempts: the __call__ method of the metaclass is not something one usually fiddles with. To keep a long history short, the __call__ method of a __class__ is what is called to coordinate the calling of the __new__ and __init__ methods of instances of that class - that is made by Python automatically, and it is the __call__ from type that has this mechanism. When you transpose that to your metaclass you might realise that the __call__ method of your metaclass is not used when the __new__ and __init__ methods of your metaclass itself are about to be called (when a class is defined). In other words - the __call__ that is used is on the "meta-meta" class (which is again, type).
The __call__ method you wrote will instead be used by the instances of your custom classes when they are created (and this is what you intended), and will have no effect on class creation as it won't invoke the metaclass' __new__ - just the class __new__ itself. (and this is not what you intended).
So, what you need is, from inside __call__ not to call super().__call__ with the same arguments you received: that will pass cls to type's call, and the bases for cls are baked in when the metaclass __new__ was run - it just happens when the class body itself is declared.
You must have, in this __call__ dynamically create a new class, or use one of a pre-filled in table, and them pass that dynamically created class to the type.__call__.
But - in the end of the day, one can perceive that all this can be done with a factory function, so there is no need to create this super-complicated meta-class mechanism for that - and other Python tools such as linters and static analysers (as embbeded in an IDE you or your colleagues may be using) might work better with that.
Solution using factory function:
def factory(cls, *args, options=None, **kwargs):
if options == 'thingy':
cls = type(cls.__name__, (option1, ), cls.__dict__)
elif options = 'other':
...
return cls(*args, **kwargs)
If you don't want to create a new class on every call, but want to share a couple of pre-existing classes with common bases, just create a cache-dictionary, and use the dict's setdefault method:
class_cache = {}
def factory(cls, *args, options=None, **kwargs):
if options == 'thingy':
cls = class_cache.setdefault((cls.__name__, options),
type(cls.__name__, (option1, ), cls.__dict__))
elif options = 'other':
...
return cls(*args, **kwargs)
(The setdefault method will store the second argument on the dict if the key (name, options) does not exist yet).
using a metaclass:
updated
After breakfast :-) I came up with this:
make your metaclass __new__ inject a __new__ function on the created class itself that will either create a new class, with the desired bases dynamically or used a cached one for the same options. But unlike the other example, use a metaclass to anotate the original parameters to the class creation to create the derived class:
parameter_buffer = {}
derived_classes = {}
class Meta:
def __new__(metacls, name, bases, namespace):
cls = super().__new__(metacls, name, bases, namespace)
parameter_buffer[cls] = (name, bases, namespace)
def __new__(cls, *args, option=None, **kwargs):
if option is None:
return original_new(cls, *args, **kwargs)
name, bases, namespace = parameter_buffer[cls]
if option == 'thingy':
bases = (option1,)
elif option== 'thingy2':
...
if not (cls, bases) in derived_classes:
derived_classes[cls, bases] = type(name, bases, namespace)
return derived_classes[cls, bases](*args, **kwargs)
cls.__new__ = __new__
return cls
To keep the example short, this simply overwrites any explict __new__ method on the class that uses this metaclass. Also, the derived classes created this way are not themselves bearer of the same capability since they are created calling type and the metaclass is discarded in the process. Both things could be taken care off by writing more careful code but it would become complicated as an example here.
Related
I cant have 2 init methods in one class because of function overloading. However, why is it possible that when initializing a subclass, im able to define a new __init__ method, and use the super().__init__ method or the parentclass init method within the subclass __init__ method. i'm just a little confused by the concept of 2 __init__ methods functioning at the same time
class Employee:
emps = 0
def __init__(self,name,age,pay):
self.name = name
self.age = age
self.pay = pay
class Developer(Employee):
def __init__(self,name,age,pay,level):
Employee.__init__(self,name,age,pay)
self.level = level
I cant have 2 init methods in one class because of function overloading.
Partially true. You can't have 2 __init__ methods in the same class because the language lacks function overloading. (Libraries can partially restore a limited form of function overloading; see functools.singledispatchmethod for an example.)
i'm just a little confused by the concept of 2 init methods functioning at the same time
But you aren't trying to overload __init__. You are overriding __init__, providing a different definition for Developer than the definition it inherits from Employer. (In fact, Employer is overriding __init__ as well, using its own definition in place of the one it inherits from object.) Each class has only one definition.
In your definition of Developer.__init__, you are simply making an explicit call to the inherited method to do the initialization common to all instances of Employee, before doing the Developer-specific initialization on the same object.
Using super, you are using a form of dynamic lookup to let the method resolution order for instance of Developer decide what the "next" version of __init__ available to Developer to call. For single inheritance, the benefit is little more than avoiding the need to hard-code a reference to Employee. But for multiple inheritance, super is crucial to ensuring that all inherited methods (both the ones you know about and the ones you may not) get called, and more importantly, are called in the right order.
A full discussion of how to properly use super is beyond the scope of this question, I think, but I'll show your two classes rewritten to make the best use of super, and refer you to Python's super() considered super! for more information.
# Main rules:
# 1. *All* classes use super().__init__, even if you are only inheriting
# from object, because you don't know who will use you as a base class.
# 2. __init__ should use keyword arguments, and be prepared to accept any
# keyword arguments.
# 3. All keyword arguments that don't get assigned to your own parameters
# are passed on to an inherited __init__() to process.
class Employee:
emps = 0
def __init__(self, *, name, age, pay, **kwargs):
super().__init__(**kwargs)
self.name = name
self.age = age
self.pay = pay
class Developer(Employee):
def __init__(self, *, level, **kwargs):
super().__init__(**kwargs)
self.level = level
d1 = Developer(name="Alice", age=30, pay=85000, level=1)
To whet your appetite for the linked article, consider
class A:
def __init__(self, *, x, **kwargs):
super().__init__(**kwargs)
self.x = x
class B:
def __init__(self, *, y, **kwargs):
super().__init__(**kwargs)
self.y = y
class C1(A, B):
pass
class C2(B, A):
pass
c1 = C1(x=1, y=2)
c2 = C2(x=4, y=3)
assert c1.x == 1 and c1.y == 2
assert c2.x == 4 and c2.y == 3
The assertions all pass, and both A.__init__ and B.__init__ are called as intended when c1 and c2 are created.
The super() function is used to give access to methods and properties of a parent or sibling class
check out: https://www.geeksforgeeks.org/python-super/
I am trying to create a Meta-Class for my Class.
I have tried to print information about my class in meta-class
Now I have created two objects of my class
But Second object gets created without referencing my Meta-Class
Does Meta Class gets called only once per Class??
Any help will be appreciated
Thanks
class Singleton(type):
def __new__(cls,name,bases,attr):
print (f"name {name}")
print (f"bases {bases}")
print (f"attr {attr}")
print ("Space Please")
return super(Singleton,cls).__new__(cls,name,bases,attr)
class Multiply(metaclass = Singleton):
pass
objA = Multiply()
objB = Multiply()
print (objA)
print (objB)
Yes - the metaclass's __new__ and __init__ methods are called only when the class is created. After that, in your example, the class will be bound to theMultiply name. In many aspects, it is just an object like any other in Python. When you do objA = Multiply() you are not creating a new instance of type(Multiply), which is the metaclass - you are creating a new instance of Multiply itself: Multiply.__new__ and Multiply.__init__ are called.
Now, there is this: the mechanism in Python which make __new__ and __init__ be called when creating an instance is the code in the metaclass __call__ method. That is, just as when you create any class with a __call__ method and use an instance of it with the calling syntax obj() will invoke type(obj).__call__(obj), when you do Multiply() what is called (in this case) is Singleton.__call__(Multiply).
Since it is not implemented, Singleton's superclass, which is type __call__ method is called instead - and it is in there that Multiply.__new__ and __init__ are called.
That said, there is nothing in the code above that would make your classes behave as "singletons". And more importantly you don't need a metaclass to have a singleton in Python. I don't know who invented this thing, but it keeps circulating around.
First, if you really need a singleton, all you need to do is to write a plain class, no special anything, create your single instance, and document that the instance should be used. Just as people use None - no one keeps getting a reference to Nonetype and keep calling it to get a None reference:
class _Multiply:
...
# document that the code should use this instance:
Multiply = _Multiply()
second Alternatively, if your code have a need, whatsoever, for instantiating the class that should be a singleton where it will be used, you can use the class' __new__ method itself to control instantiation, no need for a metaclass:
class Multiply:
_instance = None
def __new__(cls):
if not cls._instance:
cls._instance = super().__new__(cls)
# insert any code that would go in `__init__` here:
...
...
return cls._instance
Third just for demonstration purposes, please don't use this, the metaclass mechanism to have singletons can be built in the __call__ method:
class Singleton(type):
registry = {}
def __new__(mcls,name,bases,attr):
print(f"name {name}")
print(f"bases {bases}")
print(f"attr {attr}")
print("Class created")
print ("Space Please")
return super(Singleton,mcls).__new__(cls,name,bases,attr)
def __call__(cls, *args, **kw):
registry = type(cls).registry
if cls not in registry:
print(f"{cls.__name__} being instantiated for the first time")
registry[cls] = super().__call__(*args, **kw)
else:
print(f"Attempting to create a new instance of {cls.__name__}. Returning single instance instead")
return registry[cls]
class Multiply(metaclass = Singleton):
pass
I have got one question: why do I need to call super().--init--() in metaclasses? Because metaclass is factory of classes, I think we don`t need to call initialization for making objects of class Shop. Or with using super().--init-- we initializing the class? (Because my IDE says, that I should call it. But without super().--init-- nothing happens, my class working without mistakes).
Can you explane me, why?
Thanks in advance!
class Descriptor:
_counter = 0
def __init__(self):
self.attr_name = f'Descriptor attr#{Descriptor._counter}'
Descriptor._counter += 1
def __get__(self, instance, owner):
return self if instance is None else instance.__dict__[self.attr_name]
def __set__(self, instance, value):
if value > 0:
instance.__dict__[self.attr_name] = value
else:
msg = 'Value must be > 0!'
raise AttributeError(msg)
class Shop():
weight = Descriptor()
price = Descriptor()
def __init__(self, name, price, weight):
self.name = name
self.price = price
self.weight = weight
def __repr__(self):
return f'{self.name}: price - {self.price} weight - {self.weight}'
def buy(self):
return self.price * self.weight
class Meta(type):
def __init__(cls, name, bases, attr_dict):
super().__init__(name, bases, attr_dict) # <- this is that func. call
for key, value in attr_dict.items():
if isinstance(value, Descriptor): # Here I rename attributes name of descriptor`s object.
value.attr_name = key
#classmethod
def __prepare__(metacls, name, bases):
return OrderedDict()
You don't "need" to - and if your code use no other custom metaclasses, not calling the metaclass'__init__.super() will work just the same.
But if one needs to combine your metaclass with another, through inheritance, without the super() call, it won't work "out of the box": the super() call is the way to ensure all methods in the inheritance chain are called.
And if at first it looks like that a metaclass is extremely rare, and combining metaclasses would likely never take place: a few libraries or frameworks have their own metaclasses, including Python's "abc"s (abstract base classes), PyQT, ORM frameworks, and so on. If any metaclass under your control is well behaved with proper super() calls on the __new__, __init__ and __call__ methods, (if you override those), what you need to do to combine both superclasses and have a working metaclass can be done in a single line:
CompatibleMeta = type("CompatibleMeta", (meta, type(OtherClassBase)), {})
This way, for example, if you want to use the mechanisms in your metaclass in a class using the ABCMeta functionalities in Python, you just do it. The __init__ method in your Meta will call the other metaclass __init__. Otherwise it would not run, and some subtle unexpectd thing would not be initialized in your classes, and this could be a very hard to find bug.
On a side note: there is no need to declare __prepare__ in a metaclass if all it does is creating an OrderedDict on a Python newer than 3.6: Since that version, dicitionaries used as the "locals()" while executing class bodies are ordered by default. Also, if another metaclass you are combining with also have a __prepare__, there is no way to make that work automatically by using "super()" - you have to check the code and verify which of the two __prepare__s should be used, or create a new mapping type with features to attend both metaclasses.
What does this code mean?
class Singleton(object):
_instances = {}
def __new__(class_, *args, **kwargs):
if class_ not in class_._instances:
class_._instances[class_] = super(Singleton, class_).__new__(class_, *args, **kwargs) # noqa E501
return class_._instances[class_]
This is a parent class for creating Singleton classes. The Singleton pattern means that there is only one instance of a class. (For example, None is the only instance of the NoneType class).
This works by creating a map of classes to instances, _instances. It has overridden the default __new__ method so that whenever someone tries to create a new instance, it either uses the existing instance from the map or stores the new instance in the map.
I have a scenario where I can accept different objects (classes or functions) and I wrap them with a class to enhance their capabilities and I want to still be able to access their native methods (that I didn't write).
With the __call__ function, I can easily pass the arguments to the native __call__ function, but how can I still route the functions I don't know beforehand to their native functions?
For example:
import modules.i.didnt.write as some_classes
import modules.i.didnt.write2 as some_functions
class Wrapper:
def __init__(self, module, attr_name):
self.obj = getattr(module, attr_name)
self.extra_args = ....
def __call__(self, *args, **kwargs):
return self.obj(*args, **kwargs)
def added_functionality(self, ...):
....
wrapped_class = Wrapper(some_classes, 'class_a')
wrapped_function = Wrapper(some_functions, 'func_a')
wrapped_class(a=1, b=2)
wrapped_function(a=10, b=20)
wrapped_class.native_method(c=10) # <--------------
In this example, the last one will fail, because native_method does not exist in the Wrapper class, but it exists in the class_a original structure.
How can I support the native functionality while adding my own?
Am I taking the wrong approach? Is there a better way to do it? Is it even possible?