I have assigned one method to combobox like this
def run(self):
GetAllLayers(self) #custom Methods
#attach index changed event / passing parametric method
self.dlg.cbLayerNamesAll.currentIndexChanged.connect(lambda arg=self: LayersValueChange(arg))
getting error here
def LayersValueChange(self):
layers = self.iface.legendInterface().layers()//here error
And Error is:
layers = self.iface.legendInterface().layers()
AttributeError:
'int' object has no attribute 'iface'
self is object but it getting it like int.
Assuming LayersValueChange is an external function and not a method of the same class, you can connect the signal like this:
self.dlg.cbLayerNamesAll.currentIndexChanged.connect(
lambda: LayersValueChange(self))
This simply ignores the parameters sent by the signal, and creates a function enclosure that allows self to be referenced later (so there's no need to use arg=self).
If you also want the index sent by the signal, you will have to change the signature of the function, and then connect the signal like this:
self.dlg.cbLayerNamesAll.currentIndexChanged.connect(
lambda index: LayersValueChange(self, index))
def LayersValueChange(self, index):
layers = self.iface.legendInterface().layers()
print(index)
However, a much better design would be to make all the functions methods of the same class. Then your code would look like this:
class MyClass(QtGui.QMainWindow):
def __init__(self, parent=None):
super(MyClass, self).__init__(parent)
...
self.dlg.cbLayerNamesAll.currentIndexChanged.connect(
self.layersValueChange)
def run(self):
self.getAllLayers()
def layersValueChange(self, index):
layers = self.iface.legendInterface().layers()
def getAllLayers(self):
...
The whole signature of your signal is: currentIndexChanged(int index) (see the doc). So the argument, your lambda is receiving is of type int and from that information Python assumes the self parameter of your LayersValueChanged is of type int.
You need to do two things:
Don’t pass self as default parameter. In fact you don’t need any default parameter, because currentIndexChanged always provides one.
Change your slot signature to correctly accept the int parameter:
def LayersValueChange(self, index):
…
Related
I am interested in patching the a classmethod called _validate in a Schema class and in a replaced fn using the value of cls and the other arguments.
For context ArrayHoldingAnyType inherits from Schema and _validate is called when it is instantiated.
When I try it with the below code, the value for cls is not a class. How do I fix the cls variable?
def test_validate_called_n_times(self):
def replacement_validate(cls, *args):
# code which will return the correct values
with patch.object(Schema, '_validate', new=replacement_validate) as mock_validate:
path_to_schemas = ArrayHoldingAnyType(['a'])
# I will check that the mock was called a certain number of times here with specific inputs
So the problem here was that the classmethod decorator was missing from replacement_validate.
This fixes it:
def test_validate_called_n_times(self):
#classmethod
def replacement_validate(cls, *args):
# code which will return the correct values
with patch.object(Schema, '_validate', new=replacement_validate) as mock_validate:
path_to_schemas = ArrayHoldingAnyType(['a'])
# I will check that the mock was called a certain number of times here with specific inputs
I have got one question: why do I need to call super().--init--() in metaclasses? Because metaclass is factory of classes, I think we don`t need to call initialization for making objects of class Shop. Or with using super().--init-- we initializing the class? (Because my IDE says, that I should call it. But without super().--init-- nothing happens, my class working without mistakes).
Can you explane me, why?
Thanks in advance!
class Descriptor:
_counter = 0
def __init__(self):
self.attr_name = f'Descriptor attr#{Descriptor._counter}'
Descriptor._counter += 1
def __get__(self, instance, owner):
return self if instance is None else instance.__dict__[self.attr_name]
def __set__(self, instance, value):
if value > 0:
instance.__dict__[self.attr_name] = value
else:
msg = 'Value must be > 0!'
raise AttributeError(msg)
class Shop():
weight = Descriptor()
price = Descriptor()
def __init__(self, name, price, weight):
self.name = name
self.price = price
self.weight = weight
def __repr__(self):
return f'{self.name}: price - {self.price} weight - {self.weight}'
def buy(self):
return self.price * self.weight
class Meta(type):
def __init__(cls, name, bases, attr_dict):
super().__init__(name, bases, attr_dict) # <- this is that func. call
for key, value in attr_dict.items():
if isinstance(value, Descriptor): # Here I rename attributes name of descriptor`s object.
value.attr_name = key
#classmethod
def __prepare__(metacls, name, bases):
return OrderedDict()
You don't "need" to - and if your code use no other custom metaclasses, not calling the metaclass'__init__.super() will work just the same.
But if one needs to combine your metaclass with another, through inheritance, without the super() call, it won't work "out of the box": the super() call is the way to ensure all methods in the inheritance chain are called.
And if at first it looks like that a metaclass is extremely rare, and combining metaclasses would likely never take place: a few libraries or frameworks have their own metaclasses, including Python's "abc"s (abstract base classes), PyQT, ORM frameworks, and so on. If any metaclass under your control is well behaved with proper super() calls on the __new__, __init__ and __call__ methods, (if you override those), what you need to do to combine both superclasses and have a working metaclass can be done in a single line:
CompatibleMeta = type("CompatibleMeta", (meta, type(OtherClassBase)), {})
This way, for example, if you want to use the mechanisms in your metaclass in a class using the ABCMeta functionalities in Python, you just do it. The __init__ method in your Meta will call the other metaclass __init__. Otherwise it would not run, and some subtle unexpectd thing would not be initialized in your classes, and this could be a very hard to find bug.
On a side note: there is no need to declare __prepare__ in a metaclass if all it does is creating an OrderedDict on a Python newer than 3.6: Since that version, dicitionaries used as the "locals()" while executing class bodies are ordered by default. Also, if another metaclass you are combining with also have a __prepare__, there is no way to make that work automatically by using "super()" - you have to check the code and verify which of the two __prepare__s should be used, or create a new mapping type with features to attend both metaclasses.
Let's say I want to set functions for each classes in module Named 'MacroMethods'. So I've set up singledispatch after seeing it in 'Fluent Python' like this:
#singledispatch
def addMethod(self, obj):
print(f'Wrong Object {str(obj)} supplied.')
return obj
...
#addMethod.register(MacroMethods.Wait)
def _(self, obj):
print('adding object wait')
obj.delay = self.waitSpin.value
obj.onFail = None
obj.onSuccess = None
return obj
Desired behavior is - when instance of class 'MacroMethods.Wait' is given as argument, singledispatch runs registered function with that class type.
Instead, it runs default function rather than registered one.
>>> Wrong Object <MacroMethods.Wait object at 0x0936D1A8> supplied.
However, type() clearly shows instance is class 'MacroMethods.Wait', and dict_keys property also contains it.
>>> dict_keys([<class 'object'>, ..., <class 'MacroMethods.Wait'>])
I suspect all custom classes I made count as 'object' type and don't run desired functions in result.
Any way to solve this problem? Entire codes are here.
Update
I've managed to mimic singledispatch's actions as following:
from functools import wraps
def state_deco(func_main):
"""
Decorator that mimics singledispatch for ease of interaction expansions.
"""
# assuming no args are needed for interaction functions.
func_main.dispatch_list = {} # collect decorated functions
#wraps(func_main)
def wrapper(target):
# dispatch target to destination interaction function.
nonlocal func_main
try:
# find and run callable for target
return func_main.dispatch_list[type(target)]()
except KeyError:
# If no matching case found, main decorated function will run instead.
func_main()
def register(target):
# A decorator that register decorated function to main decorated function.
def decorate(func_sub):
nonlocal func_main
func_main.dispatch_list[target] = func_sub
def register_wrapper(*args, **kwargs):
return func_sub(*args, **kwargs)
return register_wrapper
return decorate
wrapper.register = register
return wrapper
Used like:
#state_deco
def general():
return "A's reaction to undefined others."
#general.register(StateA)
def _():
return "A's reaction of another A"
#general.register(StateB)
def _():
return "A's reaction of B"
But still it's not singledispatch, so I find this might be inappropriate to post this as answer.
I wanted to do similar and had the same trouble. Looks like we have bumped into a python bug. Found a write-up that describes this situation.
Here is the link to the Python Bug Tracker.
Python 3.7 breaks on singledispatch_function.register(pseudo_type), which Python 3.6 accepted
I'm writing some tests for a library using pytest. I want to try a number of test cases for each function exposed by the library, so I've found it convenient to group the tests for each method in a class. All of the functions I want to test have the same signature and return similar results, so I'd like to use a helper method defined in a superclass to do some assertions on the results. A simplified version would run like so:
class MyTestCase:
function_under_test: Optional[Callable[[str], Any]] = None
def assert_something(self, input_str: str, expected_result: Any) -> None:
if self.function_under_test is None:
raise AssertionError(
"To use this helper method, you must set the function_under_test"
"class variable within your test class to the function to be called.")
result = self.function_under_test.__func__(input_str)
assert result == expected_result
# various other assertions on result...
class FunctionATest(MyTestCase):
function_under_test = mymodule.myfunction
def test_whatever(self):
self.assert_something("foo bar baz")
In assert_something, It's necessary to call __func__() on the function since assigning a function to a class attribute makes it a bound method of that class -- otherwise self will be passed through as the first argument to the external library function, where it doesn't make any sense.
This code works as intended. However, it yields the MyPy error:
"Callable[[str], Any]" has no attribute "__func__"
Based on my annotation, it's correct that this isn't a safe operation: an arbitrary Callable may not have a __func__ attribute. However, I can't find any type annotation that would indicate that the function_under_test variable refers to a method and thus will always have __func__. Am I overlooking one, or is there another way to tweak my annotations or accesses to get this working with type-checking?
Certainly, there are plenty of other ways I could get around this, some of which might even be cleaner (use an Any type, skip type checking, use a private method to return the function under test rather than making it a class variable, make the helper method a function, etc.). I'm more interested in whether there's an annotation or other mypy trick that would get this code working.
Callable only makes sure that your object has the __call__ method.
You problem is your call self.function_under_test.__func__(input_str) you should just call your function self.function_under_test(input_str)
See below your example without mypy complaints (v0.910)
from typing import Any, Callable, Optional
class MyTestCase:
function_under_test: Optional[Callable] = None
def myfunction_wrap(self, *args, **kwargs):
raise NotImplementedError
def assert_something(self, input_str: str, expected_result: Any) -> None:
if self.function_under_test is None:
raise AssertionError(
"To use this helper method, you must set the function_under_test"
"class variable within your test class to the function to be called.")
result = self.myfunction_wrap(input_str)
assert result == expected_result
# various other assertions on result...
def myfunction(a: str) -> None:
...
class FunctionATest(MyTestCase):
def myfunction_wrap(self, *args, **kwargs):
myfunction(*args, **kwargs)
def test_whatever(self):
self.assert_something("foo bar baz")
Edit1: missed the point of the questio, moved function inside a wrapper function
class Player():
def __init__():
...
def moveHandle(self, event):
self.anything = ...
box.bind("<Key>", Player.moveHandle)
The bind function sets self as the event variable and ignores/throws up an error for event. I can't find a way to pass the event argument to the correct variable and maintain self for that function, even if I use args*. I can do one or the other, but not both.
I'm probably just lacking some basic knowledge about classes to be honest, I taught them to myself and didn't do it very thoroughly.
If I made a syntax mistake, it's because of me rewriting out the code incorrectly; in my program, the code works until the variables get passed.
the problem is that you are trying to use an instance method as a class method.
consider the following:
class Player():
def __init__():
...
def moveHandle(self, event):
self.anything = ...
box.bind("<Key>", Player.moveHandle)
where box is an instance of something, but Player is not.
instead this:
class Player():
def __init__(self):
...
def moveHandle(self, event):
self.anything = ...
p = Player()
box.bind("<Key>", p.moveHandle)
creates an instance of the player class, and then binds to the instances method, not the class method.