MyPy type annotation for a bound method? - python-3.x

I'm writing some tests for a library using pytest. I want to try a number of test cases for each function exposed by the library, so I've found it convenient to group the tests for each method in a class. All of the functions I want to test have the same signature and return similar results, so I'd like to use a helper method defined in a superclass to do some assertions on the results. A simplified version would run like so:
class MyTestCase:
function_under_test: Optional[Callable[[str], Any]] = None
def assert_something(self, input_str: str, expected_result: Any) -> None:
if self.function_under_test is None:
raise AssertionError(
"To use this helper method, you must set the function_under_test"
"class variable within your test class to the function to be called.")
result = self.function_under_test.__func__(input_str)
assert result == expected_result
# various other assertions on result...
class FunctionATest(MyTestCase):
function_under_test = mymodule.myfunction
def test_whatever(self):
self.assert_something("foo bar baz")
In assert_something, It's necessary to call __func__() on the function since assigning a function to a class attribute makes it a bound method of that class -- otherwise self will be passed through as the first argument to the external library function, where it doesn't make any sense.
This code works as intended. However, it yields the MyPy error:
"Callable[[str], Any]" has no attribute "__func__"
Based on my annotation, it's correct that this isn't a safe operation: an arbitrary Callable may not have a __func__ attribute. However, I can't find any type annotation that would indicate that the function_under_test variable refers to a method and thus will always have __func__. Am I overlooking one, or is there another way to tweak my annotations or accesses to get this working with type-checking?
Certainly, there are plenty of other ways I could get around this, some of which might even be cleaner (use an Any type, skip type checking, use a private method to return the function under test rather than making it a class variable, make the helper method a function, etc.). I'm more interested in whether there's an annotation or other mypy trick that would get this code working.

Callable only makes sure that your object has the __call__ method.
You problem is your call self.function_under_test.__func__(input_str) you should just call your function self.function_under_test(input_str)
See below your example without mypy complaints (v0.910)
from typing import Any, Callable, Optional
class MyTestCase:
function_under_test: Optional[Callable] = None
def myfunction_wrap(self, *args, **kwargs):
raise NotImplementedError
def assert_something(self, input_str: str, expected_result: Any) -> None:
if self.function_under_test is None:
raise AssertionError(
"To use this helper method, you must set the function_under_test"
"class variable within your test class to the function to be called.")
result = self.myfunction_wrap(input_str)
assert result == expected_result
# various other assertions on result...
def myfunction(a: str) -> None:
...
class FunctionATest(MyTestCase):
def myfunction_wrap(self, *args, **kwargs):
myfunction(*args, **kwargs)
def test_whatever(self):
self.assert_something("foo bar baz")
Edit1: missed the point of the questio, moved function inside a wrapper function

Related

Usage of __setattr__ to rewrite whole method of library class issue: missing 1 required positional argument: 'self'

I've got some imported packages with tricky structure
and need to call some method that bases on lots of other methods
with non-default parameters, which are not class attributes themself like pipeline in sklearn.
Minimal example of this module structure:
class Library_class:
def __init__(
self,
defined_class_options,
):
self.defined_class_options = defined_class_options
def method1( self , default_non_class_arg = 12 ):
assert self.defined_class_options==3
return default_non_class_arg
def method2( self, image ):
return image/ self.method1()
Default usage:
class_instance = Library_class( 3 )
class_instance.method2( 36 )
> 3.0
I need to set default_non_class_arg to 6 for example.
I've tried multiple approaches:
Analogous to https://stackoverflow.com/a/35634198/7607734
class_instance.method2( 36 ,
method1__default_non_class_arg=3 )
TypeError: method2() got an unexpected keyword argument 'method1__default_non_class_arg'
It don't work probably because class definitely don't have set_params
With setattr on redefined function
class_instance.__setattr__('method1',Library_class.new_method1)
class_instance.method2( 36 )
TypeError: new_method1() missing 1 required positional argument: 'self'
Both your snippets and question are quite messy, almost to the point of being unreadable.
Anyway, if you wantt to replace method1 with another function, say new_method1 in an specific instance, just do that. Your call to .__setattr__ does that, but it is not needed at all, (and if it was, due to you not having the method to be replaced name at code writting time, and needed it as a parameter, it is more correct to call the built-in setattr, not the instance method: `setattr(class_instance, "method1", new_method1").
Ordinarily, if you know, at code writting time you have to replace "method1" in an instance, the assigment operator will do it:
class_instance.method1 = new_method1
What went wrong in your examle is that if you assign a method to an instance, instead of a class, you are bypassing the mechanism that Python uses to insert the self attribute into it - so your new_method1 needs a different signature. (and this is exactly what the error message "TypeError: new_method1() missing 1 required positional argument: 'self'" is saying):
class MyClass:
...
def method1(self, param1=36):
...
...
def new_method1(param1=6): # <-- written outside of any class body, sans self
...
my_instance = MyClass()
my_instance.method1 = new_method1
this will work.
new_method1 could be written in a class body as well, and could be replaced just the same, but you would have to write it without the self parameter the same, and then it would not work straight as a normal method.
OR, you can, at assigment time, insert the self argument yourself - the functools.partial call is a convenient way to do that:
class MyClass:
...
def method1(self, param1=36):
...
def new_method1(self, param1=6):
...
...
my_instance = MyClass()
from functools import partial
MyClass.method1 = partial(MyClass.new_method1, my_instance)
Now, this should answer what you are asking, but it would not be honest of me to end the answer without saying this is not a good design. The best thing there is to pull your parameter from another place, it might be from an instance attribute, instead of replacing the method entirely just to change it.
Since for normal attributes, Python will read the class attribute if no instance attribute exists, it will happen naturally, and all you have to do is to set the new default value in your instance.
class MyClass:
default_param_1 = 36 # Class attribute. Valid for every instance unless overriden
...
def method1(self, param1=None):
if param1 is None:
param1 = self.default_param_1 #Automatically fetched from the class if not set on the instance
...
...
my_instance = MyClass()
my_instance.default_param_1 = 6
...

Python use __new__ with no default init

I have the below code to create Fruit Object when no parameter is passed and when parameter is passed.
I am trying to achieve, for which I get TypeError: object() takes no parameters
class Fruit:
def __new__(cls, *args, **kwargs):
return object.__new__(cls,"empty name")
def __init__(self, fruitname):
self.__fruit_name = fruitname
Expected code to work.
apple = Fruit("apple")
empty = Fruit()
The correct way to do this would be to subclass:
class Fruit(object):
def __new__(cls, *args, **kw):
return super().__new__(cls)
What you will likely see with your code is:
TypeError: object.__new__() takes exactly one argument (the type to instantiate)
Because you're passing an extra parameter that isn't supported.
But the __new__ is redundant. There is no need for you to use it at all here, and unless you're doing some fancy metaprogramming stuff, it's likely you never need to override __new__ at all.
Just intialise all your variables in __init__, and you should be fine.
I think the following should be sufficient for what you have described:
class Fruit:
def __init__(self, fruitname = "empty name"):
self.__fruit_name = fruitname
The reason for the error is an improper use of object.__new__() (This function does not expect any parameters) which is actually not necessary here.

How can I type cast a non-primitive, custom class in Python?

How can I cast a var into a CustomClass?
In Python, I can use float(var), int(var) and str(var) to cast a variable into primitive data types but I can't use CustomClass(var) to cast a variable into a CustomClass unless I have a constructor for that variable type.
Example with inheritance.
class CustomBase:
pass
class CustomClass(CustomBase):
def foo():
pass
def bar(var: CustomBase):
if isinstance(var, CustomClass):
# customClass = CustomClass(var) <-- Would like to cast here...
# customClass.foo() <-- to make it clear that I can call foo here.
In the process of writing this question I believe I've found a solution.
Python is using Duck-typing
Therefore it is not necessary to cast before calling a function.
Ie. the following is functionally fine.
def bar(var):
if isinstance(var, CustomClass):
customClass.foo()
I actually wanted static type casting on variables
I want this so that I can continue to get all the lovely benefits of the typing PEP in my IDE such as checking function input types, warnings for non-existant class methods, autocompleting methods, etc.
For this I believe re-typing (not sure if this is the correct term) is a suitable solution:
class CustomBase:
pass
class CustomClass(CustomBase):
def foo():
pass
def bar(var: CustomBase):
if isinstance(var, CustomClass):
customClass: CustomClass = var
customClass.foo() # Now my IDE doesn't report this method call as a warning.

Using singledispatch with custom class(CPython 3.8.2)

Let's say I want to set functions for each classes in module Named 'MacroMethods'. So I've set up singledispatch after seeing it in 'Fluent Python' like this:
#singledispatch
def addMethod(self, obj):
print(f'Wrong Object {str(obj)} supplied.')
return obj
...
#addMethod.register(MacroMethods.Wait)
def _(self, obj):
print('adding object wait')
obj.delay = self.waitSpin.value
obj.onFail = None
obj.onSuccess = None
return obj
Desired behavior is - when instance of class 'MacroMethods.Wait' is given as argument, singledispatch runs registered function with that class type.
Instead, it runs default function rather than registered one.
>>> Wrong Object <MacroMethods.Wait object at 0x0936D1A8> supplied.
However, type() clearly shows instance is class 'MacroMethods.Wait', and dict_keys property also contains it.
>>> dict_keys([<class 'object'>, ..., <class 'MacroMethods.Wait'>])
I suspect all custom classes I made count as 'object' type and don't run desired functions in result.
Any way to solve this problem? Entire codes are here.
Update
I've managed to mimic singledispatch's actions as following:
from functools import wraps
def state_deco(func_main):
"""
Decorator that mimics singledispatch for ease of interaction expansions.
"""
# assuming no args are needed for interaction functions.
func_main.dispatch_list = {} # collect decorated functions
#wraps(func_main)
def wrapper(target):
# dispatch target to destination interaction function.
nonlocal func_main
try:
# find and run callable for target
return func_main.dispatch_list[type(target)]()
except KeyError:
# If no matching case found, main decorated function will run instead.
func_main()
def register(target):
# A decorator that register decorated function to main decorated function.
def decorate(func_sub):
nonlocal func_main
func_main.dispatch_list[target] = func_sub
def register_wrapper(*args, **kwargs):
return func_sub(*args, **kwargs)
return register_wrapper
return decorate
wrapper.register = register
return wrapper
Used like:
#state_deco
def general():
return "A's reaction to undefined others."
#general.register(StateA)
def _():
return "A's reaction of another A"
#general.register(StateB)
def _():
return "A's reaction of B"
But still it's not singledispatch, so I find this might be inappropriate to post this as answer.
I wanted to do similar and had the same trouble. Looks like we have bumped into a python bug. Found a write-up that describes this situation.
Here is the link to the Python Bug Tracker.
Python 3.7 breaks on singledispatch_function.register(pseudo_type), which Python 3.6 accepted

Get an attribute of an instance with a decorator [duplicate]

This question already has an answer here:
Access self from decorator
(1 answer)
Closed 4 years ago.
I have a class defined as follows:
class SomeViewController(BaseViewController):
#requires('id', 'param1', 'param2')
#ajaxGet
def create(self):
#do something here
Is it possible to write a decorator function that:
Takes a list of args, and possibly kwargs, and
Accesses the instance of the class the method its decorating is defined in?
So for the #ajaxGet decorator, there is a attribute in self called type which contains the value I need to check.
Thanks
Yes. In fact, in the sense you seem to mean, there isn't really a way to write a decorator that doesn't have access to self. The decorated function wraps the original function, so it has to accept at least the arguments that that function accepts (or some arguments from which those can be derived), otherwise it couldn't pass the right arguments to the underlying function.
There is nothing special you need to do to do this, just write an ordinary decorator:
def deco(func):
def wrapper(self, *args, **kwargs):
print "I am the decorator, I know that self is", self, "and I can do whatever I want with it!"
print "I also got other args:", args, kwargs
func(self)
return wrapper
class Foo(object):
#deco
def meth(self):
print "I am the method, my self is", self
Then you can just use it:
>>> f = Foo()
>>> f.meth()
I am the decorator, I know that self is <__main__.Foo object at 0x0000000002BCBE80> and I can do whatever I want with it!
I also got other args: () {}
I am the method, my self is <__main__.Foo object at 0x0000000002BCBE80>
>>> f.meth('blah', stuff='crud')
I am the decorator, I know that self is <__main__.Foo object at 0x0000000002BCBE80> and I can do whatever I want with it!
I also got other args: (u'blah',) {'stuff': u'crud'}
I am the method, my self is <__main__.Foo object at 0x0000000002BCBE80>

Resources