Variadic generic type alias - python-3.x

I'm writing a python typing stub for use with mypy. There are a lot of functions that take callback parameters of the form Callable[[*foo], Any], where *foo represents zero or more types. I would like to be able to use a generic type alias to reduce repetition.
Generic type aliases are documented here, but I don't see how it would be possible to have a list of types as a parameter.
I know that this can be done with a concrete number of arguments:
T = TypeVar('T')
Callback0 = Callable[[], Any]
Callback1 = Callable[[T], Any]
def foo(f: Callback0): ...
def bar(f: Callback1[str]): ...
What I'd like to declare instead is something like:
def foo(f: Callback[]): ...
def bar(f: Callback[str]): ...
If it matters, the code is for Python 3.3, and I'm running mypy with Python 3.7.

What about protocol?
T = TypeVar("T")
class MyAwesomeProtocol(Protocol[T]):
def __call__(self, a: T) -> Any:
pass
def foo(f: MyAwesomeProtocol):
...
def bar(f: MyAwesomeProtocol[str]):
...

Related

Rust version of Python's __getitem__

Python has a special class method __getitem__ that is basically syntactic sugar. For example,
class MyList:
def __init__(self):
self.list = []
def __getitem__(self, key):
return self.list[key]
will allow you to get elements of MyList objects like this: mylist[2]. I wonder if there is a trait in Rust that does the same kind of thing. Just like the trait Ord in Rust allows you to use >, <, ==.
You can implement Index for your struct, which allows you to use the bracket operators to get an immutable reference to the data. If you want mutable access, you also need IndexMut

How to create a wrapper function with the same annotations and parameter names as the wrapped one in Python

say i have a function:
def foo(x: Type1, y: Type2):
do something..
And i want another function, say:
def bar(f: Callable, args..):
#do something
#return another function Fizz
And i want the returned function (Fizz) to have a signature like:
def Fizz(x: Type1, y: Type2, ...)
pass
One solution i found to work is:
def foo(x: Type1, y: Type2):
#do something..
def bar(func: Callable):
func_signature = signature(func)
params_it = iter(func_signature.parameters.values())
x = next(params_it)
y = next(params_it)
def fizz(x: x.annotation, y: y.annotation):
pass
return fizz
But the limitation of that function is that assumes foo's parameter names (i.e: it can't deduce it like it deduces the annotation)

Python & MyPy - Passing On Kwargs To Complex Functions

Consider the following attempt at add adding type hints to the functions parent and child:
def parent(*, a: Type1, b: Type2):
...
def child(*, c: Type3, d: Type4, **kwargs):
parent(**kwargs)
...
MyPy complains that kwargs has the type Dict[str, Any] but that the arguments a and b require Type1 and Type2 respectively.
I understand that the solution to this error is to rewrite child in the following way:
def child(*, a: Type1, b: Type2, c: Type3, d: Type4, **kwargs):
parent(a=a, b=b)
...
However what if the argument list of parent is much longer, or there is functiongrandchild which has its own argument list and must call child. Are you required to enumerate the arguments and their types from all the downstream functions? Or is there a graceful way to handle the "per-key" typing of **kwargs?

python typing: How to inherit self type (with static typing)

I want a child class to inherit its parents' methods which return self. While doing so, the type-checker (mypy) by default keeps the return type the parent class. I want it to automatically infer the child class as the return type. For simple cases I found the following code to work:
import typing
Self = typing.TypeVar("Self", bound="Foo")
class Foo:
def foo(self) -> "Foo":
return self
def bar(self: Self) -> Self:
return self
class Bar(Foo):
...
# The naive implementation reports the type as "Foo".
# This is not what I want
reveal_type(Bar().foo())
# Using a generic `Self` type seems to work for simple cases
# Revealed type is 'Bar*'
reveal_type(Bar().bar())
This "solution" breaks down if I try to use a context-manager:
import contextlib
import typing
Self = typing.TypeVar("Self")
class Foo(typing.ContextManager):
def __enter__(self: Self) -> Self:
return self
class Bar(Foo):
...
with Bar() as f:
# Revealed type is 'Bar*'
reveal_type(f)
with contextlib.ExitStack() as cx:
f2 = cx.enter_context(Bar())
# Revealed type is 'Any'
reveal_type(f2)
It works in the first case but not in the second. I think this is because I
did not specify the type parameter of typing.ContextManager. If I do though, mypy reveals both types as Any:
class Foo(typing.ContextManager[Self]):
def __enter__(self: Self) -> Self:
return self
As far as I understand, this happens because Self isn't bound to any concrete type at this point. I am kind of lost right now, I did not find any way to make it work ... Is this even possible?

How do I annotate the type of __add__ (double underscore) method on a class?

I'm trying to annotate the __add__ method of a class so that I can add instances of it together. However, I'm unable to specify that this method takes another instance of the same type:
class Foo:
def __init__(self, v: int) -> None:
self.v = v
def __add__(self, other: Foo) -> Foo:
return Foo(self.v + other.v)
However I seem to get an error regarding Foo not yet being defined, since it is referencing Foo within one of it's own method definitions
def __add__(self, other: Foo) -> Foo:
NameError: name 'Foo' is not defined
How can I annotate this correctly so that mypy warns me whenever there is code attempting to add instances of this class to instances of other classes?

Resources