PyCharm: 'Function Doesn't Return Anything' - python-3.x

I just started working with PyCharm Community Edition 2016.3.2 today. Every time I assign a value from my function at_square, it warns me that 'Function at_square doesn't return anything,' but it definitely does in every instance unless an error is raised during execution, and every use of the function is behaving as expected. I want to know why PyCharm thinks it doesn't and if there's anything I can do to correct it. (I know there is an option to suppress the warning for that particular function, but it does so by inserting a commented line in my code above the function, and I find it just as annoying to have to remember to take that out at the end of the project.)
This is the function in question:
def at_square(self, square):
""" Return the value at the given square """
if type(square) == str:
file, rank = Board.tup_from_an(square)
elif type(square) == tuple:
file, rank = square
else:
raise ValueError("Expected tuple or AN str, got " + str(type(square)))
if not 0 <= file <= 7:
raise ValueError("File out of range: " + str(file))
if not 0 <= rank <= 7:
raise ValueError("Rank out of range: " + str(rank))
return self.board[file][rank]
If it matters, this is more precisely a method of an object. I stuck with the term 'function' because that is the language PyCharm is using.
My only thought is that my use of error raising might be confusing PyCharm, but that seems too simple. (Please feel free to critique my error raising, as I'm not sure this is the idiomatic way to do it.)
Update: Humorously, if I remove the return line altogether, the warning goes away and returns immediately when I put it back. It also goes away if I replace self.board[file][rank] with a constant value like 8. Changing file or rank to constant values does not remove the warning, so I gather that PyCharm is somehow confused about the nature of self.board, which is a list of 8 other lists.
Update: Per the suggestion of #StephenRauch, I created a minimal example that reflects everything relevant to data assignment done by at_square:
class Obj:
def __init__(self):
self.nested_list = [[0],[1]]
#staticmethod
def tup_method(data):
return tuple(data)
def method(self,data):
x,y = Obj.tup_method(data)
return self.nested_list[x][y]
def other_method(self,data):
value = self.method(data)
print(value)
x = Obj()
x.other_method([1,2])
PyCharm doesn't give any warnings for this. In at_square, I've tried commenting out every single line down to the two following:
def at_square(self, square):
file, rank = Board.tup_from_an(square)
return self.board[file][rank]
PyCharm gives the same warning. If I leave only the return line, then and only then does the warning disappear. PyCharm appears to be confused by the simultaneous assignment of file and rank via tup_from_an. Here is the code for that method:
#staticmethod
def tup_from_an(an):
""" Convert a square in algebraic notation into a coordinate tuple """
if an[0] in Board.a_file_dict:
file = Board.a_file_dict[an[0]]
else:
raise ValueError("Invalid an syntax (file out of range a-h): " + str(an))
if not an[1].isnumeric():
raise ValueError("Invalid an syntax (rank out of range 1-8): " + str(an))
elif int(an[1]) - 1 in Board.n_file_dict:
rank = int(an[1]) - 1
else:
raise ValueError("Invalid an syntax (rank out of range 1-8): " + str(an))
return file, rank
Update: In its constructor, the class Board (which is the parent class for all these methods) saves a reference to the instance in a static variable instance. self.at_square(square) gives the warning, while Board.instance.at_square(square) does not. I'm still going to use the former where appropriate, but that could shed some light on what PyCharm is thinking.

PyCharm assumes a missing return value if the return value statically evaluates to None. This can happen if initialising values using None, and changing their type later on.
class Foo:
def __init__(self):
self.qux = [None] # infers type for Foo().qux as List[None]
def bar(self):
return self.qux[0] # infers return type as None
At this point, Foo.bar is statically inferred as (self: Foo) -> None. Dynamically changing the type of qux via side-effects does not update this:
foo = Foo()
foo.qux = [2] # *dynamic* type of foo.bar() is now ``(self: Foo) -> int``
foo_bar = foo.bar() # Function 'bar' still has same *static* type
The problem is that you are overwriting a statically inferred class attribute by means of a dynamically assigned instance attribute. That is simply not feasible for static analysis to catch in general.
You can fix this with an explicit type hint.
import typing
class Foo:
def __init__(self):
self.qux = [None] # type: typing.List[int]
def bar(self):
return self.qux[0] # infers return type as int
Since Python 3.5, you can also use inline type hints. These are especially useful for return types.
import typing
class Foo:
def __init__(self):
# initial type hint to enable inference
self.qux: typing.List[int] = [None]
# explicit return type hint to override inference
def bar(self) -> int:
return self.qux[0] # infers return type as int
Note that it is still a good idea to rely on inference where it works! Annotating only self.qux makes it easier to change the type later on. Annotating bar is mostly useful for documentation and to override incorrect inference.
If you need to support pre-3.5, you can also use stub files. Say your class is in foomodule.py, create a file called foomodule.pyi. Inside, just add the annotated fields and function signatures; you can (and should) leave out the bodies.
import typing
class Foo:
# type hint for fields
qux: typing.List[int]
# explicit return type hint to override inference
def bar(self) -> int:
...

Type hinting as of Python 3.6
The style in the example below is now recommended:
from typing import typing
class Board:
def __init__(self):
self.board: List[List[int]] = []
Quick Documentation
Pycharm's 'Quick Documentation' show if you got the typing right. Place the cursor in the middle of the object of interest and hit Ctrl+Q. I suspect the types from tup_from_an(an) is not going to be as desired. You could, try and type hint all the args and internal objects, but it may be better value to type hint just the function return types. Type hinting means I don't need to trawl external documentation, so I focus effort on objects that'll be used by external users and try not to do too much internal stuff. Here's both arg and return type hinting:
#staticmethod
def tup_from_an(an: List[int]) -> (int, int):
...
Clear Cache
Pycharm can lock onto out dated definitions. Doesn't hurt to go help>find action...>clear caches
No bodies perfect
Python is constantly improving (Type hinting was updated in 3.7) Pycharm is also constantly improving. The price for the fast pace of development on these relatively immature advanced features means checking or submitting to their issue tracker may be the next call.

Related

Typing and pint

I'm using pint to use and convert units. I wanted to create classes which restricts the quantities only to "[time]" or "[length]" dimensions, so as a first approach I did the following:
from pint import Quantity, DimensionalityError
class Time(Quantity):
def __new__(cls, v: str | Quantity) -> Quantity:
obj = Quantity(v)
if not obj.check("[time]"):
raise DimensionalityError(v, "[time]")
return obj
class Length(Quantity):
def __new__(cls, v: str | Quantity) -> Quantity:
obj = Quantity(v)
if not obj.check("[length]"):
raise DimensionalityError(v, "[length]")
return obj
At runtime it works as expected, i.e: I can do the following:
1hour = Time("1h") # Works ok, variable 1hour contains `<Quantity(1, 'hour')>`
bad = Time("1meter") # As expected, raises pint.errors.DimensionalityError: Cannot convert from '1meter' to '[time]'
1meter = Length("1meter") # Ok
bad_again = Length("1h") # Ok, raises DimensionalityError
However, from a typing perspective, something is wrong:
def myfunc(t: Time) -> str:
return f"The duration is {t}"
print(myfunc(Time("1h"))) # Ok
print(myfunc(Length("1m"))) # Type error?
The second call to myfunc() is a type error, since I'm passing a Length instead of a Time. However mypy is happy with the code. So I have some questions:
Why doesn't mypy catches the error?
How to do it properly?
For 1. I guess that something fishy is happening in pint's Quantity implementation. I tried:
foo = Quantity("3 pounds")
reveal_type(foo)
and the revealed type is Any instead of Quantity which is very suspicious.
So I tried removing the base class Quantity from my Time and Length classes (i.e: they derive now from object instead of Quantity), and in this case, mypy correctly manages the typing errors.
But it fails again as soon as I try something like Length("60km")/Time("1h"). mypy complains that Length object does not implement the required method for performing that division (although the code works ok at runtime because, after all, Length and Time __new__() method is returning a Quantity object which does implement the arithmetical operations).
So, again, is there any workaround for making the idea work both at run-time and for mypy?

Is there any problem in calling a static method from type name in python?

Should I be aware of any problem that could arise from doing this?
Example:
class A(object):
def __init__(self, a):
self.a = a
#staticmethod
def add1(a):
return a+1
x = A(1)
y = type(x).add1(2)
My use case would be calling a static method that processes data that was generated by an object that we cannot use anymore.
Simple test for identity gives us:
x = A(1)
print(type(x) is A)
True
print(type(x).add is A.add)
True
So based on that there should not be any problem, but I am not 100% sure. Although I would probably go with accessing x.__class__ property, which is in my opinion more intuitive.
EDIT: From Python documentation regarding the type function:
With one argument, return the type of an object. The return value is a type object and generally the same object as returned by object.__class__.

How to specify type based on runtime condition

I am using mypy and protocols and have run into a spot where I would like type hinting if possible, but I am unable to figure out how I should set it up so mypy doesn't error.
Consider the following example:
class TProtocol(Protocol):
t: str
#attrs(auto_attribs=True)
class T:
t: str
t2: str
#attrs(auto_attribs=True)
class T2:
t: str
def func(var: TProtocol) -> None:
if some_condition:
var: T
reveal_type(var)
else:
reveal_type(var)
While very contrived, it illustrates my goal of I have some runtime condition that if met I know that the type of that variable based on knowledge of the code base. I then want to pass this knowledge onto mypy so that further type checking uses that type.
The same example can be replace with a Union. Some runtime check occurs which tells me explicitly which one of the types I have based on knowledge of the code base. I then want to tell mypy explicitly which type that is for further type checking based on that outside knowledge.
The above example raises an error stating that var is already defined. I tried the allow_redefinition option and it didn't change the output.
Use typing.cast to forcefully declare a variable as a specific type. This disregards any other typing information, and works on runtime-branches as well.
def func(var: TProtocol) -> None:
reveal_type(var) # line 21
if some_condition:
var = cast(T, var) # line 23
reveal_type(var)
else:
var = cast(T2, var) # line 26
reveal_type(var)
This makes mypy treat each casted occurrence of var differently, i.e. before and inside each branch:
type_tests.py:21: note: Revealed type is 'type_tests.TProtocol'
type_tests.py:24: note: Revealed type is 'type_tests.T'
type_tests.py:27: note: Revealed type is 'type_tests.T2'

mypy 0.6.4 return type Optional[str] but sometimes you have prior knowledge about the type you will get

I have a function that is returning either an class instance or None depending
on some logic.
In some places of the code I know this function is for sure not returning None,
but mypy complains.
I made a minimal example that reproduces the situation described above.
I would like to avoid marking a_string as a_string: Optional[str] = "", I
know I can also overcome the problem using cast or type ignore, but somehow I
feel there might be a better way.
Any recommendations how to handle this situation?
For this example I am using mypy 0.641 and python 3.7
"""
Function returns either an object or none
"""
from typing import Optional, cast
RET_NONE = False
def minimal_example() -> Optional[str]:
if RET_NONE:
return None
else:
return "my string"
a_string = ""
maybe_string = minimal_example()
a_string = maybe_string
# mypy doesn't complain if I do the following
a_string = cast(str, maybe_string)
a_string = maybe_string # type: ignore
Mypy complains as follows:
❯❯❯ mypy mypy_none_or_object.py (chatsalot) ✘ 1
mypy_none_or_object.py:19: error: Incompatible types in assignment (expression has type "Optional[str]", variable has type "str")
Mypy is designed to treat function signatures as the "source of truth". If you indicate that some function returns an Optional[str], then mypy will assume that will always be the case. It won't attempt to see how any global variables may or may not alter that function signature.
The easiest possible way of working around this is to add an assert or isinstance check:
maybe_string = minimal_example()
reveal_type(maybe_string) # Revealed type is Optional[str]
assert maybe_string is not None # Or use 'if isinstance(maybe_string, str)
reveal_type(maybe_string) # Revealed type is str
(If you're not aware, mypy will special-case the reveal_type(...) function: whenever mypy encounters it, mypy prints out the type of whatever expression you provide. This is useful for debugging, but you should remember to delete the pseudo-function after you're done since it doesn't exist at runtime.)
Alternatively, you could redesign your code so that your function's return value is more normalized -- it always returns a string instead of sometimes returning one.
If RET_NONE is meant to be a more-or-less immutable global (e.g. something like "enable debug mode" or "assume we're running on Windows"), you could use is to take advantage of mypy's --always-true and --always-false flags and provide two different definitions of minimal_example. For example:
RET_NONE = False
if RET_NONE:
def minimal_example() -> None:
return None
else:
def minimal_example() -> str:
return str
You then invoke mypy using mypy --always-true RET_NONE or mypy --always-false RET_NONE to match how your variable is defined. You can find more info about these types here and maybe here.
A fourth alternative you could explore is using function overloads: https://mypy.readthedocs.io/en/latest/more_types.html#function-overloading
However, idk if that really works in your case: you can't define overloads where only the return type differs: the argument arity or types of each overload need to be distinguishable from each other in some way.
Both solutions: cast() and # type: ignore are effectively turning off mypy checks for the variable. This can shadow bugs and should be avoided when possible.
In your case mypy cannot know the value of RET_NONE, since it can be changed in runtime from False to anything else, thus the error.
I suggest adding an assertion:
a_string = ""
maybe_string = minimal_example()
assert maybe_string is not None # <- here
a_string = maybe_string
Now mypy is sure that on the next line maybe_string definitely won't be None. I covered this in the Constraining types section of my blog post about typing.

Mypy reports an incompatible supertype error with overridden method

Below is a simplified example of a problem I've encountered with mypy.
The A.transform method takes an iterable of objects, transforms each one (defined in the subclass B, and potentially other subclasses) and returns an iterable of transformed objects.
from typing import Iterable, TypeVar
T = TypeVar('T')
class A:
def transform(self, x: Iterable[T]) -> Iterable[T]:
raise NotImplementedError()
class B(A):
def transform(self, x: Iterable[str]) -> Iterable[str]:
return [x.upper() for x in x]
However mypy says:
error: Argument 1 of "transform" incompatible with supertype "A"
error: Return type of "transform" incompatible with supertype "A"
If I remove [T] from A.transform(), then the error goes away. But that seems like the wrong solution.
After reading about covariance and contravariance, I thought that setting
T = TypeVar('T', covariant=True) might be a solution, but this produces the same error.
How can I fix this? I have considered binning the design altogether and replacing the A class with a higher order function.
Making T covariant or contravariant isn't really going to help you in this case. Suppose that the code you had in your question was allowed by mypy, and suppose a user wrote the following snippet of code:
def uses_a_or_subclass(foo: A) -> None:
# This is perfectly typesafe (though it'll crash at runtime)
print(a.transform(3))
# Uh-oh! B.transform expects a str, so we just broke typesafety!
uses_a_or_subclass(B())
The golden rule to remember is that when you need to overwrite or redefine a function (when subclassing, like you're doing, for example), that functions are contravariant in parameters, and covariant in their return type. This means that when you're redefining a function, it's legal to make the parameters more broad/a superclass of the original parameter type, but not a subtype.
One possible fix is to make your entire class generic with respect to T. Then, instead of subclassing A (which is now equivalent to subclassing A[Any] and is probably not what you want if you'd like to stay perfectly typesafe), you'd subclass A[str].
Now, your code is perfectly typesafe, and your redefined function respects function variance:
from typing import Iterable, TypeVar, Generic
T = TypeVar('T')
class A(Generic[T]):
def transform(self, x: Iterable[T]) -> Iterable[T]:
raise NotImplementedError()
class B(A[str]):
def transform(self, x: Iterable[str]) -> Iterable[str]:
return [x.upper() for x in x]
Now, our uses_a_or_subclass function from up above should be rewritten to either be generic, or to accept specifically classes that subtype A[str]. Either way works, depending on what you're trying to do.

Resources