-
Notifications
You must be signed in to change notification settings - Fork 288
-
[I posted this in the typing-sig last week but haven't yet received much input. Reposting here for visibility.]
Since the introduction of PEP 647 (User-defined Type Guards), we've received a steady stream of input from users saying that they don't like the limitation that type narrowing is applied only in the positive case and is not applied in the negative case.
In general, type narrowing is not safe to perform if a user-defined type guard returns False, so I've stood by the original decision not to provide type narrowing in the negative case, but there are cases where such narrowing is safe and desirable.
In [a recent thread](#996 (comment)
In this discussion) where this was discussed in some detail, @ikamensh proposed a solution. The proposal is to extend the existing TypeGuard to support an optional second type argument. If present, the second argument indicates the narrowed type in the negative type narrowing situation.
Here's a simple (admittedly contrived) example:
def is_str(val: str | int) -> TypeGuard[str, int]: return isinstance(val, str) def func(val: str | int): if is_str(val): reveal_type(val) # str else: reveal_type(val) # int
I've implemented this proposal in pyright so folks can experiment with it and see if they like it. If there's general consensus that it's the right approach, I can file an amendment for the existing PEP 647 to include this functionality. It was trivial to add to pyright, so I'm optimistic that it would likewise be easy to add to the other type checkers.
Another common request for PEP 647 is the desire for an "assert" form of TypeGuard — a way to indicate that a function performs runtime validation of a type, raising an exception if the type is incorrect. It occurred to me that this two-argument form of TypeGuard could also be used to handle this use case. The second argument would be specified as a NoReturn. I've provisionally implemented this in pyright as well.
Here's what this would look like:
def validate_sequence(val: Sequence[_T | None]) -> TypeGuard[Sequence[_T], NoReturn]: """Verifies that the sequence contains no None values""" if len([x for x in val if x is not None]) > 0: raise Exception() return True def func(val: Sequence[int | None]): validate_sequence(val) reveal_type(val) # Sequence[int]
I'm interested in input on these proposals.
-Eric
--
Eric Traut
Contributor to Pyright & Pylance
Microsoft
Beta Was this translation helpful? Give feedback.
All reactions
-
❤️ 3
Replies: 11 comments 52 replies
-
Based on feedback, it sounds like the above proposal does not sufficiently meet the needs or expectations of users who are dissatisfied with the current TypeGuard mechanism documented in PEP 647. Here's a discussion of another case where the current TypeGuard has been used incorrectly in a recent release of numpy, and the resulting behavior doesn't do what was intended.
Taking in all of the feedback, here's an alternative proposal that involves the introduction of another form of TypeGuard which has "strict" type narrowing semantics. It would be less flexible than the existing TypeGuard, but the added constraints would allow it to be used for these alternate use cases.
StrictTypeGuard
This proposal would introduce a new flavor of TypeGuard that would have strict narrowing semantics. We'd need to come up with a new name for this, perhaps StrictTypeGuard, ExactTypeGuard, or PreciseTypeGuard. Other suggestions are welcome.
This new flavor of type guard would be similar to the more flexible version defined in PEP 647 with the following three differences:
- The type checker would enforce the requirement that the type guard type (specified as a return type of the type guard function) is a subtype of the input type (the declared type of the first parameter to the type guard function). In other words, the type guard type must be strictly narrower than the input type. This precludes some of the use cases anticipated in the original PEP 647.
def is_marsupial(val: Animal) -> StrictTypeGuard[Kangaroo | Koala]: # This is allowed return isinstance(val, Kangaroo | Koala) def has_no_nones(val: list[T | None]) -> StrictTypeGuard[list[T]]: # Error: "list[T]" cannot be assigned to "list[T | None]" return None not in val
- Type narrowing would be applied in the negative ("else") case. This may still lead to incorrect assumptions, but it's less likely to be incorrect with restriction 1 in place. Consider, for example,
def is_black_cat(val: Animal) -> StrictTypeGuard[Cat]: return isinstance(val, Cat) and val.color == Color.Black def func(val: Cat | Dog): if is_black_cat(val): reveal_type(val) # Cat else: reveal_type(val) # Dog - which is potentially wrong here
- If the return type of the type guard function includes a union, the type checker would apply additional type narrowing based on the type of the argument passed to the type guard call, eliminating union elements that are impossible given the argument type. For example:
def is_cardinal_direction(val: str) -> StrictTypeGuard[Literal["N", "S", "E", "W"]]: return val in ("N", "S", "E", "W") def func(direction: Literal["NW", "E"]): if is_cardinal_direction(direction): reveal_type(direction) # Literal["E"] # The type cannot be "N", "S" or "W" here because of argument type else: reveal_type(direction) # Literal["NW"]
TypeAssert and StrictTypeAssert
As mentioned above, there has also been a desire to support a "type assert" function. This is similar to a "type guard" function except that it raises an exception (similar to an assert statement, except that its behavior is not dependent on 'debug' mode) if the input's type doesn't match the declared return type. This is analogous to "type predicate functions" supported in TypeScript, like the following:
function assertAuthenticated(user: User | null): asserts user is User { if (user === null) { throw new Error('Unauthenticated'); } }
We propose to add two new forms TypeAssert and StrictTypeAssert, which would be analogous to TypeGuard and StrictTypeGuard. While type guard functions are expected to return a bool value, "type assert" forms would return None (if the input type matches) or raise an exception (if the input type doesn't match). Type narrowing in the negative ("else") case doesn't apply for type assertions because it is assumed that an exception is raised in this event.
Here are some examples:
def verify_no_nones(val: list[None | T]) -> TypeAssert[list[T]]: if None in val: raise ValueError() def func(x: list[int | None]): verify_no_nones(x) reveal_type(x) # list[int]
and
def assert_is_one_dimensional(val: tuple[T, ...] | T) -> StrictTypeAssert[tuple[T] | T]: if isinstance(val, tuple) and len(val) != 1: raise ValueError() def func(x: float, y: tuple[float, ...]): assert_is_one_dimensional(x) reveal_type(x) # float assert_is_one_dimensional(y) reveal_type(y) # tuple[float]
Thoughts? Suggestions?
If we move forward with the above proposal (or some subset thereof), it will probably require a new PEP, as opposed to a modification to PEP 647.
Beta Was this translation helpful? Give feedback.
All reactions
-
👍 1 -
❤️ 8 -
👀 1
-
I'm wondering how should a StrictTypeGuard[str] behave if it's defined input type is Any, and it's given an argument int in a usecase:
def tg(x: Any) -> StrictTypeGuard[str]: return isinstance(x, str) def foo(y: int): if tg(y): # do we raise here because we narrow y type to empty set? print("got string") else: print("got int")
Effectively narrowed type is empty set ("it can't return true for this input type"). Does this mean rising a type error or just ignoring a branch in if/else?
Beta Was this translation helpful? Give feedback.
All reactions
-
In your example above, the type of y is narrowed to Never in the positive case and int in the negative case.
In the latest published version of pyright (1.1.210), I've removed support for the two-argument form of TypeGuard (based on lukwarm-to-negative feedback it garnered) and I've implemented StrictTypeGuard. If you're interested in this topic, please give it a try and provide feedback.
Beta Was this translation helpful? Give feedback.
All reactions
-
I just hit the need for StrictTypeGuard which brought me here, and the experimental support in pyright has solved my problem, thanks, so it would be great if it was standardised. Reading about TypeAssert, however, I've realised that would be great too. I currently do TypeAssert like this
def to_str(x: object) -> str:
if isinstance(x, str):
return x
raise TypeError("expected str")
def process_part_of_json(input: object):
input = to_str(input)
use(input)
which different from the way I'd naturally write untyped Python like
def check_is_str(x):
if not isinstance(x, str):
raise TypeError("expected str")
def process_part_of_json(input: object):
check_is_str(input)
use(input)
Beta Was this translation helpful? Give feedback.
All reactions
-
As I've adopted regular TypeGuards, I've also realized that I have a need for what you call StrictTypeGuard more often than not. It would be a great addition.
Beta Was this translation helpful? Give feedback.
All reactions
-
👍 4
-
After playing around with type guards, I've noticed that the 3rd point is often insufficient for many cases and not what I would expect, especially when trying to write generic type guards. The problem is that types are allowed to expand beyond their original types, at least on mypy.
def isiterable(x: Any) -> TypeGuard[Iterable]: return isinstance(x, Iterable) x: list[int] if isiterable(x): reveal_type(x) # Got Iterable[Any], expected list[int]. if isinstance(x, Iterable): reveal_type(x) # Got expected list[int].
My expectation is that type guards should at least work like isinstance in the positive case.
Beta Was this translation helpful? Give feedback.
All reactions
-
👍 1
-
It's not clear how I can use TypeGuard or StrictTypeGuard to imply a negative assertion for something like this (which currently does not pass mypy validation):
import typing as t import typing_extensions as te from collections import abc @overload def is_collection(item: str) -> te.Literal[False]: ... @overload def is_collection(item: bytes) -> te.Literal[False]: ... @overload def is_collection(item: t.Collection) -> te.Literal[True]: ... def is_collection(item: t.Any) -> bool: """Return True if the item is a collection class but not a string.""" return not isinstance(item, (str, bytes)) and isinstance(item, abc.Collection) def is_present(collection: t.Set, item: t.Union[str, t.Collection]) -> bool: if is_collection(item): # item is still a union here and mypy will flag `str` as unusable for `&` return bool(collection & item) return item in collection
Beta Was this translation helpful? Give feedback.
All reactions
-
The & operator (which translates to the magic method set.__and__) requires an AbstractSet operand. It doesn't work with an arbitrary Collection, at least not according to the builtins.pyi stub.
class set(MutableSet[_T], Generic[_T]): ... def __and__(self, __s: AbstractSet[object]) -> set[_T]: ...
So I don't think it's enough to narrow item to Collection. You need to verify that it's an instance of AbstractSet. Something like this...
def is_present(collection: set[Any], item: Collection[Any]) -> bool: if isinstance(item, AbstractSet): return bool(collection & item) return item in collection
Beta Was this translation helpful? Give feedback.
All reactions
-
👍 1
-
I would like to add my support for this StrictTypeGuard feature. A notable use case is my attempt to properly annotate pd.isnull and pd.notnull in the pandas-stubs repo. This has strict type narrowing semantics on a value which can be a Union of several null-equivalent types and a non-null value type. If these type guards return true, then the current approach largely works, however it falls short when the passed-in value is a subset of the potential types - it should be decomposing the Union to eliminate impossible types. If the type guards return false, there is no narrowing, but there should be in the negative case.
For instance, I have these tests which currently document the present behavior, with commented out code indicating what the true/desired behavior should be. A feature that can enable this would be a substantial improvement:
Beta Was this translation helpful? Give feedback.
All reactions
-
👍 2
-
All this stuff seems like syntactic sugar, for example see @parched's example here. For a large codebase something like to_str is going to be in a shared utility library anyway, so this adds several new things to the language to save a couple lines in code where no one will care much how it's implemented. This is reasonable for huge organizations with lots of Python experts, but it should be balanced with the problem of making typing even more confusing for newcomers than it already is.
Also the assert stuff seems like a step on the path of making a statically-typed version of Python. With code like:
# validate_* functions use TypeAssert def my_func(x: str, y: int) -> None: validate_str(x) validate_int(y)
it makes sense to want sugar in the function signature to validate all inputs at once, and there we are. I like Python and static typing so I'm fine with that, but in that case the question is whether TypeAssert is a correct step in that direction. I don't know whether it is or not, I'm just asking whether this has been considered.
Beta Was this translation helpful? Give feedback.
All reactions
-
[we can discuss this through the PEP process]
@erictraut created this discussion to get feedback on the proposed feature, so I'm giving feedback here. It's a little strange that people are responding to my feedback by saying there should be a PEP where, presumably, I can go repeat the same feedback.
The is_int/is_positive_int situation is a known problem. Wouldn't it make more sense to try to sort that out here instead of submitting a PEP with a known problem and figuring it out then?
I really just see this as an extension of a feature that was already approved into the language.
Others (not me) have emphasized in this discussion that StrictTypeGuard is a separate thing from TypeGuard, not an extension, so that's something you can discuss directly with them. Personally I feel the relationship between StrictTypeGuard and TypeGuard is part of why this will be confusing to non-experts, and in particular (going back to @erictraut's comment here) that makes this proposed Python change more confusing than what's in TypeScript, which only has the StrictTypeGuard approach.
[provisional support in Pyright]
I really don't want to get too sidetracked or upset anyone, but you all seem to be saying or implying that provisional support in a major type checker is a necessary but not sufficient condition to typing PEP acceptance. This gives the type checker devs, maybe in some cases a particular person, a lot of influence. Maybe that's as it should be, I couldn't say, but the fact remains.
Beta Was this translation helpful? Give feedback.
All reactions
-
@harahu I'm sorry, I'm not really sure what you mean by "validated" in your comment. However, I'm saying the difficulty is that novice developers can get unintuitive results by importing StrictTypeGuard code they didn't write. If you look again at the nested logic in func2 in my earlier comment, you can imagine that a developer could import is_int and is_positive_int from some library, naively use them and then get weird typing results. To figure this out they'd have to somehow know these is_* functions are the culprit, go look up their source, and then figure out what StrictTypeGuard is doing. (I wrote something similar in another earlier comment here.)
For your second point about pandas, also in the func2 comment I wrote:
So, on one hand we have a new feature that is useful to big projects like NumPy with lots of Python experts, while on the other hand it feels like opening the door to allow bad code that can trip up even experienced users for years to come.
I'm not sure what I think about it. Ideally we'd also want to hear from all the people down the road who will be confused by this new stuff, or who will have to debug bad library code that uses it, not just the experts in this discussion.
So it's not that this super advanced feature is only useful to something like pandas, it's that, practically speaking, only projects with Python typing experts will be able to use StrictTypeGuard effectively. All the other projects will not use it or will be confused by it when they find it in library code. In other words it advantages experts and is neutral to or disadvantages everyone else.
Beta Was this translation helpful? Give feedback.
All reactions
-
@jeremyn By validation I'm referring to the practice of having type checkers make sure that the definition of a function is in line with its annotation. Just like a type checker wouldn't be ok with the return type annotation of my meaning_of_life function example above, it shouldn't be ok with your is_positive_int either.
My point is that for your is_positive_int example to be a valid point, it must somehow be realistic. And, I think it is only realistic if it is "easy" to create such invalid StrictTypeGuards. The moment the type checker validates them, and complains if it doesn't agree that some function is "safe" as a StrictTypeGuard, the probability of seeing is_positive_int (and its brethren) annotated as StrictTypeGuard goes down enormously. After all, it would imply either getting a cast or # type: ignore past code review.
This, in my mind, makes the problem you're pointing out not very different from one we already have. Functions you haven't written yourself can have too narrow return annotations due to casting. Should you use them and encounter a type error, you have to go through the process of discovering that a certain external function is the culprit, and you have to work out how it doesn't do what it says on the tin. I don't think this process is made significantly harder (or more likely to occur) by the introduction of a validated StrictTypeGuard.
only projects with Python typing experts will be able to use StrictTypeGuard effectively. All the other projects will not use it or will be confused by it when they find it in library code.
I'm not a typing expert. I'm also not confused by this concept, and I want to make use of it. What gives?
Beta Was this translation helpful? Give feedback.
All reactions
-
@harahu Your meaning_of_life function is easy to check because it's returning a str when it should get an int. However as I understand it a *TypeGuard function just expects a bool and that's what it's getting in every case we've discussed, so that basic check won't work. It sounds like you want the checker to logically reason about whether the function is coming up with the boolean in a sensible way, which would require near-human intelligence, especially if you still permit stuff like is to find None, and ==/</etc to narrow down Literals. If you have a simple proposal for implementing that validation, I would be interested in hearing it.
Also anyone who has found their way to this forum and can meaningfully discuss these things is a typing expert compared to the vast majority. Also note that you only discovered a need for StrictTypeGuard after using TypeGuard a lot, TypeGuard itself being described as an "advanced feature" by @erictraut.
Beta Was this translation helpful? Give feedback.
All reactions
-
Sorry for forgetting to come back to you @jeremyn. I just realized given that I was reminded of this discussion by and issue touching upon a usecase for this feature: rustedpy/result#69
It sounds like you want the checker to logically reason about whether the function is coming up with the boolean in a sensible way, which would require near-human intelligence [...]
It wouldn't have to be very intelligent, no. I wouldn't mind having to use casts or ignore statements in cases where I am absolutely confident in the guard, but the type checker is not. The point is more that this is a psychological boundary that people feel bad about crossing unless they know what they're doing. It will mean that you end up using a StrictTypeGuard only when it is easy to statically verify that it is indeed correct, or when it is easy for you to make the case that the type checker is wrong in not trusting your code. And the latter won't be the case if you don't really understand the concept.
Beta Was this translation helpful? Give feedback.
All reactions
-
Without StrictTypeGuard, the only workaround to implement strict type guard with zero performance cost is to use the pattern like
A = TypeVar("A") B = TypeVar("B") def is_A(x: A | B) -> TypeGuard[A]: raise NotImplementedError def after_is_A(x: A | B) -> TypeGuard[B]: return True def test(x: A | B): if is_A(x): reveal_type(x) return assert after_is_A(x) reveal_type(x) return
Such that everything is enclosed in if is_A(x): ... assert after_is_A(x)
(And assert statements can be automatically dropped with python optimized mode)
I think that such things are very redundant, but if there is no other workaround, then it may be the only way without StrictTypeGuard
Beta Was this translation helpful? Give feedback.
All reactions
-
I'd like to revive this thread. There continues to be a need for a TypeGuard that supports narrowing in the negative case. The topic comes up about once a month in the python/typing forum or in the pyright or mypy issue trackers. For example, this issue was posted to mypy's issue tracker today.
In this discussion thread, I proposed a StrictTypeGuard mechanism that appears to meet everyone's needs. @rchen152 presented this proposal at the PyCon 2022 typing summit, and I was told that it was well received in that forum.
StrictTypeGuard is more limiting than the existing TypeGuard in that it requires that the type guard return type be a subtype of the un-narrowed (input) type. With this constraint, it's safe(r) to perform type narrowing in the negative case. It's also safe(r) to apply narrowing to subtypes within a union as discussed above. I've prototyped StrictTypeGuard in pyright (you can try it today if you want to play with it), and it seems to work well.
If we want to move forward with this proposal, I think there are two approaches we could consider:
- Introduce a new
StrictTypeGuardspecial form as proposed above. - Make a small change to the existing
TypeGuardbehavior such that if the output type is a subtype of the input type, the "strict type guard" semantics are assumed. This has the advantage of not requiring a new special form — and the cognitive burden this imposes on users. It would also require no new support in thetypingmodule ortyping_extensions, so it would be backward compatible with older versions of Python. It has the disadvantage that this is theoretically a breaking change, although I think it's unlikely that any existing code would actually break due to this change.
The first option would definitely require a new PEP; the second option probably would too (although perhaps not, since it would affect only type checking behavior, not runtime behavior).
I have a slight preference for option 2, but I'd love to hear other thoughts.
Another (somewhat orthogonal) question is whether we should add TypeAssert as part of this same effort. There hasn't been as much demand for this, so I think the answer is "no", but it may be worth considering as part of a new PEP.
Beta Was this translation helpful? Give feedback.
All reactions
-
As someone who introduced type checking into our CI process at my current technology group of about ~50 developers, and have had to deal with the fallout of helping a number of people who have never used types before understand how they work from the ground up, I am strongly in favor of any solution that (1) matches user's natural intuition and (2) reduces cognitive burden, especially when it comes to concepts with a lot of edge cases like narrowing.
To that extent, I have a strong preference for option 2. The example I gave above, and am repeating here, is the annotation of pd.isna in pandas. Basic intuition would lead one to believe that if that returns True, the value will be narrowed to the null-equivalent members of the Union, and the actual object in the negative case.
In my opinion enhancements to type narrowing are one of the highest ROI enhancements one can make to the type system. Almost all of these kinds of enhancements have the effect of improving a user's ability to represent known facts at runtime statically in code. Even today there are still gaps in where one lacks the necessary syntax to be able to represent something through types, or alternatively must jump through excessive hoops like assert or cast to get things that we know to be true, to be understood to be true by type checkers. Any time we can eliminate one of those, it is a massive step forward.
Beta Was this translation helpful? Give feedback.
All reactions
-
👍 2
-
What exactly do you mean by TypeAssert?
Recently I needed to assert that values are not None inside a generator expression and narrow the type accordingly. I came up with the following solution:
from typing import TypeVar ValueT = TypeVar('ValueT') def assert_not_none(value: ValueT | None) -> ValueT: """Raise an exception if the value is None. Args: value: The value to check. Raises: ValueError: If the value is None. Returns: The value, guaranteed to not be None. """ if value is None: raise ValueError("Value cannot be None.") return value
Note: I am still not sure if the signature assert_not_none(value: ValueT | None) -> ValueT will work for any type in place of ValueT.
Usage:
def pass_int(value: int) -> int: return value sequence: list[int | None] = [1, 2, 3, 4, None, 5] values = (pass_int(assert_not_none(value)) for value in sequence)
I would certainly welcome a more general version of this asserting type guard in the standard library.
I have also found an interesting discussion in this issue: Request: an AssertingTypeGuard type for TypeGuard-like semantics #930 There you will see other needs for such an asserting type guard.
Beta Was this translation helpful? Give feedback.
All reactions
-
Hi, I'd like to write the PEP for Eric's second option above: #1013 (comment)
We think just modifying TypeGuard in specific cases would provide most of the value in this discussion. When the type checker can determine the return type is a strict subtype of the input type, then the negative case can be assumed as well.
Example:
A = TypeVar("A") B = TypeVar("B") def is_A(x: A | B) -> TypeGuard[A]: return isinstance(x, A) def foo(x: A | B): if is_A(x): reveal_type(x) <-- A else reveal_type(x) <-- B here
This is only true because the TypeGuard function is narrowing the input type.
This example however would not know anything about the negative case.
def is_A_nonstrict(x: Any) -> TypeGuard[A]: return isinstance(x, A) def foo(x: A | B): if is_A_nonstrict(x): reveal_type(x) <-- A else: reveal_type(x) <-- A | B
Beta Was this translation helpful? Give feedback.
All reactions
-
Typescript seems to behave the way we're suggesting TypeGuard should behave:
function is_positive_int(value: number | string): value is number { // Equivalent of TypeGuard[int] return typeof value === 'number' && value > 0; } function output(val: number | string) { if (is_positive_int(val)) { console.log(val + 1); } else { console.log(val - 1); // Error here because the assumption is that val is a string } }
So, there's a precedent for assuming the negative type even when it might not make sense.
Beta Was this translation helpful? Give feedback.
All reactions
-
sorry for unsolicited comment, but I wanted to chime in regarding the Guido's example. In my mind the reason why it is really contrived is that if one is trying to distinguish positive ints as a type (hence using static typing), they should declare it as such. Thus the real-world example should have looked like this:
PosInt = NewType("PosInt", int) def is_positive_int(a: PosInt|str) -> TypeGuard[PosInt]: return isinstance(a, int) and a >= 0 def test2(x: str | PosInt): if is_positive_int(x): reveal_type(x) else: reveal_type(x) # Pyright (at least the reference implementation) is saying x is 'str' here
This would also disallow (from the static type checker perspective) passing regular int from the upstream code.
This is just to support the notion that the else branch behavior should narrow the type without an introduction of another StrictTypeGuard. After all, the reasoning to not enforce strict narrowing in the original PEP says:
However, there are many ways a determined or uninformed developer can subvert type safety – most commonly by using cast or Any. If a Python developer takes the time to learn about and implement user-defined type guards within their code, it is safe to assume that they are interested in type safety and will not write their type guard functions in a way that will undermine type safety or produce nonsensical results.
Why not apply the same principle here? If someone shoots themselves in a foot by creating typeguadrs that are "narrowing" to implicit derived types (like the positive int), then it is just should be considered a "known" choice.
Beta Was this translation helpful? Give feedback.
All reactions
-
Why not apply the same principle here? If someone shoots themselves in a foot by creating typeguadrs that are "narrowing" to implicit derived types (like the positive int), then it is just should be considered a "known" choice.
I guess my argument is it wasn't a "known" choice before. So now people have to know that this is the new behavior. If the new behavior is counter-intuitive, it's harder for people to guess what's going to happen.
I'm having a hard time finding any examples where type guards are used with unions that wouldn't work with the new concept though.
Beta Was this translation helpful? Give feedback.
All reactions
-
Oh and @mishamsk, thanks for the feedback. I'm going to use your example in the PEP as to how people should handle that case that Guido had:
PosInt = NewType('PosInt', int) def is_positive_int(val: PosInt | int | str) -> TypeGuard[PosInt]: return isinstance(val, int) and val > 0 def func(val: int | str): if is_positive_int(val): # Type checker assumes PosInt here else: # Type checker assumes str | int here
Beta Was this translation helpful? Give feedback.
All reactions
-
@rchiodo happy to help. Hope this PEP will get through!
Beta Was this translation helpful? Give feedback.
All reactions
-
@JelleZijlstra would you be willing to sponsor a PEP based on Eric's idea above?
I've created a new PEP here:
python/peps#3266
Beta Was this translation helpful? Give feedback.
All reactions
-
Could we use existing Literal and @overload syntax?
@overload def is_cat(val: Cat) -> Literal[True]: ... @overload def is_cat(val: Dog) -> Literal[False]: ... def is_cat(val: Cat | Dog) -> bool: return isinstance(val, Cat) def func(val: Cat | Dog): if is_cat(val): reveal_type(val) # Cat else: reveal_type(val) # Dog
Beta Was this translation helpful? Give feedback.
All reactions
-
I think something deserving at least a mention in considered ideas (and consideration) is TypeGuard[A,B] where B, False case, is optional. TypeGuard[A] would behave like before, and users who need it can specify TypeGuard[A,B] with all bells and whistles they want, like cover exact semantics of is_positive_int.
Advantages:
- This would be backwards compatible
- Avoids arbitrary preference for True over False, supports any way of writing code:
is_str -> Trueis as good asis_not_str -> False. Given that typing is added to existing code and is optional in python, argument that having preference for True will force user to write more "logical" code is weaker. - Completeness: possible to annotate code that is too complex for StrictTypeGuard in else case.
- More explicit - the logic of activating "strict" sense of type guard on some conditions is harder to teach, and potentially trickier to apply when types are not trivial
Disadvantages:
- More verbose, users get nothing for free
- If people assume type guard is strict, type system doesn't match their assumption. This is the case now, but it doesn't get fixed unlike proposal in PEP
- When called with more narrow type than in declaration, this additional type narrowing is not used (or at least it would take quite complicated logic to do so). See link in the comment from Eric.
Beta Was this translation helpful? Give feedback.
All reactions
-
@ikamensh, this idea was proposed earlier. I implemented it in prototype form, and the feedback was quite negative. The problem is that it doesn't address all of the needs. It provides support for narrowing in the negative case, but it doesn't support narrowing of union types as explained in point number 3 in this post.
Beta Was this translation helpful? Give feedback.
All reactions
-
Thanks for the hint. Not using information from more narrow type in actual calls is indeed very significant.
Beta Was this translation helpful? Give feedback.
All reactions
-
Thinking about it a bit more, I guess there could be StrictTypeGuard[A,B] - it would narrow types in not mutually exclusive way on both True and False. This would not have the limitation of not using narrower type in actual calls.
Beta Was this translation helpful? Give feedback.
All reactions
-
As I said, that was previously proposed and prototyped, and it received negative feedback for reasons I discuss above. The PEP can discuss it in the "Rejected Ideas" section, but I don't think there's a need to discuss it any further.
Beta Was this translation helpful? Give feedback.
All reactions
-
I'd like to contribute an example of when supporting narrowing in the negative case would be useful:
import asyncio from collections.abc import Coroutine from typing import Any, Callable, ParamSpec, TypeVar, overload async def coroutine_func() -> str: return 'hello world!' def func() -> str: return 'hello world!' P = ParamSpec('P') R = TypeVar('R') @overload def async_or_non_async_wrapper(func: Callable[P, Coroutine[Any, Any, R]]) -> Callable[P, Coroutine[Any, Any, R]]: ... @overload def async_or_non_async_wrapper(func: Callable[P, R]) -> Callable[P, R]: ... def async_or_non_async_wrapper(func: Callable[P, R] | Callable[P, Coroutine[Any, Any, R]]) -> Callable[P, R] | Callable[P, Coroutine[Any, Any, R]]: if asyncio.iscoroutinefunction(func): reveal_type(func) # Type of "func" is "(**P@overload_func) -> Coroutine[Any, Any, Any]" async def wrapper(*args: P.args, **kwargs: P.kwargs) -> R: return await func(*args, **kwargs) return wrapper reveal_type(func) # Type of "func" is "((**P@overload_func) -> R@overload_func) | ((**P@overload_func) -> Coroutine[Any, Any, R@overload_func])" def wrapper(*args: P.args, **kwargs: P.kwargs) -> R: return func(*args, **kwargs) # Error: Type "R@async_or_non_async_wrapper | Coroutine[Any, Any, R@async_or_non_async_wrapper]" cannot be assigned to type "R@async_or_non_async_wrapper" return func
Beta Was this translation helpful? Give feedback.
All reactions
-
Note that all major type checkers now support TypeIs (https://peps.python.org/pep-0742/), which does allow narrowing in the negative case.
Beta Was this translation helpful? Give feedback.
All reactions
-
👍 1 -
🎉 2 -
👀 1
-
Thoughts about updating the typeshed hints for iscoroutinefunction?
@overload def iscoroutinefunction(obj: Callable[_P, Coroutine[Any, Any, _T]]) -> TypeIs[Callable[_P, Coroutine[Any, Any, _T]]]: ... @overload def iscoroutinefunction(obj: Callable[_P, Awaitable[_T]]) -> TypeIs[Callable[_P, Coroutine[Any, Any, _T]]]: ... @overload def iscoroutinefunction(obj: object) -> TypeIs[Callable[..., Coroutine[Any, Any, Any]]]: ...
Beta Was this translation helpful? Give feedback.
All reactions
-
👍 1
-
I believe that would be unsafe, because there may be callables that returns a coroutine but that iscoroutinefunction() does not recognize.
Beta Was this translation helpful? Give feedback.
All reactions
-
👍 1
-
That is a relatively recent PEP, but why wasn't this mentioned before? I think PEP-742 simply settles this whole discussion! Does anybody disagree?
And now that the feature exists, turning any existing TypeGuard into a TypeIs should be a new issue being tracked in the projects releasing each TypeGuard
Beta Was this translation helpful? Give feedback.
All reactions
-
👍 1