-
-
Notifications
You must be signed in to change notification settings - Fork 58
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Initial attempt a caching protocol implementation #86
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Highlighted some questions/areas of concern. Nope. I committed a completely broken/incomplete thing. I fixed that and will highlight questions/areas of concern in the proper places. Apologies for the stealth-edit/commit rewrite.
For the future, please let me know if you prefer that I keep my commit(s) tidy (i.e., squash/rewrite as a go), or whether you prefer to preserve a longer commit chain here (complete with blunders and missteps) and tidy up at the end. Different maintainers have different preferences, so I want to make sure I don't mess up your flow.
UPDATE: I've been preserving commits in case I need to revert anything, but happy to tidy up later as desired.
20da92a
to
8216029
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Houston, we have a hiccup.¹ I forgot (or willed from my mind) some thorny issues around Protocol
s that may frustrate one of the goals of this PR, namely: Creating a drop-in replacement for typing.Protocol
.
I'm not an expert, but as far as I can tell, typing.Protocol
is a special animal. Really special. Like Chimera special.
For Mypy to work, I'm pretty sure it needs to think it's dealing with things directly derived from typing.Protocol
and nothing else. Buuut for a world where others can gain access to caching behavior through mere subclassing, they need a parent with our metaclass implementation, which almost certainly isn't typing.Protocol
.
I'm pretty sure the first import of typing.Protocol
as Protocol
"hides" our implementation from Mypy (hence the various # type: ignore [no-redef]
comments), which tricks it into allowing static checking to proceed as normal. I think this is why we don't get static type-check errors for test_api_typing.py
, even though it uses non-compliant "protocols" (i.e., those that derive from beartype.typing.Protocol
without also deriving from typing.Protocol
) all over the place.
That's a happy accident … for now. (I have no idea if that holds for other static type checkers.)
UPDATE: I believe I have settled on a solution (see my subsequent comment) with one minor compromise by putting implementations in their own module (_protocol.py
for now) and conditionally importing them into typing.py
.
The bigger issue is generic protocols. As far as I know, one can't have arbitrary generic placeholders. But typing.Protocol
can. It behaves like typing.Generic
in that regard. In other words, you can totally do this:
from abc import abstractmethod
from typing import Protocol, TypeVar, runtime_checkable
_AT_co = TypeVar("_AT_co", covariant=True)
_BT_co = TypeVar("_BT_co", covariant=True)
…
_NT_co = TypeVar("_NT_co", covariant=True)
@runtime_checkable
class SupportsOneTwo(Protocol[_AT_co, _BT_co, …, _NT_co]): # <-- WTF?!
@abstractmethod
def a(self) -> _AT_co:
pass
@abstractmethod
def b(self) -> _BT_co:
pass
…
@abstractmethod
def n(self) -> _NT_co:
pass
What I don't know how to do is to replicate typing.Protocol
's ability to subscript arbitrary type parameters in our own class without requiring subclasses explicitly include Generic
. So one could do this …
from typing import Generic
from beartype.typing import Protocol
class SupportsOneTwo(Protocol, Generic[_AT_co, _BT_co, …, _NT_co]):
...
… or this …
from typing import Protocol, runtime_checkable
from beartype.typing import _CachingProtocolMeta
@runtime_checkable
class SupportsOneTwo(Protocol[_AT_co, _BT_co, …, _NT_co], metaclass=_CachingProtocolMeta):
...
… or this …
import typing
from beartype.typing import Protocol
class SupportsOneTwo(Protocol, typing.Protocol[_AT_co, _BT_co, …, _NT_co]):
...
… but one cannot do this …
from beartype.typing import Protocol
class SupportsOneTwo(Protocol[_AT_co, _BT_co, …, _NT_co]):
...
You'll get a module loading error claiming that beartype.typing.Protocol
is not generic. I'm stumped on how to get a syntax-compatible version. My hunch is that the best thing we can do in this case is tell customers to use Generic
and not Protocol
(i.e., the first alternative I presented above).
¹ Or "hiccough", if one prefers it. 🧐
beartype/typing.py
Outdated
|
||
# Define our own caching protocol | ||
from typing import Protocol as _Protocol | ||
# TODO: This is...a wart. Obviously, there is a fragility here. We should |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
At worst, we could copy over the implementation from the standard library.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Asking a related question here. UPDATE: @ktbarrett directed my attention to this tiny, but important bit from PEP 544:
As he points out, there doesn't seem to be any other purpose for allowing
obsolete examplefrom typing import TYPE_CHECKING
if TYPE_CHECKING:
from typing import Protocol
else:
from typing import Protocol as _Protocol
class Protocol(_Protocol, metaclass=_CachingProtocolMeta): … UPDATE: As mentioned elsewhere in this PR, 6ad8629 pulls implementations into |
6ad8629 pulls out the implementation into The one wart here is that Mypy won't detect this problem: >>> from beartype.typing import Protocol, TypeVar
>>> _T = TypeVar("_T")
>>> class MyProtocol(Protocol[_T]): # <-- can't do this
>>> pass Users of >>> from beartype.typing import Generic, Protocol, TypeVar
>>> _T = TypeVar("_T")
>>> class MyProtocol(Protocol, Generic[_T]):
>>> pass |
This right here straight-up Naruto run. Yet again, @posita went above and beyond the bear call of duty. Aren't bears supposed to sleeping in a fitful hibernate slumber filled with berry bunches, bloodied muzzles, and swaggering female bear haunches this time of year anyway!?!? Yet again, I am beyond the wall of sleep myself. This means I will remain verbose yet disconcertingly incoherent. Let's begin.
...eye spasmodically twitches. Blackest heresy! There is only one stylistic preference – and that is mine, I believe. All others are merely accidents their authors have yet to openly acknowledge. I'm kidding! Really. I'm not the unseemly jerk our neighbours continually claim me to be (however factual their video evidence may be). Alternately, I'm obsessive-compulsive to the detriment of every codebase within keyboard reach of my pudgy fingers. I dispute neither claim.
...very well. We reserve the Comfy Chair for another occasion then.
You have done well, "young" protégé. I see your code is as big as mine. Seriously, though. This rocks. For those who are about to code, I salute you and then genuflect until my back cracks in every single vertebrae simultaneously.
Checkmate, mypy. Checkmate.
vigorous nod nod nod
...wut u say!?!?
You are, of course, always correct. vigorous nod nod nod
...heh. In hindsight, I probably should have documented that. With only minimal hand-waving, I think. Maybe. Actually, let's not quote me on that.
Hindu elephant deity Ganesha preserve us. That smells suspiciously like a poo-smelling deal-breaker. Let's see if we can't resolve this without any further reference to the Comfy Chair. Are you perhaps hitting this exception buried within the bowels of def _check_generic(cls, parameters, elen):
"""Check correct count for parameters of a generic cls (internal helper).
This gives a nice error message in case of count mismatch.
"""
if not elen:
# v--- THIS BAD DUDE, RIGHT?
raise TypeError(f"{cls} is not a generic class")
alen = len(parameters)
if alen != elen:
raise TypeError(f"Too {'many' if alen > elen else 'few'} arguments for {cls};"
f" actual {alen}, expected {elen}") Since With the well-honed intuition of one who started violating privacy encapsulation when he was only 5, I intuit that the core issue is that ...just get to it already the @_tp_cache
def __class_getitem__(cls, params):
if not isinstance(params, tuple):
params = (params,)
if not params and cls is not Tuple:
raise TypeError(
f"Parameter list to {cls.__qualname__}[...] cannot be empty")
params = tuple(_type_convert(p) for p in params)
if cls in (Generic, Protocol): # <-- THIS BAD DUDE RIGHT HERE
# Generic and Protocol can only be subscripted with unique type variables.
if not all(isinstance(p, (TypeVar, ParamSpec)) for p in params):
raise TypeError(
f"Parameters to {cls.__name__}[...] must all be type variables "
f"or parameter specification variables.")
if len(set(params)) != len(params):
raise TypeError(
f"Parameters to {cls.__name__}[...] must all be unique")
else:
# Subscripting a regular Generic subclass.
if any(isinstance(t, ParamSpec) for t in cls.__parameters__):
params = _prepare_paramspec_params(cls, params)
else:
_check_generic(cls, params, len(cls.__parameters__))
return _GenericAlias(cls, params,
_typevar_types=(TypeVar, ParamSpec),
_paramspec_tvars=True) Let's circumvent that. Now hear me out here. Things gonna get ugly. Ever seen Reservoir Dogs? We're reproducing the ending right now. class Protocol(...):
...
def __class_getitem__(cls, params):
super().__class_getitem__(_Protocol, params) # <-- MAD CACKLING HERE Pretty sure that'll work – but dead certain that'll fail. When it does, we consider options. Options that are labelled "DO NOT PUSH THIS SICKENINGLY RED BUTTON" include:
class Protocol(...):
...
def __class_getitem__(cls, params):
Prococol_old = typing.Protocol
typing.Protocol = cls # <-- MAD CACKLING HERE
item = super().__class_getitem__(cls, params) # <-- HERE AS WELL
typing.Protocol = Protocol_old # <-- CACKLING STILL AUDIBLE
return item |
Thank you for the education on I ain't too proud to beg [a system to do what I want by runtime patching¹]. At the risk of hitting the jolly, candy-like history eraser button, after looking at the implementation of class Protocol(...):
...
def __class_getitem__(cls, params):
Protocol_old = typing.Protocol
try:
typing.Protocol = cls
return super().__class_getitem__(params)
finally:
typing.Protocol = Protocol_old … and this (also adapted from above) … class Protocol(...):
...
def __class_getitem__(cls, params):
_Protocol.__class_getitem__(params) … ?² One remote problem with the first approach might be a lack of thread safety. I assume that doesn't come up very often, but who knows? Something something … web frameworks … something something … runtime (re)loading of packages in separate threads … something something … dependency injection … [remainder deleted in the interest of time]. I think both encounter the LRU cache while masquerading as UPDATE: I must have been tired. They're completely different. One invokes That being said, I see your Reservoir Dogs ending and raise you a Taxi Driver with: class Protocol(_Protocol, …):
# …
def __class_getitem__(cls, params):
gen_alias = _Protocol.__class_getitem__(params)
return type(gen_alias)(cls, *gen_alias.__args__) UPDATE: I am also spent (and I haven't even gotten to "Beyond the Wall of Sleep" yet), but I did ¹ I concede that starts to sound less like begging and more like coercing, but let's leave that be for the time being. ² Mad cackling omitted for brevity, but assume it persists throughout the quoted material. ³ I have a lot of skepticism around my own knowledge in this area, if you didn't catch that theme. |
8094469
to
fdd4a82
Compare
Codecov Report
@@ Coverage Diff @@
## main #86 +/- ##
==========================================
- Coverage 96.69% 96.59% -0.10%
==========================================
Files 118 119 +1
Lines 4387 4464 +77
==========================================
+ Hits 4242 4312 +70
- Misses 145 152 +7
Continue to review full report at Codecov.
|
097c39b
to
9a9b4a8
Compare
This is amazing fiery balls, if you hadn't already noticed. 🌞 But first!
Because... Because mypy is just a pseudonym for He Who Walks Behind the Rows, let's pretend mypy isn't watching like a voyeur in the corn fields for now. As time permits and the cats finally stop howling for food, we can uppercut mypy's glass jaw with a spray of For now, we glare sternly in mypy's general direction. But next! def __class_getitem__(cls, params):
gen_alias = _Protocol.__class_getitem__(params)
# I'm pretty sure we need this nudge. Otherwise our inheritors end
# up with the wrong metaclass (i.e., type(typing.Protocol) instead
# of the desired _CachingProtocolMeta). Luddite alert: I don't
# fully understand the mechanics here.
return type(gen_alias)(cls, *gen_alias.__args__) You have now gone to a place where none but eagles dare. You almost got there without burning your microlight wings made of greasy (yet patented) paraffin wax in the positronic radiation field emitted by the standard The burning concern here is the instantiation of Under Python 3.10, here's how @_tp_cache
def __class_getitem__(cls, params):
...
return _GenericAlias(cls, params,
_typevar_types=(TypeVar, ParamSpec),
_paramspec_tvars=True) So I'm now squinting suspiciously at
Because I'm now squinting suspiciously at everything, let's slowly backtrack to the last point in the autumn woods not inhabited by unsettling effigies, cawing crows, and a furtive shadow at the corner of your eye that keeps shifting even as you uneasily pretend "...it's really not there, it's not really there." This is that point: class Protocol(...):
...
def __class_getitem__(cls, params):
Protocol_old = typing.Protocol
try:
typing.Protocol = cls
return super().__class_getitem__(params)
finally:
typing.Protocol = Protocol_old This is galaxy brain. Unlike my plebeian proposal, your upper-class approach cleverly accounts for unexpected exceptions. Let's rock this like Rosemary's baby in the cradle. |
9a9b4a8
to
ea7b861
Compare
Your wish is my command…eventually. Allow me first to draw attention to one (perceived) flaw of Rosemary's Galaxy Baby, which may amount to a nothingburger, as well as one last delve into my misguided hijack-the-generic-alias approach. Theoretical race condition present in the we're-going-to-replace-that-faberge-egg-with-a-fake-but-just-for-a-minute-no-one-will-notice-we-promise approachHere is the race I am worried about with the above (which I'll rewrite slightly). class Protocol(...):
...
def __class_getitem__(cls, params):
# Thread 1
Protocol_old = typing.Protocol
typing.Protocol = cls
# Thread 2
Protocol_old = typing.Protocol
typing.Protocol = cls
try:
return super().__class_getitem__(params)
finally:
typing.Protocol = Protocol_old
try:
return super().__class_getitem__(params)
finally:
typing.Protocol = Protocol_old Consider: import typing
from abc import abstractmethod
from functools import partial
from threading import Thread
from typing import TypeVar
from beartype.typing import Protocol
T = TypeVar("T")
contrived_example = """
class ThisIsNutsButMayBePossible{}(Protocol[T]):
@abstractmethod
def foo(self) -> None:
pass
"""
def compileit(i: int):
exec(contrived_example.format(i))
threads = [Thread(target=partial(compileit, i)) for i in range(100)]
for thread in threads:
thread.start()
for thread in threads:
thread.join()
print(typing.Protocol) # <-- what result? If I run that, I get the output Yojimbo Baggins was right - revisiting creating and gutting the
|
ea7b861
to
fbb2cc6
Compare
Ohnoes. All pending work for Tragically, I took so long to review this that this probably won't land in Thankfully, this is my top priority for the inevitable |
I think the delay is good, and I wouldn't kill myself to squeeze this into 0.10.1. First, we might want to consider waiting until CPython gets is |
@posita: On second thought, this is good to go. Don't sweat the overly fine minutiae and my banal code review. I'm all for banal, let's be clear – but you've already been around the bend (and back again) with @TeamSpen210. I'll add a few internal
If heaven can wait, ...Alexa says it can so can this. Thunderous applause for this third out of three tremendous contributions, @posita. Likewise, thanks so much for the grand critique that helped fuel this along in my awkward and questionable absence, @TeamSpen210. You both are the stuff open-source dreams are made of. 🕺 💃 |
☝️ This. Your points are well taken. If I get some time to try to nudge things into a better place, I will (although |
Just got there fifteen minutes ago. Thanks to True worst-case end times story: when our galaxy collides with Andromeda in 5 billion years, armchair Wikipedia astrologists believe that "...the Sun might be brought near the centre of the combined galaxy, potentially coming near one of the black holes before being ejected entirely out of the galaxy.[13] Alternatively, the Sun might approach one of the black holes a bit closer and be torn apart by its gravity. Parts of the former Sun would be pulled into the black hole." Is this supposed to reassure us, Wikipedia? Are we just ping-pong balls to you, Wikipedia? Back to the issue front. I realize I no longer understand the def _check_only_my_attrs(cls, inst: Any) -> bool:
# So, "dict" actually supports set operators under
# Python >= 3.9. Ignoring Python < 3.9 just cause,
# this *probably* reduces to this one-liner:
# attrs = set(cls.__dict__ | cls.__dict__.get("__annotations__", {}))
# We may not even need to coerce "attrs" to a set.
# Dictionaries are now pretty much sets, but hotter.
# *ANYWAY.* This makes sense; we're getting a
# set of all unique attribute names. Good!
attrs = set(cls.__dict__)
attrs.update(cls.__dict__.get("__annotations__", {}))
#!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
# Inlining _get_protocol_attrs(): it begins.
#!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
# Uh... This is why we should inline. Let's pretend
# this redefinition never happens. Eye is twitching.
base_attrs = set()
# *UH OH.* For all classes, the following tautology
# holds:
# >>> cls.__mro__[0] is cls
# True
# This means that the first "base" visited by this
# iteration is just the passed "cls". But, wait. We
# just introspected the attributes of "cls" above!
# Clearly, we don't need to do that above. Let's
# centralize all of the attribute inspection here.
#
# Very well. We progress deeper into the dimly
# lit cavern from which grunting can be heard.
for base in cls.__mro__[:-1]: # without object
# Guido, you cause me pain.
if base.__name__ in ('Protocol', 'Generic'):
continue
# *WTFBRO.* What is this madness? Coercing
# "KeyView" containers into lists and then
# appending them into an even larger list?
# This is a facepalm. This is why we inline.
# Lastly, "{}" is not a singleton –
# unlike "()", which is. Don't ask. So, we
# declare a stupid empty global dictionary
# singleton as a stupid hidden parameter: e.g.,
# def _check_only_my_attrs(
# cls, inst: Any, _empty_dict = {}) -> bool:
#
# This should then suffice with speed and glamour:
# base_dict = base.__dict__
# cls_attrs = base_dict | base_dict.get('__annotations__', _empty_dict)
# for attr in cls_attrs: ...
#
# Grunting sounds segue into mewling, hissing,
# and a blood-curdling falsetto abruptly cut short.
annotations = getattr(base, '__annotations__', {})
for attr in list(base.__dict__.keys()) + list(annotations.keys()):
# This is fine. It is the only thing that is.
if not attr.startswith('_abc_') and attr not in EXCLUDED_ATTRIBUTES:
base_attrs.add(attr)
#!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
# Inlining _get_protocol_attrs(): it ends.
#!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
# I begin to question basic assumptions like
# whether my 30 year-old username being a
# portmanteau of leyline and Cecil is really
# such a good idea after all.
#
# I've now officially lost the entire plot.
# Doesn't this statement just reduce to:
# attrs = attrs
# That is, doesn't intersection_update()
# just remove all attribute names that are
# *NOT* in both sets? But "base_attrs" is
# the proper superset of "attrs". So, this
# just removes all attribute names except
# those directly defined by the passed "cls".
# In other words, the entire doubly-nested
# "for" loop we performed above was a noop.
#
# The sound of wet bloody meat being forcibly
# dragged across a rough surface of stalagmites
# percolates through the fetid crypt-like air.
attrs.intersection_update(base_attrs) # type: ignore [attr-defined]
# And... we're iterating above. Pretty sure we can
# just perform this test in the iteration above. That
# is, I *would* be pretty sure if I was pretty sure
# about anything here. But I'm not.
#
# Oppressive grunting sounds resume.
for attr in attrs:
if (
not hasattr(inst, attr) or
(
callable(getattr(cls, attr, None)) and
getattr(inst, attr) is None
)
):
return False
return True Kinda uncertain about core scruples and basic worldview anymore, but suspect this alternative trash compactor might get us past the abyss: from typing import Generic
_BASES_IGNORABLE = frozenset((
Generic, Protocol, _Protocol))
def _check_only_my_attrs(
cls,
inst: Any,
_empty_dict = {},
) -> bool:
for base in cls.__mro__[:-1]: # without object
# Guido, you cause me pain.
if base in _BASES_IGNORABLE:
continue
base_dict = base.__dict__
base_annotations = base_dict.get('__annotations__', _empty_dict)
base_attrs = (
base_dict | base_annotations
if IS_PYTHON_AT_LEAST_3_9 else
dict(base_dict, **base_annotations)
)
for attr in base_attrs:
if (
# If this attribute name is unignorable *AND*...
(
not attr.startswith('_abc_') and
attr not in EXCLUDED_ATTRIBUTES
) and
# Either...
(
# This instance does not define this attribute *OR*
not hasattr(inst, attr) or (
# This attribute is a callable method defined on
# this class instead *AND*...
callable(getattr(cls, attr, None)) and
#FIXME: Must confess, I still have no idea what
#this specific test is on about. Back to the abyss!
# This attribute is... none? I don't. I just don't.
getattr(inst, attr) is None
)
)
):
# This instance fails to define this attribute and thus
# fails to satisfy this protocol. In this case, we fail.
return False
# Else, this instance defines all attributes and thus satisfies this
# protocol. In this case, we succeed.
return True In theory, being optimistic here fellahs we don't even need to keep a running
Rictus grimace: triggered. 😬 Since you understandably find no deeper fulfilment in life than reporting CPython issues, i am this way too would you mind taking one for all humanity by throwing yourself onto the sword that is the public CPython bug tracker? @TeamSpen210 and I loudly cheer from the sidelines at a distance. It's a disturbing shocker that this is still in CPython 3.10. The tenuous compassion for all mankind I normally, of course, feel has slipped yet another femptometer closer to the abyssal cliff of Nietzschean despair. |
ohisortagetitnowbutnotreally You... you are bloody ingenious, @posita. We already knew that but I had to personally confirm that. Specifically, I had to judiciously pollute your The ominous key appears to be that CPython implicitly calls the metaclass That's fascinating if true. I'd assumed CPython would only call EDIT: ZOMG. Of course, my donkey-like assumption wasn't without merit. CPython does only call That... that's insane but so clever my brains are leaking out my itchy scalp, bro. I'd better document all of this immediately before I lose my feeble grip and succumb to this shadow madness. Please Send Help for I Am Tired and It's ColdI still have no clear idea what exactly this does, though: attrs.intersection_update(_get_protocol_attrs(cls)) # type: ignore [attr-defined] Clearly, that's essential. I know this because the Abyss opens up when I comment that out. Clearly, I'm also incapable of fully comprehending the eternal verities that you have authored here. Please enlighten, Beneficent Protocol Guru, that I may comment this and then inline |
Hah! We see you deleting branches over there. (≖_≖ ) |
Buh-yah. Finally untangled the rat's nest. Behold a type-checking subroutine for caching protocols respectful of privacy (and faster on that first uncached hit, too): def _check_only_my_attrs(cls, inst: Any, _EMPTY_DICT = {}) -> bool:
cls_attr_name_to_value = cls.__dict__
cls_attr_name_to_hint = cls_attr_name_to_value.get(
'__annotations__', _EMPTY_DICT)
cls_attr_names = (
cls_attr_name_to_value | cls_attr_name_to_hint
if IS_PYTHON_AT_LEAST_3_9 else
dict(cls_attr_name_to_value, **cls_attr_name_to_hint)
)
# For the name of each attribute declared by this protocol class...
for cls_attr_name in cls_attr_names:
# If...
if (
# This name implies this attribute to be unignorable *AND*...
#
# Specifically, if this name is neither...
not (
# A private attribute specific to dark machinery defined by
# the "ABCMeta" metaclass for abstract base classes *OR*...
cls_attr_name.startswith('_abc_') or
# That of an ignorable non-protocol attribute...
cls_attr_name in _PROTOCOL_ATTR_NAMES_IGNORABLE
# This attribute is either...
) and (
# Undefined by the passed object *OR*...
#
# This method has been specifically "blocked" (i.e.,
# ignored) by the passed object from being type-checked as
# part of this protocol. For unknown and presumably
# indefensible reasons, PEP 544 explicitly supports a
# fragile, unreadable, and error-prone idiom enabling
# objects to leave methods "undefined." What this madness!
not hasattr(inst, cls_attr_name) or
(
# A callable *AND*...
callable(getattr(cls, cls_attr_name, None)) and
# The passed object nullified this method. *facepalm*
getattr(inst, cls_attr_name) is None
)
)
):
# Then the passed object violates this protocol. In this case,
# return false.
return False
# Else, the passed object satisfies this protocol. In this case, return
# true.
return True @posita: Uhohs. Turns out caching protocols fail to support a standard use case of subscription by non-type variables satisfying a type variable: e.g., # This is fine.
>>> from typing import AnyStr, Protocol
>>> class GoodProtocol(Protocol[AnyStr]): pass
>>> class BadProtocol(GoodProtocol[str]): pass
# This blows fine chunks across my keyboard.
>>> from beartype.typing import AnyStr, Protocol
>>> class GoodProtocol(Protocol[AnyStr]): pass
>>> class BadProtocol(GoodProtocol[str]): pass
Traceback (most recent call last):
File "mopy.py", line 7, in <module>
class BadProtocol(GoodProtocol[str]): pass
File "/home/leycec/py/beartype/beartype/typing/_typingpep544.py", line 403, in __class_getitem__
gen_alias = _Protocol.__class_getitem__.__wrapped__(
File "/usr/lib/python3.8/typing.py", line 890, in __class_getitem__
raise TypeError(
TypeError: Parameters to Protocol[...] must all be type variables Do |
Wow…that is some heavy lifting! I hope you did not suffer permanent injury. My apologies for my lack of response on this thread. (Pesky day job has me runnin' wild.) I promise to circle back as soon as I get a minute! 😅 P.S. |
Oh, no worries whatsoever! Work-life balance is no balance when you're teetering on the edge. I just accidentally resolved the above issue, too. I don't entirely understand the solution, which is par for the course around here: # Newer, better, faster, harder __class_getitem__()
# builds a brighter world despite watering eyes and
# increased sanity loss.
def __class_getitem__(cls, item):
# We have to redefine this method because typing.Protocol's version
# is very persnickety about only working for typing.Generic and
# typing.Protocol. That's an exclusive club, and we ain't in it.
# (RIP, GC.) Let's see if we can sneak in, shall we?
# FIXME: Once <https://bugs.python.org/issue46581> is addressed,
# consider replacing the madness below with something like:
# cached_gen_alias = _Protocol.__class_getitem__(_Protocol, params)
# our_gen_alias = cached_gen_alias.copy_with(params)
# our_gen_alias.__origin__ = cls
# return our_gen_alias
# We can call typing.Protocol's implementation directly to get the
# resulting generic alias. We need to bypass any memoization cache
# to ensure the object on which we're about to perform surgery
# isn't visible to anyone but us.
if hasattr(_Protocol.__class_getitem__, '__wrapped__'):
# !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
# @posita: New magic happens here.
# !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
base_cls = (_Protocol if _Protocol in cls.__bases__ else cls)
gen_alias = super().__class_getitem__.__wrapped__(
base_cls, item)
else:
# We shouldn't ever be here, but if we are, we're making the
# assumption that typing.Protocol.__class_getitem__ no longer
# caches. Heaven help us if it ever uses some proprietary
# memoization implementation we can't see anymore because it's
# not based on functools.wraps.
gen_alias = _Protocol.__class_getitem__(item)
# Now perform origin-replacement surgery. As-created,
# gen_alias.__origin__ is (unsurprisingly) typing.Protocol, but we
# need it to be our class. Otherwise our inheritors end up with
# the wrong metaclass for some reason (i.e., type(typing.Protocol)
# instead of the desired _CachingProtocolMeta). Luddite alert: I
# don't fully understand the mechanics here. I suspect no one does.
gen_alias.__origin__ = cls
# We're done! Time for a honey brewskie break. We earned it.
return gen_alias Because this makes my eyes water, this might make your eyes water, too. I recant everything I said about @beartype and Would you like to unify our two approaches? As I dimly recall, our current caching protocol API almost satisfied your use cases – except for Because you're Bear Clan, you have full carte blanche to:
It's a full moon out there and we're howling at it. 🌝 |
For better or worse, this just got pushed out the door when no one was looking with You decide: $ pip install --upgrade beartype |
Playing catch-up…. Do I understand from In other words, is your question: "Did I just trigger the doomsday machine by releasing this into the world, or are we still cool and I should learn to stop worrying and love the bomb?" I want to make sure I'm responding to the right thing…. UPDATE: I now have a branch that strips out the base implementation of the caching protocol from 🙌
|
... and separate override functionality from caching functionality. This will be useful if beartype/beartype#86 lands and we decide to use that version (somehow) instead of replicate our own.
Now that beartype/beartype#86 has landed (and made it into a release), we can remove our own base implementation and use that one instead. We still provide our own implementation that allows overriding runtime checking, but it neatly derives from ``beartype``'s.
Now that beartype/beartype#86 has landed (and made it into a release), we can remove our own base implementation and use that one instead. We still provide our own implementation that allows overriding runtime checking, but it neatly derives from ``beartype``'s.
Now that beartype/beartype#86 has landed (and made it into a release), we can remove our own base implementation and use that one instead. We still provide our own implementation that allows overriding runtime checking, but it neatly derives from ``beartype``'s.
Now that beartype/beartype#86 has landed (and made it into a release), we can remove our own base implementation and use that one instead. We still provide our own implementation that allows overriding runtime checking, but it neatly derives from ``beartype``'s.
Now that beartype/beartype#86 has landed (and made it into a release), we can remove our own base implementation and use that one instead. We still provide our own implementation that allows overriding runtime checking, but it neatly derives from ``beartype``'s.
Now that beartype/beartype#86 has landed (and made it into a release), we can remove our own base implementation and use that one instead. We still provide our own implementation that allows overriding runtime checking, but it neatly derives from ``beartype``'s.
You understand everything. Thus, you understand this. Actually, I'm jelly at how fast you grok code even in the midst of a feverish working pitch, full-throttle family fun, and open-source commitments beyond even my most wide-eyed idealism. You dah bro. That's what I'm saying here.
🤕 On the bright side, Python 3.7 hits EOL in a year-and-a-half. On the dark side, something worked and now it doesn't. I'm actually curious how you achieved that when... oh. That is right. I recall it like it was yesterday's fetid goat cheese. The private
It goes without saying, but I'll say it: "Please feel free to have a go at everything in the Relatedly:
🥳 Wait just a hot minute! That's not something to celebrate.
You make the call. I'll roll the dice and tackle the issue tracker uproar. |
... and separate override functionality from caching functionality. This will be useful if beartype/beartype#86 lands and we decide to use that version (somehow) instead of replicate our own.
Now that beartype/beartype#86 has landed (and made it into a release), we can remove our own base implementation and use that one instead. We still provide our own implementation that allows overriding runtime checking, but it neatly derives from ``beartype``'s.
Initial stab at porting
numerary.types.CachingProtocolMeta
tobeartype
. It only implements a naive cache that will grow indefinitely. (I.e., it does not include an LRU mechanism.) That can be addressed in a subsequent PR (before this one lands, if necessary). Further, this port does not includenumerary
'sinclude
/exclude
mechanism, which should keep this implementation simpler.Eventually, this should fix beartype/numerary#8.