-
Notifications
You must be signed in to change notification settings - Fork 486
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Remove on_device_shape
and logical_on_device_shape
#5546
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
on_device_shape
and logical_on_device_shape
on_device_shape
and logical_on_device_shape
Once the openXLA change to add |
9e1c0da
to
e180883
Compare
The upstream commit merged: openxla/xla@6308dba Added the hash to the patch in this commit. Let's wait until after we're done moving the OpenXLA pin to merge this (see #5592) |
Do we want this PR in 2.1 branch? |
No. This is a risky change, so we should not backport it. |
These were removed from the PJRT C API client recently in this commit: tensorflow/tensorflow@840633c
Instead, we can rebuild the shapes from the decomposed components. In the future, we should see if we can remove
Shape
fromComputationClient::Data
entirely in favor ofdimensions
,layout
, etc.