-
Notifications
You must be signed in to change notification settings - Fork 494
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
remove upper transformers version limit #2048
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, neuron is pinned to trfrs 4.43.2 anyway so...
"exporters": ["onnx", "onnxruntime", "timm"], | ||
"exporters-gpu": ["onnx", "onnxruntime-gpu", "timm"], | ||
"exporters": ["onnx", "onnxruntime", "timm", "transformers<4.46.0"], | ||
"exporters-gpu": ["onnx", "onnxruntime-gpu", "timm", "transformers<4.46.0"], | ||
"exporters-tf": [ | ||
"tensorflow>=2.4,<=2.12.1", | ||
"tf2onnx", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe we can also lift the explicit restrictions here also, since optimum-neuron
will enforce a pinned version anyway (cc @JingyaHuang ):
"neuron": ["optimum-neuron[neuron]>=0.0.20", "transformers>=4.36.2,<4.42.0"],
"neuronx": ["optimum-neuron[neuronx]>=0.0.20", "transformers>=4.36.2,<4.42.0"],
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
let me know if you'd like me to include this change in today's release @JingyaHuang
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@dacorvo you mean something like:
"neuron": ["optimum-neuron[neuron]>=0.0.20", "transformers>=4.36.2,<=4.43.2"],
"neuronx": ["optimum-neuron[neuronx]>=0.0.20", "transformers>=4.36.2,<=4.43.2"],
? Or you would that we remove transformers from the extra? if so wouldn't we install once again a super old optimum-neuron if we do
pip install optimum[neuronx]
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
merging this PR as not directly related to this discussion, let me know if you want me to include any modification related to neuron in todays release!
To not block a release from optimum-intel, optimum-neuron, optimum-habana and others to support the latest transformers release (removing the need to dsync it with an optimum release)