-
Notifications
You must be signed in to change notification settings - Fork 486
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add support for Falcon model to export to ONNX #1172
Comments
Hi @mindest , yes the easiest way would be first a transformers support we can base ourselves on. An alternative is to indeed use the custom modeling from tiiuae/falcon-7b & to use custom ONNX configs for the export (available only on |
@mindest Actually huggingface/transformers#24523 has been merged, so just waiting on a release from transformers :) |
Yo, just curious, why do we have to wait for a proper release (IE: pip etc) before this can be resolved? I was able to build from source and it works just fine. |
Our CI relies on transformers latest release. |
Hey John, I am looking for solution on this. Can you please share how do you able to convert it? |
up |
Hi, the support is added in #1391 |
Feature request
Add support so that we can export Falcon model to ONNX format using
optimum-cli
, likeMotivation
Wanted to run Falcon model using ONNXRuntime. Currently error will be raised when exporting Falcon model to ONNX, using the above command:
KeyError: "refinedwebmodel is not supported yet. Only {'electra', 'mt5', ...} are supported. If you want to support refinedwebmodel please propose a PR or open up an issue."
Your contribution
Support may be dependent on updates in transformers, like huggingface/transformers#24523.
The text was updated successfully, but these errors were encountered: