Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for Falcon model to export to ONNX #1172

Closed
mindest opened this issue Jul 7, 2023 · 7 comments
Closed

Add support for Falcon model to export to ONNX #1172

mindest opened this issue Jul 7, 2023 · 7 comments

Comments

@mindest
Copy link

mindest commented Jul 7, 2023

Feature request

Add support so that we can export Falcon model to ONNX format using optimum-cli, like

optimum-cli export onnx -m tiiuae/falcon-7b falcon_7b --trust-remote-code

Motivation

Wanted to run Falcon model using ONNXRuntime. Currently error will be raised when exporting Falcon model to ONNX, using the above command:
KeyError: "refinedwebmodel is not supported yet. Only {'electra', 'mt5', ...} are supported. If you want to support refinedwebmodel please propose a PR or open up an issue."

Your contribution

Support may be dependent on updates in transformers, like huggingface/transformers#24523.

@fxmarty
Copy link
Contributor

fxmarty commented Jul 11, 2023

Hi @mindest , yes the easiest way would be first a transformers support we can base ourselves on. An alternative is to indeed use the custom modeling from tiiuae/falcon-7b & to use custom ONNX configs for the export (available only on main for now, not in a release: https://huggingface.co/docs/optimum/main/en/exporters/onnx/usage_guides/export_a_model#customize-the-export-of-transformers-models-with-custom-modeling)

@fxmarty
Copy link
Contributor

fxmarty commented Jul 11, 2023

@mindest Actually huggingface/transformers#24523 has been merged, so just waiting on a release from transformers :)

@Johnno1011
Copy link

Yo, just curious, why do we have to wait for a proper release (IE: pip etc) before this can be resolved? I was able to build from source and it works just fine.
Thanks.

@fxmarty
Copy link
Contributor

fxmarty commented Aug 8, 2023

Our CI relies on transformers latest release.

@Pradipve2011
Copy link

Pradipve2011 commented Aug 11, 2023

@Johnno1011

Hey John,

I am looking for solution on this. Can you please share how do you able to convert it?

@WilliamTambellini
Copy link

up

@fxmarty
Copy link
Contributor

fxmarty commented Oct 18, 2023

Hi, the support is added in #1391

@fxmarty fxmarty closed this as completed Oct 18, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants