Skip to content
This repository has been archived by the owner on Aug 28, 2023. It is now read-only.

inference on google coral usb accelerator = #45

Open
co-manifold opened this issue Jun 12, 2023 · 1 comment
Open

inference on google coral usb accelerator = #45

co-manifold opened this issue Jun 12, 2023 · 1 comment

Comments

@co-manifold
Copy link

co-manifold commented Jun 12, 2023

Hi, I noticed on this thread, that there was a request to be able to make a tflite model that could be run on the google coral usb accelerator device. I believe there is an issue with the model that does not currently make this possible, namely the presence of flexops in the whisper tflite implementation. Is it possible that the tflite model could be changed to allow for inference on a google coral usb accelerator device? Many thanks.

@jayyang-zigbang
Copy link

actually the whisper tflite model works well on the google coral, just edge TPU runtime is still not available. if you are interested in only CPU, please just upgrade your python to 3.9.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants