Replies: 19 comments
-
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.
-
>>> chizhang |
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.
-
>>> chizhang |
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.
-
>>> dr0ptp4kt |
Beta Was this translation helpful? Give feedback.
-
>>> nmstoker |
Beta Was this translation helpful? Give feedback.
-
>>> lissyx |
Beta Was this translation helpful? Give feedback.
-
>>> dr0ptp4kt |
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.
-
>>> lissyx |
Beta Was this translation helpful? Give feedback.
-
>>> dr0ptp4kt |
Beta Was this translation helpful? Give feedback.
-
>>> lissyx |
Beta Was this translation helpful? Give feedback.
-
>>> alphac |
Beta Was this translation helpful? Give feedback.
-
>>> chizhang
[May 8, 2019, 10:19am]
Hi, I have exported my model to tensorflow lite format. But stuck on
inferring as the prebuild deepspeech binary is for .pb model. slash
I noticed that deepspeech.cc does have USE_TFLITE flag to enable tflite
model inference.Need I rebuild it? Anybody could give me some guidance?
[This is an archived TTS discussion thread from discourse.mozilla.org/t/prebuild-deep-speech-binary-for-tensorflow-lite-model-on-raspberry-pi-3]
Beta Was this translation helpful? Give feedback.
All reactions