You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have tried to add layer normalization to the layers.py by setting layer_norm to True as a default value.
I used the "listener.py" as my encoder.
Those modifications decreased the performance of the model ( training time is longer, both accuracy and WER decreased dramatically).
I have seen other users complaining about such problems with tf.contrib.rnn.LayerNormBasicLSTMCell
in stackoverflow
Am I missing something else?
The text was updated successfully, but these errors were encountered:
Hi Vincent, I have used the TIMIT dataset and I used the recipe of LAS and it worked fine for my case.
I have attached my validation loss (the blue one is without layer normalization and the orange one is with layer normalization).
Why it didn't work for you case?
P.S: I used the KALDI-ASR project to process the dataset
Previously, I have pointed that it didn't work with the librispeech but I do have doubts that it is due to some mistake I might have done during the processing.
I have tried to add layer normalization to the layers.py by setting layer_norm to True as a default value.
I used the "listener.py" as my encoder.
Those modifications decreased the performance of the model ( training time is longer, both accuracy and WER decreased dramatically).
I have seen other users complaining about such problems with
tf.contrib.rnn.LayerNormBasicLSTMCell
in stackoverflow
Am I missing something else?
The text was updated successfully, but these errors were encountered: