You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Expected:
When mouse over a token, you should see the top 10 tokens at that generation step and their probabilities.
Observed behaviour:
When running with llama.cpp release b4281, using llama.cpp API option in mikupad[eb98d22], any generated text will only show the top tokens for the first token in the generated text for all tokens. Additionally, a JavaScript error is observed in browser console indicating e.completion_probabilities is missing. The error is also displayed under the "predict" button.
Possible workaround / clue for cause of issue:
I have only tested that llama.cpp release b4242 should still work. It must have stopped working since one of the later commits, but I do not have time to test for now.
The text was updated successfully, but these errors were encountered:
This seems to be a bug in llama.cpp, inspecting the API response it's possible to see that all received tokens have the probabilities of the first token, for example:
Expected:
When mouse over a token, you should see the top 10 tokens at that generation step and their probabilities.
Observed behaviour:
When running with llama.cpp release b4281, using llama.cpp API option in mikupad[eb98d22], any generated text will only show the top tokens for the first token in the generated text for all tokens. Additionally, a JavaScript error is observed in browser console indicating
e.completion_probabilities
is missing. The error is also displayed under the "predict" button.Possible workaround / clue for cause of issue:
I have only tested that llama.cpp release b4242 should still work. It must have stopped working since one of the later commits, but I do not have time to test for now.
The text was updated successfully, but these errors were encountered: