We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
调用model.stream_chat()函数做输出,将do_sample关闭,正常此刻推理同一条内容,答案输出应一致。但是在实际测试中,仅推理该条时和在该条前添加一条问题时的输出不一致,history也置为空了,但是没有效果。原因求解。
1
- OS: - Python: - Transformers: - PyTorch:1.13.1 - CUDA Support (`python -c "import torch; print(torch.cuda.is_available())"`) :12.0
No response
The text was updated successfully, but these errors were encountered:
No branches or pull requests
Is there an existing issue for this?
Current Behavior
调用model.stream_chat()函数做输出,将do_sample关闭,正常此刻推理同一条内容,答案输出应一致。但是在实际测试中,仅推理该条时和在该条前添加一条问题时的输出不一致,history也置为空了,但是没有效果。原因求解。
Expected Behavior
Steps To Reproduce
1
Environment
Anything else?
No response
The text was updated successfully, but these errors were encountered: