Skip to content

Commit

Permalink
Fix llama.cpp loader not being random (thanks @reydeljuego12345)
Browse files Browse the repository at this point in the history
  • Loading branch information
oobabooga committed Oct 14, 2024
1 parent 49dfa0a commit c9a9f63
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion modules/llamacpp_model.py
Original file line number Diff line number Diff line change
Expand Up @@ -136,7 +136,7 @@ def generate(self, prompt, state, callback=None):
prompt=prompt,
max_tokens=state['max_new_tokens'],
temperature=state['temperature'],
top_p=state['top_p'],
top_p=state['top_p'] if state['top_p'] < 1 else 0.999,
min_p=state['min_p'],
typical_p=state['typical_p'],
frequency_penalty=state['frequency_penalty'],
Expand Down

0 comments on commit c9a9f63

Please sign in to comment.