Skip to content

Commit

Permalink
fix bug that always sets temperature to 0.02 or lower for vllm (UKGov…
Browse files Browse the repository at this point in the history
  • Loading branch information
tadamcz authored Aug 22, 2024
1 parent 5667bf8 commit b5a9248
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion src/inspect_ai/model/_providers/vllm.py
Original file line number Diff line number Diff line change
Expand Up @@ -161,7 +161,7 @@ def get_sampling_params(self, config: GenerateConfig, chat: str) -> SamplingPara

if config.temperature is not None:
# for some reason vllm doesn't generate anything for 0 < temperature < 0.02
if 0 < config.temperature > 0.02:
if 0 < config.temperature < 0.02:
config.temperature = 0.02
kwargs["temperature"] = config.temperature
if config.top_p is not None:
Expand Down

0 comments on commit b5a9248

Please sign in to comment.