Skip to content

Commit

Permalink
Clamps inf values in prompt_logprobs
Browse files Browse the repository at this point in the history
Signed-off-by: Rafael Vasquez <[email protected]>
  • Loading branch information
rafvasq committed Dec 10, 2024
1 parent d1c2e15 commit 9c87700
Showing 1 changed file with 6 additions and 0 deletions.
6 changes: 6 additions & 0 deletions vllm/entrypoints/openai/serving_completion.py
Original file line number Diff line number Diff line change
Expand Up @@ -392,6 +392,12 @@ def request_output_to_completion_response(
prompt_token_ids = final_res.prompt_token_ids
assert prompt_token_ids is not None
prompt_logprobs = final_res.prompt_logprobs
if prompt_logprobs:
for logprob_dict in prompt_logprobs:
if logprob_dict:
for logprobs in logprob_dict.values():
if logprobs.logprob == float('-inf'):
logprobs.logprob = -9999.0
prompt_text = final_res.prompt

token_ids: GenericSequence[int]
Expand Down

0 comments on commit 9c87700

Please sign in to comment.