Skip to content

Commit

Permalink
Fix for judge pipeline when data is filled
Browse files Browse the repository at this point in the history
Signed-off-by: Igor Gitman <[email protected]>
  • Loading branch information
Kipok committed Dec 16, 2024
1 parent d94bc47 commit 919bfe3
Showing 1 changed file with 3 additions and 0 deletions.
3 changes: 3 additions & 0 deletions nemo_skills/inference/llm_math_judge.py
Original file line number Diff line number Diff line change
Expand Up @@ -142,6 +142,9 @@ def llm_math_judge(cfg: LlmMathJudgeConfig):
# additionally, skipping whatever is pre-filled, assuming offset didn't change
data = data[starting_idx:]

if len(data) == 0: # we might not have any examples if skip_filled=True
return

prompt = get_prompt(cfg.prompt_config, cfg.prompt_template, examples_type=cfg.examples_type)
LOG.info("Prompt used: %s", prompt)
LOG.info("Example prompt:\nData dictionary: %s\nPrompt: %s", data[0], prompt.fill(data[0]))
Expand Down

0 comments on commit 919bfe3

Please sign in to comment.