Skip to content

Fix accuracy issue in vLLM LLAMA2 offline scenario #3

Fix accuracy issue in vLLM LLAMA2 offline scenario

Fix accuracy issue in vLLM LLAMA2 offline scenario #3

Triggered via pull request July 24, 2024 11:18
@arjunsuresharjunsuresh
opened #3
Status Failure
Total duration 14s
Artifacts

cla.yml

on: pull_request_target
cla-check
2s
cla-check
Fit to window
Zoom out
Zoom in

Annotations

1 warning
cla-check
The following actions uses Node.js version which is deprecated and will be forced to run on node20: mlcommons/cla-bot@master. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/