Build llama-cpu-nightly #85
build-llama-cpu-nightly.yml
on: schedule
nightly-build-and-push-llama-cpu
7m 4s
Annotations
1 error and 4 warnings
nightly-build-and-push-llama-cpu
buildx failed with: ERROR: failed to solve: process "/bin/sh -c git clone https://github.com/oobabooga/GPTQ-for-LLaMa.git -b cuda /app/repositories/GPTQ-for-LLaMa" did not complete successfully: exit code: 128
|
nightly-build-and-push-llama-cpu
Failed to download action 'https://api.github.com/repos/docker/build-push-action/tarball/0a97817b6ade9f46837855d676c4cca3a2471fc9'. Error: Response status code does not indicate success: 500 (Internal Server Error).
|
nightly-build-and-push-llama-cpu
Back off 26.807 seconds before retry.
|
nightly-build-and-push-llama-cpu
Failed to download action 'https://api.github.com/repos/docker/build-push-action/tarball/0a97817b6ade9f46837855d676c4cca3a2471fc9'. Error: Response status code does not indicate success: 500 (Internal Server Error).
|
nightly-build-and-push-llama-cpu
Back off 28.917 seconds before retry.
|