chore: ⬆️ Update ggerganov/llama.cpp to 924518e2e5726e81f3aeb2518fb85963a500e93a
#5766
Triggered via pull request
January 12, 2025 20:06
Status
Failure
Total duration
1h 34m 38s
Artifacts
1
Annotations
1 error
extras-image-build (cublas, 12, 0, linux/amd64, false, -cublas-cuda12-ffmpeg, true, extras, arc-r... / reusable_image-build
buildx failed with: ERROR: failed to solve: process "/bin/sh -c if [ \"${BUILD_TYPE}\" = \"cublas\" ] || [ \"${BUILD_TYPE}\" = \"hipblas\" ]; then SKIP_GRPC_BACKEND=\"backend-assets/grpc/llama-cpp-avx backend-assets/grpc/llama-cpp-avx2\" make build; else make build; fi" did not complete successfully: exit code: 2
|
Artifacts
Produced during runtime
Name | Size | |
---|---|---|
mudler~LocalAI~E7GZ6Y.dockerbuild
|
254 KB |
|