We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
P1-Stopper
Ubuntu
Xeon-SPR
Single Node
53fcc6f
lvm-video-llama docker image build fail
lvm-video-llama: build: dockerfile: comps/lvms/src/integrations/dependency/video-llama/Dockerfile image: ${REGISTRY:-opea}/lvm-video-llama:${TAG:-latest}
Job link: https://github.com/opea-project/GenAIComps/actions/runs/12900670469/job/35971527833#step:5:2964
docker compose build --no-cache .github/workflows/docker/compose/lvms-compose.yaml
No response
The text was updated successfully, but these errors were encountered:
Possibly same as #1207
Sorry, something went wrong.
I verified that I'm able to build the LVM video llama docker image locally from main.
main
BaoHuiling
No branches or pull requests
Priority
P1-Stopper
OS type
Ubuntu
Hardware type
Xeon-SPR
Installation method
Deploy method
Running nodes
Single Node
What's the version?
53fcc6f
Description
lvm-video-llama docker image build fail
Job link: https://github.com/opea-project/GenAIComps/actions/runs/12900670469/job/35971527833#step:5:2964
Reproduce steps
docker compose build --no-cache .github/workflows/docker/compose/lvms-compose.yaml
Raw log
Attachments
No response
The text was updated successfully, but these errors were encountered: