Skip to content

Actions: vllm-project/vllm

All workflows

Actions

Loading...
Loading

Showing runs from all workflows
108,726 workflow run results
108,726 workflow run results

Filter by Event

Filter by Status

Filter by Branch

Filter by Actor

Making vLLM compatible with Mistral fp8 weights.
yapf #27620: Pull request #10229 synchronize by akllm
November 11, 2024 20:12 1m 49s akllm:vllmfp8mistral
November 11, 2024 20:12 1m 49s
Making vLLM compatible with Mistral fp8 weights.
codespell #410: Pull request #10229 synchronize by akllm
November 11, 2024 20:12 21s akllm:vllmfp8mistral
November 11, 2024 20:12 21s
Making vLLM compatible with Mistral fp8 weights.
ruff #26094: Pull request #10229 synchronize by akllm
November 11, 2024 20:12 21s akllm:vllmfp8mistral
November 11, 2024 20:12 21s
Making vLLM compatible with Mistral fp8 weights.
mypy #21105: Pull request #10229 synchronize by akllm
November 11, 2024 20:12 52s akllm:vllmfp8mistral
November 11, 2024 20:12 52s
[LoRA] Adds support for bias in LoRA
codespell #409: Pull request #5733 synchronize by followumesh
November 11, 2024 20:09 22s followumesh:bias-for-lora
November 11, 2024 20:09 22s
[LoRA] Adds support for bias in LoRA
ruff #26093: Pull request #5733 synchronize by followumesh
November 11, 2024 20:09 20s followumesh:bias-for-lora
November 11, 2024 20:09 20s
[LoRA] Adds support for bias in LoRA
mypy #21104: Pull request #5733 synchronize by followumesh
November 11, 2024 20:09 47s followumesh:bias-for-lora
November 11, 2024 20:09 47s
[LoRA] Adds support for bias in LoRA
yapf #27619: Pull request #5733 synchronize by followumesh
November 11, 2024 20:09 1m 47s followumesh:bias-for-lora
November 11, 2024 20:09 1m 47s
Making vLLM compatible with Mistral fp8 weights.
mypy #21103: Pull request #10229 synchronize by akllm
November 11, 2024 20:01 47s akllm:vllmfp8mistral
November 11, 2024 20:01 47s
Making vLLM compatible with Mistral fp8 weights.
codespell #408: Pull request #10229 synchronize by akllm
November 11, 2024 20:01 20s akllm:vllmfp8mistral
November 11, 2024 20:01 20s
Making vLLM compatible with Mistral fp8 weights.
ruff #26092: Pull request #10229 synchronize by akllm
November 11, 2024 20:01 23s akllm:vllmfp8mistral
November 11, 2024 20:01 23s
Making vLLM compatible with Mistral fp8 weights.
yapf #27618: Pull request #10229 synchronize by akllm
November 11, 2024 20:01 1m 45s akllm:vllmfp8mistral
November 11, 2024 20:01 1m 45s
[V1] Support VLMs with fine-grained scheduling
yapf #27617: Pull request #9871 synchronize by WoosukKwon
November 11, 2024 19:59 1m 42s v1-vlm-sched
November 11, 2024 19:59 1m 42s
[V1] Support VLMs with fine-grained scheduling
codespell #407: Pull request #9871 synchronize by WoosukKwon
November 11, 2024 19:59 20s v1-vlm-sched
November 11, 2024 19:59 20s
[V1] Support VLMs with fine-grained scheduling
ruff #26091: Pull request #9871 synchronize by WoosukKwon
November 11, 2024 19:59 21s v1-vlm-sched
November 11, 2024 19:59 21s
[V1] Support VLMs with fine-grained scheduling
mypy #21102: Pull request #9871 synchronize by WoosukKwon
November 11, 2024 19:59 44s v1-vlm-sched
November 11, 2024 19:59 44s
[V1] Enable custom ops with piecewise CUDA graphs (#10228)
ruff #26090: Commit 9d5b4e4 pushed by WoosukKwon
November 11, 2024 19:58 21s main
November 11, 2024 19:58 21s
[V1] Enable custom ops with piecewise CUDA graphs (#10228)
codespell #406: Commit 9d5b4e4 pushed by WoosukKwon
November 11, 2024 19:58 20s main
November 11, 2024 19:58 20s
[V1] Enable custom ops with piecewise CUDA graphs (#10228)
yapf #27616: Commit 9d5b4e4 pushed by WoosukKwon
November 11, 2024 19:58 1m 52s main
November 11, 2024 19:58 1m 52s
[V1] Enable custom ops with piecewise CUDA graphs (#10228)
mypy #21101: Commit 9d5b4e4 pushed by WoosukKwon
November 11, 2024 19:58 43s main
November 11, 2024 19:58 43s
Making vLLM compatible with Mistral fp8 weights.
codespell #405: Pull request #10229 opened by akllm
November 11, 2024 19:57 22s akllm:vllmfp8mistral
November 11, 2024 19:57 22s
Making vLLM compatible with Mistral fp8 weights.
ruff #26089: Pull request #10229 opened by akllm
November 11, 2024 19:57 21s akllm:vllmfp8mistral
November 11, 2024 19:57 21s
Making vLLM compatible with Mistral fp8 weights.
mypy #21100: Pull request #10229 opened by akllm
November 11, 2024 19:57 42s akllm:vllmfp8mistral
November 11, 2024 19:57 42s