Skip to content

Commit

Permalink
new base images
Browse files Browse the repository at this point in the history
  • Loading branch information
dakinggg committed Feb 12, 2024
1 parent 122f965 commit 7175d66
Show file tree
Hide file tree
Showing 2 changed files with 5 additions and 13 deletions.
10 changes: 2 additions & 8 deletions .github/workflows/docker.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -17,17 +17,11 @@ jobs:
strategy:
matrix:
include:
- name: "2.1.0_cu121"
base_image: mosaicml/pytorch:2.1.0_cu121-python3.10-ubuntu20.04
dep_groups: "[gpu]"
- name: "2.1.0_cu121_flash2"
base_image: mosaicml/pytorch:2.1.0_cu121-python3.10-ubuntu20.04
base_image: mosaicml/pytorch:2.1.2_cu121-python3.10-ubuntu20.04
dep_groups: "[gpu-flash2]"
- name: "2.1.0_cu121_aws"
base_image: mosaicml/pytorch:2.1.0_cu121-python3.10-ubuntu20.04-aws
dep_groups: "[gpu]"
- name: "2.1.0_cu121_flash2_aws"
base_image: mosaicml/pytorch:2.1.0_cu121-python3.10-ubuntu20.04-aws
base_image: mosaicml/pytorch:2.1.2_cu121-python3.10-ubuntu20.04-aws
dep_groups: "[gpu-flash2]"
steps:
- name: Maximize Build Space on Worker
Expand Down
8 changes: 3 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -113,11 +113,9 @@ You can select a specific commit hash such as `mosaicml/llm-foundry:1.13.1_cu117

| Docker Image | Torch Version | Cuda Version | LLM Foundry dependencies installed? |
| ------------------------------------------------------ | ------------- | ----------------- | ----------------------------------- |
| `mosaicml/pytorch:2.1.0_cu121-python3.10-ubuntu20.04` | 2.1.0 | 12.1 (Infiniband) | No |
| `mosaicml/llm-foundry:2.1.0_cu121-latest` | 2.1.0 | 12.1 (Infiniband) | Yes (flash attention v1. Warning: Support for flash attention v1 has been deprecated.) |
| `mosaicml/llm-foundry:2.1.0_cu121_flash2-latest` | 2.1.0 | 12.1 (Infiniband) | Yes (flash attention v2. Note: We recommend using flash attention v2.) |
| `mosaicml/llm-foundry:2.1.0_cu121_aws-latest` | 2.1.0 | 12.1 (EFA) | Yes (flash attention v1. Warning: Support for flash attention v1 has been deprecated.) |
| `mosaicml/llm-foundry:2.1.0_cu121_flash2_aws-latest` | 2.1.0 | 12.1 (EFA) | Yes (flash attention v2. Note: We recommend using flash attention v2.) |
| `mosaicml/pytorch:2.1.2_cu121-python3.10-ubuntu20.04` | 2.1.2 | 12.1 (Infiniband) | No |
| `mosaicml/llm-foundry:2.1.0_cu121_flash2-latest` | 2.1.0 | 12.1 (Infiniband) | Yes |
| `mosaicml/llm-foundry:2.1.0_cu121_flash2_aws-latest` | 2.1.0 | 12.1 (EFA) | Yes |


# Installation
Expand Down

0 comments on commit 7175d66

Please sign in to comment.