Skip to content

Commit

Permalink
[doc] updated installation command (#5389)
Browse files Browse the repository at this point in the history
  • Loading branch information
FrankLeeeee authored Feb 19, 2024
1 parent 69e3ad0 commit 705a62a
Show file tree
Hide file tree
Showing 3 changed files with 12 additions and 12 deletions.
8 changes: 4 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -398,10 +398,10 @@ pip install colossalai

**Note: only Linux is supported for now.**

However, if you want to build the PyTorch extensions during installation, you can set `CUDA_EXT=1`.
However, if you want to build the PyTorch extensions during installation, you can set `BUILD_EXT=1`.

```bash
CUDA_EXT=1 pip install colossalai
BUILD_EXT=1 pip install colossalai
```

**Otherwise, CUDA kernels will be built during runtime when you actually need them.**
Expand Down Expand Up @@ -429,7 +429,7 @@ By default, we do not compile CUDA/C++ kernels. ColossalAI will build them durin
If you want to install and enable CUDA kernel fusion (compulsory installation when using fused optimizer):

```shell
CUDA_EXT=1 pip install .
BUILD_EXT=1 pip install .
```

For Users with CUDA 10.2, you can still build ColossalAI from source. However, you need to manually download the cub library and copy it to the corresponding directory.
Expand All @@ -445,7 +445,7 @@ unzip 1.8.0.zip
cp -r cub-1.8.0/cub/ colossalai/kernel/cuda_native/csrc/kernels/include/

# install
CUDA_EXT=1 pip install .
BUILD_EXT=1 pip install .
```

<p align="right">(<a href="#top">back to top</a>)</p>
Expand Down
6 changes: 3 additions & 3 deletions docs/source/en/get_started/installation.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ pip install colossalai
If you want to build PyTorch extensions during installation, you can use the command below. Otherwise, the PyTorch extensions will be built during runtime.

```shell
CUDA_EXT=1 pip install colossalai
BUILD_EXT=1 pip install colossalai
```


Expand All @@ -39,7 +39,7 @@ cd ColossalAI
pip install -r requirements/requirements.txt

# install colossalai
CUDA_EXT=1 pip install .
BUILD_EXT=1 pip install .
```

If you don't want to install and enable CUDA kernel fusion (compulsory installation when using fused optimizer), just don't specify the `CUDA_EXT`:
Expand All @@ -61,7 +61,7 @@ unzip 1.8.0.zip
cp -r cub-1.8.0/cub/ colossalai/kernel/cuda_native/csrc/kernels/include/

# install
CUDA_EXT=1 pip install .
BUILD_EXT=1 pip install .
```

<!-- doc-test-command: echo "installation.md does not need test" -->
10 changes: 5 additions & 5 deletions docs/source/zh-Hans/get_started/installation.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,10 +20,10 @@ pip install colossalai

**注:现在只支持Linux。**

如果你想同时安装PyTorch扩展的话,可以添加`CUDA_EXT=1`。如果不添加的话,PyTorch扩展会在运行时自动安装。
如果你想同时安装PyTorch扩展的话,可以添加`BUILD_EXT=1`。如果不添加的话,PyTorch扩展会在运行时自动安装。

```shell
CUDA_EXT=1 pip install colossalai
BUILD_EXT=1 pip install colossalai
```

## 从源安装
Expand All @@ -38,10 +38,10 @@ cd ColossalAI
pip install -r requirements/requirements.txt

# install colossalai
CUDA_EXT=1 pip install .
BUILD_EXT=1 pip install .
```

如果您不想安装和启用 CUDA 内核融合(使用融合优化器时强制安装),您可以不添加`CUDA_EXT=1`
如果您不想安装和启用 CUDA 内核融合(使用融合优化器时强制安装),您可以不添加`BUILD_EXT=1`

```shell
pip install .
Expand All @@ -60,7 +60,7 @@ unzip 1.8.0.zip
cp -r cub-1.8.0/cub/ colossalai/kernel/cuda_native/csrc/kernels/include/

# install
CUDA_EXT=1 pip install .
BUILD_EXT=1 pip install .
```

<!-- doc-test-command: echo "installation.md does not need test" -->

0 comments on commit 705a62a

Please sign in to comment.