Skip to content

Commit

Permalink
Document the CUDA plugin in README (#7345)
Browse files Browse the repository at this point in the history
  • Loading branch information
will-cromar authored Jun 25, 2024
1 parent 53c77e2 commit b505288
Showing 1 changed file with 14 additions and 2 deletions.
16 changes: 14 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,9 +19,9 @@ started:
* [Distributed PyTorch/XLA
Basics](https://github.com/pytorch/xla/blob/master/contrib/kaggle/distributed-pytorch-xla-basics-with-pjrt.ipynb)

## Getting Started
## Installation

**PyTorch/XLA is now on PyPI!**
### TPU

To install PyTorch/XLA stable build in a new TPU VM:

Expand All @@ -36,6 +36,18 @@ pip3 install --pre torch torchvision --index-url https://download.pytorch.org/wh
pip3 install https://storage.googleapis.com/pytorch-xla-releases/wheels/tpuvm/torch_xla-nightly-cp310-cp310-linux_x86_64.whl
```

### GPU Plugin (beta)

PyTorch/XLA now provides GPU support through a plugin package similar to `libtpu`:

```
pip install torch~=2.3.0 torch_xla~=2.3.0 https://storage.googleapis.com/pytorch-xla-releases/wheels/cuda/12.1/torch_xla_cuda_plugin-2.3.0-py3-none-any.whl
```

To use the plugin, set `XLA_REGISTER_INSTALLED_PLUGINS=1` or call `torch_xla.experimental.plugins.use_dynamic_plugins()` in your script.

## Getting Started

To update your existing training loop, make the following changes:

```diff
Expand Down

0 comments on commit b505288

Please sign in to comment.