From ea9c6481d618f3e4b1fad536eb3773ebc180993a Mon Sep 17 00:00:00 2001 From: Pei Zhang Date: Wed, 31 Jan 2024 17:56:13 -0800 Subject: [PATCH] update README.md doc to 2.2 (#6437) --- README.md | 22 ++++++++++++++-------- docs/gpu.md | 4 ++-- 2 files changed, 16 insertions(+), 10 deletions(-) diff --git a/README.md b/README.md index b2a5110b5bb..fa5cd88c5ec 100644 --- a/README.md +++ b/README.md @@ -30,7 +30,7 @@ started: To install PyTorch/XLA a new TPU VM: ``` -pip install torch~=2.1.0 torch_xla[tpu]~=2.1.0 -f https://storage.googleapis.com/libtpu-releases/index.html +pip install torch~=2.2.0 torch_xla[tpu]~=2.2.0 -f https://storage.googleapis.com/libtpu-releases/index.html ``` To update your existing training loop, make the following changes: @@ -145,10 +145,12 @@ pip install torch_xla[tpu] -f https://storage.googleapis.com/libtpu-releases/ind GPU, XRT (legacy runtime), and nightly builds are available in our public GCS bucket. -| Version | Cloud TPU VMs Wheel | +| Version | Cloud TPU/GPU VMs Wheel | | --- | ----------- | -| 2.1 (CUDA 12.1 + Python 3.8) | `https://storage.googleapis.com/pytorch-xla-releases/wheels/cuda/12.1/torch_xla-2.1.0-cp38-cp38-manylinux_2_28_x86_64.whl` | -| 2.1 (XRT + Python 3.10) | `https://storage.googleapis.com/pytorch-xla-releases/wheels/xrt/tpuvm/torch_xla-2.1.0%2Bxrt-cp310-cp310-manylinux_2_28_x86_64.whl` | +| 2.2 (Python 3.8) | `https://storage.googleapis.com/pytorch-xla-releases/wheels/tpuvm/torch_xla-2.2.0-cp38-cp38-manylinux_2_28_x86_64.whl` | +| 2.2 (Python 3.10) | `https://storage.googleapis.com/pytorch-xla-releases/wheels/tpuvm/torch_xla-2.2.0-cp310-cp310-manylinux_2_28_x86_64.whl` | +| 2.2 (CUDA 12.1 + Python 3.8) | `https://storage.googleapis.com/pytorch-xla-releases/wheels/cuda/12.1/torch_xla-2.2.0-cp38-cp38-manylinux_2_28_x86_64.whl` | +| 2.2 (CUDA 12.1 + Python 3.10) | `https://storage.googleapis.com/pytorch-xla-releases/wheels/cuda/12.1/torch_xla-2.2.0-cp310-cp310-manylinux_2_28_x86_64.whl` | | nightly (Python 3.8) | `https://storage.googleapis.com/pytorch-xla-releases/wheels/tpuvm/torch_xla-nightly-cp38-cp38-linux_x86_64.whl` | | nightly (Python 3.10) | `https://storage.googleapis.com/pytorch-xla-releases/wheels/tpuvm/torch_xla-nightly-cp310-cp310-linux_x86_64.whl` | | nightly (CUDA 12.1 + Python 3.8) | `https://storage.googleapis.com/pytorch-xla-releases/wheels/cuda/12.1/torch_xla-nightly-cp38-cp38-linux_x86_64.whl` | @@ -159,6 +161,8 @@ bucket. | Version | Cloud TPU VMs Wheel | |---------|-------------------| +| 2.1 (XRT + Python 3.10) | `https://storage.googleapis.com/pytorch-xla-releases/wheels/xrt/tpuvm/torch_xla-2.1.0%2Bxrt-cp310-cp310-manylinux_2_28_x86_64.whl` | +| 2.1 (Python 3.8) | `https://storage.googleapis.com/tpu-pytorch/wheels/tpuvm/torch_xla-2.1-cp38-cp38-linux_x86_64.whl` | | 2.0 (Python 3.8) | `https://storage.googleapis.com/tpu-pytorch/wheels/tpuvm/torch_xla-2.0-cp38-cp38-linux_x86_64.whl` | | 1.13 | `https://storage.googleapis.com/tpu-pytorch/wheels/tpuvm/torch_xla-1.13-cp38-cp38-linux_x86_64.whl` | | 1.12 | `https://storage.googleapis.com/tpu-pytorch/wheels/tpuvm/torch_xla-1.12-cp38-cp38-linux_x86_64.whl` | @@ -168,19 +172,19 @@ bucket.
Note: For TPU Pod customers using XRT (our legacy runtime), we have custom -wheels for `torch`, `torchvision`, and `torch_xla` at +wheels for `torch` and `torch_xla` at `https://storage.googleapis.com/tpu-pytorch/wheels/xrt`. | Package | Cloud TPU VMs Wheel (XRT on Pod, Legacy Only) | | --- | ----------- | -| torch_xla | `https://storage.googleapis.com/tpu-pytorch/wheels/xrt/torch_xla-2.0-cp38-cp38-linux_x86_64.whl` | -| torch | `https://storage.googleapis.com/tpu-pytorch/wheels/xrt/torch-2.0-cp38-cp38-linux_x86_64.whl` | -| torchvision | `https://storage.googleapis.com/tpu-pytorch/wheels/xrt/torchvision-2.0-cp38-cp38-linux_x86_64.whl` | +| torch_xla | `https://storage.googleapis.com/pytorch-xla-releases/wheels/xrt/tpuvm/torch_xla-2.1.0%2Bxrt-cp310-cp310-manylinux_2_28_x86_64.whl` | +| torch | `https://storage.googleapis.com/pytorch-xla-releases/wheels/xrt/tpuvm/torch-2.1.0%2Bxrt-cp310-cp310-linux_x86_64.whl` |
| Version | GPU Wheel + Python 3.8 | | --- | ----------- | +| 2.1+ CUDA 11.8 | `https://storage.googleapis.com/pytorch-xla-releases/wheels/cuda/11.8/torch_xla-2.1.0-cp38-cp38-manylinux_2_28_x86_64.whl` | | 2.0 + CUDA 11.8 | `https://storage.googleapis.com/tpu-pytorch/wheels/cuda/118/torch_xla-2.0-cp38-cp38-linux_x86_64.whl` | | 2.0 + CUDA 11.7 | `https://storage.googleapis.com/tpu-pytorch/wheels/cuda/117/torch_xla-2.0-cp38-cp38-linux_x86_64.whl` | | 1.13 | `https://storage.googleapis.com/tpu-pytorch/wheels/cuda/112/torch_xla-1.13-cp38-cp38-linux_x86_64.whl` | @@ -224,6 +228,7 @@ This is only required on Cloud TPU VMs. | Version | Cloud TPU VMs Docker | | --- | ----------- | +| 2.2 | `us-central1-docker.pkg.dev/tpu-pytorch-releases/docker/xla:r2.2.0_3.10_tpuvm` | | 2.1 | `us-central1-docker.pkg.dev/tpu-pytorch-releases/docker/xla:r2.1.0_3.10_tpuvm` | | 2.0 | `gcr.io/tpu-pytorch/xla:r2.0_3.8_tpuvm` | | 1.13 | `gcr.io/tpu-pytorch/xla:r1.13_3.8_tpuvm` | @@ -233,6 +238,7 @@ This is only required on Cloud TPU VMs. | Version | GPU CUDA 12.1 Docker | | --- | ----------- | +| 2.2 | `us-central1-docker.pkg.dev/tpu-pytorch-releases/docker/xla:r2.2.0_3.10_cuda_12.1` | | 2.1 | `us-central1-docker.pkg.dev/tpu-pytorch-releases/docker/xla:r2.1.0_3.10_cuda_12.1` | | nightly | `us-central1-docker.pkg.dev/tpu-pytorch-releases/docker/xla:nightly_3.8_cuda_12.1` | | nightly at date | `us-central1-docker.pkg.dev/tpu-pytorch-releases/docker/xla:nightly_3.8_cuda_12.1_YYYYMMDD` | diff --git a/docs/gpu.md b/docs/gpu.md index 011bff8f806..d0ce667d46f 100644 --- a/docs/gpu.md +++ b/docs/gpu.md @@ -61,8 +61,8 @@ source ~/.bashrc ### Wheel ``` -pip3 install torch==2.1 -pip3 install https://storage.googleapis.com/pytorch-xla-releases/wheels/cuda/12.1/torch_xla-2.1.0-cp38-cp38-manylinux_2_28_x86_64.whl +pip3 install torch==2.2.0 +pip3 install https://storage.googleapis.com/pytorch-xla-releases/wheels/cuda/12.1/torch_xla-2.2.0-cp38-cp38-manylinux_2_28_x86_64.whl ``` ## Run a simple model