diff --git a/README.md b/README.md index 67bb386a62c7..4528dc6ce3ca 100644 --- a/README.md +++ b/README.md @@ -26,14 +26,14 @@ started: To install PyTorch/XLA stable build in a new TPU VM: ``` -pip install torch~=2.5.0 torch_xla[tpu]~=2.5.0 -f https://storage.googleapis.com/libtpu-releases/index.html +pip install torch==2.5.1 torch_xla[tpu]==2.5.1 -f https://storage.googleapis.com/libtpu-releases/index.html ``` To install PyTorch/XLA nightly build in a new TPU VM: ``` pip3 install --pre torch torchvision --index-url https://download.pytorch.org/whl/nightly/cpu -pip install 'torch_xla[tpu] @ https://storage.googleapis.com/pytorch-xla-releases/wheels/tpuvm/torch_xla-2.5.0.dev-cp310-cp310-linux_x86_64.whl' -f https://storage.googleapis.com/libtpu-releases/index.html +pip install 'torch_xla[tpu] @ https://storage.googleapis.com/pytorch-xla-releases/wheels/tpuvm/torch_xla-2.6.0.dev-cp310-cp310-linux_x86_64.whl' -f https://storage.googleapis.com/libtpu-releases/index.html ``` ### GPU Plugin @@ -41,7 +41,7 @@ pip install 'torch_xla[tpu] @ https://storage.googleapis.com/pytorch-xla-release PyTorch/XLA now provides GPU support through a plugin package similar to `libtpu`: ``` -pip install torch~=2.5.0 torch_xla~=2.5.0 https://storage.googleapis.com/pytorch-xla-releases/wheels/cuda/12.1/torch_xla_cuda_plugin-2.5.0-py3-none-any.whl +pip install torch==2.5.1 torch_xla==2.5.1 https://storage.googleapis.com/pytorch-xla-releases/wheels/cuda/12.1/torch_xla_cuda_plugin-2.5.1-py3-none-any.whl ``` ## Getting Started @@ -154,12 +154,12 @@ GPU and nightly builds are available in our public GCS bucket. | Version | Cloud GPU VM Wheels | | --- | ----------- | -| 2.5 (CUDA 12.1 + Python 3.9) | `https://storage.googleapis.com/pytorch-xla-releases/wheels/cuda/12.1/torch_xla-2.5.0-cp39-cp39-manylinux_2_28_x86_64.whl` | -| 2.5 (CUDA 12.1 + Python 3.10) | `https://storage.googleapis.com/pytorch-xla-releases/wheels/cuda/12.1/torch_xla-2.5.0-cp310-cp310-manylinux_2_28_x86_64.whl` | -| 2.5 (CUDA 12.1 + Python 3.11) | `https://storage.googleapis.com/pytorch-xla-releases/wheels/cuda/12.1/torch_xla-2.5.0-cp311-cp311-manylinux_2_28_x86_64.whl` | -| 2.5 (CUDA 12.4 + Python 3.9) | `https://storage.googleapis.com/pytorch-xla-releases/wheels/cuda/12.4/torch_xla-2.5.0-cp39-cp39-manylinux_2_28_x86_64.whl` | -| 2.5 (CUDA 12.4 + Python 3.10) | `https://storage.googleapis.com/pytorch-xla-releases/wheels/cuda/12.4/torch_xla-2.5.0-cp310-cp310-manylinux_2_28_x86_64.whl` | -| 2.5 (CUDA 12.4 + Python 3.11) | `https://storage.googleapis.com/pytorch-xla-releases/wheels/cuda/12.4/torch_xla-2.5.0-cp311-cp311-manylinux_2_28_x86_64.whl` | +| 2.5.1 (CUDA 12.1 + Python 3.9) | `https://storage.googleapis.com/pytorch-xla-releases/wheels/cuda/12.1/torch_xla-2.5.1-cp39-cp39-manylinux_2_28_x86_64.whl` | +| 2.5.1 (CUDA 12.1 + Python 3.10) | `https://storage.googleapis.com/pytorch-xla-releases/wheels/cuda/12.1/torch_xla-2.5.1-cp310-cp310-manylinux_2_28_x86_64.whl` | +| 2.5.1 (CUDA 12.1 + Python 3.11) | `https://storage.googleapis.com/pytorch-xla-releases/wheels/cuda/12.1/torch_xla-2.5.1-cp311-cp311-manylinux_2_28_x86_64.whl` | +| 2.5.1 (CUDA 12.4 + Python 3.9) | `https://storage.googleapis.com/pytorch-xla-releases/wheels/cuda/12.4/torch_xla-2.5.1-cp39-cp39-manylinux_2_28_x86_64.whl` | +| 2.5.1 (CUDA 12.4 + Python 3.10) | `https://storage.googleapis.com/pytorch-xla-releases/wheels/cuda/12.4/torch_xla-2.5.1-cp310-cp310-manylinux_2_28_x86_64.whl` | +| 2.5.1 (CUDA 12.4 + Python 3.11) | `https://storage.googleapis.com/pytorch-xla-releases/wheels/cuda/12.4/torch_xla-2.5.1-cp311-cp311-manylinux_2_28_x86_64.whl` | | nightly (Python 3.10) | `https://storage.googleapis.com/pytorch-xla-releases/wheels/tpuvm/torch_xla-2.6.0.dev-cp310-cp310-linux_x86_64.whl` | | nightly (CUDA 12.1 + Python 3.10) | `https://storage.googleapis.com/pytorch-xla-releases/wheels/cuda/12.1/torch_xla-2.6.0.dev-cp310-cp310-linux_x86_64.whl` | @@ -193,6 +193,7 @@ The torch wheel version `2.5.0.dev20240820+cpu` can be found at https://download | Version | Cloud TPU VMs Wheel | |---------|-------------------| +| 2.5 (Python 3.10) | `https://storage.googleapis.com/pytorch-xla-releases/wheels/tpuvm/torch_xla-2.5.0-cp310-cp310-manylinux_2_28_x86_64.whl` | | 2.4 (Python 3.10) | `https://storage.googleapis.com/pytorch-xla-releases/wheels/tpuvm/torch_xla-2.4.0-cp310-cp310-manylinux_2_28_x86_64.whl` | | 2.3 (Python 3.10) | `https://storage.googleapis.com/pytorch-xla-releases/wheels/tpuvm/torch_xla-2.3.0-cp310-cp310-manylinux_2_28_x86_64.whl` | | 2.2 (Python 3.10) | `https://storage.googleapis.com/pytorch-xla-releases/wheels/tpuvm/torch_xla-2.2.0-cp310-cp310-manylinux_2_28_x86_64.whl` | @@ -203,6 +204,12 @@ The torch wheel version `2.5.0.dev20240820+cpu` can be found at https://download | Version | GPU Wheel | | --- | ----------- | +| 2.5.1 (CUDA 12.1 + Python 3.9) | `https://storage.googleapis.com/pytorch-xla-releases/wheels/cuda/12.1/torch_xla-2.5.1-cp39-cp39-manylinux_2_28_x86_64.whl` | +| 2.5.1 (CUDA 12.1 + Python 3.10) | `https://storage.googleapis.com/pytorch-xla-releases/wheels/cuda/12.1/torch_xla-2.5.1-cp310-cp310-manylinux_2_28_x86_64.whl` | +| 2.5.1 (CUDA 12.1 + Python 3.11) | `https://storage.googleapis.com/pytorch-xla-releases/wheels/cuda/12.1/torch_xla-2.5.1-cp311-cp311-manylinux_2_28_x86_64.whl` | +| 2.5.1 (CUDA 12.4 + Python 3.9) | `https://storage.googleapis.com/pytorch-xla-releases/wheels/cuda/12.4/torch_xla-2.5.1-cp39-cp39-manylinux_2_28_x86_64.whl` | +| 2.5.1 (CUDA 12.4 + Python 3.10) | `https://storage.googleapis.com/pytorch-xla-releases/wheels/cuda/12.4/torch_xla-2.5.1-cp310-cp310-manylinux_2_28_x86_64.whl` | +| 2.5.1 (CUDA 12.4 + Python 3.11) | `https://storage.googleapis.com/pytorch-xla-releases/wheels/cuda/12.4/torch_xla-2.5.1-cp311-cp311-manylinux_2_28_x86_64.whl` | | 2.5 (CUDA 12.1 + Python 3.9) | `https://storage.googleapis.com/pytorch-xla-releases/wheels/cuda/12.1/torch_xla-2.5.0-cp39-cp39-manylinux_2_28_x86_64.whl` | | 2.5 (CUDA 12.1 + Python 3.10) | `https://storage.googleapis.com/pytorch-xla-releases/wheels/cuda/12.1/torch_xla-2.5.0-cp310-cp310-manylinux_2_28_x86_64.whl` | | 2.5 (CUDA 12.1 + Python 3.11) | `https://storage.googleapis.com/pytorch-xla-releases/wheels/cuda/12.1/torch_xla-2.5.0-cp311-cp311-manylinux_2_28_x86_64.whl` | @@ -226,6 +233,7 @@ The torch wheel version `2.5.0.dev20240820+cpu` can be found at https://download | Version | Cloud TPU VMs Docker | | --- | ----------- | +| 2.5.1 | `us-central1-docker.pkg.dev/tpu-pytorch-releases/docker/xla:r2.5.1_3.10_tpuvm` | | 2.5 | `us-central1-docker.pkg.dev/tpu-pytorch-releases/docker/xla:r2.5.0_3.10_tpuvm` | | 2.4 | `us-central1-docker.pkg.dev/tpu-pytorch-releases/docker/xla:r2.4.0_3.10_tpuvm` | | 2.3 | `us-central1-docker.pkg.dev/tpu-pytorch-releases/docker/xla:r2.3.0_3.10_tpuvm` | @@ -243,6 +251,7 @@ docker run --privileged --net host --shm-size=16G -it us-central1-docker.pkg.dev | Version | GPU CUDA 12.4 Docker | | --- | ----------- | +| 2.5.1 | `us-central1-docker.pkg.dev/tpu-pytorch-releases/docker/xla:r2.5.1_3.10_cuda_12.4` | | 2.5 | `us-central1-docker.pkg.dev/tpu-pytorch-releases/docker/xla:r2.5.0_3.10_cuda_12.4` | | 2.4 | `us-central1-docker.pkg.dev/tpu-pytorch-releases/docker/xla:r2.4.0_3.10_cuda_12.4` | @@ -251,6 +260,7 @@ docker run --privileged --net host --shm-size=16G -it us-central1-docker.pkg.dev | Version | GPU CUDA 12.1 Docker | | --- | ----------- | +| 2.5.1 | `us-central1-docker.pkg.dev/tpu-pytorch-releases/docker/xla:r2.5.1_3.10_cuda_12.1` | | 2.5 | `us-central1-docker.pkg.dev/tpu-pytorch-releases/docker/xla:r2.5.0_3.10_cuda_12.1` | | 2.4 | `us-central1-docker.pkg.dev/tpu-pytorch-releases/docker/xla:r2.4.0_3.10_cuda_12.1` | | 2.3 | `us-central1-docker.pkg.dev/tpu-pytorch-releases/docker/xla:r2.3.0_3.10_cuda_12.1` |