You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Ubuntu 22.04.3 LTS
6.6.56+
CONTAINER_NAME=v157, BUILD_DATE=20250205
total used free shared buff/cache available
Mem: 31Gi 871Mi 21Gi 1.0Mi 8.7Gi 30Gi
Swap: 0B 0B 0B
My NVIDIA driver version is '560.35.03
560.35.03'.
lrwxrwxrwx 1 root root 22 Nov 10 2023 cuda -> /etc/alternatives/cuda
lrwxrwxrwx 1 root root 25 Nov 10 2023 cuda-12 -> /etc/alternatives/cuda-12
drwxr-xr-x 1 root root 4096 Nov 10 2023 cuda-12.2
2.5.1+cu121 True 12.1 ['sm_50', 'sm_60', 'sm_70', 'sm_75', 'sm_80', 'sm_86', 'sm_90']
(7, 5) _CudaDeviceProperties(name='Tesla T4', major=7, minor=5, total_memory=15095MB, multi_processor_count=40, uuid=60e7409d-d0f8-c0a8-4de6-2eee400354ad, L2_cache_size=4MB)
12.1
True 90100
pytorch-ignite 0.5.1
pytorch-lightning 2.5.0.post0
torch 2.5.1+cu121
torchaudio 2.5.1+cu121
torchinfo 1.8.0
torchmetrics 1.6.1
torchsummary 1.5.1
torchtune 0.5.0
torchvision 0.20.1+cu121
Looking in indexes: https://flashinfer.ai/whl/cu121/torch2.5/
Collecting flashinfer-python==0.2.1.post1
Using cached https://github.com/flashinfer-ai/flashinfer/releases/download/v0.2.1.post1/flashinfer_python-0.2.1.post1%2Bcu121torch2.5-cp38-abi3-linux_x86_64.whl (527.1 MB)
INFO: pip is looking at multiple versions of flashinfer-python to determine which version is compatible with other requirements. This could take a while.
ERROR: Could not find a version that satisfies the requirement torch==2.5.* (from flashinfer-python) (from versions: none)
ERROR: No matching distribution found for torch==2.5.*
The text was updated successfully, but these errors were encountered:
Site Kaggle. GPU T4x2.
Ubuntu 22.04.3 LTS
6.6.56+
CONTAINER_NAME=v157, BUILD_DATE=20250205
total used free shared buff/cache available
Mem: 31Gi 871Mi 21Gi 1.0Mi 8.7Gi 30Gi
Swap: 0B 0B 0B
My NVIDIA driver version is '560.35.03
560.35.03'.
lrwxrwxrwx 1 root root 22 Nov 10 2023 cuda -> /etc/alternatives/cuda
lrwxrwxrwx 1 root root 25 Nov 10 2023 cuda-12 -> /etc/alternatives/cuda-12
drwxr-xr-x 1 root root 4096 Nov 10 2023 cuda-12.2
2.5.1+cu121 True 12.1 ['sm_50', 'sm_60', 'sm_70', 'sm_75', 'sm_80', 'sm_86', 'sm_90']
(7, 5) _CudaDeviceProperties(name='Tesla T4', major=7, minor=5, total_memory=15095MB, multi_processor_count=40, uuid=60e7409d-d0f8-c0a8-4de6-2eee400354ad, L2_cache_size=4MB)
12.1
True 90100
pytorch-ignite 0.5.1
pytorch-lightning 2.5.0.post0
torch 2.5.1+cu121
torchaudio 2.5.1+cu121
torchinfo 1.8.0
torchmetrics 1.6.1
torchsummary 1.5.1
torchtune 0.5.0
torchvision 0.20.1+cu121
Looking in indexes: https://flashinfer.ai/whl/cu121/torch2.5/
Collecting flashinfer-python==0.2.1.post1
Using cached https://github.com/flashinfer-ai/flashinfer/releases/download/v0.2.1.post1/flashinfer_python-0.2.1.post1%2Bcu121torch2.5-cp38-abi3-linux_x86_64.whl (527.1 MB)
INFO: pip is looking at multiple versions of flashinfer-python to determine which version is compatible with other requirements. This could take a while.
ERROR: Could not find a version that satisfies the requirement torch==2.5.* (from flashinfer-python) (from versions: none)
ERROR: No matching distribution found for torch==2.5.*
The text was updated successfully, but these errors were encountered: