Skip to content

Commit 82a61b5

Browse files
authored
Limit trl version in example (#12332)
* Limit trl version in example * Limit trl version in example
1 parent 923d696 commit 82a61b5

File tree

15 files changed

+29
-29
lines changed

15 files changed

+29
-29
lines changed

python/llm/example/CPU/HF-Transformers-AutoModels/Model/glm-4v/README.md

+2-2
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,7 @@ conda activate llm
1919
# install ipex-llm with 'all' option
2020
pip install --pre --upgrade ipex-llm[all] --extra-index-url https://download.pytorch.org/whl/cpu
2121

22-
pip install torchvision tiktoken transformers==4.42.4 trl
22+
pip install torchvision tiktoken transformers==4.42.4 "trl<0.12.0"
2323
```
2424

2525
On Windows:
@@ -30,7 +30,7 @@ conda activate llm
3030
3131
pip install --pre --upgrade ipex-llm[all]
3232
33-
pip install torchvision tiktoken transformers==4.42.4 trl
33+
pip install torchvision tiktoken transformers==4.42.4 "trl<0.12.0"
3434
```
3535

3636
### 2. Run

python/llm/example/CPU/HF-Transformers-AutoModels/Model/glm4/README.md

+2-2
Original file line numberDiff line numberDiff line change
@@ -18,7 +18,7 @@ conda activate llm
1818
pip install --pre --upgrade ipex-llm[all] --extra-index-url https://download.pytorch.org/whl/cpu
1919

2020
# install packages required for GLM-4
21-
pip install "tiktoken>=0.7.0" transformers==4.42.4 trl
21+
pip install "tiktoken>=0.7.0" transformers==4.42.4 "trl<0.12.0"
2222
```
2323

2424
On Windows:
@@ -29,7 +29,7 @@ conda activate llm
2929
3030
pip install --pre --upgrade ipex-llm[all]
3131
32-
pip install "tiktoken>=0.7.0" transformers==4.42.4 trl
32+
pip install "tiktoken>=0.7.0" transformers==4.42.4 "trl<0.12.0"
3333
```
3434

3535
## 2. Run

python/llm/example/CPU/HF-Transformers-AutoModels/Model/llama3.1/README.md

+2-2
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,7 @@ pip install --pre --upgrade ipex-llm[all] --extra-index-url https://download.pyt
2020

2121
# transformers>=4.43.1 is required for Llama3.1 with IPEX-LLM optimizations
2222
pip install transformers==4.43.1
23-
pip install trl
23+
pip install "trl<0.12.0"
2424
```
2525
On Windows:
2626

@@ -31,7 +31,7 @@ conda activate llm
3131
pip install --pre --upgrade ipex-llm[all]
3232
3333
pip install transformers==4.43.1
34-
pip install trl
34+
pip install "trl<0.12.0"
3535
```
3636

3737
### 2. Run

python/llm/example/CPU/HF-Transformers-AutoModels/Model/minicpm-v-2_6/README.md

+2-2
Original file line numberDiff line numberDiff line change
@@ -18,7 +18,7 @@ conda activate llm
1818
# install ipex-llm with 'all' option
1919
pip install --pre --upgrade ipex-llm[all] --extra-index-url https://download.pytorch.org/whl/cpu
2020
pip install torchvision==0.16.2 --index-url https://download.pytorch.org/whl/cpu
21-
pip install transformers==4.40.0 trl
21+
pip install transformers==4.40.0 "trl<0.12.0"
2222
```
2323
On Windows:
2424

@@ -28,7 +28,7 @@ conda activate llm
2828
2929
pip install --pre --upgrade ipex-llm[all]
3030
pip install torchvision==0.16.2 --index-url https://download.pytorch.org/whl/cpu
31-
pip install transformers==4.41.0 trl
31+
pip install transformers==4.41.0 "trl<0.12.0"
3232
```
3333

3434
### 2. Run

python/llm/example/CPU/PyTorch-Models/Model/glm4/README.md

+2-2
Original file line numberDiff line numberDiff line change
@@ -21,7 +21,7 @@ conda activate llm
2121
pip install --pre --upgrade ipex-llm[all] --extra-index-url https://download.pytorch.org/whl/cpu
2222

2323
# install packages required for GLM-4
24-
pip install "tiktoken>=0.7.0" transformers==4.42.4 trl
24+
pip install "tiktoken>=0.7.0" transformers==4.42.4 "trl<0.12.0"
2525
```
2626

2727
On Windows:
@@ -32,7 +32,7 @@ conda activate llm
3232
3333
pip install --pre --upgrade ipex-llm[all]
3434
35-
pip install "tiktoken>=0.7.0" transformers==4.42.4 trl
35+
pip install "tiktoken>=0.7.0" transformers==4.42.4 "trl<0.12.0"
3636
```
3737

3838
### 2. Run

python/llm/example/GPU/HuggingFace/LLM/gemma2/README.md

+3-3
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ In this directory, you will find examples on how you could apply IPEX-LLM INT4 o
44
## Requirements
55
To run these examples with IPEX-LLM on Intel GPUs, we have some recommended requirements for your machine, please refer to [here](../../../README.md#requirements) for more information.
66

7-
**Important: According to Gemma2's requirement, please make sure you have installed `transformers==4.43.1` and `trl` to run the example.**
7+
**Important: According to Gemma2's requirement, please make sure you have installed `transformers==4.43.1` and `trl<0.12.0` to run the example.**
88

99
## Example: Predict Tokens using `generate()` API
1010
In the example [generate.py](./generate.py), we show a basic use case for a Gemma2 model to predict the next N tokens using `generate()` API, with IPEX-LLM INT4 optimizations on Intel GPUs.
@@ -19,7 +19,7 @@ pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-exte
1919

2020
# According to Gemma2's requirement, please make sure you are using a stable version of Transformers, 4.43.1 or newer.
2121
pip install "transformers>=4.43.1"
22-
pip install trl
22+
pip install "trl<0.12.0"
2323
```
2424

2525
#### 1.2 Installation on Windows
@@ -33,7 +33,7 @@ pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-exte
3333

3434
# According to Gemma2's requirement, please make sure you are using a stable version of Transformers, 4.43.1 or newer.
3535
pip install "transformers>=4.43.1"
36-
pip install trl
36+
pip install "trl<0.12.0"
3737
```
3838

3939
### 2. Configures OneAPI environment variables for Linux

python/llm/example/GPU/HuggingFace/LLM/glm4/README.md

+2-2
Original file line numberDiff line numberDiff line change
@@ -14,7 +14,7 @@ conda activate llm
1414
pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/
1515

1616
# install packages required for GLM-4
17-
pip install "tiktoken>=0.7.0" transformers==4.42.4 trl
17+
pip install "tiktoken>=0.7.0" transformers==4.42.4 "trl<0.12.0"
1818
```
1919

2020
### 1.2 Installation on Windows
@@ -27,7 +27,7 @@ conda activate llm
2727
pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/
2828

2929
# install packages required for GLM-4
30-
pip install "tiktoken>=0.7.0" transformers==4.42.4 trl
30+
pip install "tiktoken>=0.7.0" transformers==4.42.4 "trl<0.12.0"
3131
```
3232

3333
## 2. Configures OneAPI environment variables for Linux

python/llm/example/GPU/HuggingFace/LLM/llama3.1/README.md

+2-2
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,7 @@ pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-exte
1717

1818
# transformers>=4.43.1 is required for Llama3.1 with IPEX-LLM optimizations
1919
pip install transformers==4.43.1
20-
pip install trl
20+
pip install "trl<0.12.0"
2121
```
2222

2323
#### 1.2 Installation on Windows
@@ -31,7 +31,7 @@ pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-exte
3131

3232
# transformers>=4.43.1 is required for Llama3.1 with IPEX-LLM optimizations
3333
pip install transformers==4.43.1
34-
pip install trl
34+
pip install "trl<0.12.0"
3535
```
3636

3737
### 2. Configures OneAPI environment variables for Linux

python/llm/example/GPU/HuggingFace/LLM/llama3.2/README.md

+2-2
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,7 @@ pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-exte
1717

1818
pip install transformers==4.45.0
1919
pip install accelerate==0.33.0
20-
pip install trl
20+
pip install "trl<0.12.0"
2121
```
2222

2323
#### 1.2 Installation on Windows
@@ -31,7 +31,7 @@ pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-exte
3131

3232
pip install transformers==4.45.0
3333
pip install accelerate==0.33.0
34-
pip install trl
34+
pip install "trl<0.12.0"
3535
```
3636

3737
### 2. Configures OneAPI environment variables for Linux

python/llm/example/GPU/HuggingFace/Multimodal/MiniCPM-Llama3-V-2_5/README.md

+2-2
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@ conda activate llm
1515
# below command will install intel_extension_for_pytorch==2.1.10+xpu as default
1616
pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/
1717

18-
pip install transformers==4.41.0 trl
18+
pip install transformers==4.41.0 "trl<0.12.0"
1919
```
2020

2121
#### 1.2 Installation on Windows
@@ -27,7 +27,7 @@ conda activate llm
2727
# below command will install intel_extension_for_pytorch==2.1.10+xpu as default
2828
pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/
2929

30-
pip install transformers==4.41.0 trl
30+
pip install transformers==4.41.0 "trl<0.12.0"
3131
```
3232

3333
### 2. Configures OneAPI environment variables for Linux

python/llm/example/GPU/HuggingFace/Multimodal/MiniCPM-V-2_6/README.md

+2-2
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@ conda activate llm
1515
# below command will install intel_extension_for_pytorch==2.1.10+xpu as default
1616
pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/
1717

18-
pip install transformers==4.40.0 trl
18+
pip install transformers==4.40.0 "trl<0.12.0"
1919
```
2020

2121
#### 1.2 Installation on Windows
@@ -27,7 +27,7 @@ conda activate llm
2727
# below command will install intel_extension_for_pytorch==2.1.10+xpu as default
2828
pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/
2929

30-
pip install transformers==4.40.0 trl
30+
pip install transformers==4.40.0 "trl<0.12.0"
3131
```
3232

3333
### 2. Configures OneAPI environment variables for Linux

python/llm/example/GPU/HuggingFace/Multimodal/glm-4v/README.md

+2-2
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@ conda activate llm
1515
# below command will install intel_extension_for_pytorch==2.1.10+xpu as default
1616
pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/
1717

18-
pip install tiktoken transformers==4.42.4 trl
18+
pip install tiktoken transformers==4.42.4 "trl<0.12.0"
1919
```
2020

2121
#### 1.2 Installation on Windows
@@ -27,7 +27,7 @@ conda activate llm
2727
# below command will install intel_extension_for_pytorch==2.1.10+xpu as default
2828
pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/
2929

30-
pip install tiktoken transformers==4.42.4 trl
30+
pip install tiktoken transformers==4.42.4 "trl<0.12.0"
3131
```
3232

3333
### 2. Configures OneAPI environment variables for Linux

python/llm/example/GPU/LLM-Finetuning/QLoRA/trl-example/README.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,7 @@ conda activate llm
1919
pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/
2020
pip install transformers==4.36.0 datasets
2121
pip install peft==0.10.0
22-
pip install bitsandbytes scipy trl
22+
pip install bitsandbytes scipy "trl<0.12.0"
2323
```
2424

2525
### 2. Configures OneAPI environment variables

python/llm/example/GPU/Lightweight-Serving/README.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -41,7 +41,7 @@ pip install gradio # for gradio web UI
4141
conda install -c conda-forge -y gperftools=2.10 # to enable tcmalloc
4242

4343
# for glm-4v-9b
44-
pip install transformers==4.42.4 trl
44+
pip install transformers==4.42.4 "trl<0.12.0"
4545

4646
# for internlm-xcomposer2-vl-7b
4747
pip install transformers==4.31.0

python/llm/example/GPU/PyTorch-Models/Model/glm4/README.md

+2-2
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@ conda activate llm
1616
pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/
1717

1818
# install packages required for GLM-4
19-
pip install "tiktoken>=0.7.0" transformers==4.42.4 trl
19+
pip install "tiktoken>=0.7.0" transformers==4.42.4 "trl<0.12.0"
2020
```
2121

2222
#### 1.2 Installation on Windows
@@ -29,7 +29,7 @@ conda activate llm
2929
pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/
3030

3131
# install packages required for GLM-4
32-
pip install "tiktoken>=0.7.0" transformers==4.42.4 trl
32+
pip install "tiktoken>=0.7.0" transformers==4.42.4 "trl<0.12.0"
3333
```
3434

3535
### 2. Configures OneAPI environment variables for Linux

0 commit comments

Comments
 (0)