Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The provided lr scheduler LambdaLR doesn't follow PyTorch's LRScheduler API. #219

Open
benninkcorien opened this issue Mar 30, 2023 · 5 comments

Comments

@benninkcorien
Copy link

  • pytorch 2.0.0
  • python 3.10.10
  • cuda 11.7

I'm trying to run the Demo code, but keep getting this error when it tries to call ai.train()
Is this an issue with aitextgen or with the pytorch/lightning package?

pytorch_lightning.utilities.exceptions.MisconfigurationException: 
The provided lr scheduler `LambdaLR` doesn't follow PyTorch's LRScheduler API. 
You should override the  `LightningModule.lr_scheduler_step` hook with your own logic if you are using a custom LR scheduler.

The demo script (with some prints added to see where it fails)

from aitextgen.TokenDataset import TokenDataset
from aitextgen.tokenizers import train_tokenizer
from aitextgen.utils import GPT2ConfigCPU
from aitextgen import aitextgen

file_name = "F:\\AITextGen\\myfile.txt"

print("train tokenizer")

train_tokenizer(file_name)
tokenizer_file = "aitextgen.tokenizer.json"
ai = aitextgen(tokenizer_file=tokenizer_file)

print("build dataset for training")
data = TokenDataset(file_name, tokenizer_file=tokenizer_file, block_size=64)

print("Train using my own text")

ai.train(data, batch_size=8, num_steps=50000, generate_every=1000, save_every=1000)

print("generate response")
# Generate text from it!
ai.generate(10, prompt="My Prompt:")
@benninkcorien
Copy link
Author

Full error message:

Traceback (most recent call last):
File "F:\LanguageCodeNLP\AITextGen\trainwithgpu.py", line 33, in
ai.train(data, batch_size=8, num_steps=50000, generate_every=5000, save_every=5000)
File "C:\Users\Corien\AppData\Local\Programs\Python\Python310\lib\site-packages\aitextgen\aitextgen.py", line 752, in train
trainer.fit(train_model)
File "C:\Users\Corien\AppData\Local\Programs\Python\Python310\lib\site-packages\pytorch_lightning\trainer\trainer.py", line 696, in fit
self._call_and_handle_interrupt(
File "C:\Users\Corien\AppData\Local\Programs\Python\Python310\lib\site-packages\pytorch_lightning\trainer\trainer.py", line 650, in _call_and_handle_interrupt
return trainer_fn(*args, **kwargs)
File "C:\Users\Corien\AppData\Local\Programs\Python\Python310\lib\site-packages\pytorch_lightning\trainer\trainer.py", line 735, in _fit_impl
results = self._run(model, ckpt_path=self.ckpt_path)
File "C:\Users\Corien\AppData\Local\Programs\Python\Python310\lib\site-packages\pytorch_lightning\trainer\trainer.py", line 1147, in _run
self.strategy.setup(self)
File "C:\Users\Corien\AppData\Local\Programs\Python\Python310\lib\site-packages\pytorch_lightning\strategies\single_device.py", line 74, in setup
super().setup(trainer)
File "C:\Users\Corien\AppData\Local\Programs\Python\Python310\lib\site-packages\pytorch_lightning\strategies\strategy.py", line 153, in setup
self.setup_optimizers(trainer)
File "C:\Users\Corien\AppData\Local\Programs\Python\Python310\lib\site-packages\pytorch_lightning\strategies\strategy.py", line 141, in setup_optimizers
self.optimizers, self.lr_scheduler_configs, self.optimizer_frequencies = _init_optimizers_and_lr_schedulers(
File "C:\Users\Corien\AppData\Local\Programs\Python\Python310\lib\site-packages\pytorch_lightning\core\optimizer.py", line 194, in _init_optimizers_and_lr_schedulers
_validate_scheduler_api(lr_scheduler_configs, model)
File "C:\Users\Corien\AppData\Local\Programs\Python\Python310\lib\site-packages\pytorch_lightning\core\optimizer.py", line 351, in _validate_scheduler_api
raise MisconfigurationException(
pytorch_lightning.utilities.exceptions.MisconfigurationException: The provided lr scheduler LambdaLR doesn't follow PyTorch's LRScheduler API. You should override the LightningModule.lr_scheduler_step hook with your own logic if you are using a custom LR scheduler.

@benninkcorien
Copy link
Author

          I manage to resolve the error on the google colab by running this:
          !pip install -qq pytorch-lightning==1.7.0 transformers==4.21.3 aitextgen==0.6.0
          Please do let me know if it also solves your issue on colab

Originally posted by @analyticray in #218 (comment)

That seems to fix things in Colab for me, but not in the local install.

@darrinh
Copy link

darrinh commented Mar 30, 2023

On Ubuntu 20.04 the above version install results in the same error as the OP.

@darrinh
Copy link

darrinh commented Apr 4, 2023

The combo !pip install -qq pytorch-lightning==1.7.0 transformers==4.21.3 aitextgen==0.6.0 does not work with python 3.10.6.

@darrinh
Copy link

darrinh commented Apr 5, 2023

on python 3.11, installing pytorch-lightning==1.7.0 results in:

error: can't find Rust compiler

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants