You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
On Diffusers end explicitly setting torch_dtype when using ChatGLMModel and setting a default torch_dtype for from_pretrained paths is working huggingface/diffusers#10816 and it's mainly internal effects as torch_dtype wasn't passed for some tests, should be ok for end users as they would generally pass torch_dtype.
The text was updated successfully, but these errors were encountered:
Hey @hlky ! Thanks for reporting that. Seems to have been broken after the recent addition of dtype for composite configs. I will submit a PR to fix it
Hi 🤗
Diffusers 🧨 noticed some failing tests starting with
v4.49.0
inKolors
, one of our models that uses a custom text encoder.Reproduction
This is working on
v4.48.3
.On
v4.49.0
:The issue seems to be that the config in the test model and checkpoints like
Kwai-Kolors/Kolors-diffusers
containtorch_dtype
as a string.On Diffusers end explicitly setting
torch_dtype
when usingChatGLMModel
and setting a defaulttorch_dtype
forfrom_pretrained
paths is working huggingface/diffusers#10816 and it's mainly internal effects astorch_dtype
wasn't passed for some tests, should be ok for end users as they would generally passtorch_dtype
.The text was updated successfully, but these errors were encountered: