Skip to content

Commit 8629d31

Browse files
authored
Add info on novel fine-tuning method
1 parent bdadc89 commit 8629d31

File tree

1 file changed

+11
-0
lines changed

1 file changed

+11
-0
lines changed

README.md

+11
Original file line numberDiff line numberDiff line change
@@ -8,6 +8,17 @@ This repo is for fine-tuning CLIP in the command line. It does not add custom no
88
### 👇 Scroll all the way down for step-by-step instructions with ComfyUI! 👇
99
### ‼️ Don't want to fine-tune? You can download the model here: [https://huggingface.co/zer0int](https://huggingface.co/zer0int)
1010
-------
11+
## Changes 09-MAR-2025:
12+
⚠️ A new way to fine-tune CLIP: 🌟 [github.com/zer0int/CLIP-fine-tune-registers-gated](https://github.com/zer0int/CLIP-fine-tune-registers-gated) 🌟
13+
- But: Is it for you? 🤔
14+
- You want a Text Encoder for T2I / T2V / Gen-AI, or you want best zero-shot accuracy: No / not necessarily. ❌
15+
- You want a CLIP that is compatible with everything (no architecture change): No / stick with this repo. ❌
16+
- You are frustrated by the modality gap and want a retrieval CLIP? Absolutely yes! [CLICK ME](https://github.com/zer0int/CLIP-fine-tune-registers-gated)
17+
- In a nutshell: New CLIP has +20M params, register tokens, Gated MLP / Fusion.
18+
- Modality Gap (OpenAI pre-trained): 0.8276 --> (NEW CLIP): 0.4740 👈🤯
19+
- Attention heatmaps are finally meaningful, not "burnt-in artifacts".
20+
- Check out the models on my HuggingFace: [huggingface.co/zer0int/CLIP-Registers-Gated_MLP-ViT-L-14](https://huggingface.co/zer0int/CLIP-Registers-Gated_MLP-ViT-L-14)
21+
-----
1122
## Changes 11/NOV/2024:
1223
- Added a new model saver: Saves either as GmP + full model object (default, legacy behavior)
1324
- Optional conversion to .weight (converting back with extra script no longer needed)

0 commit comments

Comments
 (0)