-
Notifications
You must be signed in to change notification settings - Fork 1.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
An error occurred in merging lora weights with initial weights #809
Comments
Can you provide your merge command? I think this may because of the error of your lora model. |
python tools/llama/merge_lora.py --lora-config r_8_alpha_16 --base-weight /home/root123/workspace/lwl/fish-speech/tools/checkpoints/fish-speech-1.5/ --lora-weight /home/root123/workspace/lwl/fish-speech/results/checkpoints/step_000000600.ckpt --output /home/root123/workspace/lwl/fish-speech/tools/checkpoints/result/fish-speech-1.5-yth-lora/ |
this is my command |
I just used the latest code to fine-tune it, and no errors occurred, but an error like "Loaded weights with error: _IncompatibleKeys(missing_keys="'embeddings. ora_A','embeddings. ora_B','codebook_embeddings. ora_A','codebook_embeddings. ora_B'," occurred when loading the model at the beginning of training, but the model was trained normally. |
This issue is stale because it has been open for 30 days with no activity. |
Self Checks
Cloud or Self Hosted
Cloud
Environment Details
python=3.10
Steps to Reproduce
✔️ Expected Behavior
No response
❌ Actual Behavior
No response
The text was updated successfully, but these errors were encountered: