-
Notifications
You must be signed in to change notification settings - Fork 6.3k
lora_conversion_utils: replace lora up/down with a/b even if transformer.
in key
#12101
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
lora_conversion_utils: replace lora up/down with a/b even if transformer.
in key
#12101
Conversation
Bisect shows the proj_out
problem commit as bc34fa8 so I can open an issue for that if need be
Can you show an example state dict? The changes you're introducing might be backwards-breaking.
The changes you're introducing might be backwards-breaking.
I assumed this would be impossible because lora_down
and lora_up
aren't read by peft anywhere? The diffusers loader mixin has a check for lora_down.weight
which is hardcoded to use the sd1/xl unet converter which for flux models results in an empty rank dict and later an index err because there's no unet blocks.
diffusers/src/diffusers/loaders/peft.py
Lines 227 to 230 in 4b17fa2
Can you show an example state dict?
https://huggingface.co/Beinsezii/peft_kohya_lora/blob/main/pytorch_lora_weights.safetensors
I understand now. Thanks!
HuggingFaceDocBuilderDev
commented
Aug 8, 2025
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.
Nice looks like a8e4797 fixed the proj_out too
Yes, hopefully, we will not run into those nasty issues for a while :)
Uh oh!
There was an error while loading. Please reload this page.
What does this PR do?
Saw some Flux.DEV loras in our DB with keys like
So basically PEFT layers but Kohya adapter names. This might be a mistake on the trainer part but after picking around in the converter for a bit I figured out it can be an easy one line fix so that's what I've done here. I don't have the civit.ai URLs at the moment so I don't have a public link to weights.
The
proj_out
layers still fail but so does every other peft lora with proj out layers againstmain
currently so I think that's an unrelated bug.Before submitting
documentation guidelines, and
here are tips on formatting docstrings.
Who can review?
@sayakpaul