Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update LoRA with Dropout & Conv2d Support #880

Merged
merged 18 commits into from
Jan 31, 2023

Conversation

ExponentialML
Copy link
Collaborator

This PR updates LoRA with support. More information on the changes here: cloneofsimo/lora#133 .

Also adds custom scaling of ranks for high quality, as well as descriptions for them.

@FurkanGozukara
Copy link
Contributor

You making awesome updates but really need configuration help to prepare a good tutorial video @ExponentialML :)

@ExponentialML
Copy link
Collaborator Author

You making awesome updates but really need configuration help to prepare a good tutorial video @ExponentialML :)

Thanks! This PR is actually ready to go. Just use Lora as you normally would, and test slightly lower learning rates (5e-6).

@saunderez
Copy link
Collaborator

I don't seem to be able to generate checkpoints manually using a LORA. Errors out with tensor size mismatch. I receive this error (Rank 16 for unet and tenc for reference).

RuntimeError: The size of tensor a (16) must match the size of tensor b (320) at non-singleton dimension 1

I'm trying to generate it using the model that created the LORA so tensor sizes should match.

I have checkpoints set to generate at end of job only and I've cancelled the last few jobs prior to getting to that point so can't say if it works that way.

Full error below.

Exception compiling checkpoint: The size of tensor a (16) must match the size of tensor b (320) at non-singleton dimension 1
Traceback (most recent call last):
  File "E:\sd\extensions\sd_dreambooth_extension\dreambooth\diff_to_sd.py", line 403, in compile_checkpoint
    merge_loras_to_pipe(loaded_pipeline, lora_path, lora_alpha=config.lora_weight, lora_txt_alpha=config.lora_txt_weight)
  File "E:\sd\extensions\sd_dreambooth_extension\lora_diffusion\lora.py", line 1060, in merge_loras_to_pipe
    collapse_lora(pipline.unet, lora_alpha)
  File "E:\sd\extensions\sd_dreambooth_extension\lora_diffusion\lora.py", line 588, in collapse_lora
    _child_module.lora_up.weight.data
RuntimeError: The size of tensor a (16) must match the size of tensor b (320) at non-singleton dimension 1

@ExponentialML
Copy link
Collaborator Author

Hey @saunderez. I recently caught this bug and will be pushing a fix fairly quick.

@d8ahazard d8ahazard merged commit b95e67c into d8ahazard:dev Jan 31, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants