You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have finetuned prior and unet models for Kandinsky 2.2 inpainting with Lora for about 15 images. The results are bad compared to sd 1.5 and 2.0
At inference when i load the prior model weights, I get missing keys error. Just wanted to know if it's expected
`
from diffusers.models.attention_processor import LoRAAttnProcessor, LoRAAttnAddedKVProcessor
lora_attn_procs = {}
for name in pipe_prior.prior.attn_processors.keys():
lora_attn_procs[name] = LoRAAttnProcessor(hidden_size=2048, rank=4).to('cuda')
I have finetuned prior and unet models for Kandinsky 2.2 inpainting with Lora for about 15 images. The results are bad compared to sd 1.5 and 2.0
At inference when i load the prior model weights, I get missing keys error. Just wanted to know if it's expected
`
from diffusers.models.attention_processor import LoRAAttnProcessor, LoRAAttnAddedKVProcessor
lora_attn_procs = {}
for name in pipe_prior.prior.attn_processors.keys():
lora_attn_procs[name] = LoRAAttnProcessor(hidden_size=2048, rank=4).to('cuda')
pipe_prior.prior.set_attn_processor(lora_attn_procs)
pipe_prior.prior.load_state_dict(torch.load('./arbonne-model/kardinsky-prior/checkpoint-300/pytorch_model.bin'), strict=False)
IncompatibleKeys(missing_keys=['positional_embedding', 'prd_embedding', 'clip_mean', 'clip_std', 'time_embedding.linear_1.weight', 'time_embedding.linear_1.bias', 'time_embedding.linear_2.weight', 'time_embedding.linear_2.bias', 'proj_in.weight', 'proj_in.bias', 'embedding_proj.weight', 'embedding_proj.bias', 'encoder_hidden_states_proj.weight', 'encoder_hidden_states_proj.bias', 'transformer_blocks.0.norm1.weight', 'transformer_blocks.0.norm1.bias', 'transformer_blocks.0.attn1.to_q.weight', 'transformer_blocks.0.attn1.to_q.bias', 'transformer_blocks.0.attn1.to_k.weight', 'transformer_blocks.0.attn1.to_k.bias', 'transformer_blocks.0.attn1.to_v.weight', 'transformer_blocks.0.attn1.to_v.bias', 'transformer_blocks.0.attn1.to_out.0.weight', 'transformer_blocks.0.attn1.to_out.0.bias', 'transformer_blocks.0.norm3.weight', 'transformer_blocks.0.norm3.bias', 'transformer_blocks.0.ff.net.0.proj.weight', 'transformer_blocks.0.ff.net.0.proj.bias', 'transformer_blocks.0.ff.net.2.weight', 'transformer_blocks.0.ff.net.2.bias', 'transformer_blocks.1.norm1.weight', 'transformer_blocks.1.norm1.bias', 'transformer_blocks.1.attn1.to_q.weight', 'transformer_blocks.1.attn1.to_q.bias', 'transformer_blocks.1.attn1.to_k.weight', 'transformer_blocks.1.attn1.to_k.bias', 'transformer_blocks.1.attn1.to_v.weight', 'transformer_blocks.1.attn1.to_v.bias', 'transformer_blocks.1.attn1.to_out.0.weight', 'transformer_blocks.1.attn1.to_out.0.bias', 'transformer_blocks.1.norm3.weight', 'transformer_blocks.1.norm3.bias', 'transformer_blocks.1.ff.net.0.proj.weight', 'transformer_blocks.1.ff.net.0.proj.bias', 'transformer_blocks.1.ff.net.2.weight', 'transformer_blocks.1.ff.net.2.bias', 'transformer_blocks.2.norm1.weight', 'transformer_blocks.2.norm1.bias', 'transformer_blocks.2.attn1.to_q.weight', 'transformer_blocks.2.attn1.to_q.bias', 'transformer_blocks.2.attn1.to_k.weight', 'transformer_blocks.2.attn1.to_k.bias', 'transformer_blocks.2.attn1.to_v.weight', 'transformer_blocks.2.attn1.to_v.bias', 'transformer_blocks.2.attn1.to_out.0.weight', 'transformer_blocks.2.attn1.to_out.0.bias', 'transformer_blocks.2.norm3.weight', 'transformer_blocks.2.norm3.bias', 'transformer_blocks.2.ff.net.0.proj.weight', 'transformer_blocks.2.ff.net.0.proj.bias', 'transformer_blocks.2.ff.net.2.weight', 'transformer_blocks.2.ff.net.2.bias', 'transformer_blocks.3.norm1.weight', 'transformer_blocks.3.norm1.bias', 'transformer_blocks.3.attn1.to_q.weight', 'transformer_blocks.3.attn1.to_q.bias', 'transformer_blocks.3.attn1.to_k.weight', 'transformer_blocks.3.attn1.to_k.bias', 'transformer_blocks.3.attn1.to_v.weight', 'transformer_blocks.3.attn1.to_v.bias', 'transformer_blocks.3.attn1.to_out.0.weight', 'transformer_blocks.3.attn1.to_out.0.bias', 'transformer_blocks.3.norm3.weight', 'transformer_blocks.3.norm3.bias', 'transformer_blocks.3.ff.net.0.proj.weight', 'transformer_blocks.3.ff.net.0.proj.bias', 'transformer_blocks.3.ff.net.2.weight', 'transformer_blocks.3.ff.net.2.bias', 'transformer_blocks.4.norm1.weight', 'transformer_blocks.4.norm1.bias', 'transformer_blocks.4.attn1.to_q.weight', 'transformer_blocks.4.attn1.to_q.bias', 'transformer_blocks.4.attn1.to_k.weight', 'transformer_blocks.4.attn1.to_k.bias', 'transformer_blocks.4.attn1.to_v.weight', 'transformer_blocks.4.attn1.to_v.bias', 'transformer_blocks.4.attn1.to_out.0.weight', 'transformer_blocks.4.attn1.to_out.0.bias', 'transformer_blocks.4.norm3.weight', 'transformer_blocks.4.norm3.bias', 'transformer_blocks.4.ff.net.0.proj.weight', 'transformer_blocks.4.ff.net.0.proj.bias', 'transformer_blocks.4.ff.net.2.weight', 'transformer_blocks.4.ff.net.2.bias', 'transformer_blocks.5.norm1.weight', 'transformer_blocks.5.norm1.bias', 'transformer_blocks.5.attn1.to_q.weight', 'transformer_blocks.5.attn1.to_q.bias', 'transformer_blocks.5.attn1.to_k.weight', 'transformer_blocks.5.attn1.to_k.bias', 'transformer_blocks.5.attn1.to_v.weight', 'transformer_blocks.5.attn1.to_v.bias', 'transformer_blocks.5.attn1.to_out.0.weight', 'transformer_blocks.5.attn1.to_out.0.bias', 'transformer_blocks.5.norm3.weight', 'transformer_blocks.5.norm3.bias', 'transformer_blocks.5.ff.net.0.proj.weight', 'transformer_blocks.5.ff.net.0.proj.bias', 'transformer_blocks.5.ff.net.2.weight', 'transformer_blocks.5.ff.net.2.bias', 'transformer_blocks.6.norm1.weight', 'transformer_blocks.6.norm1.bias', 'transformer_blocks.6.attn1.to_q.weight', 'transformer_blocks.6.attn1.to_q.bias', 'transformer_blocks.6.attn1.to_k.weight', 'transformer_blocks.6.attn1.to_k.bias', 'transformer_blocks.6.attn1.to_v.weight', 'transformer_blocks.6.attn1.to_v.bias', 'transformer_blocks.6.attn1.to_out.0.weight', 'transformer_blocks.6.attn1.to_out.0.bias', 'transformer_blocks.6.norm3.weight', 'transformer_blocks.6.norm3.bias', 'transformer_blocks.6.ff.net.0.proj.weight', 'transformer_blocks.6.ff.net.0.proj.bias', 'transformer_blocks.6.ff.net.2.weight', 'transformer_blocks.6.ff.net.2.bias', 'transformer_blocks.7.norm1.weight', 'transformer_blocks.7.norm1.bias', 'transformer_blocks.7.attn1.to_q.weight', 'transformer_blocks.7.attn1.to_q.bias', 'transformer_blocks.7.attn1.to_k.weight', 'transformer_blocks.7.attn1.to_k.bias', 'transformer_blocks.7.attn1.to_v.weight', 'transformer_blocks.7.attn1.to_v.bias', 'transformer_blocks.7.attn1.to_out.0.weight', 'transformer_blocks.7.attn1.to_out.0.bias', 'transformer_blocks.7.norm3.weight', 'transformer_blocks.7.norm3.bias', 'transformer_blocks.7.ff.net.0.proj.weight', 'transformer_blocks.7.ff.net.0.proj.bias', 'transformer_blocks.7.ff.net.2.weight', 'transformer_blocks.7.ff.net.2.bias', 'transformer_blocks.8.norm1.weight', 'transformer_blocks.8.norm1.bias', 'transformer_blocks.8.attn1.to_q.weight', 'transformer_blocks.8.attn1.to_q.bias', 'transformer_blocks.8.attn1.to_k.weight', 'transformer_blocks.8.attn1.to_k.bias', 'transformer_blocks.8.attn1.to_v.weight', 'transformer_blocks.8.attn1.to_v.bias', 'transformer_blocks.8.attn1.to_out.0.weight', 'transformer_blocks.8.attn1.to_out.0.bias', 'transformer_blocks.8.norm3.weight', 'transformer_blocks.8.norm3.bias', 'transformer_blocks.8.ff.net.0.proj.weight', 'transformer_blocks.8.ff.net.0.proj.bias', 'transformer_blocks.8.ff.net.2.weight', 'transformer_blocks.8.ff.net.2.bias', 'transformer_blocks.9.norm1.weight', 'transformer_blocks.9.norm1.bias', 'transformer_blocks.9.attn1.to_q.weight', 'transformer_blocks.9.attn1.to_q.bias', 'transformer_blocks.9.attn1.to_k.weight', 'transformer_blocks.9.attn1.to_k.bias', 'transformer_blocks.9.attn1.to_v.weight', 'transformer_blocks.9.attn1.to_v.bias', 'transformer_blocks.9.attn1.to_out.0.weight', 'transformer_blocks.9.attn1.to_out.0.bias', 'transformer_blocks.9.norm3.weight', 'transformer_blocks.9.norm3.bias', 'transformer_blocks.9.ff.net.0.proj.weight', 'transformer_blocks.9.ff.net.0.proj.bias', 'transformer_blocks.9.ff.net.2.weight', 'transformer_blocks.9.ff.net.2.bias', 'transformer_blocks.10.norm1.weight', 'transformer_blocks.10.norm1.bias', 'transformer_blocks.10.attn1.to_q.weight', 'transformer_blocks.10.attn1.to_q.bias', 'transformer_blocks.10.attn1.to_k.weight', 'transformer_blocks.10.attn1.to_k.bias', 'transformer_blocks.10.attn1.to_v.weight', 'transformer_blocks.10.attn1.to_v.bias', 'transformer_blocks.10.attn1.to_out.0.weight', 'transformer_blocks.10.attn1.to_out.0.bias', 'transformer_blocks.10.norm3.weight', 'transformer_blocks.10.norm3.bias', 'transformer_blocks.10.ff.net.0.proj.weight', 'transformer_blocks.10.ff.net.0.proj.bias', 'transformer_blocks.10.ff.net.2.weight', 'transformer_blocks.10.ff.net.2.bias', 'transformer_blocks.11.norm1.weight', 'transformer_blocks.11.norm1.bias', 'transformer_blocks.11.attn1.to_q.weight', 'transformer_blocks.11.attn1.to_q.bias', 'transformer_blocks.11.attn1.to_k.weight', 'transformer_blocks.11.attn1.to_k.bias', 'transformer_blocks.11.attn1.to_v.weight', 'transformer_blocks.11.attn1.to_v.bias', 'transformer_blocks.11.attn1.to_out.0.weight', 'transformer_blocks.11.attn1.to_out.0.bias', 'transformer_blocks.11.norm3.weight', 'transformer_blocks.11.norm3.bias', 'transformer_blocks.11.ff.net.0.proj.weight', 'transformer_blocks.11.ff.net.0.proj.bias', 'transformer_blocks.11.ff.net.2.weight', 'transformer_blocks.11.ff.net.2.bias', 'transformer_blocks.12.norm1.weight', 'transformer_blocks.12.norm1.bias', 'transformer_blocks.12.attn1.to_q.weight', 'transformer_blocks.12.attn1.to_q.bias', 'transformer_blocks.12.attn1.to_k.weight', 'transformer_blocks.12.attn1.to_k.bias', 'transformer_blocks.12.attn1.to_v.weight', 'transformer_blocks.12.attn1.to_v.bias', 'transformer_blocks.12.attn1.to_out.0.weight', 'transformer_blocks.12.attn1.to_out.0.bias', 'transformer_blocks.12.norm3.weight', 'transformer_blocks.12.norm3.bias', 'transformer_blocks.12.ff.net.0.proj.weight', 'transformer_blocks.12.ff.net.0.proj.bias', 'transformer_blocks.12.ff.net.2.weight', 'transformer_blocks.12.ff.net.2.bias', 'transformer_blocks.13.norm1.weight', 'transformer_blocks.13.norm1.bias', 'transformer_blocks.13.attn1.to_q.weight', 'transformer_blocks.13.attn1.to_q.bias', 'transformer_blocks.13.attn1.to_k.weight', 'transformer_blocks.13.attn1.to_k.bias', 'transformer_blocks.13.attn1.to_v.weight', 'transformer_blocks.13.attn1.to_v.bias', 'transformer_blocks.13.attn1.to_out.0.weight', 'transformer_blocks.13.attn1.to_out.0.bias', 'transformer_blocks.13.norm3.weight', 'transformer_blocks.13.norm3.bias', 'transformer_blocks.13.ff.net.0.proj.weight', 'transformer_blocks.13.ff.net.0.proj.bias', 'transformer_blocks.13.ff.net.2.weight', 'transformer_blocks.13.ff.net.2.bias', 'transformer_blocks.14.norm1.weight', 'transformer_blocks.14.norm1.bias', 'transformer_blocks.14.attn1.to_q.weight', 'transformer_blocks.14.attn1.to_q.bias', 'transformer_blocks.14.attn1.to_k.weight', 'transformer_blocks.14.attn1.to_k.bias', 'transformer_blocks.14.attn1.to_v.weight', 'transformer_blocks.14.attn1.to_v.bias', 'transformer_blocks.14.attn1.to_out.0.weight', 'transformer_blocks.14.attn1.to_out.0.bias', 'transformer_blocks.14.norm3.weight', 'transformer_blocks.14.norm3.bias', 'transformer_blocks.14.ff.net.0.proj.weight', 'transformer_blocks.14.ff.net.0.proj.bias', 'transformer_blocks.14.ff.net.2.weight', 'transformer_blocks.14.ff.net.2.bias', 'transformer_blocks.15.norm1.weight', 'transformer_blocks.15.norm1.bias', 'transformer_blocks.15.attn1.to_q.weight', 'transformer_blocks.15.attn1.to_q.bias', 'transformer_blocks.15.attn1.to_k.weight', 'transformer_blocks.15.attn1.to_k.bias', 'transformer_blocks.15.attn1.to_v.weight', 'transformer_blocks.15.attn1.to_v.bias', 'transformer_blocks.15.attn1.to_out.0.weight', 'transformer_blocks.15.attn1.to_out.0.bias', 'transformer_blocks.15.norm3.weight', 'transformer_blocks.15.norm3.bias', 'transformer_blocks.15.ff.net.0.proj.weight', 'transformer_blocks.15.ff.net.0.proj.bias', 'transformer_blocks.15.ff.net.2.weight', 'transformer_blocks.15.ff.net.2.bias', 'transformer_blocks.16.norm1.weight', 'transformer_blocks.16.norm1.bias', 'transformer_blocks.16.attn1.to_q.weight', 'transformer_blocks.16.attn1.to_q.bias', 'transformer_blocks.16.attn1.to_k.weight', 'transformer_blocks.16.attn1.to_k.bias', 'transformer_blocks.16.attn1.to_v.weight', 'transformer_blocks.16.attn1.to_v.bias', 'transformer_blocks.16.attn1.to_out.0.weight', 'transformer_blocks.16.attn1.to_out.0.bias', 'transformer_blocks.16.norm3.weight', 'transformer_blocks.16.norm3.bias', 'transformer_blocks.16.ff.net.0.proj.weight', 'transformer_blocks.16.ff.net.0.proj.bias', 'transformer_blocks.16.ff.net.2.weight', 'transformer_blocks.16.ff.net.2.bias', 'transformer_blocks.17.norm1.weight', 'transformer_blocks.17.norm1.bias', 'transformer_blocks.17.attn1.to_q.weight', 'transformer_blocks.17.attn1.to_q.bias', 'transformer_blocks.17.attn1.to_k.weight', 'transformer_blocks.17.attn1.to_k.bias', 'transformer_blocks.17.attn1.to_v.weight', 'transformer_blocks.17.attn1.to_v.bias', 'transformer_blocks.17.attn1.to_out.0.weight', 'transformer_blocks.17.attn1.to_out.0.bias', 'transformer_blocks.17.norm3.weight', 'transformer_blocks.17.norm3.bias', 'transformer_blocks.17.ff.net.0.proj.weight', 'transformer_blocks.17.ff.net.0.proj.bias', 'transformer_blocks.17.ff.net.2.weight', 'transformer_blocks.17.ff.net.2.bias', 'transformer_blocks.18.norm1.weight', 'transformer_blocks.18.norm1.bias', 'transformer_blocks.18.attn1.to_q.weight', 'transformer_blocks.18.attn1.to_q.bias', 'transformer_blocks.18.attn1.to_k.weight', 'transformer_blocks.18.attn1.to_k.bias', 'transformer_blocks.18.attn1.to_v.weight', 'transformer_blocks.18.attn1.to_v.bias', 'transformer_blocks.18.attn1.to_out.0.weight', 'transformer_blocks.18.attn1.to_out.0.bias', 'transformer_blocks.18.norm3.weight', 'transformer_blocks.18.norm3.bias', 'transformer_blocks.18.ff.net.0.proj.weight', 'transformer_blocks.18.ff.net.0.proj.bias', 'transformer_blocks.18.ff.net.2.weight', 'transformer_blocks.18.ff.net.2.bias', 'transformer_blocks.19.norm1.weight', 'transformer_blocks.19.norm1.bias', 'transformer_blocks.19.attn1.to_q.weight', 'transformer_blocks.19.attn1.to_q.bias', 'transformer_blocks.19.attn1.to_k.weight', 'transformer_blocks.19.attn1.to_k.bias', 'transformer_blocks.19.attn1.to_v.weight', 'transformer_blocks.19.attn1.to_v.bias', 'transformer_blocks.19.attn1.to_out.0.weight', 'transformer_blocks.19.attn1.to_out.0.bias', 'transformer_blocks.19.norm3.weight', 'transformer_blocks.19.norm3.bias', 'transformer_blocks.19.ff.net.0.proj.weight', 'transformer_blocks.19.ff.net.0.proj.bias', 'transformer_blocks.19.ff.net.2.weight', 'transformer_blocks.19.ff.net.2.bias', 'norm_out.weight', 'norm_out.bias', 'proj_to_clip_embeddings.weight', 'proj_to_clip_embeddings.bias'], unexpected_keys=[]
`
The text was updated successfully, but these errors were encountered: