-
Notifications
You must be signed in to change notification settings - Fork 841
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
256x256_diffusion_uncond.pt是怎么训练出来的 #156
Comments
Hello, do you solve this? I want to fine-turn with my data. But the weight loaded is wrong. |
目前还没有,请问你有思路吗 |
作业已收到
|
MODEL_FLAGS="--image_size 256 --attention_resolutions 32,16,8 --num_channels 256 --num_head_channels 64 --num_res_blocks 2 --num_heads 4 --resblock_updown true --learn_sigma True --use_scale_shift_norm true --learn_sigma true --timestep_respacing 250 --use_fp16 false --use_kl false " DIFFUSION_FLAGS="--diffusion_steps 1000 --noise_schedule linear --rescale_learned_sigmas False" TRAIN_FLAGS="--lr 1e-4 --microbatch 4 --dropout 0.0" |
为什么我按照这样训练的权重大小只有300M,但是预训练权重256x256_diffusion_uncond.pt大小却有2.21G呢,请问你遇到这个问题了吗 |
1 similar comment
为什么我按照这样训练的权重大小只有300M,但是预训练权重256x256_diffusion_uncond.pt大小却有2.21G呢,请问你遇到这个问题了吗 |
256x256_diffusion_uncond.pt是怎么训练出来的?只能使用官方给的权重吗?
The text was updated successfully, but these errors were encountered: