Prior loss weight question #265
-
Please help me understand how Prior loss weight works and how to control it? I also have a question, how does Keep N tokens work? |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments
-
Not sure about loss weight but keep n tokens is a setting for if you have shuffle captions turned on since when training additional attention is given to whatever is at the start of the prompt keep n tokens makes to so whatever the specified number that many tags will be kept at the start of the text prompt and will not be shuffled |
Beta Was this translation helpful? Give feedback.
-
Prior loss weight is a value from 0-1 (default 1) that only applies to reg images. If you set it to 0.8, every reg image will only affect the model at 80% of the strength of your non-reg images. |
Beta Was this translation helpful? Give feedback.
Prior loss weight is a value from 0-1 (default 1) that only applies to reg images. If you set it to 0.8, every reg image will only affect the model at 80% of the strength of your non-reg images.