-
Notifications
You must be signed in to change notification settings - Fork 19
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Relationship between classifier_free_dropout (0.2) and classifier_scale (1.0)? #15
Comments
classifier_scale is the scale of classifier-free guidance in inference. classifier_free_dropout means to drop condition with the probability 0.1 during training. you can refer to the paper of Classifier-free Guidance |
hi @ZGCTroy ,
Thank you. |
Yes. Specifically, classifier-free guidance is only a technique for condition enhancement in sampling. However, it requires both conditional model and unconditional model during training. Instead of training two models, they propose to train only one conditional model by dropouting condition with fixed probability. |
Hi @ZGCTroy , |
hi author,
I noticed that your classifier_free_dropout is 0.2, but what is the reason for setting classifier_scale to be 1.0? Thanks
The text was updated successfully, but these errors were encountered: