Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question about the weight of DiffLoss #15

Open
Lirrr2829 opened this issue May 19, 2024 · 0 comments
Open

Question about the weight of DiffLoss #15

Lirrr2829 opened this issue May 19, 2024 · 0 comments

Comments

@Lirrr2829
Copy link

If I read the paper rightly, the DiffLoss should become larger to make the shared feature different from private feature. So maybe the beta_weight should be a negative number rather than a positive one ?
Many people have noticed that the DiffLoss turns to zero quickly, and actually both the source_private_feature and the target_private_feature turn to zero, because it can make loss smaller, which is caused by the positive beta_weight. And then, the DSN model becomes the same as DANN.
I tried beta_weight=-0.0001 ,the acc can sometimes reach 77.5% ,which seems not that useful......but at least the private_feature is not zero anymore.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant