Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

loss_diff decrease to zero very fast #10

Open
rshaojimmy opened this issue Mar 8, 2021 · 2 comments
Open

loss_diff decrease to zero very fast #10

rshaojimmy opened this issue Mar 8, 2021 · 2 comments

Comments

@rshaojimmy
Copy link

Thanks for your nice codes!

During my training, the loss of differences decrease to 0 within the first 100 steps and remain 0 afterwards.

May I ask do you also encounter such case?

Thanks.

@FLAWLESSJade
Copy link

Hi, guys, have your solved it ? I meet same situation like this... T T
loss_dif and loss_simse are always keep 0 from epoch 0 to 99,it made me so confused :(

@chenxi52
Copy link

chenxi52 commented Sep 12, 2021

I also encountered the same problem, and found that private code learn nothing, they are all zeros...., i didn't directly use the original experimental datasets, but i think it doesn't matter too much....

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants