You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, guys, have your solved it ? I meet same situation like this... T T
loss_dif and loss_simse are always keep 0 from epoch 0 to 99,it made me so confused :(
I also encountered the same problem, and found that private code learn nothing, they are all zeros...., i didn't directly use the original experimental datasets, but i think it doesn't matter too much....
Thanks for your nice codes!
During my training, the loss of differences decrease to 0 within the first 100 steps and remain 0 afterwards.
May I ask do you also encounter such case?
Thanks.
The text was updated successfully, but these errors were encountered: