Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

When i am training the ArbRCAN model, the training loss is so large. Is it a normal situation? #24

Open
RayTan183 opened this issue Oct 24, 2023 · 2 comments

Comments

@RayTan183
Copy link

When i am training the ArbRCAN model, the training is so large that it is a normal situation. Sometime the loss so so large that skip the batch. And i tried to test by using the middle weights such as 70.pth. But the result even worse than the original pre-trained model.

@RayTan183 RayTan183 changed the title When i am training the ArbRCAN model, the training is so large that it is a normal situation? When i am training the ArbRCAN model, the training loss is so large. Is it a normal situation? Oct 24, 2023
@RayTan183
Copy link
Author

image

@hjdihs
Copy link

hjdihs commented Jul 18, 2024

Is it a normal situation?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants