-
Notifications
You must be signed in to change notification settings - Fork 39
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
RuntimeError: CUDA out of memory. Tried to allocate 144.00 MiB #2
Comments
Faced the same issue. While training. |
Can you try further decreasing the mini-batch size? |
This method doesn't work for me. |
oh, the totally same problem, have u guys solved this? |
When we are trying to run the greaselm.py we are getting this issue even if we run the batch size minimum of 8
we tried from 128-8 every time, It throws the error with different memory size as free , after some epochs. can you help us here in solving this issue and run the code
The text was updated successfully, but these errors were encountered: