You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, I'm trying to use sympointv1 for training, but I'm finding some difference in the training results between single and multi gpu. On 8*A100 I can effectively drop the loss, but using a single gpu and setting the batch_size to 16 doesn't have the same training effect.how should I train with a single card?
The text was updated successfully, but these errors were encountered:
Hi, I'm trying to use sympointv1 for training, but I'm finding some difference in the training results between single and multi gpu. On 8*A100 I can effectively drop the loss, but using a single gpu and setting the batch_size to 16 doesn't have the same training effect.how should I train with a single card?
The text was updated successfully, but these errors were encountered: