-
Notifications
You must be signed in to change notification settings - Fork 239
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conformance test add validation with batch_size #2652
Conversation
jobs 372 and 373 with fixed host are running |
Based on results - median speed up per model is ~ 40% |
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## develop #2652 +/- ##
============================================
- Coverage 46.80% 30.06% -16.75%
============================================
Files 493 493
Lines 45445 45540 +95
============================================
- Hits 21272 13692 -7580
- Misses 24173 31848 +7675 see 373 files with indirect coverage changes
Flags with carried forward coverage won't be shown. Click here to find out more.
|
tests/post_training/model_scope.py
Outdated
"batch_size": 32, | ||
"validation_batch_size": 1, # Validation is slower with batch_size > 1 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why do we have separate batch sizes for calibration & validation? If some models should be validated with the batch_size=1, they should be calibrated with the same batch.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Make sense
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
set bs=1 for swin
@@ -122,13 +122,14 @@ def prepare_calibration_dataset(self): | |||
|
|||
def _validate(self): | |||
val_dataset = datasets.ImageFolder(root=self.data_dir / "imagenet" / "val", transform=self.transform) | |||
val_loader = torch.utils.data.DataLoader(val_dataset, batch_size=1, num_workers=2, shuffle=False) | |||
val_loader = torch.utils.data.DataLoader( | |||
val_dataset, batch_size=self.validation_batch_size, num_workers=2, shuffle=False |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Are the 2 workers enough for the data loading with a batch more than 1?
bfe4e2f
to
03606b1
Compare
job with updates for swin - 375 |
Closing the PR since after obtaining the correct results on validation performance it was observed that there is no significant impact on performance with these changes |
Changes
Add arbitrary batch_size for validation dataloader.
Reason for changes
Speed up validation
Related tickets
138556
Tests
Jobs