-
Notifications
You must be signed in to change notification settings - Fork 12
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Codspeed - performance benchmarks #471
base: master
Are you sure you want to change the base?
Conversation
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## master #471 +/- ##
==========================================
+ Coverage 63.85% 64.09% +0.24%
==========================================
Files 40 40
Lines 3591 3724 +133
Branches 774 790 +16
==========================================
+ Hits 2293 2387 +94
- Misses 772 800 +28
- Partials 526 537 +11 ☔ View full report in Codecov by Sentry. |
61f4aa8
to
5ec5337
Compare
9265999
to
99fa170
Compare
f649e7f
to
d0e1548
Compare
9b797e7
to
bc6e897
Compare
`workflow_dispatch` allows CodSpeed to trigger backtest performance analysis in order to generate initial data. See also: https://docs.codspeed.io/ci/github-actions#2-create-the-benchmarks-workflow
tests/sync/test_git.py:11: error: Skipping analyzing "pytest_codspeed.plugin": module is installed, but missing library stubs or py.typed marker [import-untyped] tests/sync/test_git.py:11: note: See https://mypy.readthedocs.io/en/stable/running_mypy.html#missing-imports
# Problem Git, Mercurial, and Subversion repositories are unnecessarily reinitialized for each test. - We're not utilizing session-based scoping. - A single initial repo could be created, then copied to [`tmp_path`](https://docs.pytest.org/en/8.3.x/how-to/tmp_path.html#the-tmp-path-fixture) using [`shutil.copytree`](https://docs.python.org/3/library/shutil.html#shutil.copytree) ([source](https://github.com/python/cpython/blob/v3.13.0/Lib/shutil.py#L550-L605)). Issue #471 highlighted this inefficiency, where benchmarks showed tens of thousands of redundant functional calls. # Improvement ``` ❯ hyperfine -L branch master,pytest-plugin-fixture-caching 'git checkout {branch} && py.test' Benchmark 1: git checkout master && py.test Time (mean ± σ): 32.062 s ± 0.869 s [User: 41.391 s, System: 9.931 s] Range (min … max): 30.878 s … 33.583 s 10 runs Benchmark 2: git checkout pytest-plugin-fixture-caching && py.test Time (mean ± σ): 14.659 s ± 0.495 s [User: 16.351 s, System: 4.433 s] Range (min … max): 13.990 s … 15.423 s 10 runs Summary git checkout pytest-plugin-fixture-caching && py.test ran 2.19 ± 0.09 times faster than git checkout master && py.test ``` # Changes ## Pytest fixtures overhaul 1. Create a base VCS repo. 2. For subsequent tests, copy and modify from this template.
7f69eab
to
c480bb4
Compare
@sourcery-ai review |
Reviewer's Guide by SourceryThis pull request introduces performance benchmarks using CodSpeed. It sets up CodSpeed configuration, adds the Sequence diagram for benchmark execution flowsequenceDiagram
participant Dev as Developer
participant CI as CI Pipeline
participant CS as CodSpeed
Dev->>CI: Push code changes
activate CI
CI->>CI: Run tests with pytest
CI->>CI: Execute benchmarks
CI->>CS: Send benchmark results
activate CS
CS->>CS: Analyze performance
CS-->>Dev: Report performance changes
deactivate CS
deactivate CI
File-Level Changes
Tips and commandsInteracting with Sourcery
Customizing Your ExperienceAccess your dashboard to:
Getting Help
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hey @tony - I've reviewed your changes - here's some feedback:
Overall Comments:
- Remember to add the
benchmark
annotation/fixture as noted in your TODO. This is needed for proper benchmark function identification.
Here's what I looked at during the review
- 🟢 General issues: all looks good
- 🟢 Security: all looks good
- 🟢 Testing: all looks good
- 🟢 Complexity: all looks good
- 🟢 Documentation: all looks good
Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.
Note
benchmark
(BenchmarkFixture
)Problem
It's difficult to catch performance degradation or improvements over time, in a PR, etc.
Changes
Add performance benchmarks
TBD
Setup codpseed
Configure on website, set secret, etc.
py(deps[test]) Add
pytest-codspeed
See also:
Summary by Sourcery
Add performance benchmarks using Codspeed. Integrate Codspeed into the CI workflow to automatically run performance tests and report results.
CI:
Tests:
pytest-codspeed
to enable performance testing.