Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore: Make the progress bar disappear at the end of their execution #1255

Merged
merged 7 commits into from
Jan 30, 2025

Conversation

ictorv
Copy link
Contributor

@ictorv ictorv commented Jan 29, 2025

closes #1251

Make sure that progress bars are transient and disappear after completion.

@ictorv
Copy link
Contributor Author

ictorv commented Jan 29, 2025

Removed Docstring.

@glemaitre
Copy link
Member

@ictorv You do not need to reopen a separate PR at each commit. just need to push into your branch whenever we provide a review. Let me make a pass on the PR.

@ictorv
Copy link
Contributor Author

ictorv commented Jan 29, 2025

@ictorv You do not need to reopen a separate PR at each commit. just need to push into your branch whenever we provide a review. Let me make a pass on the PR.

Okay 👍 :)

@glemaitre
Copy link
Member

As well, do not hesitate to write a proper description in the summary. I did edit this one. Sometimes you will have people reviewing that would wish to come back to the original issue and the PR should have enough context ;)

@glemaitre glemaitre changed the title Fix progress bar docstring chore: Make the progress bar disappear at the end of their execution Jan 29, 2025
@glemaitre glemaitre self-requested a review January 29, 2025 18:06
Comment on lines 15 to 16
during execution. The progress bar automatically disappears after completion.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The lint job is failing because you have trailing space here.

Suggested change
during execution. The progress bar automatically disappears after completion.
during execution. The progress bar automatically disappears after completion.

@ictorv
Copy link
Contributor Author

ictorv commented Jan 29, 2025

As well, do not hesitate to write a proper description in the summary. I did edit this one. Sometimes you will have people reviewing that would wish to come back to the original issue and the PR should have enough context ;)

Okay
from now on, I will write complete descriptions :)

Copy link
Contributor

github-actions bot commented Jan 29, 2025

Documentation preview @ cc44134

@glemaitre
Copy link
Member

The following tests are still failing.

FAILED src/skore/sklearn/_cross_validation/metrics_accessor.py::skore.sklearn._cross_validation.metrics_accessor._MetricsAccessor.r2
FAILED src/skore/sklearn/_cross_validation/metrics_accessor.py::skore.sklearn._cross_validation.metrics_accessor._MetricsAccessor.roc_auc
FAILED src/skore/sklearn/_cross_validation/metrics_accessor.py::skore.sklearn._cross_validation.metrics_accessor._MetricsAccessor.brier_score
FAILED src/skore/sklearn/_estimator/report.py::skore.sklearn._estimator.report.EstimatorReport.cache_predictions
FAILED src/skore/sklearn/_cross_validation/metrics_accessor.py::skore.sklearn._cross_validation.metrics_accessor._MetricsAccessor.custom_metric
FAILED src/skore/sklearn/_cross_validation/report.py::skore.sklearn._cross_validation.report.CrossValidationReport.clear_cache
FAILED src/skore/sklearn/_cross_validation/metrics_accessor.py::skore.sklearn._cross_validation.metrics_accessor._MetricsAccessor.report_metrics
FAILED src/skore/sklearn/_cross_validation/metrics_accessor.py::skore.sklearn._cross_validation.metrics_accessor._MetricsAccessor.precision
FAILED src/skore/sklearn/_cross_validation/metrics_accessor.py::skore.sklearn._cross_validation.metrics_accessor._PlotMetricsAccessor.prediction_error
FAILED src/skore/sklearn/_cross_validation/metrics_accessor.py::skore.sklearn._cross_validation.metrics_accessor._MetricsAccessor.log_loss
FAILED src/skore/sklearn/_cross_validation/metrics_accessor.py::skore.sklearn._cross_validation.metrics_accessor._PlotMetricsAccessor.roc
FAILED src/skore/sklearn/_cross_validation/metrics_accessor.py::skore.sklearn._cross_validation.metrics_accessor._PlotMetricsAccessor.precision_recall
FAILED src/skore/sklearn/_cross_validation/metrics_accessor.py::skore.sklearn._cross_validation.metrics_accessor._MetricsAccessor.recall
FAILED src/skore/sklearn/_cross_validation/metrics_accessor.py::skore.sklearn._cross_validation.metrics_accessor._MetricsAccessor.rmse
FAILED src/skore/sklearn/_cross_validation/metrics_accessor.py::skore.sklearn._cross_validation.metrics_accessor._MetricsAccessor.accuracy
FAILED src/skore/sklearn/_cross_validation/report.py::skore.sklearn._cross_validation.report.CrossValidationReport.cache_predictions
FAILED src/skore/sklearn/_estimator/report.py::skore.sklearn._estimator.report.EstimatorReport.clear_cache

It means that you need to go to the associated file and remove the test that is also associated with a progress bar as you did for _estimator/report.py and _estimator/metrics_accessor.py. Now they are mainly located in the _cross_validation/ module

@ictorv
Copy link
Contributor Author

ictorv commented Jan 29, 2025

The following tests are still failing.

FAILED src/skore/sklearn/_cross_validation/metrics_accessor.py::skore.sklearn._cross_validation.metrics_accessor._MetricsAccessor.r2
FAILED src/skore/sklearn/_cross_validation/metrics_accessor.py::skore.sklearn._cross_validation.metrics_accessor._MetricsAccessor.roc_auc
FAILED src/skore/sklearn/_cross_validation/metrics_accessor.py::skore.sklearn._cross_validation.metrics_accessor._MetricsAccessor.brier_score
FAILED src/skore/sklearn/_estimator/report.py::skore.sklearn._estimator.report.EstimatorReport.cache_predictions
FAILED src/skore/sklearn/_cross_validation/metrics_accessor.py::skore.sklearn._cross_validation.metrics_accessor._MetricsAccessor.custom_metric
FAILED src/skore/sklearn/_cross_validation/report.py::skore.sklearn._cross_validation.report.CrossValidationReport.clear_cache
FAILED src/skore/sklearn/_cross_validation/metrics_accessor.py::skore.sklearn._cross_validation.metrics_accessor._MetricsAccessor.report_metrics
FAILED src/skore/sklearn/_cross_validation/metrics_accessor.py::skore.sklearn._cross_validation.metrics_accessor._MetricsAccessor.precision
FAILED src/skore/sklearn/_cross_validation/metrics_accessor.py::skore.sklearn._cross_validation.metrics_accessor._PlotMetricsAccessor.prediction_error
FAILED src/skore/sklearn/_cross_validation/metrics_accessor.py::skore.sklearn._cross_validation.metrics_accessor._MetricsAccessor.log_loss
FAILED src/skore/sklearn/_cross_validation/metrics_accessor.py::skore.sklearn._cross_validation.metrics_accessor._PlotMetricsAccessor.roc
FAILED src/skore/sklearn/_cross_validation/metrics_accessor.py::skore.sklearn._cross_validation.metrics_accessor._PlotMetricsAccessor.precision_recall
FAILED src/skore/sklearn/_cross_validation/metrics_accessor.py::skore.sklearn._cross_validation.metrics_accessor._MetricsAccessor.recall
FAILED src/skore/sklearn/_cross_validation/metrics_accessor.py::skore.sklearn._cross_validation.metrics_accessor._MetricsAccessor.rmse
FAILED src/skore/sklearn/_cross_validation/metrics_accessor.py::skore.sklearn._cross_validation.metrics_accessor._MetricsAccessor.accuracy
FAILED src/skore/sklearn/_cross_validation/report.py::skore.sklearn._cross_validation.report.CrossValidationReport.cache_predictions
FAILED src/skore/sklearn/_estimator/report.py::skore.sklearn._estimator.report.EstimatorReport.clear_cache

It means that you need to go to the associated file and remove the test that is also associated with a progress bar as you did for _estimator/report.py and _estimator/metrics_accessor.py. Now they are mainly located in the _cross_validation/ module

Okay fixing these things, can you please tell me how to test these cases eg: lint and other test cases locally. 😊 So that I can check and commit after proper check.

@glemaitre
Copy link
Member

It was the remarks of @thomass-dev regarding pre-commit. You need to install it, once install, at the time of committing it will modify the file for those linter if necessary.

If you really want to run the tests without triggering the commit, I think that the command would be something like:

pre-commit install && pre-commit run -v --all-files --show-diff-on-failure

pre-commit install would be necessary only the first time.

Copy link
Contributor Author

@ictorv ictorv left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Kindly go through this changes, fixed all doctests related to disppaearance of progress bar

@glemaitre glemaitre self-requested a review January 30, 2025 07:25
@glemaitre
Copy link
Member

I quickly pushed a commit to fix the 2 remaining docstrings that were missing. Now I expect the CI. @ictorv Be aware that you can run the tests locally. Basically, from the root directory, the following command should do the job:

pytest -v skore

Copy link
Contributor

Coverage

Coverage Report for backend
FileStmtsMissCoverMissing
venv/lib/python3.12/site-packages/skore
   __init__.py140100% 
   __main__.py880%3–19
   exceptions.py330%4–19
venv/lib/python3.12/site-packages/skore/cli
   __init__.py550%3–8
   cli.py16160%3–59
   color_format.py49490%3–116
venv/lib/python3.12/site-packages/skore/persistence
   __init__.py00100% 
venv/lib/python3.12/site-packages/skore/persistence/item
   __init__.py59491%89, 100–103
   altair_chart_item.py24193%15
   cross_validation_reporter_item.py1091289%28–41, 125–126, 259, 262
   item.py24196%86
   matplotlib_figure_item.py33195%18
   media_item.py240100% 
   numpy_array_item.py29194%16
   pandas_dataframe_item.py31194%14
   pandas_series_item.py31194%14
   pickle_item.py330100% 
   pillow_image_item.py32194%16
   plotly_figure_item.py25193%15
   polars_dataframe_item.py29194%14
   polars_series_item.py24193%14
   primitive_item.py25291%13–15
   sklearn_base_estimator_item.py31194%15
   skrub_table_report_item.py10186%11
venv/lib/python3.12/site-packages/skore/persistence/repository
   __init__.py30100% 
   item_repository.py59591%15–16, 202–203, 226
   view_repository.py19286%9–10
venv/lib/python3.12/site-packages/skore/persistence/storage
   __init__.py40100% 
   abstract_storage.py220100% 
   disk_cache_storage.py33195%44
   in_memory_storage.py200100% 
venv/lib/python3.12/site-packages/skore/persistence/view
   __init__.py20100% 
   view.py50100% 
venv/lib/python3.12/site-packages/skore/project
   __init__.py20100% 
   project.py62199%237
venv/lib/python3.12/site-packages/skore/sklearn
   __init__.py50100% 
   _base.py140497%91, 94, 168–>173, 183–184
   find_ml_task.py45196%71–>87, 86
   types.py20100% 
venv/lib/python3.12/site-packages/skore/sklearn/_cross_validation
   __init__.py60100% 
   metrics_accessor.py1630100% 
   report.py870100% 
venv/lib/python3.12/site-packages/skore/sklearn/_estimator
   __init__.py60100% 
   metrics_accessor.py265497%149–158, 183–>236, 191, 434–>437, 783–>786
   report.py120099%213–>219, 221–>223
   utils.py11110%1–19
venv/lib/python3.12/site-packages/skore/sklearn/_plot
   __init__.py40100% 
   precision_recall_curve.py121198%229–>246, 317
   prediction_error.py970100% 
   roc_curve.py1280100% 
   utils.py880100% 
venv/lib/python3.12/site-packages/skore/sklearn/cross_validation
   __init__.py20100% 
   cross_validation_helpers.py47490%104–>136, 123–126
   cross_validation_reporter.py35195%187
venv/lib/python3.12/site-packages/skore/sklearn/cross_validation/plots
   __init__.py00100% 
   compare_scores_plot.py292416%10, 28–116
   timing_plot.py292417%10, 26–123
venv/lib/python3.12/site-packages/skore/sklearn/train_test_split
   __init__.py00100% 
   train_test_split.py36294%16–17
venv/lib/python3.12/site-packages/skore/sklearn/train_test_split/warning
   __init__.py80100% 
   high_class_imbalance_too_few_examples_warning.py17378%16–18, 80
   high_class_imbalance_warning.py18288%16–18
   random_state_unset_warning.py11187%15
   shuffle_true_warning.py9091%44–>exit
   stratify_is_set_warning.py11187%15
   time_based_column_warning.py22189%17, 69–>exit
   train_test_split_warning.py5180%21
venv/lib/python3.12/site-packages/skore/ui
   __init__.py00100% 
   app.py30771%48–53, 65–67
   dependencies.py40100% 
   launch.py261539%36–57
   project_routes.py620100% 
venv/lib/python3.12/site-packages/skore/utils
   __init__.py60100% 
   _accessor.py70100% 
   _logger.py21484%14–18
   _patch.py11546%19–35
   _progress_bar.py300100% 
   _show_versions.py310100% 
TOTAL269423691% 

Tests Skipped Failures Errors Time
587 3 💤 0 ❌ 0 🔥 4m 33s ⏱️

Copy link
Member

@glemaitre glemaitre left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM on my side now.

@glemaitre
Copy link
Member

@ictorv also I would recommend that you setup a GPG to sign your commit: https://docs.github.com/en/authentication/managing-commit-signature-verification/signing-commits
It is part of good practice.

@glemaitre glemaitre merged commit 03f914c into probabl-ai:main Jan 30, 2025
18 checks passed
@ictorv
Copy link
Contributor Author

ictorv commented Jan 30, 2025

I quickly pushed a commit to fix the 2 remaining docstrings that were missing. Now I expect the CI. @ictorv Be aware that you can run the tests locally. Basically, from the root directory, the following command should do the job:

pytest -v skore

Thanks for Fixing :)

Yeah I tried it but always gets this error instead what I was expecting related to doctests.
image

I want to explore this project but I won't get proper error.

@ictorv
Copy link
Contributor Author

ictorv commented Jan 30, 2025

@ictorv also I would recommend that you setup a GPG to sign your commit: https://docs.github.com/en/authentication/managing-commit-signature-verification/signing-commits It is part of good practice.

Yes I am setting it up. Thanking You.

@glemaitre
Copy link
Member

Yeah I tried it but always gets this error

Uhm, it looks like the package is not installed properly. Our contribution guide is not yet super friendly in this regards: it uses Makefile and I saw you have powershell, meaning that you are in windows so the Makefile will not help. At least a pip install -e . should install the dependencies and the package in editable mode.

@ictorv
Copy link
Contributor Author

ictorv commented Jan 30, 2025

Yeah I tried it but always gets this error

Uhm, it looks like the package is not installed properly. Our contribution guide is not yet super friendly in this regards: it uses Makefile and I saw you have powershell, meaning that you are in windows so the Makefile will not help. At least a pip install -e . should install the dependencies and the package in editable mode.

Okay, will try testing about this project with specified OS :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

enh: hide progress bar once the execution is done
2 participants