Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

WIP Initial commit to get some things logged to mlflow. #170

Merged
merged 3 commits into from
Jan 22, 2025

Conversation

drewoldag
Copy link
Collaborator

@drewoldag drewoldag commented Jan 18, 2025

If testing locally be sure to pip install mlflow, and optionally pynvml if you want to collect gpu metrics.

You should be able to log model parameters and metrics immediately without any initial setup. To view the results you'll need to start the mlflow server using the following at the command line:
mlflow ui --backend-store-uri file://<.../results/mlflow>

Pass in the full path to the directory /<where your results are>/results/mlflow with a leading /.

Then go to `http://127.0.0.1:5000 and you should be presented with the MLFlow UI.

The default experiment is notebook. You can change the experiment name in your config using:

[train]
experiment_name = "fav_experiment_name"

In this example, I used port 8080 by starting mlflow server like so: mlflow ui --backend-store-uri ... --port 8080
mlflow_example

Example showing the parameter logging - by default, it will log all the keys under [model] and the criterion and optimizer specific parameters in the configuration used for the run. Other parameters can be added. The screenshot below shows only the model parameter.
mlflow_example2

@drewoldag drewoldag self-assigned this Jan 18, 2025
@drewoldag drewoldag linked an issue Jan 18, 2025 that may be closed by this pull request
Copy link

codecov bot commented Jan 18, 2025

Codecov Report

Attention: Patch coverage is 0% with 24 lines in your changes missing coverage. Please review.

Project coverage is 39.51%. Comparing base (d745694) to head (cb99663).
Report is 1 commits behind head on main.

Files with missing lines Patch % Lines
src/fibad/train.py 0.00% 21 Missing ⚠️
src/fibad/pytorch_ignite.py 0.00% 3 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main     #170      +/-   ##
==========================================
- Coverage   39.98%   39.51%   -0.48%     
==========================================
  Files          23       23              
  Lines        1908     1931      +23     
==========================================
  Hits          763      763              
- Misses       1145     1168      +23     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

Copy link

github-actions bot commented Jan 18, 2025

Before [d745694] After [0729658] Ratio Benchmark (Parameter)
1.88±1s 2.54±1s ~1.35 benchmarks.time_computation
4.02k 2.44k 0.61 benchmarks.mem_list

Click here to view all benchmarks.

@drewoldag drewoldag marked this pull request as ready for review January 19, 2025 03:52
Copy link
Collaborator

@mtauraso mtauraso left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@drewoldag drewoldag merged commit 86ff5fb into main Jan 22, 2025
6 of 8 checks passed
@drewoldag drewoldag deleted the issue/169/add-mlflow-support branch January 22, 2025 03:32
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Limitations with Model Comparison Capabilities
2 participants