Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FEATURE REQUEST] Abstract reusable code in run_*_benchmark,.py scripts #102

Closed
yantosca opened this issue Dec 1, 2020 · 2 comments
Closed
Labels
category: Feature Request New feature or request topic: Benchmark Plots and Tables Issues pertaining to generating plots/tables from benchmark output

Comments

@yantosca
Copy link
Contributor

yantosca commented Dec 1, 2020

In the benchmarking driver scripts, i.e.

  • benchmark/run_1mo_benchmark.py
  • benchmark/run_1yr_tt_benchmark.py
  • benchmark/run_1yr_fullchem_benchmark.py

there is an opportunity to abstract code that gets repeated, but has different arguments. The repeated code could be abstracted into global functions that could be called further down in the script.

@yantosca yantosca added the category: Feature Request New feature or request label Dec 1, 2020
@yantosca yantosca added the never stale Never label this issue as stale label Jan 13, 2021
@msulprizio msulprizio removed the never stale Never label this issue as stale label Mar 3, 2021
@yantosca
Copy link
Contributor Author

The modifications made in #66 will facilitate abstracting repeated code from the benchmark scripts. Perhaps these could be stored in a new module in benchmark.py, and the benchmark scripts can call them with the appropriate arguments.

@yantosca yantosca added the topic: Benchmark Plots and Tables Issues pertaining to generating plots/tables from benchmark output label Oct 6, 2022
@yantosca
Copy link
Contributor Author

yantosca commented Oct 6, 2022

I will close out this issue it is now discussed in #141.

@yantosca yantosca closed this as completed Oct 6, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
category: Feature Request New feature or request topic: Benchmark Plots and Tables Issues pertaining to generating plots/tables from benchmark output
Projects
None yet
Development

No branches or pull requests

2 participants