Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make using JAX (or any accelerator) an Instanced Python class (and toggle) #509

Open
wants to merge 14 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/workflows/CI.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -62,7 +62,7 @@ jobs:
- name: Run tests (pytest)
shell: bash -l {0}
run: |
pytest -v --cov=$PACKAGE --cov-report=xml --color=yes --doctest-modules $PACKAGE/
pytest -v --cov=$PACKAGE --cov-report=xml --color=yes --doctest-modules --doctest-ignore-import-errors $PACKAGE/

- name: Run examples
shell: bash -l {0}
Expand Down
7 changes: 7 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -91,6 +91,13 @@ PyMBAR needs 64-bit floats to provide reliable answers. JAX by default uses
PyMBAR will turn on JAX's 64-bit mode, which may cause issues with some separate uses of JAX in the same code as PyMBAR,
such as existing Neural Network (NN) Models for machine learning.

If you would like JAX in 32-bit mode, and PyMBAR in the same script, instance your MBAR with the `accelerator=numpy`
option, e.g.
```python
mbar = MBAR(..., accelerator="numpy")
```
replacing `...` with your other options.

Authors
-------
* Kyle A. Beauchamp <[email protected]>
Expand Down
1 change: 1 addition & 0 deletions devtools/conda-envs/test_env_jax.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -22,4 +22,5 @@ dependencies:
- xlrd
# Docs
- numpydoc
- sphinx <7
- sphinxcontrib-bibtex
19 changes: 15 additions & 4 deletions pymbar/mbar.py
Original file line number Diff line number Diff line change
Expand Up @@ -96,6 +96,7 @@ def __init__(
n_bootstraps=0,
bootstrap_solver_protocol=None,
rseed=None,
accelerator=None,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this name is good

):
"""Initialize multistate Bennett acceptance ratio (MBAR) on a set of simulation data.

Expand Down Expand Up @@ -186,6 +187,13 @@ def __init__(
We usually just do steps of adaptive sampling without. "robust" would be the backup.
Default: dict(method="adaptive", options=dict(min_sc_iter=0)),

accelerator: str, optional, default=None
Set the accelerator library. Attempts to use the named accelerator for the solvers, and then
stores the output accelerator after trying to set. Not case-sensitive. "numpy" is no-accelerators,
and will work fine. Default accelerator is JAX if nothing specified and JAX installed, else NumPy
(Valid options: jax, numpy)


Notes
-----
The reduced potential energy ``u_kn[k,n] = u_k(x_{ln})``, where the reduced potential energy ``u_l(x)`` is
Expand Down Expand Up @@ -225,6 +233,9 @@ def __init__(

"""

# Set the accelerator methods for the solvers
self.solver = mbar_solvers.get_accelerator(accelerator)

# Store local copies of necessary data.
# N_k[k] is the number of samples from state k, some of which might be zero.
self.N_k = np.array(N_k, dtype=np.int64)
Expand Down Expand Up @@ -407,7 +418,7 @@ def __init__(
else:
np.random.seed(rseed)

self.f_k = mbar_solvers.solve_mbar_for_all_states(
self.f_k = self.solver.solve_mbar_for_all_states(
self.u_kn, self.N_k, self.f_k, self.states_with_samples, solver_protocol
)

Expand All @@ -431,7 +442,7 @@ def __init__(
# If we initialized with BAR, then BAR, starting from the provided initial_f_k as well.
if initialize == "BAR":
f_k_init = self._initialize_with_bar(self.u_kn[:, rints], f_k_init=self.f_k)
self.f_k_boots[b, :] = mbar_solvers.solve_mbar_for_all_states(
self.f_k_boots[b, :] = self.solver.solve_mbar_for_all_states(
self.u_kn[:, rints],
self.N_k,
f_k_init,
Expand All @@ -449,7 +460,7 @@ def __init__(

# bootstrapped weight matrices not generated here, but when expectations are needed
# otherwise, it's too much memory to keep
self.Log_W_nk = mbar_solvers.mbar_log_W_nk(self.u_kn, self.N_k, self.f_k)
self.Log_W_nk = self.solver.mbar_log_W_nk(self.u_kn, self.N_k, self.f_k)

# Print final dimensionless free energies.
if self.verbose:
Expand Down Expand Up @@ -904,7 +915,7 @@ def compute_expectations_inner(
f_k[0:K] = self.f_k_boots[n - 1, :]
ri = self.bootstrap_rints[n - 1]
u_kn = self.u_kn[:, ri]
Log_W_nk[:, 0:K] = mbar_solvers.mbar_log_W_nk(u_kn, self.N_k, f_k[0:K])
Log_W_nk[:, 0:K] = self.solver.mbar_log_W_nk(u_kn, self.N_k, f_k[0:K])
# Pre-calculate the log denominator: Eqns 13, 14 in MBAR paper

states_with_samples = self.N_k > 0
Expand Down
Loading