Skip to content

Official implementation of Fisher-Flow Matching (NeurIPS 2024).

Notifications You must be signed in to change notification settings

olsdavis/fisher-flow

Repository files navigation

Fisher Flow Matching

All our dependencies are listed in environment.yaml, for Conda, and requirements.txt, for pip. Please also separately install DGL:

pip install -r requirements.txt
pip install dgl -f https://data.dgl.ai/wheels/torch-2.1/cu121/repo.html

Our code contains parts of FlowMol by Dunn and Koes [1] (most of QM9 experiments), Riemannian-FM by Chen, et al. [2], and, for the baselines, DFM by Stark, et al [3].

Toy Experiment

For the DFM toy experiment, the following command allows us to run our code:

python -m src.train experiment=toy_dfm_bmlp data.dim=100 trainer=gpu trainer.max_epochs=500

Of course, the dimension argument is varied, and the configuration files allow for changing manifolds ("simplex", or "sphere") and turn OT on/off ("exact" or "None").

Promoter and Enhancer DNA Experiment

To download the datasets, it suffices to follow the steps of Stark, et al. For evaluating the FBD, it also needed to download their weights from their workdir.zip. To run the promoter dataset experiment, the following command can be used:

python -m src.train experiment=promoter_sfm_promdfm trainer.max_epochs=200 trainer=gpu data.batch_size=128

As for the enhancer MEL2 experiment, the following command is available:

python -m src.train experiment=enhancer_mel_sfm_cnn trainer.max_epochs=800 trainer=gpu

and for the FlyBrain DNA one:

python -m src.train experiment=enhancer_fly_sfm_cnn trainer.max_epochs=800 trainer=gpu

QM9 experiment

To install the QM9 dataset, we have included process_qm9.py from FlowMol, so it suffices to follow the steps indicated in their README.

python -m src.train experiment=qm_clean_sfm trainer=gpu

References

About

Official implementation of Fisher-Flow Matching (NeurIPS 2024).

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •