Skip to content

Conversation

@francesding
Copy link
Contributor

Main change

Updates LaMBO2 class to optionally accept a config object at initialization.

Rationale

Currently, LaMBO2 only accepts a path to a config file, and calls hydra.initialize_config_dir during initialization. Since hydra.initialize_config_dir can only be run once, this prevents the user from using hydra configs to specify other parameters outside of the LaMBO2 settings. For example, the user might want to use hydra to set hyper-parameters for different Ehrlich black boxes as well as the LaMBO2 model. This change allows the user to initialize a hydra config elsewhere, and then pass the relevant config object to LaMBO2.

Other minor changes

  • Added a fft_expansion_factor hyperparameter to the candidate point selection in LaMBO2 (currently a fixed value of 2), and added a corresponding parameter to the hydra config.
  • Allow an optional logger to be attached to the LaMBO2 solver at initialization (for example, lightning.pytorch.loggers.WandbLogger)
  • Fixed formatting for x_0

Frances Ding added 5 commits February 27, 2025 17:05
…logger

Previously LaMBO2 only accepted a path to a config file at initialization and ran hydra initialization to compile this config. To allow for users to run hydra initialization themselves (and potentially include other config parameters besides those related to LaMBO2, such as black box parameters), this updates the initialization to also optionally accept a compiled config. This also adds an optional logger to track metrics and adds the fft expansion factor as a config param, which used to be hard-coded to 2.
@miguelgondu
Copy link
Collaborator

Hi Frances, thanks for the contribution! Looks good to me, and the CI is passing, so I'll merge.

@miguelgondu miguelgondu merged commit 6b8930f into MachineLearningLifeScience:main Mar 3, 2025
7 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants