Gryffin: An algorithm for Bayesian optimization of categorical variables informed by expert knowledge

@inproceedings{Hase2020GryffinAA,
  title={Gryffin: An algorithm for Bayesian optimization of categorical variables informed by expert knowledge},
  author={Florian Hase and Matteo Aldeghi and Riley J. Hickman and Lo{\"i}c M. Roch and Al{\'a}n Aspuru-Guzik},
  year={2020}
}
Florian Häse, 2, 3, 4, ∗ Matteo Aldeghi, 3, 4 Riley J. Hickman, 4 Löıc M. Roch, 3, 4, 5 and Alán Aspuru-Guzik 3, 4, 6, † Department of Chemistry and Chemical Biology, Harvard University, Cambridge, Massachusetts, 02138, USA Vector Institute for Artificial Intelligence, Toronto, ON M5S 1M1, Canada Department of Computer Science, University of Toronto, Toronto, ON M5S 3H6, Canada Department of Chemistry, University of Toronto, Toronto, ON M5S 3H6, Canada Atinary Technologies Sàrl, 1006 Lausanne… 

Figures from this paper

Nanoparticle synthesis assisted by machine learning
Many properties of nanoparticles are governed by their shape, size, polydispersity and surface chemistry. To apply nanoparticles in chemical sensing, medical diagnostics, catalysis, thermoelectrics,
Modeling the Multiwavelength Variability of Mrk 335 Using Gaussian Processes
The optical and UV variability of the majority of active galactic nuclei may be related to the reprocessing of rapidly changing X-ray emission from a more compact region near the central black hole.
Uncertainty-aware Mixed-variable Machine Learning for Materials Design
TLDR
This work surveys frequentist and Bayesian approaches to uncertainty quantification of machine learning with mixed variables, investigating the machine learning models’ predictive and uncertainty estimation capabilities, and provides interpretations of the observed performance differences.
Multi-objective Bayesian Optimization with Heuristic Objectives for Biomedical and Molecular Data Analysis Workflows
TLDR
A novel MOBO method is proposed that adaptively updates the scalarization function using properties of the posterior of a multi-output Gaussian process surrogate function, allowing the functional form of each objective to guide optimization.
Materiomically Designed Polymeric Vehicles for Nucleic Acids: Quo Vadis?
TLDR
Recent developments in combinatorial polymer synthesis, high-throughput screening of polymeric vectors, omics-based approaches to polymer design, barcoding schemes for pooled in vitro and in vivo screening, and materiomics-inspired research directions that will realize the long-unfulfilled clinical potential of polymer carriers in gene therapy are summarized.
ODBO: Bayesian Optimization with Search Space Prescreening for Directed Protein Evolution
TLDR
An experimental design-oriented closed-loop optimization framework for protein directed evolution, termed ODBO, which employs a combination of novel low-dimensional protein encoding strategy and Bayesian optimization enhanced with search space prescreening via outlier detection is proposed.
Bayesian optimization with known experimental and design constraints for chemistry applications
Riley J. Hickman, 2, ∗ Matteo Aldeghi, 2, 3, 4, ∗ Florian Häse, 2, 3, 5 and Alán Aspuru-Guzik 2, 3, 6, 7, 8, † Chemical Physics Theory Group, Department of Chemistry, University of Toronto, Toronto,
Routescore: Punching the Ticket to More Efficient Materials Development
TLDR
The RouteScore is used to determine the most efficient synthetic route to a well-known pharmaceutical and to simulate a self-driving laboratory that finds the most easily synthesizable organic laser molecule with specific photophysical properties from a space of ∼3500 possible molecules.
...
...

References

SHOWING 1-10 OF 150 REFERENCES
Batch Bayesian Optimization via Local Penalization
TLDR
A simple heuristic based on an estimate of the Lipschitz constant is investigated that captures the most important aspect of this interaction at negligible computational overhead and compares well, in running time, with much more elaborate alternatives.
Mixed-Variable Bayesian Optimization
TLDR
MiVaBo is introduced, a novel BO algorithm for the efficient optimization of mixed-variable functions combining a linear surrogate model based on expressive feature representations with Thompson sampling, making MiVaBo the first BO method that can handle complex constraints over the discrete variables.
Gryffin: An algorithm for Bayesian optimization for categorical variables informed by physical intuition with applications to chemistry
TLDR
Gryffin is introduced, as a general purpose optimization framework for the autonomous selection of categorical variables driven by expert knowledge and augments Bayesian optimization with kernel density estimation using smooth approximations to categorical distributions.
Practical Bayesian Optimization of Machine Learning Algorithms
TLDR
This work describes new algorithms that take into account the variable cost of learning algorithm experiments and that can leverage the presence of multiple cores for parallel experimentation and shows that these proposed algorithms improve on previous automatic procedures and can reach or surpass human expert-level optimization for many algorithms.
Scalable Bayesian Optimization Using Deep Neural Networks
TLDR
This work shows that performing adaptive basis function regression with a neural network as the parametric form performs competitively with state-of-the-art GP-based approaches, but scales linearly with the number of data rather than cubically, which allows for a previously intractable degree of parallelism.
Auto-WEKA: combined selection and hyperparameter optimization of classification algorithms
TLDR
This work considers the problem of simultaneously selecting a learning algorithm and setting its hyperparameters, going beyond previous work that attacks these issues separately and shows classification performance often much better than using standard selection and hyperparameter optimization methods.
Making a Science of Model Search: Hyperparameter Optimization in Hundreds of Dimensions for Vision Architectures
TLDR
This work proposes a meta-modeling approach to support automated hyperparameter optimization, with the goal of providing practical tools that replace hand-tuning with a reproducible and unbiased optimization process.
Are we Forgetting about Compositional Optimisers in Bayesian Optimisation?
TLDR
This paper highlights the empirical advantages of the compositional approach to acquisition function maximisation across 3958 individual experiments comprising synthetic optimisation tasks as well as tasks from the 2020 NeurIPS competition on Black-Box Optimisation for Machine Learning.
Bayesian reaction optimization as a tool for chemical synthesis.
TLDR
The development of a framework for Bayesian reaction optimization and an open-source software tool that allows chemists to easily integrate state-of-the-art optimization algorithms into their everyday laboratory practices are reported, demonstrating that Bayesian optimization outperforms human decisionmaking in both average optimization efficiency and consistency.
...
...