Share This Author
Two-dimensional materials from high-throughput computational exfoliation of experimentally known compounds
The largest available database of potentially exfoliable 2D materials has been obtained via high-throughput calculations using van der Waals density functional theory.
Molecular Transformer: A Model for Uncertainty-Calibrated Chemical Reaction Prediction
This work shows that a multihead attention Molecular Transformer model outperforms all algorithms in the literature, achieving a top-1 accuracy above 90% on a common benchmark data set and is able to handle inputs without a reactant–reagent split and including stereochemistry, which makes the method universally applicable.
“Found in Translation”: predicting outcomes of complex organic chemistry reactions using neural sequence-to-sequence models† †Electronic supplementary information (ESI) available: Time-split test set…
- P. Schwaller, T. Gaudin, D. Lanyi, C. Bekas, T. Laino
- Computer ScienceChemical science
- 13 November 2017
Using a text-based representation of molecules, chemical reactions are predicted with a neural machine translation model borrowed from language processing to describe how molecules behave in a graph-based model.
Predicting retrosynthetic pathways using transformer-based models and a hyper-graph exploration strategy†
An extension of the Molecular Transformer model combined with a hyper-graph exploration strategy for automatic retrosynthesis route planning without human intervention is presented and the end-to-end framework has an excellent performance with few weaknesses related to the training data.
Prediction of chemical reaction yields using deep learning
- P. Schwaller, Alain C. Vaucher, T. Laino, J. Reymond
- Computer ScienceMach. Learn. Sci. Technol.
- 5 August 2020
The application of natural language processing architectures is extended to predict reaction properties given a text-based representation of the reaction, using an encoder transformer model combined with a regression layer to demonstrate outstanding prediction performance on two high-throughput experiment reactions sets.
Molecular Transformer for Chemical Reaction Prediction and Uncertainty Estimation
- P. Schwaller, T. Laino, T. Gaudin, P. Bolgar, C. Bekas, A. Lee
- Computer Science, ChemistryArXiv
- 6 November 2018
This work treats reaction prediction as a machine translation problem between SMILES strings of reactants-reagents and the products, and shows that a multi-head attention Molecular Transformer model outperforms all algorithms in the literature, achieving a top-1 accuracy above 90% on a common benchmark dataset.
Transfer learning enables the molecular transformer to predict regio- and stereoselective reactions on carbohydrates
It is shown that transfer learning of the general patent reaction model with a small set of carbohydrate reactions produces a specialized model returning predictions for carbohydrate reactions with remarkable accuracy.
Exploring Chemical Space using Natural Language Processing Methodologies for Drug Discovery
Extraction of organic chemistry grammar from unsupervised learning of chemical reactions
- P. Schwaller, Benjamin Hoover, J. Reymond, H. Strobelt, T. Laino
- Computer ScienceScience Advances
- 1 April 2021
This work demonstrates that Transformer Neural Networks learn atom-mapping information between products and reactants without supervision or human labeling, and provides the missing link between data-driven and rule-based approaches for numerous chemical reaction tasks.
Mapping the Space of Chemical Reactions Using Attention-Based Neural Networks
It is shown that transformer-based models can infer reaction classes from non-annotated, simple text-based representations of chemical reactions, and that the learned representations can be used as reaction fingerprints which capture fine-grained differences between reaction classes better than traditional reaction fingerprints.