• Corpus ID: 245668667

Rxn Hypergraph: a Hypergraph Attention Model for Chemical Reaction Representation

  title={Rxn Hypergraph: a Hypergraph Attention Model for Chemical Reaction Representation},
  author={Mohammadamin Tavakoli and Alexander Shmakov and Francesco Ceccarelli and Pierre Baldi},
It is fundamental for science and technology to be able to predict chemical reactions and their properties. To achieve such skills, it is important to develop good representations of chemical reactions, or good deep learning architectures that can learn such representations automatically from the data. There is currently no universal and widely adopted method for robustly representing chemical reactions. Most existing methods suffer from one or more drawbacks, such as: (1) lacking universality… 

Figures and Tables from this paper

ChemAlgebra: Algebraic Reasoning on Chemical Reactions

This work proposes CHEMALGEBRA, a benchmark for measuring the reasoning capabilities of deep learning models through the prediction of stoichiometrically-balanced chemical reactions, and believes that it can serve as a useful test bed for the next generation of machine reasoning models and as a promoter of their development.

Hypergraph Factorisation for Multi-Tissue Gene Expression Imputation

The HYFA framework can accelerate effective and scalable integration of tissue and cell-type gene expression biorepositories and can be used to identify regulatory genetic variations (eQTLs) with substantial gains over the original incomplete dataset.



Mapping the Space of Chemical Reactions Using Attention-Based Neural Networks

It is shown that transformer-based models can infer reaction classes from non-annotated, simple text-based representations of chemical reactions, and that the learned representations can be used as reaction fingerprints which capture fine-grained differences between reaction classes better than traditional reaction fingerprints.

Prediction of chemical reaction yields using deep learning

The application of natural language processing architectures is extended to predict reaction properties given a text-based representation of the reaction, using an encoder transformer model combined with a regression layer to demonstrate outstanding prediction performance on two high-throughput experiment reactions sets.

Analyzing Learned Molecular Representations for Property Prediction

A graph convolutional model is introduced that consistently matches or outperforms models using fixed molecular descriptors as well as previous graph neural architectures on both public and proprietary data sets.

Deep learning for chemical reaction prediction

An alternative approach to predicting electron sources and sinks using recurrent neural networks, specifically long short-term memory (LSTM) architectures, operating directly on SMILES strings is discussed, which has shown promising preliminary results.

Quantum Mechanics and Machine Learning Synergies: Graph Attention Neural Networks to Predict Chemical Reactivity

This work first uses DFT to calculate MCA* and MAA* for more than 2,400 organic molecules thereby establishing a large dataset of chemical reactivity scores, and designs deep learning methods to predict the reactivity of molecular structures and trains them using this curated dataset in combination with different representations of Molecular structures.

Learning to Predict Chemical Reactions

This work describes single mechanistic reactions as interactions between coarse approximations of molecular orbitals (MOs) and use topological and physicochemical attributes as descriptors and proposes a new approach to reaction prediction utilizing elements from each pole.

Extraction of organic chemistry grammar from unsupervised learning of chemical reactions

This work demonstrates that Transformer Neural Networks learn atom-mapping information between products and reactants without supervision or human labeling, and provides the missing link between data-driven and rule-based approaches for numerous chemical reaction tasks.

MolGAN: An implicit generative model for small molecular graphs

MolGAN is introduced, an implicit, likelihood-free generative model for small molecular graphs that circumvents the need for expensive graph matching procedures or node ordering heuris-tics of previous likelihood-based methods.

Molecular Transformer: A Model for Uncertainty-Calibrated Chemical Reaction Prediction

This work shows that a multihead attention Molecular Transformer model outperforms all algorithms in the literature, achieving a top-1 accuracy above 90% on a common benchmark data set and is able to handle inputs without a reactant–reagent split and including stereochemistry, which makes the method universally applicable.

UvA-DARE (Digital Academic Modeling Relational Data with Graph Convolutional Networks Modeling Relational Data with Graph Convolutional Networks

It is shown that factorization models for link prediction such as DistMult can be improved by enriching them with an encoder model to accumulate evidence over multiple inference steps in the relational graph, demonstrating a large improvement of 29.8% on FB15k-237 over a decoder-only baseline.