Transforming the Language of Life: Transformer Neural Networks for Protein Prediction Tasks
- Ananthan Nambiar, Simon Liu, Mark Hopkins, Maeve Heflin, S. Maslov, Anna M. Ritz
- Computer Science, BiologybioRxiv
- 16 June 2020
A Transformer neural network that pre-trains task-agnostic sequence representations for protein prediction tasks, comparable to existing state-of-the art approaches for protein family classification and much more general than other architectures.
SemEval-2019 Task 10: Math Question Answering
- Mark Hopkins, Ronan Le Bras, Cristian Petrescu-Prahova, Gabriel Stanovsky, Hannaneh Hajishirzi, Rik Koncel-Kedziorski
- Computer ScienceInternational Workshop on Semantic Evaluation
- 1 June 2019
This work provided a question set derived from Math SAT practice exams, including 2778 training questions and 1082 test questions, and provided SMT-LIB logical form annotations and an interpreter that could solve these logical forms.
Beyond Sentential Semantic Parsing: Tackling the Math SAT with a Cascade of Tree Transducers
- Mark Hopkins, Cristian Petrescu-Prahova, Roie Levin, Ronan Le Bras, Alvaro Herrasti, V. Joshi
- Computer ScienceConference on Empirical Methods in Natural…
- 1 September 2017
We present an approach for answering questions that span multiple sentences and exhibit sophisticated cross-sentence anaphoric phenomena, evaluating on a rich source of such questions – the math…
Transformer Neural Networks for Protein Family and Interaction Prediction Tasks
A Transformer neural network that pre-trains task-agnostic sequence representations for protein prediction tasks and outperforms other approaches for protein interaction prediction for two out of three different scenarios that were generated.