• Corpus ID: 204798846

Smart Vet: Autocompleting Sentences in Veterinary Medical Records

  title={Smart Vet: Autocompleting Sentences in Veterinary Medical Records},
  author={Samuel Ginn},
Every day, veterinarians write tens of thousands of medical records, mostly in standard formats following the SOAP structure: “Subjective”, “Objective”, “As-sessment”, and “Plan”. These notes record the findings of their physical exams and observations of their patients, and take countless hours to write. We present in this paper a new system that we call “Smart Vet” that assists veterinarians in the writing of their notes by suggesting autocompletions for their sentences as they are writing… 
1 Citations

Figures from this paper

A Predictive Text System for Medical Recommendations in Telemedicine: A Deep Learning Approach in the Arabic Context

A deep learning-based language generation model is proposed that simplifies the process of writing medical recommendations for doctors in an Arabic context, to improve service satisfaction and patient-doctor interactions.



Learning to Write Notes in Electronic Health Records

This work proposes a new language modeling task predicting the content of notes conditioned on past data from a patient's medical record, including patient demographics, labs, medications, and past notes, and trains generative models using the public, de-identified MIMIC-III dataset.

BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding

A new language representation model, BERT, designed to pre-train deep bidirectional representations from unlabeled text by jointly conditioning on both left and right context in all layers, which can be fine-tuned with just one additional output layer to create state-of-the-art models for a wide range of tasks.

Attention is All you Need

A new simple network architecture, the Transformer, based solely on attention mechanisms, dispensing with recurrence and convolutions entirely is proposed, which generalizes well to other tasks by applying it successfully to English constituency parsing both with large and limited training data.

Improving Language Understanding by Generative Pre-Training

The general task-agnostic model outperforms discriminatively trained models that use architectures specifically crafted for each task, improving upon the state of the art in 9 out of the 12 tasks studied.

Smart Reply: Automated Response Suggestion for Email

In this paper we propose and investigate a novel end-to-end method for automatically generating short email responses, called Smart Reply. It generates semantically diverse suggestions that can be

Tethered to the EHR: Primary Care Physician Workload Assessment Using EHR Event Log Data and Time-Motion Observations

EHR event logs can identify areas of EHR-related work that could be delegated, thus reducing workload, improving professional satisfaction, and decreasing burnout, according to primary care physicians.

Tensor2Tensor for Neural Machine Translation

Tensor2Tensor is a library for deep learning models that is well-suited for neural machine translation and includes the reference implementation of the state-of-the-art Transformer model.

The Impact of Physician EHR Usage on Patient Satisfaction

The authors used existing data sources to describe the relationship between the amount of time physicians spend logged in to the EHR—both during daytime hours as well after clinic hours—and performance on a validated patient satisfaction survey, and it is found that there is no relationship between increased time logging in toThe EHR and patient satisfaction.

Allocation of Physician Time in Ambulatory Practice: A Time and Motion Study in 4 Specialties

The goal was to describe time allocation and practice characteristics for physicians in the era of EHRs and federal incentive and penalty programs and to ensure a participant base that was representative of a large and inclusive number of physicians.

A new algorithm for data compression

This article describes a simple general-purpose data compression algorithm, called Byte Pair Encoding (BPE), which provides almost as much compression as the popular Lempel, Ziv, and Welch method.