Learn More
Derivatives, mostly in the form of gradients and Hessians, are ubiquitous in machine learning. Automatic differentiation (AD) is a technique for calculating derivatives of numeric functions expressed as computer programs efficiently and accurately, used in fields such as computational fluid dynamics , nuclear engineering, and atmospheric sciences. Despite(More)
The TREC Definition and Relationship questions are evaluated on the basis of information nuggets that may be contained in system responses. Human evalua-tors provide informal descriptions of each nugget, and judgements (assignments of nuggets to responses) for each response submitted by participants. While human evaluation is the most accurate way to(More)
Reasoning with probabilistic models is a widespread and successful technique in areas ranging from computer vision, to natural language processing, to bioinformatics. Currently, these reasoning systems are either coded from scratch in general-purpose languages or use formalisms such as Bayesian networks that have limited expressive power. In both cases, the(More)
In this dissertation I propose a shift in the foundations of computation. Modern programming systems are not expressive enough. The traditional image of a single computer that has global effects on a large memory is too restrictive. The propagation paradigm replaces this with computing by networks of local, independent, stateless machines interconnected(More)
Forward Automatic Differentiation (AD) is a technique for augmenting programs to both perform their original calculation and also compute its directional derivative. The essence of Forward AD is to attach a derivative value to each number, and propagate these through the computation. When derivatives are nested, the distinct derivative calculations, and(More)
We describe an implementation of the Farfel Fortran77 AD extensions (Radul et al. These extensions integrate forward and reverse AD directly into the programming model, with attendant benefits to flexibility, modularity, and ease of use. The implementation we describe is a " prepreprocessor " that generates input to existing Fortran-based AD tools. In(More)
Gaussian Processes (GPs) are widely used tools in statistics, machine learning, robotics, computer vision, and scientific computation. However, despite their popularity, they can be difficult to apply; all but the simplest classification or regression applications require specification and inference over complex covariance functions that do not admit simple(More)