Learn More
Formal languages for probabilistic modeling enable re-use, modularity, and descriptive clarity, and can foster generic inference techniques. We introduce Church, a universal language for describing stochastic generative processes. Church is based on the Lisp model of lambda calculus, containing a pure Lisp as its deterministic subset. The semantics of(More)
I introduce a new set of natively probabilistic computing abstractions, including probabilistic generalizations of Boolean circuits, backtracking search and pure Lisp. I show how these tools let one compactly specify probabilistic generative models, generalize and parallelize widely used sampling algorithms like rejection sampling and Markov chain Monte(More)
We investigate the class of computable probability distributions and explore the fundamental limitations of using this class to describe and compute conditional distributions. In addition to proving the existence of noncomputable conditional distributions, and thus ruling out the possibility of generic probabilistic inference algorithms (even inefficient(More)
Formal languages for probabilistic modeling enable re-use, modularity, and descriptive clarity, and can foster generic inference techniques. We introduce Church, a universal language for describing stochastic generative processes. Church is based on the Lisp model of lambda calculus, containing a pure Lisp as its deterministic subset. The semantics of(More)
Human intelligence is a product of cooperation among many different specialists. Much of this cooperation must be learned, but we do not yet have a mechanism that explains how this might happen for the " high-level " agile cooperation that permeates our daily lives. I propose that the various specialists learn to cooperate by learning to communicate ,(More)
The collection and analysis of user data drives improvements in the app and web ecosystems, but comes with risks to privacy. This paper examines discrete distribution estimation under local privacy, a setting wherein service providers can learn the distribution of a categorical statistic of interest without collecting the underlying data. We present new(More)
Many online services require some form of trust between users – trust that a seller will deliver goods as advertised, trust that an author's thoughts are worth the time spent on reading them. To accommodate an internet community where users are constantly interacting with strangers, online services often construct proprietary reputation management systems(More)
Learning of new words is assisted by contex-tual information. This context can come in several forms, including observations in non-linguistic semantic domains, as well as the linguistic context in which the new word was presented. We outline a general architecture for word learning, in which structural alignment coordinates this contextual information in(More)
Secure Aggregation protocols allow a collection of mutually distrust parties, each holding a private value, to collaboratively compute the sum of those values without revealing the values themselves. We consider training a deep neural network in the Federated Learning model, using distributed stochastic gradient descent across user-held training data on(More)
2007 Sigma Xi Grant: How does the brain compensate for vision loss? (with N. Kanwisher) 2005 Sigma Xi Grant: From retina to awareness: tracking the stages of processing in the visual system (with D. MacLeod) Workshop organization: " Rational Process Models " at 31 st Annual Cognitive Science Society Conference (2009) " Bounded-rational analyses of human(More)