Learn More
Formal languages for probabilistic modeling enable re-use, modularity, and descriptive clarity, and can foster generic inference techniques. We introduce Church, a universal language for describing stochastic generative processes. Church is based on the Lisp model of lambda calculus, containing a pure Lisp as its deterministic subset. The semantics of(More)
The collection and analysis of user data drives improvements in the app and web ecosystems, but comes with risks to privacy. This paper examines discrete distribution estimation under local privacy, a setting wherein service providers can learn the distribution of a categorical statistic of interest without collecting the underlying data. We present new(More)
Formal languages for probabilistic modeling enable re-use, modularity, and descriptive clarity, and can foster generic inference techniques. We introduce Church, a universal language for describing stochastic generative processes. Church is based on the Lisp model of lambda calculus, containing a pure Lisp as its deterministic subset. The semantics of(More)
Learning of new words is assisted by contex-tual information. This context can come in several forms, including observations in non-linguistic semantic domains, as well as the linguistic context in which the new word was presented. We outline a general architecture for word learning, in which structural alignment coordinates this contextual information in(More)
Secure Aggregation protocols allow a collection of mutually distrust parties, each holding a private value, to collaboratively compute the sum of those values without revealing the values themselves. We consider training a deep neural network in the Federated Learning model, using distributed stochastic gradient descent across user-held training data on(More)
This thesis investigates the bidirectional exchange of information between linguistic and non-linguistic semantic inputs containing ambiguities. Such exchange is critical to Cognitively Complete Systems, in which collections of related representations and processes cooperate for their mutual problem-solving benefit. The exchange paradigm of reconciliation(More)
If we are ever to have intelligent systems, they will need memory. Memory is the core of learning; intelligence is about entering, extracting, and synthesizing its contents. What makes the memory problem difficult is that memory is not a simple collection of facts. The how and why of where those facts were acquired is a key part of how they are internalized(More)
  • 1