#### Filter Results:

- Full text PDF available (163)

#### Publication Year

1992

2017

- This year (2)
- Last five years (47)

#### Publication Type

#### Co-author

#### Publication Venue

#### Data Set Used

#### Key Phrases

#### Method

#### Organism

Learn More

- Ben Taskar, Carlos Guestrin, Daphne Koller
- NIPS
- 2003

In typical classification tasks, we seek a function which assigns a label to a single object. Kernel-based approaches, such as support vector machines (SVMs), which maximize the margin of confidence of the classifier, are the method of choice for many such tasks. Their popularity stems both from the ability to use high-dimensional feature spaces, and from… (More)

Given a water distribution network, where should we place sensors toquickly detect contaminants? Or, which blogs should we read to avoid missing important stories?.
These seemingly different problems share common structure: Outbreak detection can be modeled as selecting nodes (sensor locations, blogs) in a network, in order to detect the spreading of a… (More)

While high-level data parallel frameworks, like MapReduce, simplify the design and implementation of large-scale data processing systems, they do not naturally or efficiently support many important data mining and machine learning algorithms and can lead to inefficient learning systems. To help fill this critical void, we introduced the GraphLab abstraction… (More)

- Aapo Kyrola, Guy E. Blelloch, Carlos Guestrin
- OSDI
- 2012

Current systems for graph computation require a distributed computing cluster to handle very large real-world problems, such as analysis on social networks or the web graph. While distributed computational resources have become more accessible, developing distributed graph algorithms still remains challenging, especially to non-experts. In this work, we… (More)

Designing and implementing efficient, provably correct parallel machine learning (ML) algorithms is challenging. Existing high-level parallel abstractions like MapReduce are insufficiently expressive while low-level tools like MPI and Pthreads leave ML experts repeatedly solving the same design challenges. By targeting common patterns in ML, we developed… (More)

- Carlos Guestrin, Daphne Koller, Ronald Parr, Shobha Venkataraman
- J. Artif. Intell. Res.
- 2003

This paper addresses the problem of planning under uncertainty in large Markov Decision Processes (MDPs). Factored MDPs represent a complex state space using state variables and the transition model using a dynamic Bayesian network. This representation often allows an exponential reduction in the representation size of structured MDPs, but the complexity of… (More)

Declarative queries are proving to be an attractive paradigm for interacting with networks of wireless sensors. The metaphor that " the sensornet is a database " is problematic, however, because sensors do not exhaustively represent the data in the real world. In order to map the raw sensor readings onto physical reality, a model of that reality is required… (More)

- Andreas Krause, Ajit Paul Singh, Carlos Guestrin
- Journal of Machine Learning Research
- 2008

When monitoring spatial phenomena, which can often be modeled as Gaussian processes (GPs), choosing sensor locations is a fundamental task. There are several common strategies to address this task, for example, geometry or disk models, placing sensors at the points of highest entropy (variance) in the GP model, and AD D-, or E-optimal design. In this paper,… (More)

We consider large margin estimation in a broad range of prediction models where inference involves solving combinatorial optimization problems, for example, weighted graph-cuts or matchings. Our goal is to learn parameters such that inference using the model reproduces correct answers on the training data. Our method relies on the expressive power of convex… (More)

- Joseph K. Bradley, Aapo Kyrola, Danny Bickson, Carlos Guestrin
- ICML
- 2011

We propose Shotgun, a parallel coordinate descent algorithm for minimizing L 1-regularized losses. Though coordinate descent seems inherently sequential, we prove convergence bounds for Shotgun which predict linear speedups, up to a problem-dependent limit. We present a comprehensive empirical study of Shotgun for Lasso and sparse logistic regression. Our… (More)