#### Filter Results:

#### Publication Year

2000

2017

#### Publication Type

#### Co-author

#### Publication Venue

#### Data Set Used

#### Key Phrases

Learn More

—This paper is concerned with model reduction for complex Markov chain models. The Kullback–Leibler divergence rate is employed as a metric to measure the difference between the Markov model and its approximation. For a certain relaxation of the bi-partition model reduction problem, the solution is shown to be characterized by an associated eigenvalue… (More)

Active machine learning algorithms are used when large numbers of unlabeled examples are available and getting labels for them is costly (e.g. requiring consulting a human expert). Many conventional active learning algorithms focus on refining the decision boundary, at the expense of exploring new regions that the current hypothesis misclassi-fies. We… (More)

Significant changes in the instance distribution or associated cost function of a learning problem require one to reoptimize a previously learned classifier to work under new conditions. We study the problem of reoptimizing a multi-class classifier based on its ROC hypersurface and a matrix describing the costs of each type of prediction error. For a binary… (More)

In active learning, a machine learning algorithm is given an unlabeled set of examples U , and is allowed to request labels for a relatively small subset of U to use for training. The goal is then to judiciously choose which examples in U to have labeled in order to optimize some performance criterion, e.g. classification accuracy. We study how active… (More)

—In active learning, where a learning algorithm has to purchase the labels of its training examples, it is often assumed that there is only one labeler available to label examples, and that this labeler is noise-free. In reality, it is possible that there are multiple labelers available (such as human labelers in the online annotation tool Amazon Mechanical… (More)

— This paper is concerned with an information-theoretic framework to aggregate a large-scale Markov chain to obtain a reduced order Markov model. The Kullback-Leibler (K-L) divergence rate is employed as a metric to measure the distance between two stationary Markov chains. Model reduction is obtained by considering an optimization problem w.r.t. this… (More)

- Joseph S Niedbalski, Kun Deng, Prashant G Mehta, Sean Meyn
- 2008

— This paper is concerned with model reduction for a complex Markov chain using state aggregation. The work is motivated in part by the need for reduced order estimation of occupancy in a building during evacuation. We propose and compare two distinct model reduction techniques, each of which is based on the potential matrix for the Markov semigroup. The… (More)

We explore the problem of budgeted machine learning, in which the learning algorithm has free access to the training examples' labels but has to pay for each attribute that is specified. This learning model is appropriate in many areas, including medical applications. We present new algorithms for choosing which attributes to purchase of which examples in… (More)

— This paper is concerned with modeling, analysis and optimization/control of occupancy evolution in a large building. The main concern is efficient evacuation of a building in the event of threat or emergency. Complexity arises from the curse of dimensionality in a large building, as well as the uncertain and nonlinear dynamics of individuals. In this… (More)