Learn More
We make an analogy between images and statistical mechanics systems. Pixel gray levels and the presence and orientation of edges are viewed as states of atoms or molecules in a lattice-like physical system. The assignment of an energy function in the physical system determines its Gibbs distribution. Because of the Gibbs distribution, Markov random field(More)
We explore a new approach to shape recognition based on a virtually innnite family of binary features (\queries") of the image data, designed to accommodate prior information about shape invariance and regularity. Each query corresponds to a spatial arrangement of several local topographic codes (\tags") which are in themselves too primitive and common to(More)
We present a new approach for tracking roads from satellite images, and thereby illustrate a general computational strategy (\active testing") for tracking 1D structures and other recognition tasks in computer vision. Our approach is related to recent work in active vision on \where to look next" and motivated by the \divide-and-conquer" strategy of parlor(More)
One popular method for the recovery of an ideal intensity image from corrupted or indirect measurements is regularization: minimize an objective function that enforces a roughness penalty in addition to coherence with the data. Linear estimates are relatively easy to compute but generally introduce systematic errors; for example, they are incapable of(More)
MOTIVATION Various studies have shown that cancer tissue samples can be successfully detected and classified by their gene expression patterns using machine learning approaches. One of the challenges in applying these techniques for classifying gene expression data is to extract accurate, readily interpretable rules providing biological insight as to how(More)
High-throughput technologies are widely used, for example to assay genetic variants, gene and protein expression, and epigenetic modifications. One often overlooked complication with such studies is batch effects, which occur because measurements are affected by laboratory conditions, reagent lots and personnel differences. This becomes a major problem when(More)
We use a statistical framework for finding boundaries and for partitioning scenes into homogeneous regions. The model is a joint probability distribution for the array of pixel gray levels and an array of " labels. " In boundary finding, the labels are binary, zero, or one, representing the absence o r presence of boundary elements. In partitioning , the(More)
We present a new approach to molecular classification based on mRNA comparisons. Our method, referred to as the top-scoring pair(s) (TSP) classifier, is motivated by current technical and practical limitations in using gene expression microarray data for class prediction, for example to detect disease, identify tumors or predict treatment response. Accurate(More)
We explore the theoretical foundations of a " twenty questions " approach to pattern recognition. The object of the analysis is the computational process itself rather than probability distributions (Bayesian inference) or decision boundaries (statistical learning). Our formulation is motivated by applications to scene interpretation in which there are a(More)