Shape Quantization and Recognition with Randomized Trees

@article{Amit1997ShapeQA,
  title={Shape Quantization and Recognition with Randomized Trees},
  author={Yali Amit and Donald Geman},
  journal={Neural Computation},
  year={1997},
  volume={9},
  pages={1545-1588}
}
  • Y. Amit, D. Geman
  • Published 1 October 1997
  • Computer Science
  • Neural Computation
We explore a new approach to shape recognition based on a virtually infinite family of binary features (queries) of the image data, designed to accommodate prior information about shape invariance and regularity. Each query corresponds to a spatial arrangement of several local topographic codes (or tags), which are in themselves too primitive and common to be informative about shape. All the discriminating power derives from relative angles and distances among the tags. The important attributes… 
Joint Induction of Shape Features and Tree Classifiers
TLDR
A very large family of binary features for two-dimensional shapes determined by inductive learning during the construction of classification trees is introduced, which makes it possible to narrow the search for informative ones at each node of the tree.
Image Classification using Random Forests and Ferns
TLDR
It is shown that selecting the ROI adds about 5% to the performance and, together with the other improvements, the result is about a 10% improvement over the state of the art for Caltech-256.
Coarse-to-Fine Visual Selection
TLDR
The approach is sequential testing which is coarse-tone in both in the exploration of poses and the representation of objects, and the spatial distribution of processing is highly skewed and detection is rapid, but at the expense of false alarms which, presumably, could be eliminated with localized, more intensive, processing.
Coarse-to-Fine Face Detection
TLDR
The approach is sequential testing which is coarse-to-fine in both in the exploration of poses and the representation of objects, and the spatial distribution of processing is highly skewed and detection is rapid, but at the expense of (isolated) false alarms which could be eliminated with localized, more intensive, processing.
A Framework for Learning Visual Discrimination
TLDR
A method that allows incremental learning of discriminative features in a feature space that includes juxtapositions of oriented local pieces of edge and is parameterized by and the relative angles and distances between the edgels is presented.
A Graph Lattice Approach to Maintaining and Learning Dense Collections of Subgraphs as Image Features
  • E. Saund
  • Computer Science
    IEEE Transactions on Pattern Analysis and Machine Intelligence
  • 2013
TLDR
This paper shows how large families of complex image features in the form of subgraphs can be built out of simpler ones through construction of a graph lattice - a hierarchy of related sub graphs linked in a lattice.
Distance sets for shape filters and shape recognition
TLDR
This work addresses two problems that are often encountered in object recognition: object segmentation, for which a distance sets shape filter is formulated, and shape matching, which is illustrated on printed and handwritten character recognition and detection of traffic signs in complex scenes.
A Computational Model for Visual Selection
TLDR
The model was not conceived to explain brain functions, but it does cohere with evidence about the functions of neurons in V1 and V2, such as responses to coarse or incomplete patterns and to scale and translation invariance in IT.
Graded Learning for Object Detection
TLDR
The goal is to detect all instances of a generic object class, such as a face, in greyscale scenes by learning a hierarchy of spatial arrangements of edge fragments, graded by their size (sparsity).
A Memory Efficient Discriminative Approach for Location Aided Recognition
TLDR
This paper investigates a new approach to mobile visual recognition that would involve uploading only GPS coordinates to a server, following which a compact location specific classifier would be downloaded to the client and recognition would be computed completely on the client.
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 79 REFERENCES
Joint Induction of Shape Features and Tree Classifiers
TLDR
A very large family of binary features for two-dimensional shapes determined by inductive learning during the construction of classification trees is introduced, which makes it possible to narrow the search for informative ones at each node of the tree.
A Computational Model for Visual Selection
TLDR
The model was not conceived to explain brain functions, but it does cohere with evidence about the functions of neurons in V1 and V2, such as responses to coarse or incomplete patterns and to scale and translation invariance in IT.
Memory-based character recognition using a transformation invariant metric
  • P. Simard, Yann LeCun, J. Denker
  • Computer Science
    Proceedings of the 12th IAPR International Conference on Pattern Recognition, Vol. 3 - Conference C: Signal Processing (Cat. No.94CH3440-5)
  • 1994
TLDR
A new distance measure which can be made locally invariant to any set of transformations of the input; and can be computed efficiently is proposed.
Shape and Texture Recognition by a Neural Network
Solving Multiclass Learning Problems via Error-Correcting Output Codes
TLDR
It is demonstrated that error-correcting output codes provide a general-purpose method for improving the performance of inductive learning programs on multiclass problems.
Invariant Descriptors for 3D Object Recognition and Pose
TLDR
A model-based vision system that recognizes curved plane objects irrespective of their pose is demonstrated and the stability of a range of invariant descriptors to measurement error is treated in detail.
What Size Net Gives Valid Generalization?
TLDR
It is shown that if m O(W/ ∊ log N/∊) random examples can be loaded on a feedforward network of linear threshold functions with N nodes and W weights, so that at least a fraction 1 ∊/2 of the examples are correctly classified, then one has confidence approaching certainty that the network will correctly classify a fraction 2 ∊ of future test examples drawn from the same distribution.
...
1
2
3
4
5
...