Learn More
We present an automatic approach to tree annotation in which basic nonterminal symbols are alternately split and merged to maximize the likelihood of a training treebank. Starting with a simple Xbar grammar, we learn a new grammar whose nonterminals are subsymbols of the original nonterminals. In contrast with previous work, we are able to split various(More)
Linear perspective is a good approximation to the format in which the human visual system conveys 3D scene information to the brain. Artists expressing 3D scenes, however, create nonlinear projections that balance their linear perspective view of a scene with elements of aesthetic style, layout and relative importance of scene objects. Manipulating the many(More)
To perform automatic, unconscious inference, the human brain must solve the binding problem by correctly grouping properties with objects. Temporal binding models like SHRUTI already suggest much of how this might be done in a connectionist and localist way by using temporal synchrony. We propose a set of alternatives to temporal synchrony mechanisms that(More)
While most work on parsing with PCFGs has focused on local correlations between tree configurations, we attempt to model non-local correlations using a finite mixture of PCFGs. A mixture grammar fit with the EM algorithm shows improvement over a single PCFG, both in parsing accuracy and in test data likelihood. We argue that this improvement comes from the(More)
Artificial Intelligence has had disappointments before, but now there are again new and promising efforts arising. In the early days of computers, it was believed that they would soon do everything that a human could. Researchers promised great breakthroughs, such as fluent natural language translation, that have still not occurred. In part, this failure(More)