Learn More
Pollack (1991) demonstrated that second-order recurrent neural networks can act as dynamical recognizers for formal languages when trained on positive and negative examples, and observed both phase transitions in learning and IFS-like fractal state sets. Follow-on work focused mainly on the extraction and minimization of a nite state automaton (FSA) from(More)
Recurrent neural network processing of regular language is reasonably well understood. Recent work has examined the less familiar question of context-free languages. Previous results regarding the language a n b n suggest that while it is possible for a small recurrent network to process context-free languages, learning them is difficult. This paper(More)
In recent years it has been shown that first order recurrent neural networks trained by gradient-descent can learn not only regular but also simple context-free and context-sensitive languages. However, the success rate was generally low and severe instability issues were encountered. The present study examines the hypothesis that a combination of(More)
A 3-yr study using primiparous crossbred beef heifers (n = 114) was conducted to determine the effects of protein supplement during late gestation on progeny performance and carcass characteristics. Pregnant heifers were stratified by heifer development system, initial BW, and AI service sire and placed in an individual feeding system. Heifers were offered(More)