Charles P. Dolan

Learn More
TTS-MUC3 incorporates semi-automated lexicon generation and almost fully automated phras e pattern generation. Associative retrieval from a case memory provides raw data for computing se t fills and string fills . TTS-MUC3's modular process model integrates the results of case memor y retrieval over sentences from multiple stories, extracts the date and(More)
Figure 1 gives the official results for the Hughes Trainable Text Skimmer used for MUC 3 (TTS-MUC3) . TTS is a largely statistical system, using a K-Nearest Neighbor classifie r with the output of a shallow parser as features. (See the System Summary section of thi s volume for a detailed description of TTS-MUC3). The performance, on a slot by slot basi s(More)
The objective of the Hughes Trainable Text Skimmer (TTS) Project is to create text skimming softwar e that: (1) can be easily re-configured for new applications, (2) improves its performance with use, and (3) is fas t enough to process several megabytes of text per day. The TTS-MUC4 system is our second full-scale prototype . I t is an adaptation of the(More)
  • C. P. Dolan
  • International 1989 Joint Conference on Neural…
  • 1989
Summary form only given, as follows. The great majority of computational cognitive models have adhered to the physical symbol system hypothesis (PSSH) of Newell. An approach that seems to be incompatible with the PSSH is that of parallel distributed processing (PDP), or connectionism. It is a controversial issue as to whether PSSH or PDP is a better(More)
Table 1 shows the official template-by-template score results for the Hughes Trainable Text Skimmer use d for MUC-4 (TTS-MUC4) on TST3 . TI'S is a largely statistical system, using a set of Bayesian classifiers with the output of a shallow parser as features. (See the System Summary section of this volume for a detailed description o f TTS-MUC4) . SLOT POS(More)
  • 1