Bryan McEleney

Learn More
The development of speech tools suitable for use in real world environments requires collaboration between computational linguistics and new implementation fields e.g. robotics, and the incorporation of new AI techniques to improve overall system performance. In this paper we present the core development concepts of SAID (Speaking Autonomous Intelligent(More)
The taking of initiative has significance in spoken language dialogue systems and in human-computer interaction. A system that takes no initiative may fail to seize opportunities that are important, but a system that always takes the initiative may not allow the user to take the actions he favours. We have implemented a mixed-initiative planning system that(More)
A multi-agent planner is described that accounts for the replanning occurring when one agent's action is observed by another. A nested belief model is used to generate an expectation of the other agent's response. Using the planner's output, a dialogue system is being developed which decides whether uncertainties in the belief model should be resolved(More)
A machine learning approach to interpreting utterances in spoken interfaces is described, where evidence from the utterance and from the dialogue context is combined to estimate a probability distribution over interpretations. The algorithm for the utterance evidence uses nearest-neighbour classification on a set of training examples, while the contextual(More)
A design is presented for a negotiating agent that can construct coherent joint plans with human or artificial agents. In negotiation there is always a trade-off between plan quality and dialogue length. In dynamic conditions and with human partners, length becomes critical. The approach to efficient negotiation is to use an acquaintance model that predicts(More)
  • 1