Learn More
Context-sensitive Multiple Task Learning, or csMTL, is presented as a method of inductive transfer which uses a single output neural network and additional contextual inputs for learning multiple tasks. Motivated by problems with the application of MTL networks to machine lifelong learning systems, csMTL encoding of multiple task examples was developed and(More)
Context-sensitive Multiple Task Learning, or csMTL, is presented as a method of inductive transfer that uses a single output neural network and additional contextual inputs for learning multiple tasks. The csMTL method is tested on three task domains and shown to produce hypotheses for a primary task that are significantly better than standard MTL(More)
An approach to the continued practice of tasks is proposed in the context of previous work on lifelong machine learning using a system of back-propagation neural networks. The method compensates for small numbers of training examples per practice session and the concurrent practice of unrelated tasks. Knowledge from each new practice session is integrated(More)
— Fundamental to the problem of lifelong machine learning is how to consolidate the knowledge of a learned task within a long-term memory structure (domain knowledge) without the loss of prior knowledge. We investigate the effect of curriculum, ie. the order in which tasks are learned, on the consolidation of task knowledge. Relevant background material on(More)
  • 1