Learn More
Context-sensitive Multiple Task Learning, or csMTL, is presented as a method of inductive transfer that uses a single output neural network and additional contextual inputs for learning multiple tasks. The csMTL method is tested on three task domains and shown to produce hypotheses for a primary task that are significantly better than standard MTL(More)
— Fundamental to the problem of lifelong machine learning is how to consolidate the knowledge of a learned task within a long-term memory structure (domain knowledge) without the loss of prior knowledge. We investigate the effect of curriculum, ie. the order in which tasks are learned, on the consolidation of task knowledge. Relevant background material on(More)
An approach to the continued practice of tasks is proposed in the context of previous work on lifelong machine learning using a system of back-propagation neural networks. The method compensates for small numbers of training examples per practice session and the concurrent practice of unrelated tasks. Knowledge from each new practice session is integrated(More)
This report presents a case study of five online non-traditional undergraduate students who used self-created video discussion posts for a class. The study attempted to discover the influence that this type of action had on establishing social presence in online courses, which is often described as one of the critical factors in student success and(More)
  • 1