Context-Sensitive MTL Networks for Machine Lifelong Learning

10 years 6 months ago
Context-Sensitive MTL Networks for Machine Lifelong Learning
Context-sensitive Multiple Task Learning, or csMTL, is presented as a method of inductive transfer that uses a single output neural network and additional contextual inputs for learning multiple tasks. The csMTL method is tested on three task domains and shown to produce hypotheses for a primary task that are significantly better than standard MTL hypotheses when learning in the presence of related and unrelated tasks. A new measure of task relatedness, based on the context input weights, is shown to have promise. The paper also outlines a machine lifelong learning system that uses csMTL for sequentially learning multiple tasks. The approach satisfies a number of important requirements for knowledge retention and inductive transfer including the elimination of redundant outputs, representational transfer for rapid but effective short-term learning and functional transfer via task rehearsal for long-term consolidation.
Daniel L. Silver, Ryan Poirier
Added 02 Oct 2010
Updated 02 Oct 2010
Type Conference
Year 2007
Authors Daniel L. Silver, Ryan Poirier
Comments (0)