Show simple item record

dc.contributor.authorAmis, Gregory P.en_US
dc.contributor.authorCarpenter, Gail A.en_US
dc.date.accessioned2011-11-14T18:17:10Z
dc.date.available2011-11-14T18:17:10Z
dc.date.issued2009-05en_US
dc.identifier.urihttp://hdl.handle.net/2144/1970
dc.description.abstractComputational models of learning typically train on labeled input patterns (supervised learning), unlabeled input patterns (unsupervised learning), or a combination of the two (semisupervised learning). In each case input patterns have a fixed number of features throughout training and testing. Human and machine learning contexts present additional opportunities for expanding incomplete knowledge from formal training, via self-directed learning that incorporates features not previously experienced. This article defines a new self-supervised learning paradigm to address these richer learning contexts, introducing a neural network called self-supervised ARTMAP. Self-supervised learning integrates knowledge from a teacher (labeled patterns with some features), knowledge from the environment (unlabeled patterns with more features), and knowledge from internal model activation (self-labeled patterns). Self-supervised ARTMAP learns about novel features from unlabeled patterns without destroying partial knowledge previously acquired from labeled patterns. A category selection function bases system predictions on known features, and distributed network activation scales unlabeled learning to prediction confidence. Slow distributed learning on unlabeled patterns focuses on novel features and confident predictions, defining classification boundaries that were ambiguous in the labeled patterns. Self-supervised ARTMAP improves test accuracy on illustrative lowdimensional problems and on high-dimensional benchmarks. Model code and benchmark data are available from: http://techlab.bu.edu/SSART/.en_US
dc.description.sponsorshipSyNAPSE program of the Defense Advanced Projects Research Agency (Hewlett-Packard Company, subcontract under DARPA prime contract HR0011-09-3-0001; HRL Laboratories LLC, subcontract #801881-BS under DARPA prime contract HR0011-09-C-0001); CELEST, an NSF Science of Learning Center (SBE-0354378)en_US
dc.language.isoen_USen_US
dc.publisherBoston University Center for Adaptive Systems and Department of Cognitive and Neural Systemsen_US
dc.relation.ispartofseriesBU CAS/CNS Technical Reports;CAS/CNS-TR-2009-006en_US
dc.rightsCopyright 2009 Boston University. Permission to copy without fee all or part of this material is granted provided that: 1. The copies are not made or distributed for direct commercial advantage; 2. the report title, author, document number, and release date appear, and notice is given that copying is by permission of BOSTON UNIVERSITY TRUSTEES. To copy otherwise, or to republish, requires a fee and / or special permission.en_US
dc.subjectSelf-supervised learningen_US
dc.subjectSupervised learningen_US
dc.subjectAdaptive Resonance Theory (ART)en_US
dc.subjectARTMAPen_US
dc.subjectUnsupervised learningen_US
dc.subjectMachine learningen_US
dc.titleSelf-Supervised ARTMAPen_US
dc.typeTechnical Reporten_US
dc.rights.holderBoston University Trusteesen_US


Files in this item

This item appears in the following Collection(s)

Show simple item record