Modern robots face a challenge shared by biological systems: how to learn and adaptively express multiple sensorimotor skills. A key aspect of this is developing an internal model of expected sensorimotor experiences to detect and react to unexpected events, guiding self-preserving behaviors. Associative skill memories (ASMs) address this by linking movement primitives to sensory feedback, but existing implementations rely on hard-coded libraries of individual skills. A key unresolved problem is how a single neural network can learn a repertoire of skills while enabling integrated fault detection and context-aware execution. Here we introduce neural associative skill memories (neural ASMs), a framework that uses self-supervised temporal predictive coding to integrate skill learning and expression using biologically plausible local learning rules. Unlike traditional ASMs, which require explicit skill selection, neural ASMs implicitly recognize and express skills through contextual inference, enabling fault detection using "predictive surprise" across the entire learned repertoire. Compared to recurrent neural networks trained using backpropagation through time, our model achieves comparable qualitative performance in skill memory expression while using local learning rules and predicts a biologically relevant speed-versus-accuracy trade-off. By integrating fault detection, reactive control, and skill expression into a single energy-based architecture, neural ASMs contribute to safer, self-preserving robotics and provide a computational lens to study biological sensorimotor learning.
Journal article
2025-12-22T00:00:00+00:00
38
1 - 27
26
Humans, Robotics, Neural Networks, Computer, Memory, Feedback, Sensory, Association Learning, Motor Skills, Learning