Combining Hebbian and predictive plasticity yields invariant object representations in deep nets

Discriminating distinct objects and concepts from sensory stimuli is essential for survival. Our brains accomplish this feat by forming meaningful internal representations in deep sensory networks with plastic synaptic connections. Experience-dependent plasticity presumably exploits temporal contingencies between sensory inputs to build these internal representations. However, the precise mechanisms underlying plasticity remain elusive. We derive a local synaptic plasticity model inspired by self-supervised machine learning techniques that shares a deep conceptual connection to Bienenstock-Cooper-Munro (BCM) theory and is consistent with experimentally observed plasticity rules. We show that our plasticity model yields disentangled object representations in deep neural networks without the need for supervision and implausible negative examples. In response to altered visual experience, our model qualitatively captures neuronal selectivity changes observed in the monkey inferotemporal cortex in-vivo. Our work suggests a plausible learning rule to drive learning in sensory networks while making concrete testable predictions.

Go to group wiki Go to wiki users Info

Timetable

Day Time Location
Thu, 05.05.2022 16:00 - 17:00 Disco

Moderator

Friedemann Zenke

Members

Younes Bouhadjar
Jörg Conradt
Edward Jones
James Knight
Dylan Muir
Mattias Nilsson
Jens Egholm Pedersen
Margherita Ronchini