UCL Discovery Stage
UCL home » Library Services » Electronic resources » UCL Discovery Stage

Flexible task abstractions emerge in linear networks with fast and bounded units

Sandbrink, Kai; Bauer, Jan; Proca, Alexandra M; Saxe, Andrew; Summerfield, Christopher; Hummos, Ali; (2024) Flexible task abstractions emerge in linear networks with fast and bounded units. In: Advances in Neural Information Processing Systems 37 (NeurIPS 2024). NeurIPS: Vancouver, Canada. (In press).

[thumbnail of 2411.03840v2.pdf] Text
2411.03840v2.pdf - Published Version
Access restricted to UCL open access staff until 25 July 2025.

Download (19MB)

Abstract

Animals survive in dynamic environments changing at arbitrary timescales, but such data distribution shifts are a challenge to neural networks. To adapt to change, neural systems may change a large number of parameters, which is a slow process involving forgetting past information. In contrast, animals leverage distribution changes to segment their stream of experience into tasks and associate them with internal task abstractions. Animals can then respond flexibly by selecting the appropriate task abstraction. However, how such flexible task abstractions may arise in neural systems remains unknown. Here, we analyze a linear gated network where the weights and gates are jointly optimized via gradient descent, but with neuron-like constraints on the gates including a faster timescale, nonnegativity, and bounded activity. We observe that the weights self-organize into modules specialized for tasks or sub-tasks encountered, while the gates layer forms unique representations that switch the appropriate weight modules (task abstractions). We analytically reduce the learning dynamics to an effective eigenspace, revealing a virtuous cycle: fast adapting gates drive weight specialization by protecting previous knowledge, while weight specialization in turn increases the update rate of the gating layer. Task switching in the gating layer accelerates as a function of curriculum block size and task training, mirroring key findings in cognitive neuroscience. We show that the discovered task abstractions support generalization through both task and subtask composition, and we extend our findings to a nonlinear network switching between two tasks. Overall, our work offers a theory of cognitive flexibility in animals as arising from joint gradient descent on synaptic and neural gating in a neural network architecture.

Type: Proceedings paper
Title: Flexible task abstractions emerge in linear networks with fast and bounded units
Event: Neural Information Processing Systems
Location: Vancouver, Canada
Dates: 10 Dec 2024 - 15 Dec 2024
Publisher version: https://proceedings.neurips.cc/
Language: English
UCL classification: UCL
UCL > Provost and Vice Provost Offices > School of Life and Medical Sciences
UCL > Provost and Vice Provost Offices > School of Life and Medical Sciences > Faculty of Life Sciences
UCL > Provost and Vice Provost Offices > School of Life and Medical Sciences > Faculty of Life Sciences > Gatsby Computational Neurosci Unit
URI: https://discovery-pp.ucl.ac.uk/id/eprint/10203917
Downloads since deposit
11Downloads
Download activity - last month
Download activity - last 12 months
Downloads by country - last 12 months

Archive Staff Only

View Item View Item