Vous êtes ici : Accueil > Entités de recherche > NeuroSpin > Data-driven models of learning in distributed networks

Conférence | Cerveau


NeuroSpin Conferences

Data-driven models of learning in distributed networks

Du 12/01/2026 au 12/01/2026
NeuroSpin amphitheater + Zoom

​​​​​​​Talk from Alex Cayco Gajic - ENS Paris / Group for Neural Theory​​​

Short abstract:​​

Understanding how learning reshapes neural representations requires both novel analytical tools and conceptual models that can interpret and guide experimental findings. In this talk, I will present our recent efforts on both fronts. First, advances in chronic neural recordings now allow us to track population activity over the extended timescales required to study learning. However, analyzing these high-dimensional datasets remains a major challenge. I will introduce new tensor-based methods we developed to disentangle changes in neural representations across trials and over time, yielding interpretable insights into learning dynamics. Second, learning complex tasks requires coordination across multiple brain regions. To address this, we propose a modular, multi-area model that combines a recurrent controller with a feedforward adapter network. This architecture separates learning into rapid adaptation and slower consolidation, offering a candidate mechanism for cerebello-cortical interactions during motor adaptation. The model also generates testable predictions about learning dynamics following targeted lesions. Together, these approaches aim to bridge data-driven analysis and theoretical modeling to deepen our understanding of learning in distributed networks.​​

Infos Pratiques

Replay not yet available

Haut de page