WIP + Methods
Matan Mazor, Sungjun Cho
Wednesday, 04 June 2025, 12pm to 1pm
Hybrid via Teams and in the Cowey Room, WIN Annexe
Hosted by Polytimi Frangou, Michiel Cottaar
Join via TeamsModel-based self-simulation in memory reconstruction / Matan Mazor, Tobias Gerstenberg & Sanjay Manohar
Presented by Matan Mazor
Humans hold beliefs not only about the external world and other agents, but also about their own minds. These beliefs about ourselves, our “self-models”, are useful for guiding behaviour and making inferences based on noisy sensory data. One aspect of cognition that critically relies on access to a self-model is memory. When we remember a past episode from our life, we often fill in missing information by relying on schemas, or models, specifying how things usually are. Here, we are interested in the role of one particular schema in such model-based memory reconstruction: a schema of the self. This research project will examine the neural substrates the contribute to the use of a self-schema in remembering our past actions.
DyNeStE: Discrete Representation of Long-Range Brain Network Dynamics via Generative Modelling
Presented by Sungjun Cho
Abstract: Recent studies provide compelling evidence that functional brain networks and their dynamic interactions are reliable proxies for cognition and behaviour. Consequently, accurately estimating these networks from electrophysiological data has become a core challenge in neuroscience. While the Hidden Markov Model (HMM) and deep learning approaches (e.g., Dynamic Network Modes) are widely used, both have limitations: the HMM struggles to capture long-range temporal dependencies, whereas neural networks often encode brain network dynamics in a continuous latent space, compromising the ability to interpret the brain as being in a discrete state at each time point.
In this talk, I will introduce a novel generative model that learns categorical representations of network dynamics — Dynamic Network States (DyNeStE) — that aims to address these limitations. DyNeStE retains the interpretability of mutually exclusive states while incorporating the ability to model long-range temporal dependencies using recurrent neural networks (LSTMs). Using both simulated and real MEG datasets, I will show that DyNeStE can accurately infer reasonable resting-state networks while successfully capturing long-range dynamics.