Repositorio Académico UOH

Bibliotecas Universidad de O'Higgins



Mostrar el registro sencillo del ítem

dc.contributor.author Pereira-Obilinovic, U
dc.contributor.author Aljadeff, J
dc.contributor.author Brunel, N
dc.date.accessioned 2024-01-17T15:54:32Z
dc.date.available 2024-01-17T15:54:32Z
dc.date.issued 2023
dc.identifier.uri https://repositorio.uoh.cl/handle/611/539
dc.description.abstract Attractor networks are an influential theory for memory storage in brain systems. This theory has recently been challenged by the observation of strong temporal variability in neuronal recordings during memory tasks. In this work, we study a sparsely connected attractor network where memories are learned according to a Hebbian synaptic plasticity rule. After recapitulating known results for the continuous, sparsely connected Hopfield model, we investigate a model in which new memories are learned continuously and old memories are forgotten, using an online synaptic plasticity rule. We show that for a forgetting timescale that optimizes storage capacity, the qualitative features of the network's memory retrieval dynamics are age dependent: most recent memories are retrieved as fixed-point attractors while older memories are retrieved as chaotic attractors characterized by strong heterogeneity and temporal fluctuations. Therefore, fixed-point and chaotic attractors coexist in the network phase space. The network presents a continuum of statistically distinguishable memory states, where chaotic fluctuations appear abruptly above a critical age and then increase gradually until the memory disappears. We develop a dynamical mean field theory to analyze the age-dependent dynamics and compare the theory with simulations of large networks. We compute the optimal forgetting timescale for which the number of stored memories is maximized. We found that the maximum age at which memories can be retrieved is given by an instability at which old memories destabilize and the network converges instead to a more recent one. Our numerical simulations show that a high degree of sparsity is necessary for the dynamical mean field theory to accurately predict the network capacity. To test the robustness and biological plausibility of our results, we study numerically the dynamics of a network with learning rules and transfer function inferred from in vivo data in the online learning scenario. We found that all aspects of the network's dynamics characterized analytically in the simpler model also hold in this model. These results are highly robust to noise. Finally, our theory provides specific predictions for delay response tasks with aging memoranda. In particular, it predicts a higher degree of temporal fluctuations in retrieval states associated with older memories, and it also predicts fluctuations should be faster in older memories. Overall, our theory of attractor networks that continuously learn new information at the price of forgetting old memories can account for the observed diversity of retrieval states in the cortex, and in particular, the strong temporal fluctuations of cortical activity.
dc.description.sponsorship Swartz Foundation
dc.description.sponsorship NSF(National Science Foundation (NSF))
dc.description.sponsorship DARPA Award(United States Department of Defense)
dc.description.sponsorship ONR(Office of Naval Research)
dc.description.sponsorship
dc.description.sponsorship
dc.relation.uri http://dx.doi.org/10.1103/PhysRevX.13.011009
dc.title Forgetting Leads to Chaos in Attractor Networks
dc.type Artículo
uoh.revista PHYSICAL REVIEW X
dc.identifier.doi 10.1103/PhysRevX.13.011009
dc.citation.volume 13
dc.citation.issue 1
uoh.indizacion Web of Science


Ficheros en el ítem

Ficheros Tamaño Formato Ver

No hay ficheros asociados a este ítem.

Este ítem aparece en la(s) siguiente(s) colección(ones)

Mostrar el registro sencillo del ítem


Colecciones


Archivos

Artículos

Tesis

Videos


Cuartiles