Topological and dynamical structures induced by Hebbian learning in random neural networks
Hugues Berry
Alchemy, INRIA Orsay, France
Benoît Siri
Alchemy, INRIA Orsay, France Bruno Cessac
Institut Non Linéaire de Nice, UMR CNRS 6618, Nice, France Bruno Delord
ANIM  UMR 742 Inserm/Université Pierre & Marie Curie, Paris, France Mathias Quoy
ETIS, UMR 8051 ENSEAUniversité de CergyPontoise, CergyPontoise, France Full text:
PDF
Last modified: July 28, 2006
Abstract
"Topological and dynamical structures induced by Hebbian learning in random neural networks"
B. Siri, H. Berry, B. Cessac, B. Delord, M. Quoy
In recent years, a vast amount of work concerning dynamical systems interacting on complex networks has focused on the influence of network topology on the global dynamics. In this framework, neural networks are particularly interesting because the dynamics of the neurons (the network nodes) depends on synaptic weights (the network links) that themselves vary over time ("learning") as a function of the neuron dynamics. This mutual coupling between node dynamics and network topology remains largely obscure.
Here, we study the consequences of such a coupling on dynamics and architecture. To this end, we investigate the influence of learning on the topology of random recurrent neural networks, which exhibit learning and dynamical behaviors yielding associative memory properties that mimic those observed in the olfactory bulb. The state of the neurons evolves over time through classical firingrate dynamics. We investigate several learning rules to update synaptic strength. These rules are simple implementations of Hebb's rule for learning in biological neurons (i.e. neurons which fire together become more tightly coupled).
Due to the aforementioned coupling, learning shapes the network dynamics, topology and function. We evidence that the modifications of the dynamics can be related to changes in the local loop content. We further show that, because of these local structural alterations, the global network topology changes as well. Indeed, under the influence of learning, the distribution of the strong synapses on the network is no more homogeneous, i.e. two neurons have an increasing probability to be strongly coupled if they are both connected to a third neuron by strong synapses. Hence the resulting network is highly clustered. Besides, its meanshortest path remains low, so that these learning rules organize the network as a smallworld one. We obtain some criteria to discriminate between rules giving rise to such organizations from those that do not.
Hence, these findings raise the hypothesis that smallworldness in natural neural networks may be a spontaneous consequence of the learning scheme governing the network links. Moreover, we show that pattern recognition task emerges from this mutual coupling, thus questioning the relevance of smallworld architectures for storing and processing information.


