[New England
      Complex Systems Institute]
[Home] [Research] [Education] [Current Section: Activities & Events] [Community] [News] [The Complex World] [About Complex Systems] [About NECSI]
International Conference on Complex Systems (ICCS2006)

Topological and dynamical structures induced by Hebbian learning in random neural networks

Hugues Berry
Alchemy, INRIA Orsay, France

Benoît Siri
Alchemy, INRIA Orsay, France

Bruno Cessac
Institut Non Linéaire de Nice, UMR CNRS 6618, Nice, France

Bruno Delord
ANIM - UMR 742 Inserm/Université Pierre & Marie Curie, Paris, France

Mathias Quoy
ETIS, UMR 8051 ENSEA-Université de Cergy-Pontoise, Cergy-Pontoise, France

     Full text: PDF
     Last modified: July 28, 2006

"Topological and dynamical structures induced by Hebbian learning in random neural networks"

B. Siri, H. Berry, B. Cessac, B. Delord, M. Quoy

In recent years, a vast amount of work concerning dynamical systems interacting on complex networks has focused on the influence of network topology on the global dynamics. In this framework, neural networks are particularly interesting because the dynamics of the neurons (the network nodes) depends on synaptic weights (the network links) that themselves vary over time ("learning") as a function of the neuron dynamics. This mutual coupling between node dynamics and network topology remains largely obscure.
Here, we study the consequences of such a coupling on dynamics and architecture. To this end, we investigate the influence of learning on the topology of random recurrent neural networks, which exhibit learning and dynamical behaviors yielding associative memory properties that mimic those observed in the olfactory bulb. The state of the neurons evolves over time through classical firing-rate dynamics. We investigate several learning rules to update synaptic strength. These rules are simple implementations of Hebb's rule for learning in biological neurons (i.e. neurons which fire together become more tightly coupled).
Due to the aforementioned coupling, learning shapes the network dynamics, topology and function. We evidence that the modifications of the dynamics can be related to changes in the local loop content. We further show that, because of these local structural alterations, the global network topology changes as well. Indeed, under the influence of learning, the distribution of the strong synapses on the network is no more homogeneous, i.e. two neurons have an increasing probability to be strongly coupled if they are both connected to a third neuron by strong synapses. Hence the resulting network is highly clustered. Besides, its mean-shortest path remains low, so that these learning rules organize the network as a small-world one. We obtain some criteria to discriminate between rules giving rise to such organizations from those that do not.
Hence, these findings raise the hypothesis that small-worldness in natural neural networks may be a spontaneous consequence of the learning scheme governing the network links. Moreover, we show that pattern recognition task emerges from this mutual coupling, thus questioning the relevance of small-world architectures for storing and processing information.

Conference Home   |   Conference Topics   |   Application to Attend
Submit Abstract/Paper   |   Accommodation and Travel   |   Information for Participants

Maintained by NECSI Webmaster    Copyright © 2000-2005 New England Complex Systems Institute. All rights reserved.