Neural Networks


The dynamics of neural networks and the dynamics of classical spin systems are strikingly similar in several respects. The storage of information (firing patterns) or of stimulus-response schemes in neural networks, for instance, can formally be described as the construction of attractors in the dynamics of spin systems. This construction is achieved by giving suitable values to the exchange couplings between the spins, which take the role of synapses in neural networks. The excitatory or inhibitory nature of synapses is in this construction corresponds to a ferromagnetic or anti ferromagnetic nature of the spin-spin interaction. The search for suitable values of synaptic interactions in a neural net can be done iteratively through a process of learning or training.

In the course of the last ten years or so, we have adressed questions concerned with the storage capacity of neural networks, questions related to neural code (low-activity and hierarchically organized patterns), with the storage and representation of sequences, with analog or graded-response neuron systems, with learning algorithms, with preprocessing in feed-forward systems, with the role of dream-sleep (unlearning), and more.

Very recently we have used neural network type modelling to investigate mechanisms that might underly the process of cell reprogramming.

Recent talks and a list of publications can be found below.

Recent Talks



rk  09.12.2019

web counter