Seminar: Probabilistic Foundations of Neural Networks


We meet on May 23, 12:15 in room B139 to distribute topics and determine dates.


The human brain is a complex network consisting of 10^11 neurons interconnected by 10^15 synapses. How does information disseminate between distant regions in the brain? How can the process of learning influence the brain's network structure? What are mechanisms for efficient memory formation and retrieval? Since these questions are so diverse, mathematical modeling of neural networks combines a variety of fundamental subfields of probability theory that otherwise would seem unrelated.

A substantial part of the seminar will be devoted to developing the mathematical foundations of classical models from statistical physics such as Gibssian systems, bootstrap percolation and random processes with reinforcement. In the final talks, we discuss more recent articles adapting these models to the special features that are characteristic for neural networks. The prerequisite for the seminar is an introductory course in probability theory.

Target audience. Master Mathematics, Master Financial and Insurance Mathematics, TMP.

Prerequisite. An introductory course in probability theory.

Preliminary plan of talks

Bootstrap percolation
[2, Sec. 1-3]
Bootstrap percolation with inhibition
Foundations of perfect simulation
Clan-of-Ancestors method
[5, Sec. 1-4]
Stochastic model for spiking neurons
[6, Sec. 1,2]
Phase transition in spin glasses
[7, Sec. 1,2]
Storage capacity of hopfield model
[8, Sec. 1-4]
Dynamics of stochastic approximation algorithms I
[8, Sec. 5-7]
Dynamics of stochastic approximation algorithms II
Strongly reinforced Polya process on networks


[1] Aizenman, M.; Lebowitz, J.L. Metastability effects in bootstrap percolation, Journal of Physics A: Mathematical and General 21 (1988), 3801--3813.

[2] Einarsson H., Mousset F., Lengler J., Panagiotou K., Steger A. Bootstrap percolation with inhibition. arXiv:1410.3291.

[3] Møller, J. A review of perfect simulation in stochastic geometry. In: Selected Proceedings of the Symposium on Inference for Stochastic Processes, IMS Lecture Notes Monogr. Ser. 37, 333--355

[4] Ferrari, P. A.; Fernández R.; Garcia, N. L. Perfect simulation for interacting point processes, loss networks and Ising models, Stochastic Processes and their Applications 102 (2002), 63--88.

[5] Galves, A.; Löcherbach, E. Infinite Systems of Interacting Chains with Memory of Variable Length -- A Stochastic Model for Biological Neural Nets, J Stat Phys 151 (2013) : 896--921.

[6] Bovier, A. A short course in mean field spin glasses. In: Spin Glasses: Statics and Dynamics. Progress in Probability, vol 62

[7] Bovier, A; Gayrard, V. Rigorous bounds on the storage capacity of the dilute Hopfield model. J Stat Phys 69 (1992) 69: 597--627.

[8] M. Benaïm. Dynamics of stochastic approximation algorithms. In: Seminaires de Probabilités XXXIII, volume 1709 of Lecture Notes in Mathematics, 1--68.

[9] van der Hofstad, R.,; Holmes M.; Kuznetsov, A.; Ruszel, W. Strongly reinforced Pólya urns with graph-based competition, Annals of Applied Probability 26 (2016): 2494--2539.