CECAM-MARVEL Classics in molecular and materials modelling: Giorgio Parisi and Marc Mézard
In this series, methods that have become fundamental tools in computational physics and chemistry are presented by their originators at a level appropriate for master and graduate students. The lectures are followed by an interview: we ask our guests to recall for us the period, problems, people and circumstances that accompanied the creation of milestone methods and algorithms that we now routinely use.
This time, the second part of the session will be moderated by L. Berthier, Université de Montpellier.
Join us to share this exciting opportunity to learn first-hand from our pioneers and get to know better the genesis of work that is now recorded in books!
15:00 – Introduction
15:05 – 3D simulations of spin glasses on dedicated architectures (G. Parisi)
15:45 – Spin glasses concepts and algorithms in hard constraint satisfaction problems (M. Mézard)
16:25 – Break
16:35 – Interview and recollections (moderator L. Berthier)
17:30 – End
3D simulations of spin glasses on dedicated architectures
Giorgio Parisi, Università di Roma La Sapienza
In this talk I will present very large scale simulations that have been done on spin glasses in 3 and 4 dimensions.
After an introduction on the theoretical framework based on replica symmetry breaking I will present simulations, both at equilibrium at non-equilibrium, focusing on the large scale behaviour.
Spin glasses concepts and algorithms in hard constraint satisfaction problems
Marc Mézard, Università Bocconi Milano
Spin glass theory has had a large impact on many fields. Among them, a new field of research is rapidly expanding at the crossroad between statistical physics, information theory and combinatorial optimization. It deals with problems which are very important in each of these fields, like spin glasses, error correction, or satisfiability. This talk will review how the cavity method, initially developed to understand spin glass theory in a framework more transparent than the replica method, can be transformed into message passing algorithms that turn out to be quite efficient for several large-scale problems of constraint satisfaction and statistical inference.
About the speakers
Giorgio Parisi is professor emeritus of Theoretical Physics at the University of Rome La Sapienza and associate researcher at the INFN National Institute of Nuclear Physics. From 2018 to 2021 he was President of the Accademia Nazionale dei Lincei, and currently acts as President of the Class of Physical Sciences, Mathematics and Natural and Vice President of the Academy. Born in Rome in 1948, Parisi completed his studies at the Sapienza University of Rome where he graduated in physics in 1970. Throughout his scientific career, Giorgio Parisi has made many decisive and widely recognized contributions in different areas of physics: in particle physics, statistical mechanics, fluid dynamics, condensed matter, supercomputer. He has also written articles on neural networks, immune systems and movement of groups of animals. Among other recognitions, Parisi was awarded, in 1992, the Boltzmann Medal for his contributions to the theory of disordered systems, the Dirac Medal in Theoretical Physics in 1999, the Max Planck Medal in 2011, the Lars Onsager Prize of the American Physical Society 2016, the Nature Award for Mentoring in Science 2013, and the Wolf Prize in Physics in 2021. In 2021 Giorgio Parisi was awarded the Nobel Prize in Physics.
Marc Mézard is a Professor of Theoretical Physics. He studied physics at Ecole normale supérieure in Paris and obtained his PhD in 1984. Hired at CNRS in Paris, he was Research Director in Université Paris Sud. From 2012 and 2022 he became Director of Ecole normale supérieure, and then joined Bocconi University as a professor, in the newly created department of computational sciences. Prof. Mezard’s work focuses on statistical physics of disordered systems, with applications in various fields like information theory, computer science, machine learning, biophysics. In recent years his research has focused on information processing in neural networks, machine learning and deep networks, with specific interest in the theoretical impact of data structure on learning strategies and generalization performance.
Previous CECAM and MARVEL lectures can be found at:
Low-volume newsletters, targeted to the scientific and industrial communities.Subscribe to our newsletter