In this Thesis, we will explore how tools from Statistical Physics and Information Theory can help us describe and understand complex systems. We will often deal with minimal and paradigmatic models characterized by stochastic processes, employing both analytical techniques and numerical simulations. In the first part of this Thesis, we will focus on the interplay between internal interactions and environmental changes and how they shape the properties of the many degrees of freedom of a complex system. We model the environment as an independent stochastic process that affects the parameters of the system, a fruitful paradigm in several fields - such as changing carrying capacities in ecosystems, switching environments in chemical systems, mutating strategies in microbial communities, different regimes of neural activity, and diffusion in disordered or inhomogeneous media. At any time, the environmental state is not known, so we are interested in the joint distribution of the internal degrees of freedom marginalized over the environmental ones. By computing its mutual information in different scenarios, we explicitly describe the internal and effective dependencies arising in the system. We study the properties of environmental information in different scenarios and how mutual information encodes internal and environmental processes, as well as how it can help us disentangle them. Yet, often we are not able to observe and describe all the internal dependencies of a complex system, and typically we resort to effective representations. By building information-preserving projections and focusing on the paradigmatic case of underdamped systems, we show these optimal effective representations may be unexpectedly singular and undergo abrupt changes. These results help us understand the fundamental limits of approximating complex models with simpler effective ones while attempting to preserve their dependencies, that may arise from unknown environmental changes. In the second part of this Thesis, we leverage this approach and ideas from criticality and apply them to neural systems at different scales. We build on the hypothesis that brain dynamics may be poised near a critical point, finding seemingly scale-free correlations and power-law distributed neuronal avalanches in data from the somatosensory barrel cortex of rats. However, we show that an environmental-like latent variable that models unobserved neural activity may lead to power-law neuronal avalanches in the absence of criticality. Remarkably, the properties of mutual information suggest that, whereas avalanches may emerge from an external stochastic modulation, interactions between neural populations are the fundamental biological mechanism that gives rise to seemingly scale-free correlations. We further explore the role of the structure of interactions at a different but relevant scale - whole-brain dynamics. We develop a stochastic continuous-time formulation of a well-known cellular automaton, showing how the mean-field limit predicts a bistability region. Yet, a continuous transition appears with an interaction network, with localized oscillations that provide a dynamical mechanism for the emergence of clusters of activity known to generate functional networks. Our results shed light on the role of the interaction network, its topological features, and unobserved modulation in the emergence of collective patterns of brain activity. Finally, these ideas will lead us to a phenomenological coarse-graining procedure for neural timeseries. We test it in equilibrium and non-equilibrium models, as well as models of conditionally independent variables in a stochastic environment - which, as we have shown throughout this Thesis, often display non-trivial and unexpected features. In doing so, we further test the fascinating hypothesis that the Statistical Physics of phase transition may serve as a powerful and universal framework for understanding biological and living systems.

In this Thesis, we will explore how tools from Statistical Physics and Information Theory can help us describe and understand complex systems. We will often deal with minimal and paradigmatic models characterized by stochastic processes, employing both analytical techniques and numerical simulations. In the first part of this Thesis, we will focus on the interplay between internal interactions and environmental changes and how they shape the properties of the many degrees of freedom of a complex system. We model the environment as an independent stochastic process that affects the parameters of the system, a fruitful paradigm in several fields - such as changing carrying capacities in ecosystems, switching environments in chemical systems, mutating strategies in microbial communities, different regimes of neural activity, and diffusion in disordered or inhomogeneous media. At any time, the environmental state is not known, so we are interested in the joint distribution of the internal degrees of freedom marginalized over the environmental ones. By computing its mutual information in different scenarios, we explicitly describe the internal and effective dependencies arising in the system. We study the properties of environmental information in different scenarios and how mutual information encodes internal and environmental processes, as well as how it can help us disentangle them. Yet, often we are not able to observe and describe all the internal dependencies of a complex system, and typically we resort to effective representations. By building information-preserving projections and focusing on the paradigmatic case of underdamped systems, we show these optimal effective representations may be unexpectedly singular and undergo abrupt changes. These results help us understand the fundamental limits of approximating complex models with simpler effective ones while attempting to preserve their dependencies, that may arise from unknown environmental changes. In the second part of this Thesis, we leverage this approach and ideas from criticality and apply them to neural systems at different scales. We build on the hypothesis that brain dynamics may be poised near a critical point, finding seemingly scale-free correlations and power-law distributed neuronal avalanches in data from the somatosensory barrel cortex of rats. However, we show that an environmental-like latent variable that models unobserved neural activity may lead to power-law neuronal avalanches in the absence of criticality. Remarkably, the properties of mutual information suggest that, whereas avalanches may emerge from an external stochastic modulation, interactions between neural populations are the fundamental biological mechanism that gives rise to seemingly scale-free correlations. We further explore the role of the structure of interactions at a different but relevant scale - whole-brain dynamics. We develop a stochastic continuous-time formulation of a well-known cellular automaton, showing how the mean-field limit predicts a bistability region. Yet, a continuous transition appears with an interaction network, with localized oscillations that provide a dynamical mechanism for the emergence of clusters of activity known to generate functional networks. Our results shed light on the role of the interaction network, its topological features, and unobserved modulation in the emergence of collective patterns of brain activity. Finally, these ideas will lead us to a phenomenological coarse-graining procedure for neural timeseries. We test it in equilibrium and non-equilibrium models, as well as models of conditionally independent variables in a stochastic environment - which, as we have shown throughout this Thesis, often display non-trivial and unexpected features. In doing so, we further test the fascinating hypothesis that the Statistical Physics of phase transition may serve as a powerful and universal framework for understanding biological and living systems.

Information and Criticality in Complex Stochastic Systems / Nicoletti, Giorgio. - (2023 Apr 27).

Information and Criticality in Complex Stochastic Systems

NICOLETTI, GIORGIO
2023

Abstract

In this Thesis, we will explore how tools from Statistical Physics and Information Theory can help us describe and understand complex systems. We will often deal with minimal and paradigmatic models characterized by stochastic processes, employing both analytical techniques and numerical simulations. In the first part of this Thesis, we will focus on the interplay between internal interactions and environmental changes and how they shape the properties of the many degrees of freedom of a complex system. We model the environment as an independent stochastic process that affects the parameters of the system, a fruitful paradigm in several fields - such as changing carrying capacities in ecosystems, switching environments in chemical systems, mutating strategies in microbial communities, different regimes of neural activity, and diffusion in disordered or inhomogeneous media. At any time, the environmental state is not known, so we are interested in the joint distribution of the internal degrees of freedom marginalized over the environmental ones. By computing its mutual information in different scenarios, we explicitly describe the internal and effective dependencies arising in the system. We study the properties of environmental information in different scenarios and how mutual information encodes internal and environmental processes, as well as how it can help us disentangle them. Yet, often we are not able to observe and describe all the internal dependencies of a complex system, and typically we resort to effective representations. By building information-preserving projections and focusing on the paradigmatic case of underdamped systems, we show these optimal effective representations may be unexpectedly singular and undergo abrupt changes. These results help us understand the fundamental limits of approximating complex models with simpler effective ones while attempting to preserve their dependencies, that may arise from unknown environmental changes. In the second part of this Thesis, we leverage this approach and ideas from criticality and apply them to neural systems at different scales. We build on the hypothesis that brain dynamics may be poised near a critical point, finding seemingly scale-free correlations and power-law distributed neuronal avalanches in data from the somatosensory barrel cortex of rats. However, we show that an environmental-like latent variable that models unobserved neural activity may lead to power-law neuronal avalanches in the absence of criticality. Remarkably, the properties of mutual information suggest that, whereas avalanches may emerge from an external stochastic modulation, interactions between neural populations are the fundamental biological mechanism that gives rise to seemingly scale-free correlations. We further explore the role of the structure of interactions at a different but relevant scale - whole-brain dynamics. We develop a stochastic continuous-time formulation of a well-known cellular automaton, showing how the mean-field limit predicts a bistability region. Yet, a continuous transition appears with an interaction network, with localized oscillations that provide a dynamical mechanism for the emergence of clusters of activity known to generate functional networks. Our results shed light on the role of the interaction network, its topological features, and unobserved modulation in the emergence of collective patterns of brain activity. Finally, these ideas will lead us to a phenomenological coarse-graining procedure for neural timeseries. We test it in equilibrium and non-equilibrium models, as well as models of conditionally independent variables in a stochastic environment - which, as we have shown throughout this Thesis, often display non-trivial and unexpected features. In doing so, we further test the fascinating hypothesis that the Statistical Physics of phase transition may serve as a powerful and universal framework for understanding biological and living systems.
Information and Criticality in Complex Stochastic Systems
27-apr-2023
In this Thesis, we will explore how tools from Statistical Physics and Information Theory can help us describe and understand complex systems. We will often deal with minimal and paradigmatic models characterized by stochastic processes, employing both analytical techniques and numerical simulations. In the first part of this Thesis, we will focus on the interplay between internal interactions and environmental changes and how they shape the properties of the many degrees of freedom of a complex system. We model the environment as an independent stochastic process that affects the parameters of the system, a fruitful paradigm in several fields - such as changing carrying capacities in ecosystems, switching environments in chemical systems, mutating strategies in microbial communities, different regimes of neural activity, and diffusion in disordered or inhomogeneous media. At any time, the environmental state is not known, so we are interested in the joint distribution of the internal degrees of freedom marginalized over the environmental ones. By computing its mutual information in different scenarios, we explicitly describe the internal and effective dependencies arising in the system. We study the properties of environmental information in different scenarios and how mutual information encodes internal and environmental processes, as well as how it can help us disentangle them. Yet, often we are not able to observe and describe all the internal dependencies of a complex system, and typically we resort to effective representations. By building information-preserving projections and focusing on the paradigmatic case of underdamped systems, we show these optimal effective representations may be unexpectedly singular and undergo abrupt changes. These results help us understand the fundamental limits of approximating complex models with simpler effective ones while attempting to preserve their dependencies, that may arise from unknown environmental changes. In the second part of this Thesis, we leverage this approach and ideas from criticality and apply them to neural systems at different scales. We build on the hypothesis that brain dynamics may be poised near a critical point, finding seemingly scale-free correlations and power-law distributed neuronal avalanches in data from the somatosensory barrel cortex of rats. However, we show that an environmental-like latent variable that models unobserved neural activity may lead to power-law neuronal avalanches in the absence of criticality. Remarkably, the properties of mutual information suggest that, whereas avalanches may emerge from an external stochastic modulation, interactions between neural populations are the fundamental biological mechanism that gives rise to seemingly scale-free correlations. We further explore the role of the structure of interactions at a different but relevant scale - whole-brain dynamics. We develop a stochastic continuous-time formulation of a well-known cellular automaton, showing how the mean-field limit predicts a bistability region. Yet, a continuous transition appears with an interaction network, with localized oscillations that provide a dynamical mechanism for the emergence of clusters of activity known to generate functional networks. Our results shed light on the role of the interaction network, its topological features, and unobserved modulation in the emergence of collective patterns of brain activity. Finally, these ideas will lead us to a phenomenological coarse-graining procedure for neural timeseries. We test it in equilibrium and non-equilibrium models, as well as models of conditionally independent variables in a stochastic environment - which, as we have shown throughout this Thesis, often display non-trivial and unexpected features. In doing so, we further test the fascinating hypothesis that the Statistical Physics of phase transition may serve as a powerful and universal framework for understanding biological and living systems.
Information and Criticality in Complex Stochastic Systems / Nicoletti, Giorgio. - (2023 Apr 27).
File in questo prodotto:
File Dimensione Formato  
final_PhD_thesis (PDFA).pdf

accesso aperto

Descrizione: tesi_definitiva_Giorgio_Nicoletti
Tipologia: Tesi di dottorato
Dimensione 33.78 MB
Formato Adobe PDF
33.78 MB Adobe PDF Visualizza/Apri
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11577/3485780
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact