Dante R. Chialvo

This project approaches brain function and behavior from a complex network's perspective. Networks are sets of nodes linked by connections. The nodes and connections may represent persons and their social relations, molecules and their interactions, or web pages and hyperlinks, often numbering in the thousands or millions. What makes such networks complex is not their size but the interaction of architecture (the networkÂ’s connection topology) and dynamics (the behavior of the individual network nodes), giving rise to "emergent" properties. Since the complex network is the backbone of the complex system, the understanding of brain and cognitive function can be illuminated by focussing the analysis on the network' structure and dynamics, as discussed on a recent review.

Any observable human behavior results from the concerted action of several brain cortical areas. Given its extensive inter-connectivity, it is safe to assume that for any given area that is activated some other must be inhibited. Only the achievement of an appropriate cooperative-competitive balance would prevent the two extremes: complete silence or explosive activity. This stability problem of the cortex is unsolved, but it is attracting increasing attention both at the experimental and theoretical levels. The basic point is to uncover the rules by which the dynamic of a system with such connectivity and size remains bounded on the "healthy" state and at the same time it is able to respond swiftly to even minute demands. We have argued that as long as the brain remains close to a **second order phase transition** cooperative-competitive correlation' patterns -as well as other relevant properties- will naturally emerge from the **critical** character of the dynamics. These ideas are explored in this project using theoretical arguments, mathematical models and fMRI brain imaging in normal and chronic pain patients .

This project is concerned with mathematical models of learning and memory working *near the regime of criticality*. Learning *is* easier in highly *susceptible* dynamical states and one of the properties of the critical state is precisely the susceptibility to respond to even minute perturbations. It was Turing the first to point out that the complexity of the functioning brain must require some sort of "critical" balance , such that during purpose-directed behavior stays at a barely sub-critical state, not super-critical neither highly correlated (subcritical). With the late Professor Per Bak, we explored this line of thinking analyzing simple models of self-organized learning. Counter to current paradigms of learning, we have proposed models where the dynamical process of self-organized learning does not involve reinforcing of connections, rather the opposite. With fewer constraints that all other previous neural networks our theoretical formalism out-perform them. It learns from mistakes (not from examples) evolving during learning to a "critical state" (not to a global minima) from which it can rapidly escape if the connection matrix does not fit anymore the requirements. An issue of current interest is to elucidate how the critical nature of the environment could progressively shape the structure of neuronal nets. Another aspect concerns the formal understanding and modeling of recent experimental results of Plenz demonstrating neuronal avalanches in cortex.

We studied the role of stochastic fluctuations in neural function and uncovered peculiar roles for noise, in the context of the celebrated stochastic resonance phenomena. More recent work in this project predicts the dynamic response neural tissue driven simultaneously by several frequencies and noise. In a recent manuscript "How we hear what isn't there: A neural mechanism for the missing fundamental illusion", we work out a problem with a very long tradition dating back at least to Pythagoras' investigation of what physical properties causes the pitch of a given sound. The manuscript shows, for the first time, that the intrinsic nonlinear dynamics of noisy neurons *are enough* to extract the "pitch" of a complex sound by a stochastic resonance process happening at a "ghost" frequency not present in the complex input. The theory provides *quantitative* *predictions* which agrees extremely well with data from psycho-acoustic and physiological experiments. In addition, the results are of general validity on a variety of fields, almost every time that periodic signals interfere. Work in this area is used to articulate a neural based theory for consonance. A recent unexpected spin off is related to similar ghost resonances proposed as the basis for sudden climatic changes with a periodicity of 1470 years, in the absence of such forcing periods as discussed further in a recent paper.