Abstract:Most systems in nature operate far from equilibrium, exhibiting time-asymmetric, irreversible dynamics; giving rise to entropy production as they exchange energy and matter with their environment. In neuroscience, effective information processing entails flexible architectures integrating multiple sensory streams that vary in time with internal and external events. Physically, neural computation is, in a thermodynamic sense, an out-of-equilibrium, non-stationary process that changes dynamically. Cognitively, nonequilibrium neural activity results in dynamic changes in sensory streams and internal states. In contrast, classical neuroscience theory focuses on stationary, equilibrium information paradigms (e.g., efficient coding theory), which often fail to describe the role of nonequilibrium fluctuations in neural processes. Inspired by the success of the equilibrium Ising model in investigating disordered systems and related associative-memory neural networks, we study the nonequilibrium thermodynamics of the asymmetric Sherrington-Kirkpatrick system as a prototypical model of large-scale nonequilibrium networks. We employ a path integral method to calculate a generating functional over the trajectories to derive exact solutions of the order parameters, conditional entropy of trajectories, and steady-state entropy production of infinitely large networks. We find that entropy production peaks at a critical order-disorder phase transition but is more prominent in a regime with quasi-deterministic disordered dynamics. While entropy production is becoming popular to characterize various complex systems as well as neural activity, our results reveal that increased entropy production is linked with radically different scenarios, and combining multiple thermodynamic quantities yields a more precise picture of the system. These results contribute to an exact analytical theory for studying the thermodynamic properties of large-scale nonequilibrium systems and their phase transitions.