Abstract: One of the central concerns of computer science is how the resources needed to perform a given computation depend on that computation. Moreover, one of the major resource requirements of computers—ranging from biological cells to human brains to high-performance (engineered) computers—is the energy used to run them, i.e. the thermodynamic costs of running them. Those thermodynamic costs of performing a computation have been a long-standing focus of research in physics, going back (at least) to the early work of Landauer and colleagues. However, one of the most prominent aspects of computers is that they are inherently non-equilibrium systems. Unfortunately, the research by Landauer and co-workers on the thermodynamics of computation was done when non-equilibrium statistical physics was still in its infancy, severely limiting the scope and formal detail of their analyses. The recent breakthroughs in non-equilibrium statistical physics hold the promise of allowing us to go beyond those limitations. Here I present some initial results along these lines, concerning the entropic costs of running (loop-free) digital circuits and Turing machines. These results reveal new, challenging engineering problems for how to design computers to have minimal thermodynamic costs. They also allow us to start to combine computer science theory and stochastic thermodynamics at a foundational level, thereby expanding both.