We define a “complex” system to be one whose collective properties emerge from the dynamics of a very large number of constituent parts or components that operate over much shorter length and time scales. No general theoretical framework exists to describe such multi-component, multiscale systems unless they are at or near a physical equilibrium or steady state, so when this is not the case, modeling is particularly nuanced and challenging. Constructing an approximate model that is simple enough to be tractably analyzed yet sophisticated enough to be capable of meaningful, novel predictions is a balancing act that requires ingenuity as well as a diverse toolbox of mathematical techniques. It is as much an art as a science, an unraveling of a Gordian knot that many scientists would prefer to cleave with a blade of silicon.Technology can be an empowering and informative tool when used responsibly, but, just as a reliance on calculators has deprived many primary school students of basic arithmetic skills, a reliance on high-powered computing has robbed many scientists of the capacity to make keen insights into the elegant physics that underlie many complex phenomena. Bioinformatics, atomistic simulations, and “big data” approaches all have scientific merit, but too often they are used as a crutch, a sort of intellectual capitulation to the complexity of a problem.The EGSB team recognizes that, like a fractal, most complexity in nature arises from fairly simple physical principles or rules, and we let this philosophy guide and inform our approach to mathematical modeling. It is our contention that a three parameter model that only qualitatively fits experimental data, but which enhances our fundamental understanding of a complicated problem, is worth far more than a thirty parameter model that quantitatively fits that same data while teaching us nothing.