# Simulations bubbling like a stew

Peter Turchin writes very effectively about quantitative modeling and analytical methods in biology. So every so often I like to post an illuminative quote. Here’s his description of maximum likelihood estimation, from *Quantitative Analysis of Movement*:

Simpler, more direct analyses may make unwarranted assumptions, but they are better at revealing important patterns in the data, and their results can suggest what variables and functional forms to use in the modeling of data. Eventually, however, direct methods of analysis get beyond the bounds of their competence. The general approach discussed in this section can in principle estimate parameters of any model, given infinite amounts of informative data and infinite computer power.

The basic approach is to construct a detailed simulation model (better even, a series of models) and fit it to the data using nonlinear estimation techniques. Jon Schnute colorfully describes a detailed simulation as a "stew" of calculations from which observable quantities (to be compared with the actual data) bubble up to the surface (quoted from Hilborn and Mangel 1997). Nonlinear estimation is the process of adjusting the parameters of the stew (adding more or less salt, increasing or decreasing temperature, etc.) until the stuff that bubbles up resembles the actual data the best. The crudest approach is to change parameters in the simulation by the method of trail [sic] and error and to compare the simulation results to data by eye. A more refined approach is to use some quantitative measure of goodness of fit and a nonlinear minimization routine to search for the best fit automatically (Turchin 1998:295).

The quote has some relevance to yesterday’s discussion of the Neandertal population structure paper. I’m philosophically reluctant to turn to simulations until I exhaust my analytical options. This is a matter of trusting myself – if I really had a lot of confidence in my ability to choose the right assumptions to underlie my simulations, I might turn to them first. But assumptions are tricky. Analytical models have their own assumptions, but those have the advantage of transparency – I didn’t pick them, they are fundamental to the models.

Still, in some cases it doesn’t take long to exhaust the analytical options. So we let the observable quantities “bubble to the surface” of simulations.