I’ll probably have some more material on quantitative analysis of dispersal in the few days. Here’s a quote from Peter Turchin (1998:17-18):
Of course, we do not know that animals truly move at random, like flipping coins to decide whether to turn right or left. Each individual could be a perfect automaton, rigidly reacting to environmental cues and its internatl states in accordance with some set of behavioral rules. However, even if this were true, we might still choose to model behavior of such animals stochastically, because we would not have the perfect knowledge of all the deterministic rules driving these animals. Even if we did, we might not want to include them all in our dispersal model, since such a model would have an enormous number of parameters and would require a very accurate representation of all environmental "micro-cues." The point is that randomness is a modeling convention. Because it is impractical, and not even helpful, to attempt to model individual movement deterministically, we use a more parsimonious probabilistic model.
I’m pausing the quote to point out my boldface. It has become computationally feasible in the last few years to model enormously complicated scenarios with individuals acting pseudo-deterministically. The most popular use of such modeling is to try to constrain dispersal models by some geographic conditions, such as local habitat richness, rainfall, or altitude (see also, “One model, hold the extra parameters”). Of course, animals really do disperse in ways that depend on such geographic parameters. The question is whether any datasets are sufficient to test models involving so many parameters.
This approach is aptly termed behavioral minimalism (Lima and Zollner 1996). In essence, we adopt a thermodynamic approach: the behavior of individuals is erratic, or irregular, but the redistibution process at the population level has many regular features. There is a direct analogy with with thermodynamic theory. The motion of each gas molecule is chaotic and essentially unpredictable, and can only be described probabilistically. When dealing with large numbers of molecules, however, the laws at the aggregate level are for all intents and purposes deterministic. Similarly, the problem of biological dispersal can be treated by starting with a probabilistic description of individual movements (in other words, formulating the problem as a random walk), and then approximating the redistribution process of the ensemble of individuals with a deterministic equation, diffusion.
The effective scale of stochastic versus deterministic processes is important. I’m chiefly interested in the dispersal of adaptive genes in human populations, for which the deterministic approximation may be considered to have become more and more relevant over time, as the population sizes of regional populations grew. Still, the present pattern in many cases may reflect the stochasticity of populations from earlier time periods, when they were smaller. And formerly important deterministic processes, such as the adoption of agriculture, may no longer be directly observable. So how do we model variance?
The thermodynamic approach to dispersal does not have to assume that the movement of each "particle" is completely random. The important feature of this approach is that we can control the degree of realism in the model. Environmental factors that have strong effects on movement can be included explicitly in the model, while other factors that have weak effects (or about which we have no information) are included in the stochastic component.
This would incorporate the geographic modeling approaches mentioned above – deterministic processes related to spatial variance of habitat or dispersal potential. But then the important step must be to find a minimal deterministic model to account for the data, and then test it with other observations – such as more extensive genetic sampling, archaeological information, or historical documentation.
Turchin P. 1998. Quantitative Analysis of Movement. Sinauer, Sunderland MA.