The scientific theme of the paper is about body size and dimorphism. The species presumed to have made all the trackways is Australopithecus afarensis, the only species that has so far been reported from fossil remains at Laetoli, although the tracks at 3.66 million years old are a bit more ancient than any of the fossils. This is the same species as the Lucy skeleton, which was found at Hadar, Ethiopia, and the “First Family” series of fossils from Hadar in the locality known as A.L. 333. In 2010, Yohannes Haile-Selassie and colleagues reported a partial skeleton from Woranso-Mille, Ethiopia, some 3.6 million years old, which also seems to represent a large male individual, that stood just under 160 cm tall. Based on a regression of foot size to stature, the new footprint trail in test pit L8 represents an individual that probably stood around 165 cm, with 10 cm or so error either way.
Here’s a neat graphic showing stature estimates for early hominins up through early H. erectus:
That’s a bit complicated but the point is pretty clear. A. afarensis overlaps with H. erectus substantially in stature. If we consider only the tiny Lucy skeleton (the lowest “x” in the figure at less than 110 cm), we get a misleading view of body size in this early hominin species. But at the same time, Lucy and some other specimens of A. afarensis really are quite a lot smaller than any H. erectus specimens. The conclusion made by Masao and colleagues, applying some statistics, is that A. afarensis was more variable and sexually dimorphic than humans and H. erectus.
This idea of higher dimorphism in early hominins has been the subject of pointed debate over the past fifteen years, a debate that has been driven by insufficient fossil data. One group of authors has, through the use of increasingly complicated statistical games, tried to show that a tiny sample of fossils are not really as variable as they look to the eye.
These new data don’t revolutionize the question, they just move the ball downfield slightly. Adding a few more data points, even very large individuals, doesn’t vastly narrow the confidence limits on second- and higher moments of a size distribution. But what these footprints should remind us is that discovering new fossils is a lot more valuable than statistical games. With that in mind, I think we should also be skeptical about whether these footprints were really produced by A. afarensis. That species already has problems at Hadar and at Woranso-Mille, where some researchers now recognize multiple species are present. At Laetoli, we should probably apply a level of skepticism to the idea that only one fossil species could be present.
As I was reading this new paper, what struck me was that the authors had some difficulty being certain that the new footprints represent the same “Footprint Tuff” in which the older track G hominin footprints were found back in the 1970s. They found that the original scientists involved in describing the footprints, including Tim White, Mary Leakey, and Richard Hay, had been very thorough in describing the footprints themselves and some aspects of the geological setting. But those original descriptions just did not present sufficient detail about some geological aspects essential to recognizing the geological layers in the field setting. There were no published photographs of the stratigraphic sequence, for example, and no description of the color or “eye-scale characteristics” of the tuffs. Without such details, it is difficult to do replicable work on new aspects of the geology. That is, the level of detail sufficient to publish fossils in the 1980s does not meet the basic needs of scientists today. We need better descriptions of the context of fossils, and we need to know when those fossils are really in situ like these footprints.
We have come a long way in understanding aspects of microstratigraphy and taphonomy, and we have come to demand greater contextual detail in addition to the basic description of fossils. You can see that shift manifested in this paper by Masao and colleagues, which includes clear field descriptions of all the geological units they encountered.
What we demand in descriptions today is also much more detailed than forty years ago. Then, it was sufficient to publish line drawings of footprints, with a few topographic views. But even then, researchers recognized that publications didn’t provide all the detail that was necessary to really evaluate the science. So they made casts available to allow other researchers to compare the evidence. Of course we all know that some scientists have stopped exercising such care, but for the rest of us, we place higher demands on the evidence we’re willing to accept.
Today, we can make three-dimensional models available instantly. So in addition to the different modes of visualization used in the research paper, Masao and colleagues have placed research-grade models of the footprint trails on Morphosource for anyone to download. That’s the same mechanism we used to distribute 3D surface models of the Homo naledi discovery, and it is great to see more and more scientists taking advantage of the opportunity to increase the replicability and quality of their science in this way.
It’s just really exciting to see researchers around the world engaging with new discoveries like this, getting them out to the public in open access journals, and building the global support for our science.
Masao F, Ichumbaki EB, Cherin M, Barili A, Boschian G, Iurino DA, Menconero S, Moggi-Cecchi J, Manzi G. 2016. New footprints from Laetoli (Tanzania) provide evidence for marked body size variation in early hominins. eLife 2016;5:e19568. doi:10.7554/eLife.19568
Haile-Selassie, Y., Latimer, B. M., Alene, M., Deino, A. L., Gibert, L., Melillo, S. M., ... & Lovejoy, C. O. (2010). An early Australopithecus afarensis postcranium from Woranso-Mille, Ethiopia. Proceedings of the National Academy of Sciences, 107(27), 12121-12126.
At the city's apex in 1100, the population exploded to as many as 30 thousand people. It was the largest pre-Columbian city in North America, bigger than London or Paris at the time. Its colorful wooden homes and monuments rose along the eastern side of the Mississippi, eventually spreading across the river to St. Louis. One particularly magnificent structure, known today as Monk’s Mound, marked the center of downtown. It towered 30 meters over an enormous central plaza and had three dramatic ascending levels, each covered in ceremonial buildings. Standing on the highest level, a person speaking loudly could be heard all the way across the Grand Plaza below. Flanking Monk’s Mound to the west was a circle of tall wooden poles, dubbed Woodhenge, that marked the solstices.
Despite its greatness, the city’s name has been lost to time. Its culture is known simply as Mississippian. When Europeans explored Illinois in the 17th century, the city had been abandoned for hundreds of years. At that time, the region was inhabited by the Cahokia, a tribe from the Illinois Confederation. Europeans decided to name the ancient city after them, despite the fact that the Cahokia themselves claimed no connection to it.
She reports from the scene where she participated in some archaeological work, including some of my favorite bits: digging in the houses and structures used by ordinary people, on the outskirts of the monumental area.
Given their intelligence, it seems to me likely that the Neanderthals contemplated, in some way, the mysteries of life. Wouldn't they have wondered not only about unexpected and surprising weather events and sky events but also what happens when our lives comes to an end? If they thought about these questions, did they do so with awe, dread or reverence?
I provided a quote for the piece, which discusses what kinds of evidence archaeologists consider. I have come to recognize that the recognition of mortality and cultural practices associated with death may be among the deepest behavioral aspects of human evolutionary history.
One of the most obvious cases of recent human evolution is the increasing frequency with which individuals don’t develop third molars, what is called “M3 agenesis”. This condition is when the third molars, or wisdom teeth, don’t form at all – the individual never developed them.
I could be wrong, and I’m sure that a reader will remind me if I am, but I cannot think of an instance of M3 agenesis in hominins outside of modern humans. It was entirely typical for most hominins throughout our evolutionary history to develop and erupt third molars into normal occlusion.
Agenesis of the third molars is different from cases where the M3 fails to erupt normally or remains impacted within the jaw. It is also different from the dental or surgical removal of third molars, which is now carried out on a very high proportion of people in the United States and many other countries. The number of students in my courses at the University of Wisconsin that still have their third molars is pretty small, generally less than 20 percent and often less than 10 percent. The majority of third molar removal is for orthodontic purposes. When third molars erupt, they exert forces on the rest of the teeth that can cause crowding and malocclusion, and more rarely, severe pain. One of the easiest ways to maintain straight teeth is to pull the ones in the back.
The problems with crowding of the anterior teeth points to the reason that anthropologists have traditionally given for the change toward a higher incidence of agenesis. Our Pleistocene ancestors very rarely had malocclusions due to dental crowding. The development of our jaws and teeth evolved within a context where people lived hunter-gatherer lifestyles and ate fairly tough foods, even as young children. By contrast, crowding of the dentition and resulting malocclusion have been very common in many Holocene populations, mostly those with agricultural subsistence. Agriculturalists don’t eat as many tough foods; they eat a large fraction of cooked grains and easy-to-chew cooked plants, milk, and meats. The idea is that the development of the jaw is plastic, and a reduction of the forces exerted on the dentition during early childhood can alter the developmental trajectory of the mandible and maxilla. If the jaws don’t develop as to as large an adult size, the teeth will tend to be crowded, and the last teeth to initiate development, the third molars, may not form at all.
This is an elegant story in some ways, but in fact we don’t know whether it’s true.
We do know that the teeth develop from an embryonic tissue sheet, which is divided into segments by genes during early development. Those segments expand and wrinkle into the form of tooth germs in a characteristic pattern, which is retained across most mammal orders, although many of the orders have come to have different numbers of teeth, which result from evolution of this early developmental pattern. Within a species, if an individual ends up with a segment that is too small or doesn’t develop quite right, that tooth will not form. The most common change is to the extreme tooth, which is the third molar. Less commonly, individual teeth within the other types may fail to develop – P4 agenesis is fairly frequent, sometimes I2 agenesis. Some of these kinds of agenesis are syndromic, meaning that other biological traits covary strongly with them.
How jaw forces might influence the pattern of tooth development is not well understood. Nor is it well understood how heritable M3 agenesis may be – to the extent it seems to run in families, this might reflect similar environments or similar genetics.
We do know quite a bit about the incidence of M3 agenesis in different human populations. The trait is nearly universal – it occurs everywhere in large enough samples of people, even in hunter-gatherer peoples who have recently continued eating “wild” food diets. But it differs greatly in frequencies.
Carter and Worthington (2015) did a meta-analysis of studies that have estimated M3 agenesis frequencies in particular human populations. By looking at 92 studies in different regions of the world, they give a global picture of where humans are more or less likely to have agenesis of the third molars. They limited their analysis to include studies of living human populations with radiographic evidence, so the agenesis was clearly documented.
Here’s the picture summarizing the frequencies in different regions:
The incidence of M3 agenesis is lowest in African populations today. This chart represents only two studies in Africans, but the observation of a very low frequency of M3 agenesis is in accord with studies of skeletal samples in my experience. Other populations have a higher frequency, although the study does not include any populations indigenous to Australia, New Guinea or the nearby areas of Melanesia, which in my experience also have low frequencies of M3 agenesis. Some of the highest population frequencies are observed in Asia; these are sampled broadly, including one sample from South India, several studies of Japanese, Turkish, Israeli and Iraqi. So it’s not specifically East Asian, but more broadly some population samples across the continent.
The most common M3 agenesis is a single missing third molar, and that is what is illustrated in the chart. It is less common to be missing two molars, and very uncommon to be missing three or all four of them. Women are slightly more likely to be missing an M3 than men (Carter and Worthington found a 14 percent greater likelihood in women).
I know that many anthropologists lecture about M3 agenesis as a recent evolutionary change, but it is a complicated one. Few look deeply into the pattern, which is strongly parallel across many human populations, not regionalized. In a broad sense, the frequencies of M3 agenesis across populations covary with molar sizes, and it may simply be that the evolution of smaller tooth size has M3 agenesis as a frequent side effect. The greater incidence of agenesis in women also might be expected as a correlate of smaller molar sizes.
There is no evidence that M3 agenesis is itself an adaptation that has been favored by natural or sexual selection. The possibility of sexual selection presents itself in this context because of the possible effects of a crowded dentition on mating preferences in past populations. But if M3 agenesis owes its recent high frequency to selection, it is likely as a side effect of selection for smaller teeth. However, even that is hardly so simple, as the pattern of size reduction in teeth was not uniform in Holocene populations, and the frequencies of M3 agenesis fluctuate substantially among studies.
And we do not know how much of M3 agenesis may be explained by plasticity of development within current environments. Some studies show a difference within a geographic sample between people who have originated from different immigrant populations (for example, in Singapore between people of Chinese, Malay, and Indian ethnicity), but none have really evaluated the frequencies in second-generation and third-generation immigrants. The genetics of the trait, even heritability, is basically an open question. Of course, if it were entirely explained by environment, M3 agenesis would not be an example of evolutionary change at all.
So M3 agenesis is a fascinating example of recent biological change in human populations, and we know very little about how and why it has changed.
Carter, K., & Worthington, S. (2015). Morphologic and Demographic Predictors of Third Molar Agenesis A Systematic Review and Meta-analysis. Journal of dental research, 94(7), 886-894. doi:10.1177/0022034515581644
Just a note that ducks provide many great examples of hybridization dynamics, particularly invasive ducks. This recent paper on geese by Jente Ottenburghs and colleagues (“Hybridization in geese: a review”) shows that they are much the same. Lots of geese species, lots of hybridization, including cross-generic hybridization.
Most hybrid geese are fertile; only in crosses between distantly related species do female hybrids become sterile. This fertility pattern, which is in line with Haldane’s Rule, may facilitate interspecific gene flow between closely related species. The knowledge on hybrid geese should be used, in combination with the information available on hybridization in ducks, to study the process of avian speciation.
“You see ‘African American,’ automatically just circle ‘sickle cell,’” said Nermine Abdelwahab, a first-year student at the University of Minnesota Medical School, recounting tips she’s heard from older classmates describing the “sad reality” of the tests.
Medical school curricula traditionally leave little room for nuanced discussions about the impact of race and racism on health, physicians and sociologists say. Instead, students learn to see race as a diagnostic shortcut, as lectures, textbooks, and scientific journal articles divide patients by racial categories, reinforcing the idea that race is biological. That mind-set can lead to misdiagnoses, such as treating sickle cell anemia as a largely “black” disease.
In an episode of M*A*S*H from the early 1980s, Corporal Klinger starts suffering from a rare side effect of the anti-malarial drug primaquine. The doctors know that the drug has the potential of negative side effects in blacks, but issue it to everyone else. Hawkeye and the other doctors assume Klinger is just goldbricking. But another soldier, Private Goldman, starts to exhibit the same symptoms. The doctors determine that both Klinger and Goldman are suffering anemia, and take them off the primaquine. At the end of the program, it is revealed that people of Levantine origin (like Klinger) and Ashkenazi Jews (like Goldman) also may have the same susceptibility to primaquine side effects owing to their ancestry.
The side effect in question is a breakdown of blood cells and consequent anemia in people with G6PD deficiency, which is indeed very common in sub-Saharan and North Africans, and less common but still notable in people of broader Mediterranean descent.
I like the program quite a lot, and I remember it from the first time it was broadcast. It is a well-scripted way illustration of how a physician can make erroneous assumptions about ancestry and genetics that lead to bad treatment. But it also goes to show that there’s very little new in today’s attempts to improve medical school training with respect to race and medicine. These are all ideas that were well-known more than forty years ago and have been staples of anthropology.
Of course today we can know from anyone’s genotype data whether they have a susceptibility to some adverse drug reactions, and that includes many that do not have much higher frequencies in one population or another. Whatever there is to be said for genotyping, it beats census categories if you are looking to diagnose most common traits influenced by Mendelian genes. If we are training medical students for the world of five or ten years from now, allowing them to make effective use of this information should be the priority.
The article discusses many issues with museums both large and small. The best museums add context to the objects that they display, putting them into a story that builds knowledge in the museum-goer. But some concepts are incredibly difficult to communicate in that fashion, and others rely so much on place that removing objects to a museum does not convey their context accurately.
And the crowds suck. Nothing is better than a huge museum on a very empty day, and those don’t happen very often.
The problem is, museums don’t scale well. The British Museum sees almost 7 million visitors a year. What would it take to accommodate double that number? Ten times that number? It simply cannot be done, not for any reasonable amount of money.
The Internet, on the other hand, is built for scale. The marginal cost of an extra YouTube viewer or app download is practically zero. That’s how a video about the history of Japan can be made for free, distributed for free, and enjoyed for free by more people in a single month than who walk through the doors of the Louvre in an entire year.
Museums are a very important part of human evolution research, both by serving as repositories for the objects we study and for helping the public to understand the importance of our science. I’ve consulted with many museums over the years and have visited a large fraction of the major museums of natural history in the world.
Human genetics is more and more important to how we understand human evolution. Yet this is one of the most difficult parts of science to illustrate in a museum setting. Museums excel at visual material and unique objects. While it is possible to do video or virtual content for genetics, whenever I encounter videos at a museum, I groan. They’re always a chore and rarely hit the mark as well as simple text accompanying an object.
Honestly, I think that museums face much the same problem as movies. Studios invest tremendous sums of money in movies that have bad scripts. There are many reasons for this – sometimes the director has too much power and keeps shifting the script, sometimes the original idea relied upon visuals that cannot be realized, sometimes studio executives ruin a cohesive script by committee. Whatever is the case, the ultimate reason why this situation happens so often is the same: Audience demand for certain kinds of movies is just not very responsive to script quality.
Likewise, public visitation to certain museums doesn’t respond much to the quality of stories they can effectively tell. A museum exhibition with bad videos is regrettable, but most people skip the videos anyway. Especially if the first few seconds of the first one doesn’t connect.
The great thing about the idea of virtual experiences is that they may be laboratories for real innovation in storytelling. The stories that work should be translated into the museums that audiences already value. As someone who has done a lot of museum consulting, I can really imagine having a lot of fun helping make virtual experiences that educate and convey exciting science.
The interesting thing is that many of the flakes are indistinguishable on technical grounds from Oldowan flakes. That raises the possibility that intermittent, possibly local traditions of nut-cracking among some forms of primates might create the appearance of localized flake assemblages.
He also thinks that archaeologists should spend more time looking for the works of ancient monkeys and apes. “[We should] consider what other non-human primates in Africa and elsewhere may have been up to for the past tens of millions of years,” he says. “There is no reason why stone flakes may not be littered throughout primate history, at unknown places and times.”
Actually, I think there is nothing materially different about what these capuchins are doing and what early hominins were doing when they made flakes. One may say that the hominins made flakes “intentionally”, or with a goal in mind to use the flake. But the difference here is not cognitive, it that the capuchins have not learned socially to use flakes for anything.
Primatologists already observe capuchin monkeys learning socially to lift rocks nearly half their body weight, thrust them down onto a platform with a fruit, and repeat this until yummy bits of nutmeat scatter everywhere. It would not take any great cognitive advance for them to learn how to use flakes, if there were something useful for them to do. The ability to generalize is simply the ability to emulate others.
Notable paper: William Hutchison et al. (2016) A pulse of mid-Pleistocene rift volcanism in Ethiopia at the dawn of modern humans. Nature Communications 7, 13192. doi:10.1038/ncomms13192
Synopsis: Many of today’s lakes and volcanic calderas of the central Ethiopian rift were the outcome of a cluster of volcanic activity between 320,000 and 170,000 years ago.
Interesting because: Early examples of Middle Stone Age (MSA) archaeological industries were developing at this time, and possibly the immediate ancestors of most of today’s gene pool were making some of them. Volcanoes might have affected the local environment.
KILL THIS QUOTE WITH FIRE: “current evidence overwhelming [sic] suggests that all major events in hominin evolution occurred in East Africa”. No. Just no.
Bottom line: It is very hard to test how local environmental shifts may have affected hominin populations; either prompting adaptation or creating population sinks. My guess would be that a region full of active volcanoes probably acted either as a sink or a barrier to gene flow. As an extreme, these volcanoes may have impeded migration from the Afar region further south along the African rift during the later Middle Pleistocene, making this fossil-rich region a relative cul-de-sac. However, in my opinion a 30-km zone of unpleasantness surrounding a volcanic caldera is not much of a barrier to mobile and interconnected hominin populations. Ancient people probably looked in wonder at the great forces within the earth, and watched their children played in the snowing ash.
The opinion is often expressed that species and races are arbitrary categories. This opinion is false. If given the opportunity to secure the necessary data, a biologist is able in a majority of cases to decide beyond a reasonable doubt whether the forms under study are distinct species or only distinct races. Lion, tiger, leopard, and domestic cat are species ; Angora cat and alley cat are surely not species but races. However, “borderline cases”, in which it is impossible to decide whether one is dealing with species or with races, do exist. Indeed, their existence was used by Darwin to demonstrate organic evolution. If species are the primordial units of creation, or else if they arise by sudden leaps (as thought by G. St. Hilaire and recently by Goldschmidt), then we should be able to find methods to decide whether any two forms are still races or already species. If, on the other hand, species evolve gradually from races, then the decision will be possible only in some, perhaps in a majority, of cases, but at least some instances must be found in which forms are too distinct to be races but not distinct enough to be species. Evolutionists have concentrated their efforts on proving that such borderline cases do exist; by indirection they conveyed to biologists in general the impression that there are no other but borderline cases.
This essay should be required reading for graduate students: “The problem with p-values”. David Colquhoun writes extensively about science and statistics, and in this essay he brings out many of the biggest misconceptions that drive poor conclusions in scientific practice.
Even quite respectable sources will tell you that the p-value is the probability that your observations occurred by chance. And that is plain wrong.
Paleoanthropology is one field in which papers that point out poor use of statistics are publishable. The primary fossil data are very sparse, and we work hard to establish what little we can say with confidence. There are generally scientists willing to criticize statistically misleading attempts to answer the unanswerable.
Personally, I’ve spent a good amount of time looking into the basic statistical underpinnings of human evolution datasets. For example, my paper “How much can cladistics tell us about early hominid relationships?” showed that the datasets of most hominin species are simply not big enough to yield confident conclusions about how they are related to each other. A later paper, “No brain expansion in Australopithecus boisei”, worked through statistical issues with time-series data on which paleoanthropologists have often based conclusions about trends in morphological features over time.
My all-time favorite paper outlining statistical problems in human evolution research is by Richard Smith, “Biology and body size in human evolution: statistical inference misapplied.” Smith shows a systematic problem with the most common comparisons of ancient human relatives. Now, twenty years after that paper was first published, most papers that consider body masses of fossil hominins still get this wrong.
That’s a well-worn tale in science. Pointing out statistical errors is sisyphean. Many recent papers in human evolution reflect poor statistical practice. And a good number of “classic” results are based on datasets that today would be statistically doubtful. Science is self-correcting, but it is going to take some hard work to get this stuff straight.
Hawks, J. (2004). How much can cladistics tell us about early hominid relationships?. American journal of physical anthropology, 125(3), 207-219. doi:10.1002/ajpa.10280
Hawks, J. (2011). No brain expansion in Australopithecus boisei. American journal of physical anthropology, 146(2), 155-160. doi:10.1002/ajpa.21420
Smith, R. J. (1996). Biology and body size in human evolution: statistical inference misapplied. Current Anthropology, 37(3), 451-481. JSTOR
"We realised nobody had directly compared Neanderthal [teeth loss] to modern humans, so we didn't realise Neanderthals had [slightly less] tooth loss," says Weaver.
This flies in the face of previous studies, which suggested that several Neanderthals lived long after losing all, or nearly all, their teeth.
But bizarrely, the finding that Neanderthals apparently had healthy teeth actually suggests something rather negative about them.
I don’t disagree that there is a slight difference in tooth loss. That does not contradict the real observation that some Neandertal individuals had extensive premortem tooth loss.
I agree that some people have made more cultural conclusions from the observation of tooth loss than the data warrant. I wrote about this back in 2005, when the subject of discussion was total premortem tooth loss in the Dmanisi skull D3444: “Caring for the edentulous”. My conclusions today don’t differ from then:
In the case of life history variation, I think that the survival of a small number of individuals under extraordinary circumstances says little about the habitual capabilities of a species.
A handful of Neandertals lived with fairly extensive loss of dental function, but that probably doesn’t tell us much about Neandertals that we do not already know about many primates.
What I really object to in the linked article is the way this topic is framed: This good thing about Neandertals “bizarrely suggests something negative about them.”
Often in human evolution, research is presented with a simple storyline that goes like this: “This bad thing everybody knows about, well, guess what—it’s actually good when you think about it from the evolutionary perspective.”
That’s the storyline of the sickle cell mutation. It causes disease, but it is an adaptation to malaria. It’s also the storyline of the “thrifty genotype” idea: Diabetes is bad, but its occurrence today may be a side effect of ancient adaptations to food scarcity.
That framing doesn’t erase the bad aspects of such biological traits, but it does give people a different way of thinking about why bad things happen. “Everything bad is actually good” is a fairly useful frame for teaching human evolution.
The opposite storyline is also pretty common. “That good thing that everybody knows about? Well, guess what—it’s actually bad when you think about it from the evolutionary perspective.”
This is how the invention of agriculture gets portrayed nowadays. Jared Diamond famously called it, “The worst mistake in the history of the human race.” The idea is that once upon a time, humans were adapted to a hunter-gatherer existence, and agricultural subsistence caused scores of bad unforeseen effects.
This “everything good is actually bad” storyline is especially common when it comes to studying Neandertals. For example, Neandertals seem to have eaten lots of meat. Does that mean they were successful hunters optimizing resources in a harsh environment? No, it means they failed to build knowledge of plant foods, putting them at extreme risk of extinction when times got tough.
Another example: Some Neandertal sites seem to have different assemblages of tools that may be suited to different functional tasks, for example some toolkits include many scrapers for preparing hides, while others lack such a dominance of scrapers. Is this evidence of clever and flexible Neandertals? No, to some archaeologists it was evidence that Neandertals must have behaved like herd animals, with women and children in some camps, and small groups of bachelor males in others.
Every so often, new evidence convincingly debunks one of these Neandertal stereotypes. For example, over the past ten years, a series of papers describing starches and phytoliths in Neandertal dental calculus have documented their use of plant resources, including cooking of some grains and the possible use of medicinal plants. In this case and many others, the press has reported the “surprising” conclusion that Neandertals were very much like modern human subsistence foragers.
Just once, I would like to see a journalist report such results as unsurprising evidence that past archaeologists were incompetent.
Now I don’t want to go overboard in the opposite direction. There probably really were some strange things about some Neandertals. Culture did evolve, and the evolutionary history of Neandertals probably yielded cultural abilities that humans lack, just as modern humans may have abilities that they lacked. In other words, I do not assume that they were merely modern humans with browridges.
But these are among the hardest ideas to test with archaeological evidence. At the same time, ideas about Neandertal cognitive difference align with persistent stereotypes about Neandertals. For that reason, I maintain an attitude of skepticism.
We know a good amount. Some of the cultural behaviors of recent and living modern humans have never been noted in Neandertal sites. But their tools, the traces of animals and plants that they ate, and their use of space show that Neandertal subsistence behavior had a lot in common with modern human subsistence foragers.
Many modern human subsistence foraging groups have left little or no evidence of “symbolic” artifacts, “complex site structure”, musical instruments, projectile weapons or similar trappings. We now know that Neandertals used pigments, engraved objects and rock surfaces, wore ornaments, made and used many kinds of bone tools, used shellfish, birds and small mammals—basically all things that past stereotypes held they didn’t do.
The point is, the difference between Neandertal and modern human behavior is clearly not a yawning chasm. They overlapped.
Studying Neandertal biological traits in combination with archaeology has a lot to offer to understanding their behavior. Dental pathology is a great avenue to understand how their health relates to their subsistence behavior. For example, Neandertals were once believed to have a much higher incidence of developmental dental pathologies than modern humans, traits like linear enamel hypoplasias that result from stress on the developing teeth from nutritional shortfalls or disease. It turns out that many modern human groups have just as high an incidence of such dental traits as Neandertals, including children from many agricultural groups.
Both Neandertals and prehistoric modern human subsistence foragers have vastly lower incidence of dental pathologies like caries when compared to most agricultural peoples, so it’s interesting to see that tooth loss was actually less among Neandertals than in the prehistoric modern human groups.
Better teeth may reflect the basic fact that Neandertals died faster than the Upper Paleolithic modern humans that followed them. By our best estimates (provided by Rachel Caspari and Sang-Hee Lee), Neandertal mortality was greater across the adult life span. Most known Neandertal dental remains come from relatively young adults, less than thirty or so years old.
Such high mortality probably does indicate something about Neandertal social relationships. Today’s human cultures owe much to the knowledge and experience of older adults, common in human societies. If those older adults were rarer, with some groups lacking older adults altogether, Neandertal cultures must have been poorer for it.
But adults in their twenties and thirties are not poor caregivers today. In fact, they are the primary caregivers toward both children and the most aged adults in our societies.
For that reason, I resist the framing that Neandertals good teeth may have meant they took less care of the sick. It may well have been true that sick Neandertals did not live as long, or have as good a chance of recovery. But I attribute that to the basic challenges of subsistence, not social incompetence.
Caspari, R., & Lee, S. H. (2004). Older age becomes common late in human evolution. Proceedings of the National Academy of Sciences of the United States of America, 101(30), 10895-10900. doi:10.1073/pnas.0402857101
Speth, J. (2004). News flash: negative evidence convicts Neanderthals of gross mental incompetence. World archaeology, 36(4), 519-526. doi:10.1080/0043824042000303692
Local villager Kongo Sakkae found some of the footprints prior to 2006, but the site didn’t reach scientists’ attention until 2008, when Pennsylvania-based conservationist Jim Brett happened to be staying at the Lake Natron Tented Camp, just a few hundred yards from the footprints.
Stunned by what he saw, Brett snapped as many pictures as he could and resolved to pass them along to a scientist he knew he could trust: Liutkus-Pierce, whom he had met when she was a postdoctoral researcher.
The trouble was, Brett picked the worst possible day to call.
“It was April Fool’s Day, I kid you not,” says Liutkus-Price. “He called me and said, ‘I think I have found some really cool hominid footprints.’ And I said, ‘Jim, can you call me tomorrow, so I know that this is not a joke?’”
The article gives a very nice account of the realization that these are fossil footprints, notes that they had been known to some local people earlier, discusses how collaborators were brought into the project, does not avoid mentioning a big problem with one of the researchers, and generally does a great job of showing how science is done.