Neanderthal Museum” by suchosch is licensed under CC BY SA 2.0

I vividly remember my anthropology professor explaining how Raymond Dart is credited with discovering the first Australopithecine in South Africa 1924— a strange primate that walked on two legs some 3 million years ago. Technically, the real discoverers were miners who were instructed to bring anything strange they found to Dr. Dart’s attention during the course of their excavations. After explaining the discovery of the Taung Child as it is called, my professor changed his tone to storytelling mode. Dart wanted to make casts of the spectacular “career-making” fossil in England just in case something catastrophic happened to the actual remains. Dart was unable to go to London at the time, so his wife Dora was responsible for transporting the Taung Child to the lab in London. After arriving in the city, Dora took a cab to her hotel, but accidentally forgot the box with the skull in the backseat of the cab. The Taung Child then got a whirlwind tour of London. Eventually, the taxi driver discovered the box with the skull, assumed it was the head of a baby, and immediately called the authorities. My professor ended the story abruptly by stating that Dart and his wife divorced soon thereafter.

Paleoanthropology

Paleoanthropology is a specialized field in biological anthropology that is interested in human origins. This subject is investigated through two main avenues—the study of ancient fossils and genetics. Paleoanthropologists group ancient species into taxonomic groups (just like modern primates are classified in taxonomic groups) according to their morphology and sometimes their genetic makeup. This is done in order to understand the evolution, or change over time, in the human species.

Fossils are simply the remains of once-living organisms, whose tissues (usually bones and teeth) have been replaced by rock. Fossils only form when the organism was quickly buried and conditions are ideal. As a result, the fossil record represents only a small fraction of the populations that were once living. Paleoanthropologists also group species based on their genetics. This approach is only possible when tissues have not been fully mineralized or fossilized. This sparsity of the fossil record and the lack of preserved genetic material makes paleoanthropology a challenging discipline.

The processes by which species diversify today, as seen in the rock pocket mouse example, were also at play in the past. Mutations occur randomly and processes like natural selection and founder effect lead to diversification in species. These processes can lead to many branches that derive from a common ancestor. Change over time does not happen in a linear step-wise fashion, but rather in a branching fashion. In this chapter, we will look at some of those species that are thought to be ancestral to humans and why they are considered to be in the human family.

The Problem of Time

It was once thought that there wasn’t much time for a lot to happen on planet earth. Many scholars attempted to calculate the age of the Earth and/or people on the Earth using the available texts. Archbishop James Ussher’s calculation used, in part, the genealogies and reigns outlined in the Bible along with known dates of rulers. Ussher calculated the beginning of creation to 6004 BP. These calculations were taken very seriously, and even Sir Isaac Newton attempted an estimate.

While Usher’s estimate was extremely influential, the physical evidence did not align well with this relatively short time frame. In the 1790s, John Frere, great-great-grandfather of famous paleoanthropologist Mary Leakey, excavated stone tools four meters (that’s about 12 feet) below the ground surface in ancient lake deposits in Suffolk, England. How did these artifacts get to be so deep in the ground. he wondered? Going against conventional thought, Frere concluded, “The situation in which these weapons were found may tempt us to refer them to a very ancient period indeed.” A similar situation was coming to light in France. In the 1840s, a customs official named Boucher de Perthes uncovered stone tools associated with extinct elephant remains (mammoth) deep in ancient Somme River gravels in France. Like Frere, he concluded that the human presence in Europe much be much older than previously thought. 

The problem was that people were finding artifacts and fossils in places where conventional thinking said they shouldn’t be. The deeper the geological layers, the older the fossils. These layers and the fossils they contain provide a blueprint for how life changed on earth since its beginnings around 4.5 billion years ago. If the earth’s history were scaled to a year, humans appear in the last hour of the year. We now know based on geology and paleontology that the earth is much older than previous believed, and that humans and our cousins are relatively late arrivals. 

Paleoanthropologists study fossils and genetics to understand the human family tree. “Neanderthal Skulls” is licensed under CC BY-NC 2.0

So what kinds of things do paleoanthropologists look for in the fossil record? Paleoanthropologists of course look for fossils that have features of primates. In addition, they look for morphological traits that stand out as human in the field of other primates. Bipedalism, large brain to body size, a reduced face and changes in dentition (teeth) are features that are unique to humans in the primate order, and so paleoanthropologists use these to identify human ancestors, or hominins. Bipedalism is an especially important feature because it is the earliest to show up in the fossil record. All hominins are bipedal. Other human-like features like large brains, a reduced face, and changes in dentition, appear later in the fossil record. 

Australopithecines

Some of the earliest and best understood hominins were Australopithecines, all of which have so far been discovered in Africa. There are several different species of australopithecines: Australopithicus afarensisAustralopithecus africanus, Australopithecus sebida and several others. Skeletal morphology (the form of the skeleton) reveals that Australopithecines walked on two legs as their main means of locomotion (moving around). No other primate besides humans is habitually bipedal. Chimps, bonobos, and gorillas can walk on two legs, but they are facultative bipeds, meaning the can walk on two legs for short periods. More typically, they propel themselves forward with their back legs, and support the front part of their bodies with their arms on curled-up hands—knuckle-walking. 

“Lucy (Australophitecus)” by 120, CC BY 2.5, https://en.wikipedia.org/wiki/Lucy_(Australopithecus)#/media/File:Reconstruction_of_the_fossil_skeleton_of_%22Lucy%22_the_Australopithecus_afarensis.jpg

The Hips Don’t Lie: Evidence for Bipedalism

The most famous Australopithecine is probably “Lucy”, an Australopithecus afarensis, discovered by Donald Johanson in Hadar, Ethiopia, in 1974. Lucy dates to about 3–3.5 million years ago (mya). She is remarkable not so much for her age, but for the completeness of her skeleton, with 40% of the bones being present. Lucy was about 20 years old and just under 4 foot when she died. Her brain was small (chimp sized), but she was clearly bipedal. Much of her postcranial skeleton (from the neck down) was present. Her pelvis, or hip bones, were especially informative. Lucy’s hip structure was more “bowl-shaped” like humans, and her femurs and knee joints angled inward, as in humans, to maintain a central center of gravity while walking upright. In contrast, chimpanzees have an elongated pelvis and femurs (upper leg bone) and knees that do not angle in. Therefore, the structure of the Australipithecine hip is an indicator of bipedalism.

 

Other indications of bipedalism in Australopithecines include the absence of a divergent big toe, as chimps have. The australopithecine toe is in line with the others making it adaptive to bipedal walking. The foramen magnum, which is the large opening for the spinal cord at the base of skull, is oriented downward, indicating the head sat atop the spinal column. Based on the arm bones, however, Australopithecines may have been partially arboreal.

Note that this foramen magnum is oriented toward the back of the skull because the animal is not bipedal. “Walrus Skull” By Travis is licensed under CC BY-NC 2.0

Another piece of evidence for bipedalism in Australopithecines are the famed Laetoli footprints in Tanzania, discovered in 1978. At Laetoli, numerous animal prints have been preserved in volcanic ash. Following a volcanic eruption, hominins (and other creatures) walked on ash, creating footprints, which were subsequently buried by more ash and preserved. The hominin footprint showed no evidence of quadrupedalism or knuckle walking and were dated to around 3.6 million years ago. In all, 70 hominin footprints created by three individuals were discovered. In 2016, two new trackways at Laetoli were reported (Ichumbaki et al. 2016).

Cast of Footprints, Laetoli Museum”, by Teresa is licensed under CC BY-NC 2.0

The shift to bipedalism is thought to be a response to an episode of drying in Africa in which tropical rainforests developed into savanna/woodland environments. Bipedalism may have been a response to this environmental shift. Several ideas about why bipedalism, which involves a radical restructuring of the body, was favored. Bipedalism entails costs, namely, stress on the lower spine, especially for pregnant females. It must therefore have conferred some advantage given this drawback. Bipedalism may have helped hominins see further, and it could have helped hominins to carry objects, transport food, and make tools. Bipedalism may also have enabled hominins to cool off better in the more open environment. Finally, bipedalism is more efficient than knuckle-walking for traveling long distances, potentially beneficial in a savanna/woodland environment. As it stands, we know that early hominins were bipedal, but are uncertain of the conditions under which this trait was selected for.

Homo habilis in Middle Earth

The first hominin that is sometimes classified into our own genus—the genus Homo—evolved in eastern Africa from Australopithecine populations around 2.1–1.5 million years ago and is called Homo habilis. The term Homo habilis roughly means “handy man” in Latin. Compared to australopithecines, habilis has more human-like features including the cranial capacity is larger and it has a less proganathic (projecting forward) face and a more vertical forehead.

Habilis is thought to be one of the first hominins to make and use stone tools—the Oldowan stone tool tradition, though stone tool cut marks are known from a pre-habilis site (Lovett 2010). Captive chimps have been shown by human trainers how to smash up long bones and use the splinters to puncture containers full of Kool-Aid. Monkeys produce flakes from smashing rocks, but so far, they haven’t used them to cut with. Making and using sharp edges seems to be within the performance capabilities of apes and even monkeys, but applying cutting edge technology in the wild is not in the repertoire. This propensity for making stone tools, is quite a bit more like modern humans than apes. It seems likely that cutting edge production in hominins was conditioned or selected for by a distinctive subsistence pattern, namely a strong dependence on hunting and/or scavenging meat. It is thought that Homo habilis scavenged for meat.

Habilis was fully bipedal, although it had not yet developed the modern human body proportions; it lacks the longer legs and shorter torso adapted for running and striding, that we see in later hominins. There are relatively few postcranial fossils from Homo habilis and different specimens show a range of variation, making it a controversial species. Many researchers lump them together with Australopithecines, considering them all to be a continuous, highly variable species. One paleoanthropologist has suggested that the world of early Homo was like Middle Earth in The Hobbit, inhabited by many very different looking bipedal primates.

Homo erectus

By about 2.0 to 1.8 mya, Homo erectus represents a continuation of anatomical and behavioral trends already present in Homo habilis—increased encephalization, stone tool use, reduced prognathism and teeth, and of course bipedalism. Erectus had a larger relative brain size than habilis, but still smaller than modern humans. Erectus also had a thick skull and massive browridge that continued, shelf-like, across the forehead. 

 
homo erectus  by  BuluLulu on Sketchfab

The biggest change was in the development of its nearly modern postcranial skeleton (that is, body proportions and stature). This is thought to be associated with an adaptation for striding, running and covering large distances during the day while foraging, probably indicating an increased dependence on hunting animals. Erectus is the first hominin to be found outside of Africa. There is variability between erectus fossils, especially between Asia and Africa, and some designate the African erectus as Homo ergaster.

Turkana boy. Note modern body but primitive-looking head.

Our best evidence for reconstructing Homo erectus body build for comes from a fossil known as Turkana boy (or Nariokotome boy) recovered from near the shore of Lake Turkana in Kenya, which dates to about 1.5 mya (Note: some researchers consider Turkana boy to be Homo ergaster, the African contemporary of Homo erectus). Turkana Boy is a near complete skeleton of a young boy, who based on tooth eruption patterns was about 10 or 11 years old at the time of his death. He stood about 5 feet tall when he died, and would have grown to about 5’ 6” as an adult. It is not known what caused his death or how he came to be so quickly buried by sediments and was thereby so incredibly well preserved. Despite his modern body build and stature, his cranial capacity as an adult would have been about 900 cc, about 440 cc less than the average brain size of modern people.

By comparing tibia (the larger lower leg bone) lengths of adult erectus specimens, paleoanthropologists have estimated the average height of Homo erectus to be between 5’5 and 5’7”. This is close to the average height of modern Europeans in the 18th and 19th century. Furthermore, this average height stays stable for about 2 million years, through the appearance of later species. Anatomically modern humans, that is, humans identical in morphology to us, who appeared about 200-300 thousand years ago in Africa, were slightly taller, averaging about 5’10”, which is taller than the average height of most present-day people. The take-home message here is that our modern body proportions and stature evolved early—much earlier than our modern brain size—and has stayed constant ever since. This suggests that this body build was a fundamental aspect of our adaptation to the environment.

Homo erectus emerged during the geological time period marked by a series of ice ages and radical fluctuation in the Earth’s climate. This time period, called the Pleistocene began around 2.5 million years ago and ended around 12,000 years ago. The Pleistocene consisted of a series of glacial periods, periods of long-term reduction in the temperature of the Earth’s surface and atmosphere, resulting in the presence or expansion of continental and polar ice sheets and alpine glaciers, interrupted periodically by warmer interglacials. This cold-warm pattern alternated on a fairly regular cycle about 100,000 years long (see the Milankovitch cycles). The world’s configuration of continents was the same as today, but some land masses were exposed because much of the world’s water was bound up in enormous ice sheets, more than 2 miles thick in places. Ocean levels at one point were 330 feet lower than present levels.

Handaxes

The appearance of Homo erectus coincides with a new, more sophisticated stone tool type called the Acheulean handaxe. These tools could be used for a variety of purposes like digging or cutting. In addition, they could be used to produced sharp flakes of stone that were tools in and of themselves. Acheulian handaxes were widespread and took a sophisticated brain to produce. Kanzi, the bonobo chimpanzee, can produce a simple flake to cut a rope, but no chimpanzee in captivity or in the wild has made anything close to an Acheulian handaxe.

Acheulian handaxes were made by Homo erectus and Archaic Humans. “Acheulian Handaxe and iPhone” by Alex Pang is licensed under CC BY-NC_SA 2.0.
The Swanscombe Hand-axe Sculpture” by mira 66 is licensed under CC BY 2.0

Unlike previous species, fossils and archaeological sites of Homo erectus are found not only in Africa, but also in Europe and Asia. Some erectus fossils have been found on islands in southeast Asia, like Java, which means these hominins could make rafts or boats, perhaps out of the bamboo that grows throughout the region, to cross significant expanses of open sea. They were also able to survive and prevail in temperate climates with extended freezing winter weather in places like England, northern Europe, and northern China, which suggests that they could make and control fire for warmth.

A recent site in Dmanisi, Republic of Georgia suggests the picture of Homo erectus may be more complex than previously thought. Georgia is a small country between Russia and Iran on the east coast of the Black Sea. Here, in Pleistocene fossil deposits, five well-preserved Homo erectus crania were discovered around 2010 (Skull 3). The Homo erectus skulls at Dmanisi are relatively small in size. One of those skulls (Skull 5) looks very primitive, meaning more like earlier fossils, with a very small braincase (346 cc) and prominent canines, more like Homo habilis than erectus. What’s more, the associated stone tools are of the earlier Oldowan industry, not the more developed Acheulean industry typical of Homo erectus. The tools at Dmanisi were associated with butchered bones of large grazing animals typical of a savannah, or grassland environment. The site dates to as old as 1.8 mya. The discoveries at Dmanisi mean that genus Homo ventured out of Africa much earlier than had previously been thought, and at a much more primitive level of development with respect to body build, cranial capacity and tool kit.

The variation at Dmanisi is surprising. Had the fossils skulls been found at different sites, they may have been designated as different species (SciNews 2013). This points to an ongoing problem in assigning fossils to taxonomic categories, namely, it is hard to know whether a fossil represents a single species or variation within a single species. 

Skull 5 at Dmanisi.
The site of Dmanisi is a Medieval site and a lower Paleolithic site.

Archaic Humans

Around 500,000 years ago, larger-brained hominins evolved from Homo erectus populations in Africa and subsequently spread into Europe and Asia. These hominins are collectively called Archaic Humans and represent several species: Homo heidelbergensis, Neanderthals, Denisovans. Archaic humans are identified by large brains, reduced robustness, reduced prognathism (jutting out of the face), and reduced postorbital (behind the eyes) constriction. Archaic humans continue using the Acheulean handaxe, and also developed their own technological innovations. Tool finds include hafted flaked stone points on spears (from South Africa) and finely crafted wooden spears found preserved in a coal deposit in Germany—both from about 350–400,000 years ago. There is direct evidence that Archaic Humans had control of fire and used prepared hearths for warmth and cooking.

Perhaps the most well-known Archaic Human is the Neanderthal. Neanderthals are often classified as Homo sapiens neanderthalensis, a subspecies of humans. The first Neanderthal was found in a quarry in Feldhofer Cave in the Neander Valley of Germany. Neanderthals were more robust than modern humans, shorter and stockier, likely an adaptation to living in Pleistocene conditions. Neanderthals are mainly known from Europe and the Middle East from around 300,000 years ago to around 40,000 years ago. Neanderthal colonization of Europe was facilitated by the control of fire. Compared to other hominins, we know a lot about Neanderthals, and have even sequenced their complete genome, the complete set of an organism’s DNA (Prufer et al. 2014). Another variety of Archaic human from Siberia, the Denisovans, is known only a finger bone and two teeth and was identified through analysis of DNA.

Neanderthals had large brains, somewhat larger than modern humans. The morphology of the Neanderthal skull is, however, distinct. Instead of the high, domed skull of humans, Neanderthals had long and low brain cases and thick brow ridges. Neanderthal faces were more prognathic (projecting forward), their faces and teeth were larger than modern humans. Early Neanderthals used Acheulian handaxes, but later developed more sophisticated stone tools and techniques. They also very clearly controlled fire as evidenced by the remains of hearths.

An early Neanderthal site is Sima de los Huesos, the “Pit of the Bones”, located in northern Spain. This site is important because it is the largest collection of fossil hominins in the world with 28 early Neanderthals represented in the collection (Calloway 2016). At 400,000 years old, it is also the oldest fossil site for which we have hominin DNA (Meyer et al. 2016). The site is literally a pit inside a cave, and excavators must rappel down to access it. Sima de los Huesos is also important because some argue that the deposition of the people in the pit was intentional and perhaps even ritual in nature, a definite sign of modern human behavior. A single Acheulean handaxe, dubbed “Excalibur”, was also found among the skeletal elements. As discussed in Chapter 2, ritual behavior is not common in any species besides humans. Paleoanthropologists are interested in looking for traits that are quintessentially human, (like ritual, large-scale cooperation, language and other symbol systems) in order to investigate when humans became ourselves, behaviorally speaking. Others argue that the skeletons at Sima de los Huesos were washed in naturally. The debate over natural deposition versus intentional is a common thread in paleoanthropology and continues to be a source of debate.

A 2015 find is even more mysterious that Sima de los Huesos—Rising Star Cave in South Africa (Berger et al. 2015). Paleoanthropologist Lee Berger put the call out through social media for small paleoanthropologists for an hominin excavation. The six women who signed on had to pass through a 7-inch wide opening to access the finds. The research resulted in the recovery of more than 1,000 hominin fossils and at least 12 individuals—the largest collection of a single species of hominin in Africa. The skeletal elements are human-like in many respects and more primitive in others. Most notably Homo naledi, as the species has been called, had a very small braincase. The braincase and postcranial skeleton are more akin to Homo erectus and even earlier fossils.

In 2017, another chamber was discovered with three more Homo naledi individuals (Hawks et al. 2017). Because the chambers are so difficult to access even today with modern equipment and lighting, the researchers suggested that the deposition might be ritual in nature. Given the tininess of the brain of Homo naledi, others are skeptical. At first, the date of the site was unknown, but in May 2017, the researchers published the surprisingly late dates of between 200,000 and 300,000 years using multiple dating techniques (Dirks et al. 2017).

The other significant aspect of the Homo naledi find was how it was brought to the public. Hominin finds are so rare, that often the public doesn’t hear about them until years later, and the researchers do not share the fossils with others until their own analysis is completed. With the Homo naledi find, the excavation was “live-tweeted” on Twitter and the analysis published in an open-source journal called eLife. In addition, 3D models of Homo naledi are available for free download. As anthropologist Kristina Killgrove explains how unusual this is, “In the past, fellow researchers and teachers would have to wait multiple years—and pay hundreds of dollars—to get a cast of the new fossil” (Killgrove 2015). Berger and his team ushered in a new era of accessibility by making the Homo naledi find and all of its data accessible to researchers and the public.

Homo naledi is an important recent find. “Fossil discovery announcement, 10 Sept 2015” by GovernmentZA is licensed under CC BY-ND 2.0

We know, for instance, that Neanderthals buried their dead. At Shanidar Cave in the Zagros Mountains of Iraq there are 10 Neanderthal burials in a cave. While there are no grave goods accompanying the dead, the skeletons are fully articulated, or put together. If the dead were not buried, we would expect scavengers to take away skeletal elements, such that only fragments would be left behind. One recent burial at Shanidar is a woman resting her head on her hand, positioned as if asleep. In other cases, like La Ferrassie in France, the bodies of Neanderthals are oriented east to west and head-to-head suggesting intentional burial. Like Sima de los Huesos, Early pollen samples from Shanidar Cave suggested that flowers might have been laid in the graves, but more recent research indicates the pollen was likely brought in by bees. Importantly, the flowers in the pollen samples bloom at different times of the year. Neanderthal burial is suggestive of behavior that is like that of modern humans, and different from what we see in chimpanzees. 

Humans regularly care for others who are injured, sick, or elderly. Sometimes we even support others we don’t know through charities like GoFund Me. Another intensely social species does this as well–ants. Matabele ants (Megaponera analis), will drag their injured back to the nest where their wounds are licked by “nurse ants.” Ants who are treated in this manner have a 90percent survival rate (Fox-Skelly 2018). There is evidence that Neanderthals did the same. Shanidar Cave and other Neanderthals sites have skeletons of older and infirm people—some with evidence of trauma, blindness, and toothlessness. It is thought that Neanderthals cared for others that could not have easily cared for themselves based on the healing of bone injuries. Care for the injured and infirm suggests human-like compassion. That said, in some human forager groups, older people are abandoned or even killed outright. In other traditional societies they are highly valued, and having grandparents in linked to higher survival rates for children. Jared diamond describes Papua New Guineans that chew up food so that older, toothless elders are able to eat (Diamond 2013). In contrast, some have written about how elderly people have been abandoned in the wake COVID-19, much like some traditional societies abandon their elderly. At Standing Rock in North Dakota, the loss of elders to COVID has been likened to a “cultural book burning.” As one tribal member put it, ““It takes your breath away. The amount of knowledge they held, and connection to our past” (Healy 2021).

There is another side of the coin, however. At the site of Krapina in Croatia (and other sites), there are Neanderthals bones that clear signs of stone tool cut marks—what look like butcher marks. We of course don’t know the exact circumstances behind these finds, but all the evidence points to Neanderthals cannibalizing other Neanderthals. But humans also practice cannibalism with regularly and often for ritual purposes. For the Yanomami of Brazil, the soul can only rest after death if it burned and then consumed (Ukiwe 2018).

 

 

Genetic analysis indicates Neanderthals did not evolve into anatomically modern humans (AMHs). Some previous hominin, perhaps Homo heidelbergensis, evolved into modern humans in Africa. From there, modern humans spread out of Africa and lived at the same time as Neanderthals. Genetic analysis indicates that some modern people have distinct Neanderthal SNPs. This likely means that early modern humans interbred with Neanderthals, and some modern people still carry this Neanderthal legacy. If you have non-sub-Saharan ancestry, then you likely have some Neanderthal SNPs too. I know I do.

 

Fur, Fire, Sweat

We have compared and contrasted humans with other animals in a previous chapter. A question that paleoanthropologists ask is “How old are these traits?” and “How did they come about?” For instance, paleoanthropologists are interested in when complex tool-making and hunting began, when and why big brains arose, when hominins began to use fire and when we lost our body hair.

Hunted Foods as an Important Resource

Hunter-gatherers known from the recent historical record clearly depend heavily on hunted foods—meat—for sustenance. Is this pattern a defining feature of hominins from the very beginning, or did it develop relatively recently, for example with the appearance of anatomically modern humans? This has been the subject of heated debate among archaeologists and paleoanthropologists. One reason the debate is so important is that hunting is seen as a key factor that makes us human.

No other primate relies significantly on hunting for subsistence. Chimpanzees, the ape most closely related to humans, hunt, kill, and eat small animals—red colobus monkeys are their preferred prey—but meat makes up at most 3% of their diet in terms of caloric intake. What chimps eat the most is ripe fruit—in some parts of their range, over 50% of chimps’ caloric intake comes from ripe figs. In contrast, among modern hunter-gatherers, hunted foods take up between 40% and 60% of the diet on average depending on the environment. In Arctic environments, meat and fish can take up nearly 100% of the total diet—there are literally no plant foods to collect and eat during most times of the year in the Arctic. 

Based on stone tool cut marks on animal bone, Homo habilis appears to have been a scavenger of meat some 2 million years ago. The earliest evidence for direct hunting, as opposed to scavenging, comes from the site of Schöningen in Germany (Kouwenhoven 1997). Here there are 400,000-year-old remains of fore-hardened wooden spears alongside butchered horses. So, we know that the pattern of heavy reliance on hunting has deep roots going back hundred of thousands of years. The spears represent not only sophisticated projectile technology, but also very likely cooperation and coordination among a group of hominins for a common purpose.

In addition, hunting had affects on hominin biology, especially the brain. Meat, especially cooked meat, could have helped to fund the energy-expensive human brain. Big brains also come with a big price tag. Big brains require a surprising amount of energy, up to 20% of the human energy intake and more for developing infants. Meat fats provide a concentrated sources of calories, which can be used to fund the cost of brain growth. Gorillas, who are mainly vegans, cannot even get enough calories from their diet to grow a bigger brain. As anthropologist Greg Downey puts it, there just aren’t enough hours in the day for a gorilla to grow a bigger brain. Evidence for scavenging and hunting in the paleoanthropological record is of interest to paleontologists because it points toward the human pattern of increased energy input to offset brain growth. Another important source of energy for early hominins is fire.

Fire and Cooking Made Us Human

Today, we humans are totally reliant on fire and every culture cooks food. Fire is basically a chemical reaction between carbon (like charcoal) or carbon-containing materials (like wood or grass) and oxygen; the reaction releases heat and light that can be used by human in virtually every life-sustaining activity we engage in. Fire also releases carbon dioxide, a greenhouse gas. It can be thought as the first step in our ability to alter the environment on a global scale. The early discovery of the control and use of fire is probably at the top of anyone’s list of crucial developments that made us what we are. Fire was a new and powerful means of harnessing energy from the environment, and can be considered a kind of tool. Anthropologist James C. Scott puts it this way in his book Against the Grain; In the lower Homo erectus levels, there is evidence that big cats were preying on hominins. In the upper levels where there is evidence for fire, the pattern gets reversed. Fire, he argues, enabled hominins to become predator rather than prey. Scott thinks of it as being domesticated, in the sense of being tamed. 

Although the advantages of fire use seem obvious to us now, the question of what fire actually did for its earliest users—what were its specific adaptive advantages—is an interesting one. On the first pass, most people would point to warmth and protection from predators as likely benefits of fire for early humans, and there is little doubt that this was the case. Fire also would have allowed hunter-gatherers to intensify, get more food out of an area of land, by hunting with fire and increasing productivity of the landscape by burning. Fire might also have allowed humans to colonize new areas by using fire to make them more productive, as well as providing safety from new potential predators.

In his book Catching Fire (2009), Harvard anthropologist Richard Wrangham argues that the use of fire for cooking food in order to make it more digestible was a turning point in the development of the modern human brain, physique, and way of life. Fire, in a sense, is a way of “outsourcing” your teeth and gut, by doing the work of digestion literally outside your body. The chimpanzee gut by comparison is about three times the size of a human’s, and the chimp has to expend more energy digesting food. In effect, cooking allowed hominins to eat far less food and to eat a wider range of foods. Cooking food, Wrangham points out that cooking increases the proportion of nutrients digested and reduces the energetic costs associated with digestion. This extra energy could be put toward a new project—building a larger brain. Other primates, like gorillas, simply can’t get enough energy from their environment to channel toward building a larger brain. There’s not enough hours in the day to eat enough leaves to support the body and have some leftover for funding the brain. Fire and cooking, Wrangham argues, allowed hominins to break through that energy barrier and fund the high cost of an expanding brain. Wrangham further points out that humans do not thrive on raw food diets, and non-essential functions like reproduction shut down on strict raw food diets.

Hominins had fire before totally modern humans came on the scene. The earliest evidence for the use of fire by hominins appears to be at Koobi Fora, a site in northern Kenya associated with Homo ergaster/erectus. Here, baked, reddened earth and charcoal particles are found next to stone tools lying on a buried surface dating to about 1.6 million years ago. Burnt bones of savanna grazing animals are found in association with Oldowan tools and fossils of Homo ergaster/erectus at a site in South Africa called Swartkrans Cave. Both of these contexts are considered to be fairly compelling evidence for the use of fire by Homo erectus, although there are still some debate over whether the fires could have been caused naturally. It can be difficult to distinguish intentional fires from natural ones, especially early on in the record (King 2017). It is also hard to know whether people produced the fire or whether they acquired from a natural one. Wrangham is cautious about the early dates for fire, although his hypothesis about the importance of cooking in human evolution would seem to require that the control of fire should appeared at about the same time as hunting.

Aside from warmth, protection, a tool for getting food, and a way increase energy intake, fire likely brought people together. Hunter-gatherers often congregate around fires to eat together and tell stories. And often fore has a spiritual aspect. Among the Eveny reindeer herders of Siberia it is common practice to offer vodka to the fire when entering a new camp. This practice is called “feeding the fire”. As an older Eveny woman put it, “The fire is the foundation of life, we feed. we warm ourselves, we’re nourished with its help.” (Vitebsky 2005:86)

Sweating in Out

Sophisticated tools, big brains, fire, and bipedalism are all fairly obvious features of humans that can be traced back in the paleoanthropological record. Somewhat less obvious is another feature of humans that stands out among primates and that is our “furlessness” and our “sweatiness”. Humans are the sweatiest of all the primates, which enables us to stay cool in very hot environments. Our sweat glands are also more watery, a better cooling system than that of other mammals. Most mammals cool themselves not by sweating but by panting, losing heat through their wet tongues and lungs. With our dense and watery sweat glands, humans can cool off more effectively and have greater endurance in mid-day heat than other mammals. This is especially important because of our large brains. As skin cancer specialist Sharad Paul (2016) points out, just like computers, “larger brains need larger cooling systems.” With enough persistence, a human hunter can literally run a grazing animal like a deer or antelope to death in mid-day heat, thanks to our sweatiness. But of course, most foragers rely on subtler methods, like poisoned darts.

According the anthropologist Nina Jablonski (2010), our dense watery sweat glands was an essential adaptation to the more open savanna/woodland environment in Africa that Homo erectus occupied. Like modern humans, Homo erectus would have avoided confrontations with predators by hunting and scavenging during the heat of the day when lions and leopards are asleep under the trees. In order to effectively cool off, an increased ability to “sweat it out” was selected for.

The selection for sweat glands came with another related adaptation, our lack of fur. A thick coat of body hair or fur reduces the cooling effects of sweating, and thus to be effective, sweat glands do better on naked skin. Jablonski argues that furlessness is a response to the need for sweat glands. Humans babies do have a covering of fur called languno in utero, but lose it shortly before before birth, and we are born naked and furless. 

The final consequence of our water sweat and furlessness is skin color. Chimpanzees, beneath their fur, have light skin, not needing further protection from the sun’s damaging UV rays. But if you are furless, protection is needed. Jablonski argues that the loss of fur precipitated another adaptation—the selection of darker skin to protect against damage to skin by ultraviolet light and loss of folate, which can lead to birth defects. In effect, the shift to a more open environment created a cascade of consequences, sweatiness, furlessness, and increased pigmentation. 

Anatomically Modern Humans

Anatomically Modern Humans (AMH’s) are morphologically distinct from previous Archaic Homo sapiens and Homo erectus forms. AMH’s average cranial capacity is slightly smaller, the brow ridge is reduced, and the faces are smaller. AMHs have a more gracile (less robust) cranial and postcranial skeleton. The modern human skull has a high vertical forehead and the face is situated below the frontal braincase (less prognathic). Anatomically modern humans likely arose from some form of Archaic human living in Africa, though we don’t know which one.

Anatomically modern humans (that is, people who look like us) likely arose in Africa between 200,000 and 300,000 years ago. This is indicated by the fossil evidence and genetic evidence. First, some of the earliest AMH fossils are found in Africa at the sites of Omo and Herto, and more recently at 300,000 years old the site of Jebel Irhoud in Morocco. Secondly, genetic anthropologists have looked at DNA from people around the world in order to investigate where early ancestors originated. But the geneticists did not focus on nuclear DNA, the DNA we usually talk about that we inherit from our mother and father. Rather, they looked at mitochondrial DNA. Mitochondrial DNA (mtDNA) resides in the mitochondria of cells. Mitochondria help cells to use oxygen and produce energy. Cells that require a lot of energy, like muscles cells, have many mitochondria. Because there are multiple mitochondria in each cell, there tends to be many more copies of mtDNA than nuclear DNA, making it ideal for studying ancient genetic variation.

Unlike nuclear DNA with its 3 billion nucleotide bases, mitochondrial DNA has only 16,500 bases, so it is much smaller. Like bacterial DNA, mitochondria has a circular chromosome. Also, in contrast to nuclear DNA which is derived from the mother and father, mitochondrial DNA is only inherited through the mother. So you, whether male or female or non-binary, have your mother’s mitochondrial DNA. Another difference is that mitochondrial DNA is not as good as nuclear DNA at checking for mistakes when cells divide. For this reason, mitochondrial has 20 times the mutation rate as nuclear DNA. 

Because of these frequent mutations and simple inheritance, we can look at mitochondrial DNA and tease apart ancestry. If living people have identical mutations, then it is very likely they are related through their mothers. By the same logic, mitochondrial DNA can be used to connect living people to people from the distant past. Scientists also use what is called a molecular clock (Alex and Moorjani 2017). First, the number of mutations that arise in a single generation are calculated. Then, if you compare two people you can estimate the time elapsed since they shared a common ancestor by counting the differences.

 

Syrian Hamsters to the Rescue!

One early concern in linking modern people to past people through mtDNA, was that the mutation rate of mtDNA might be too fast, and therefore too messy to be of any use. Surprisingly, it was shown through domesticated Syrian hamsters that the mutation rate was not too fast. It turns out that all domesticated Syrian hamsters are descended from a single female hamster collected from Aleppo in the 1930s. If the mutation rate was very fast then it would expected that the mtDNA of domesticated hamsters would vary a lot, and it would not very useful for sorting out ancestry. After analyzing the mitochondrial DNA from feces from widely distributed domesticated Syrian hamsters (yes, they collected hamster poo through the mail), they found the DNA was identical! Since Syrian hamsters have been domesticated only since 1938, the researchers knew the mutation rate wasn’t too fast and could potentially be used to determine ancestry.

Domesticated Syrian hamsters are descended from a single maternal ancestor as demonstrated by the mtDNA. “Leonard” by Sweet-Rainb0w is licensed under CC BY-NC-ND 2.0

A Revolution in Ancient Genetics

One of the first applications of the idea of connecting modern people with people living in the past comes from the Romanovs, the Imperial Russian family. The Romanovs—the tsar, tsarina, and their five children— were murdered on July 16, 1918 by the Bolsheviks. Following the Revolution, the location of the remains of the Romanovs was a mystery. In 1991, in the Russian Ural Mountains, skeletons fitting the description of the Romanovs came to light. But was it them? Researchers tracked down living descendants of the tsar and tsarina, and compared the SNPs of their mtDNA to the tsar and tsarina SNPs. Both were matches. The fact that both the tsar and tsarina’s mtDNA matched modern relative’s mtDNA, along with the other material evidence, meant this was a closed case; the mystery of the Romanovs had been solved through the help of mtDNA. (Incidentally, Tsarina Alexandra was the grandchild of Queen Victoria, who spread a SNP for haemophilia to the tsarina and her son Alexei). Mitochondrial DNA also helped solved the case of Richard III, who was buried in an unknown location. Richard had scoliosis, so when a body with scoliosis was found beneath a parking lot in Leicester, England in 2012, in a plausible location for Richard’s burial, the body’s mitochondrial DNA was compared to known a descendant of Richard. They were a perfect match (King et al. 2014).    

The bodies of the Romanov family were identified through mitochondrial DNA. Public domain.

Out of Africa

Analysis of mtDNA can be used to investigate broader patterns of human migration. The region with the greatest mtDNA variation is likely the place where humans originated, based on the idea of increased time for new SNPs to develop in the mother population. In other regions, we should see SNPs that are derived from the original mother population, based on the Founder Effect. When mtDNA was analyzed from people around the world, the region that had the most variation was Africa. Biologist Lewis Spurgin (2013) explains why this is important:

“Travelling across unknown lands is a dangerous business, so we would expect ancient humans to avoid travelling a long way except when they needed to, and they probably didn’t travel en masse. Instead, when humans colonised new lands, it was probably a few intrepid explorers looking for new pastures. And the DNA evidence bears this out. Across the world, we see ‘bottlenecks’ at the genetic level — signatures where a small group of individuals, carrying a relatively small number of genetic variants, have set up new colonies. This is the key to understanding why the most genetically diverse human population can be found in Africa, while the populations of further migrations are descended from a much smaller stock of brave (or desperate) migrants.”

Both the fossil evidence and the mtDNA evidence point to Africa as the origin of modern humans. From there, humans spread to other parts of the world encountering Neanderthals, Denisovans and likely other Archaic Humans. It is not known why no archaic humans are left to tell their side of the story. Did modern humans wage war with them? Did they out-compete them? We currently don’t know why we are the only representative of the genus.

Back to: Anthropology: Being Human > Anthropology Table of Contents