The evolution of the human diet is an important research topic within physical anthropology and nutritional anthropology. It involves evidence drawn from human biology, nutritional science, the paleoanthropological analysis of hominin fossil remains, and comparative studies in primatology. Key issues that have been investigated to date include the functional relationship of dentition and craniofacial anatomy to diet, behavioral adaptations to diet (such as the use of tools and fire), the metabolic consequences of increased encephalization, and the relative evolutionary importance of meat-eating. Ancient hominin diets are inferred through a wide range of techniques, such as biomechanics, dental microwear analysis, stable isotope analysis, and paleoenvironmental reconstruction. Historically, much of the research has involved comparing the diets of the australopiths Paranthropus robustus and Australopithecus africanus.
The human diet differs from that of other living primates in several important ways. First, humans are highly omnivorous, exploiting a wide range of plant, animal, and fungal foods (although they do not tend to consume plants high in cellulose, unlike some primates). Second, the human diet is comparatively high-quality, or dense in energy and nutrients. Finally, there is not just one human diet, but instead a very wide range of diets situated within a highly diverse set of environments. Although all humans share the same broad dietary requirements for calories, macronutrients, and micronutrients, various populations have discovered or invented significantly different strategies for meeting those requirements.
This creates a conceptual problem for the evolution of the human diet: if modern humans, all of whom are morphologically and physiologically very similar, consume such a wide range of foods, then how is it possible to specify the particular diet of earlier humans? It is certainly unlikely that any extinct hominin species consumed only a very circumscribed set of specific food items. However, it is still true that the combination of a wide variety of analytical techniques, such as those outlined below, can reveal the most important mechanical, nutritional, and ecological characteristics of hominin diets to a high degree of confidence.
The reconstruction of ancient hominin environments provides a valuable foundation for inferring hominin diets, since diets are necessarily shaped and constrained by environmental factors. For example, if an environment is characterized by dry, open grasslands, then it is unlikely that the diet would have contained a large proportion of fruit.
An analysis of extant primate diets can provide clues to the evolution of the human diet, since humans are themselves primates. Additionally, the diet of modern humans living in small-scale foraging societies may be similar to that of the earliest humans. Although such comparisons do not provide direct evidence of hominin diets, they do suggest potentially useful research strategies.
Much of the comparative research has been conducted by nutritional scientists, rather than by physical anthropologists or paleoanthropologists. Many of these studies begin from a standpoint of nutritional epidemiology, and seek to contrast modern industrial human diets with those of earlier societies, or contemporary societies that continue to consume traditional diets. Typically, they emphasize the nutritional benefits of a diet characterized by plentiful fiber, large proportions of fruits and vegetables, and significant quantities of animal foods, but with low levels of fat (particularly saturated and omega-6 fatty acids). Such a dietary composition is thought to be typical for most traditional hunter-gatherer societies, and indeed for all preagricultural societies. From an evolutionary standpoint, the most salient findings of these studies involve the importance of wild animal foods in the human diet, in contrast to the diets of other extant primates.[1]
The physical qualities of the masticatory apparatus - teeth, jaws, and related cranial features - can be used to infer the mechanical properties of the diet. However, most morphological evidence is indicative of what an organism could eat, rather than what it necessarily did eat. Also, in evolutionary terms an organism's morphology is strictly adapted to its ancestors' environment, rather than its own.
Another aspect of morphology directly related to diet is the gastrointestinal tract (GIT). Digestive systems designed to efficiently digest meat, for example, look quite different from ones designed to process large volumes of fibrous plant matter. Intriguingly, modern human GITs appear quite carnivore-like in comparison to extant great apes, a fact which may be related to issues of bioenergetics and brain metabolism. Unfortunately, it is difficult to trace the morphological evolution of the human GIT, because soft tissues do not readily fossilize. However, it may be possible to infer some properties of hominin GITs from other aspects of thoracic morphology (such as the size and shape of the ribcage).
The size and shape of an animal's mandible, in addition to other aspects of cranial morphology related to mastication, provide a ready indication of the animal's diet. Simply put, a set of large, sturdy, jaws attached to an equally robust cranium indicate the ability to exert powerful bite forces, which in turn implies a diet of quite hard or tough foods. Less robust craniofacial anatomy, on the other hand, implies a softer diet.
This relatively straightforward inference was the basis of the earliest reconstructions of australopith diets. The "robust" species (such as P. robustus), with their large masticatory complexes, were interpreted as herbivorous hard- and tough-food specialists. On the other hand, the "gracile" species (such as A. africanus), with their relatively smaller masticatory complexes, were interpreted as omnivorous soft-food generalists.[2]
A number of more in-depth studies have employed techniques drawn from physics and engineering to analyze the stress loads and force tolerances of primate and hominin mandibles. Such studies require complex calculations of leverage, torsion, bone mass distribution, and other factors to determine the functional capacity of various jaw morphologies.
Biomechanical analyses of craniofacial morphology have produced several results contrary to early morphological arguments. For example, it appears that P. robustus and A. africanus both possessed jaws with greater bending indices (the ability to accommodate bending forces) than extant apes and humans, yet the two australopiths were indistinguishable from one another in this respect. In other words, there is currently no evidence that the jaws of P. robustus were regularly subjected to different biomechanical stresses than those of A. africanus. On the other hand, other aspects of cranial morphology relating to muscle attachments, as well as dental proportions and patterns of dental microwear, do indeed distinguish the diets of these two taxa. This suggests that mandibular morphology provides only limited insight into diet, even if there is indeed a functional linkage between them.[3]
The size, shape, and configuration of teeth are strongly indicative of diet. Teeth designed to fracture hard, brittle foods, for example, look quite different from those designed for shearing tough, elastic foods. Furthermore, food items often cause microscopic pits and scratches on tooth enamel, and the structure of these features is determined not only by the physical properties of the food, but also by the mechanics of mastication and external processing.
The most straightforward way of assessing diet through dental morphology is to simply compare the relative sizes of the anterior and posterior teeth. In extant primates, species with relatively small incisors tend to specialize on leaves or small fruits, while species with relatively large incisors commonly eat larger fruits. Also, relatively large molars indicate a tougher, lower-quality diet (i.e. leaves), while relatively small molars reflect a softer, higher-quality diet (i.e. fruits). In light of these dental patterns, the aforementioned early investigations of australopith diets again concluded that P. robustus was specialized for tough, low-quality plant foods, while A. africanus consumed a more balanced, high-quality omnivorous diet.
Another way of quantifying dental functional morphology is to compare the shearing quotients (SQs) of primate molars associated with different diets. Shearing quotients are a measure of mesiodistal crest length compared with occlusal surface length. A high SQ indicates a more "pointed" or "jagged" dental topography, while a lower SQ indicates a "flatter" tooth. Comparative primate studies show that folivores and insectivores exhibit higher SQs than do frugivores, and that within frugivores, soft-fruit feeders have higher SQs than do hard-object feeders. SQ analyses have indicated that both P. robustus and A. africanus both had relatively flat, blunt molars compared with extant frugivores, but A. africanus had the higher SQ value of the two. This suggests that neither australopith could have had a very tough diet, but A. africanus possessed greater shearing ability.[4]
A problem with this technique is that it requires relatively unworn teeth - once crests or other "landmarks" have worn down, it is no longer possible to accurately calculate the SQ. Since most fossil teeth are indeed worn, the technique is therefore of limited utility for analyzing the diets of extinct species.
This problem can be surmounted through dental topographic analysis, a recent technique that involves scanning a tooth in three dimensions, and analyzing the resultant data through geographic information systems (GIS) software. The advantage of this technique is that it analyzes the entire occlusal surface of a tooth, rather than specific "landmarks." Thus, it can be used with equal effect on both worn and unworn teeth. Dental topographic analysis provides information on the total surface slope, aspect, area, angularity, and other topographic attributes of the tooth surface. These attributes, like SQs, can then be used to characterize functional aspects of occlusal morphology. Comparisons between Australopithecus afarensis and early Homo, for example, indicate that A. afarensis had flatter molars with less occlusal relief, and therefore would not have been able to fracture tough, elastic foods as efficiently as early Homo.[4]
Dental morphology strictly indicates only what an animal was capable of eating, rather than what it actually did eat. One way of addressing this problem is dental microwear analysis: the examination of microscopic pits, scratches, and other defects on the occlusal surface of a tooth. Microwear patterns reflect the physical properties of the foods actually consumed by an animal. Hard objects, for example, tend to cause pits and dents, while leaves and other tough objects tend to cause scratches and striations. Furthermore, the orientation and organization of striations on the incisors can reflect the preferred direction of food movement across the incisors.
Analyses of P. robustus and A. africanus molars indicate a high level of microwear variability within each species. However, P. robustus specimens generally display more pitting and more microwear overall, supporting the conception of P. robustus as a hard-object feeder and A. africanus as a soft-fruit eater. Additionally, A. africanus shows a greater density of incisor microwear features, indicating a heavier emphasis on incisal preparation of food than P. robustus.
Several confounding factors complicate dental microwear analyses. First, it is important for researchers to exclude teeth that have been artificially worn by taphonomic processes. While most specimen selection protocols are quite stringent, there is always a possibility that observed microwear patterns might be the result of non-dietary variables. Second, the inclusion of grit or other foreign matter in the diet may cause microwear in addition to that caused by food items themselves. Similarly, if hominins practiced any extra-oral food processing with tools or fire, they could have significantly altered the normal microwear properties of their diet. Finally, individual microwear features are not permanent, but instead only exist for a few days. Thus, the microwear patterns observed on a fossil tooth strictly reflect only what the animal consumed during the last few days of its life. If the animal died because of illness, age, or injury, it is quite possible that the composition of its "last supper" could have been substantially different from its normal diet.[5]
A powerful method of assessing diet is the direct chemical analysis of bones, teeth, and other remains. While the morphology and habitat of an extinct species may provide strong indications of what it probably or usually ate, chemical analyses provide direct evidence of what it actually ate. This is because the chemical properties of an ingested food item are reproduced with high fidelity in the tissues of the animal that consumed it.
It is known that mammal metabolisms discriminate against strontium in the diet, in favor of calcium. Since plants contain both elements, an herbivorous mammal would be expected to express a lower strontium-calcium (Sr/Ca) ratio in its tissues than the plant. If the herbivorous mammal is then itself consumed by a carnivore, the carnivore would be expected to express an even lower Sr/Ca ratio, since its metabolism would further discriminate against the strontium. Thus, Sr/Ca ratios can provide indications of trophic level: high values signify a diet high in plant tissues, while low values signify a diet high in animal tissues. Sr/Ca analyses have indicated that the diet of P. robustus was quite diverse and probably included significant quantities of animal foods, in contrast to earlier morphological reconstructions that suggested a primarily herbivorous diet.
There are several disadvantages to the Sr/Ca technique. First, since bone tissue is approximately 30% organic, it is very susceptible to diagenesis. The Sr/Ca ratio and other chemical properties of a fossil specimen may therefore be quite different from those of the original bone. Second, there are notable exceptions to the expected Sr/Ca ratios of certain plants and animals. For example, some browsing herbivores exhibit unusually low (carnivore-like) ratios, because the leaves on which they feed contain relatively low levels of strontium compared with other parts of the plant. Thus, the Sr/Ca ratios seen in P. robustus might reflect a diverse herbivorous diet, rather than an omnivorous one - although this conclusion may not be supported by paleoenvironmental or dental evidence.[6]
Despite these disadvantages, Sr/Ca analysis remains historically important, in that it was one of the first attempts to understand hominin diets through tissue chemistry.
More recent chemical analyses involve measuring the relative levels of stable isotopes of carbon in tooth enamel. There are numerous advantages to this technique. First, tooth enamel is only 1% organic. Therefore, it is not significantly affected by diagenesis - enamel is in a sense "pre-fossilized." Also, teeth are quite numerous in the fossil record, because they are more resistant to taphonomic processes than many other parts of the skeleton. Finally, the wide variance in isotopic signatures makes it relatively easy to determine where a sample falls within the possible range.
Carbon isotope analysis depends on the fact that most plants use one of two different photosynthetic pathways in the fixation of atmospheric CO2. The first pathway, known as C3, discriminates heavily against 13C during photosynthesis, resulting in a highly depleted 13C/12C ratio in the plant's tissues. The C4 pathway, on the other hand, discriminates less heavily against 13C, and results in a less depleted 13C/12C ratio.
Since tropical trees, bushes, shrubs, and forbs are C3 plants, while tropical grasses and some sedges are C4 plants, the 13C/12C ratio of a tropical animal's tissues can indicate what types of plants it ate. Importantly, in contrast to the strontium signal, the carbon isotope signature does not become attenuated across trophic levels. Therefore, a tropical animal exhibiting a strong C4 signature may have eaten large proportions of grasses and sedges, or it could have eaten large proportions of animals that had themselves consumed such plants.
According to early constructions, both A. africanus and P. robustus would be expected to display a strong C3 profile, reflecting their arboreal diets of fleshy fruits and leaves (A. africanus) and smaller, harder fruits and nuts (P. robustus). However, carbon isotope analyses demonstrate that both australopiths display signatures highly distinct from C3 browsers and C4 grazers. In other words, both taxa consumed a mixture of C3 and C4 foods. The C4 component is particularly surprising, since it implies the consumption of grasses, seeds, sedges, or grazing animals - food sources not previously thought to have been consumed by australopiths. Additionally, the C4 component contrasts with earlier conceptions that A. africanus was very similar to modern chimpanzees (Pan troglodytes) in both diet and habitat, since chimpanzees do not tend to utilize C4 resources even when in open, grassy, savanna environments.[7]
The most recent carbon isotope analyses have employed laser ablation techniques to sample enamel along multiple points on a single tooth. Since teeth grow in physically detectable increments, it is thus possible to investigate dietary variability within the lifetime of an individual hominin. These studies indicate that C4 foods such as grasses, sedges, and grazing animals were an important but highly variable component of the P. robustus diet. The high seasonal and interannual variation in the diet suggests that P. robustus was a dietary generalist, able to adapt to fluctuating environmental conditions, perhaps by using its unique masticatory apparatus to exploit hard, low-quality foods when preferred foods were not available. Again, this contradicts the classic notion of P. robustus as a dietary specialist only capable of eating hard, low-quality foods.[8]
Overall, stable carbon isotope analysis has demonstrated that the diets of A. africanus and P. robustus were far more variable and complex than the simple dichotomy of "high-quality generalist" and "low-quality specialist" suggested by their respective morphologies.
Although the archaeological record of early human evolution is quite sparse (especially before 1.8 mya), artififacts can provide valuable dietary evidence. Specifically, the form and wear patterns of early stone tools can suggest how hominins may have gathered and processed various plant and animal foods. Additionally, examinations of the cut marks on preserved animal bones may indicate hominin hunting or scavenging behaviors.
An emerging research domain in the evolution of the human diet is energetics, or the study of how animals utilize food energy for basal metabolism, physical activity, growth, and reproduction. If an animal's total energy expenditure can be estimated, then it should be possible to predict the energetic properties of its diet.
Several influential studies have noted that brain and other neurological tissues are metabolically "expensive," requiring approximately 16 times as much caloric energy to maintain as skeletal muscle. Furthermore, brain metabolism accounts for up to 25% of resting metabolic rate (RMR) in adult Homo sapiens. Since H. sapiens is the most highly encephalized of all extant primates, these facts would seem to predict that it should take in more caloric energy than any other primate of similar body size. However, this is not the case. On the contrary, the energy intake in relation to body size of H. sapiens falls within the predicted values for other primates.
Several solutions to this problem have been proposed. First, it has been argued that H. sapiens has a relatively small gastrointestinal tract (GIT) for its body size when compared with other primates. Since the GIT is itself an "expensive tissue" like the brain, it is possible that the energy cost of increasing brain size over time within the hominin lineage was compensated for by a decreasing GIT size.[9] However, this evidence has been disputed.[10] Additionally, H. sapiens possesses relatively more adipose tissue and less muscle tissue than other primates of similar body size. Since adipose tissue is metabolically less expensive than muscle tissue, this could also represent an energetic compensation for increased brain size. Furthermore, large amounts of energy-storing adipose tissue may have been of particular evolutionary importance to human infants, who experience a period of metabolically expensive postnatal brain growth, unlike other mammals.
As noted above, relatively smaller GITs are also associated with higher-quality diets, while relatively larger GITs are associated with lower-quality diets. This suggests that hominins may have energetically compensated for larger brains by increasing the quality (caloric density) of their diets; the smaller GIT of H. sapiens may therefore constitute an adaptation for a higher-quality diet, rather than necessarily a reduction in basal metabolic expenditure.[11]
These arguments have strong implications for interpreting the evolution of the human diet in the fossil record, since they are strongly connected to brain size. Given the quite strong paleontological evidence for the cranial capacities of most hominin species, it should be possible to trace the development of dietary quality over time. For example, it appears that the most significant increases in hominin brain size began with the emergence of Homo erectus approximately 1.8 million years ago, which may suggest a corresponding increase in dietary quality at this time.