In their myriad telling, “scientific” narratives of human evolution have accumulated a ton of ideological baggage; human origins accounts often are more rooted in fiction than fact, and many were spawned before recent archaeological and scientific breakthroughs. Few models are as dominant as the story of “Man the Hunter.” This theory of evolution and human nature argues that human beings (1) are natural carnivores, (2) were always hunters, and (3) are inherently violent and aggressive. Not only prevalent in science, these assumptions spread into culture and everyday life, where they shaped anthropocentric worldviews and sedimented into “common sense.” Yet each element in the Man the Hunter model is a fiction and myth that both stems from and perpetuates false concepts of human identity. The prevalent notion of “human nature” has no grounding in historical reality and in fact is a social construction with troubling implications and consequences.
Clearly, these three assertions sustain and support each other. If humans are natural carnivores, they have to hunt to survive; since hunting, moreover, is impossible without killing, violent behaviors form the basis of social life. To say that humans are natural carnivores is to state that since our hominid beginnings 5-8 million years ago we ate a meat-based diet and killed animals to satisfy our cravings for flesh and blood. But it also makes a stronger claim that the human physiology requires meat and cannot flourish or function properly on a vegetarian diet. Meat consumption is primordial, natural, good, and necessary. Thus, humans cannot and should not live without killing animals, and violence is inherently and necessarily a part of their existence. Natural carnivores are therefore born to hunt and kill; they are violent not only toward animals but also toward each other; carnivorism is our original sin.
The Man the Hunter view has influenced many views about the biological basis and evolution of violence in human life. These are arbitrary claims rooted in speciesist, carnivorous, and patriarchal biases, and we shall take them apart one at a time.
The fact that our early hominids ate some animal flesh in no way entails that they were “carnivores.” Our australopithecine ancestors were opportunist omnivores who ate anything they could, a diet largely composed of fruits, nuts, seeds, plants, and any scraps of flesh they could find. For at least three million years, they maintained this diet and the little meat they consumed came from scavenging carrion or eating insects, but not hunting. The jaws, small incisors and canines and blunt and flat molars of australopithecines were hardly suited for cutting, tearing, and masticating meat.
According to many theorists, however, once the genus Homo emerged over two million years ago, and with it the first primitive tools, the hominid diet changed. Whereas ten percent of australopithecine food intake was meat, this figure doubled for Homo erectus, the first hominid species thought to actively hunt animals. In the turn from scavengers to hunters, many argue, Homo erectus established a dramatic new mode of life whereby human survival was secured not by whatever the environment provided but rather through actively securing sustenance by hunting animals. The Homo erectus diet was far more versatile than other hominids, enabling it to freely move about independent of the food supply of any specific locale, and thus to begin a dynamic exodus out of Africa to other continents.
For many interpreters, meat was not just a key part of the Homo erectus diet two million years ago, it was a crucial stimulus to the human brain and the evolution of society. In Ape Man: The Story of Human Evolution, Robin McKie writes: “Meat …made us brainy. Easy to digest and rich in energy, meat provided the vital resources that our expanding brains demanded … The new diet provided mothers with high-quality for the brains of their developing babies, and provided continuing neurological sustenance as those infants grew up. And not just meat, but fat and bone marrow – easily digested, energy-rich foods that permitted the evolution of smaller stomachs which in turn saved internal energy… We started to eat meat, got smarter and thought of clever ways to obtain more meat.” As hunting demanded intelligence, stealth, communication, and cooperation, it sparked the development of a more complex social life.
Behold how patriarchal, carnivorist fantasies are projected from the present to prehistory. There is no evidence linking meat consumption and the qualitative advancement of social life and the human brain. Moreover, the gathering or production of any food source surely required as much cooperation as hunting and would logically have brought about the same evolutionary result. Indeed, whereas hunting is framed as an exclusively male activity, gathering plants involved the cooperation of men and women and thus – as a practical activity — should have been a greater catalyst for social cooperation and brain development than hunting.
The claim that human beings are natural carnivores who thrive from eating meat is falsified by mountains of scientific evidence and everyday experience in modern populations plagued by heart disease, cancer, strokes, obesity, osteoporosis, and other diseases. An overwhelming body of scientific data demonstrates that animal fat causes disease processes in the human body, such as prostate and breast cancer, heart disease, diabetes, and strokes. Carnivores are many times more likely to fall victim to these diseases, along with obesity, than vegetarians and vegans. No truly carnivorous animal dies from the fat and protein of another animal. Human physiology is radically different from that of bona fide carnivores such as tigers and hyenas. Humans lack the teeth, saliva, and digestive systems necessary to eat and digest meat efficiently.
Even if humans have been carnivores and killers throughout their history, even if meat-consumption was crucial to the stimulation of the human brain and social evolution, it does not therefore follow that a carnivorous mode of existence continues to be a healthy lifestyle, an ethical diet, or a positive stimulant of social evolution. Appeals to tradition always beg the question of whether or not the tradition is valid and viable and should be perpetuated rather than ended.
The entrenchment of carnivorous lifestyles makes it difficult to change, to be sure, but not impossible and nor undesirable. Because the intense propaganda of modern meat and dairy industries drives consumer appetites, now at staggeringly high levels on a global scale, in the last century an omnivorous hominid has mutated into a carnivorous ecomorph. Champions of hunting and meat-eating fail to grasp that what was once a necessary survival mechanism and functional behavior is now – putting aside the debatable exception for any rare prehistorical cultures still left — an unnecessary, unjustifiable, addictive, health-destroying, environment-devastating, dysfunctional behavior and social practice.
Whatever greater adaptability and brain stimulation meat eating might once have provided (a dubious proposition), hunting and meat consumption also endowed Homo species with the technologies to massacre one another, to exterminate countless other animal species, and to colonize the planet. Armed with spears, knives, swords, guns, blades, and forks, Homo sapiens — no longer a vulnerable source of prey — became the most powerful predator on the planet and an agent of mass extinction. From scavenging and hunting to factory farming and the erection of the Global Meat Culture on the ruins of ancient rainforests, humans’ socially constructed carnivorous appetites have become a driving force of social and ecological crisis.
Our hominid ancestors secured their meat primarily through scavenging, not hunting, and therefore were dependent upon the efforts of other species. In contradistinction to the killer-carnivore dogma, Donna Hart and Robert Sussman’s book, Man the Hunted, emphasizes that our ancestors were prey far longer than they were predators, and this vulnerability sparked the evolution of intelligence. Adults were only 3-5 feet in height, weighed 60-100 pounds, had small teeth and no claws, and lacked tools or weapons. Whether sleeping in caves or walking through savannas, hominids were constantly vulnerably to attack from mega-predators such as hyenas, saber-tooted cats, reptiles, and raptors who regularly dined on hominids and other primates. Outnumbered, slower, and weaker than the ferocious beasts that hunted them, they had to band together, be versatile, communicate with sounds, guard sleep sites, and on the whole be clever and smart. The status of hominids as the hunted instead of the hunters destroys the image of powerful hominids at the top of the food chain, and it also underscores a key dynamic in human evolution, involving a coevolution between humans (as prey) and powerful carnivorous animals (as predators).
Thus, obtaining the social skills and smarts necessary to avoid being eaten by deadly predators, not eating their flesh, stimulated the growth of social complexity and the hominid brain. The first evidence of stone tools was 2.3 million years ago; the earliest human fossils date back at least seven million years, and thus our ancestors walked about for five million years before inventing tools. As there is no good evidence of our ancestors using fire until 800,000 years ago, and the “first unequivocal evidence of large scale, systematic hunting … is available from paleoarchaeological sites possibly only 60,000-80,000 years old,” Hart and Sussman conclude that “meat consumption could not have been the main or only catalyst in the qualitative leap toward humankind.”
If, as Hart and Sussman argue, large-scale hunting does not begin until 60,000-80,000 years ago, this aspect of human behavior has been grotesquely overemphasized. It is a strange “hunter” species who has hunted only a small fraction of its existence, who mostly killed insects and small animals, who scavenged more than killed, and who — until very recently in Western nations – obtained the bulk of its calories from plant foods. As Jared Diamond writes in The Third Chimpanzee, “Studies of modern hunter-gathers with far more effective weapons than early Homo sapiens show that most of a family’s calories consumes from plant food gathered by women. Men catch rabbits and other small game never mentioned in the heroic campfire stories…I would guess that big-game hunting contributed only modestly to our food intake until after we had evolved fully modern anatomy and behavior. For most of our history we were not mighty hunters but skilled chimps, using stone tools to acquire and prepare plant food and small animals.”
Clearly, the Man the Hunter theory is a patriarchal construct which inflates the role men played in social reproduction and minimizes the contributions of women. Observing chimpanzees, feminists have discerned that most food is obtained by gathering, not hunting; analysis of modern human hunter-gatherer cultures shows that tools are also used mainly for gathering (plants, eggs, small insects and animals) not hunting, that most of the tools are made and used by women, and that women collect 60-90% of the food. Thus, a far more accurate view of early human history would single out not Man the Hunter but rather Woman the Gatherer, for in early societies women play the more important role in feeding families, socializing the young, and acquiring and sharing knowledge that is passed to subsequent generations.
The unfortunate mythology linking carnivorism, hunting, and violence was spawned in large part by archaeologist Raymond Dart. Looking at the holes and dents in australopithecine skulls, Dart concluded that our ancestors not only hunted and killed prey, but also murdered each other by using the bones of animals as clubs and weapons. In the 1960s, Robert Ardrey popularized Dart’s theory in a number of books that were influential on the public and scientific community alike. Following Dart’s thesis, Ardrey believes that “Man is a predator whose natural instinct is to kill with a weapon.” Killing stimulated the development of big brains, and war and territorialism have led to great accomplishments of Western man.
In the mid-1970s, however, South African fossil specialist, C.K. Brain, refuted the killer ape-man theory through meticulous research and common sense. He realized that the bones Dart interpreted as the lethal weapons wielded by australopithecines were actually fragments of hominids and other primates discarded by tigers and hyenas. Brain examined the marks and indentations in the skulls of baboons and australopithecines and saw that they were consistent not with weapons used by hominid, but rather with bites from predators such as leopards and hyenas, who dragged their prey into a cave. Dart confused cause and effect: the hominids were the meals not the diners.
Brain’s critiques began the shift in anthropology away from Dart’s theory, but the killer-ape view persisted in many quarters of science and certainly in the popular imagination. On top of Dart’s initial error, elaborate falsehoods were written about the aggressive, territorial, bloodthirsty human type whose true beastly nature lies simmering beneath the veneer of “civilization” and morality. Forced to hunt and kill throughout its history, the argument goes, humans have a violent nature that can explode at any time, and is barely subdued with morality and law.
In his book, Primates and Philosophers, Frans de Waal takes apart the “veneer” model of civilization which sees animality as inherently violet and brutal, such that civilization succeeds only to extent it covers it over, holds it back, and creates a gauzy and fragile barrier between primates and humans. Morality is a “thin overlay on an otherwise nasty nature.” We are “bad” when we lapse into “nature” and “good” when we stave it off. We are “bad” when we lapse into “nature” and “good” when we stave it off. The veneer model commits two grave errors: (1) it denies the continuity between animal and human, and (2) it gives a one-dimensional view of animal conduct as selfish and violent, completely missing the empathetic and cooperative side of primate behavior.
While portrayals of humans as natural born carnivores, hunters, and killer were highly distorted, for the last two centuries Western culture and anthropology has spawned yet another myth. Rejecting Thomas Hobbes’ bellicose view of human nature (initialed locked into a “war of all against all”), many theorists turned the opposite extreme and embraced Jean Jacques Rousseau’s vision of the “noble savage” and the peaceful nature of preliterate cultures (before the emergence of agricultural societies ten thousand years ago). As discussed in Lawrence Kelley’s War Before Civilization, however, a new body of research suggests that warfare was pervasive throughout prehistory, such that the percentage of the population killed annually exceeds that of any modern society.
The truth of human nature is somewhere between the Hobbesian view of people as inherently wicked though manageable through coercive social authority, and the Rousseauian belief that humans as innately good but corrupted by society. The very same species that produced the rock paintings in the caves of Lascaux, the Parthenon, Hamlet, the Sistine Chapel, and the Eroica Symphony also operated the ovens of Dachau, dropped atomic weapons on civilian populations in Japan, and fertilized the killing fields of Cambodia with bones and blood. As Homo ambiguous, we are a Janus-faced species capable of peace and warfare, love and hatred, good and evil, compassion and contempt, and creativity and destruction.
We need to acknowledge the dark side and violent tendencies of human nature without lapsing into pessimism and determinism. We should recognize that human violence is more pervasive that often thought, but also that peaceful cultures have existed and that hierarchical societies invariably spread violence, warfare, and ecological destruction. We need a frank evaluation of human nature, social history, and the gravity of the current social and ecological crisis, while also envisioning alternative ethics and social institutions. We must realize that traits which are “natural” are not unchangeable, nor are they bad (e.g., reciprocal altruism in primates). Once we grasp that culture itself is a product of nature, that our capacities for language, thought, and ethics stem from potentialities and dynamics inherent in evolution, and that genes require suitable environments to be expressed, the rigid wall between biology and culture comes crashing down. It’s not nature vs. nurture, but rather nature via nurture.
We are not infinitely plastic, pliable, and malleable, but nor are we rigidly fixed and inflexible due to our biological make-up. Everything turns on the basic but crucial distinction between being influenced by genes and being controlled by them; genes shape us in a wider social, cultural and psychological context, which in turns conditions genes. If we have the capacity to change by learning and education, as has been demonstrated countless times in human history, then we are not defined solely by our biological nature and genetic make-up, and our socialization and cultural practices play a major if not decisive part of why we are and become. This leaves the door wide open for education, moral evolution, and progressive social change.