Should We Thank or Blame the Ancestors? Ancient Farmers Stop the Ice Age

Should We Thank or Blame the Ancestors? Ancient Farmers Stop the Ice Age

Millennia ago, ancient farmers cleared land to plant wheat and maize, potatoes and squash. And unknowingly, they may have been fundamentally altering the climate of Earth.

New evidence of ancient farming effects

A study published in the journal Scientific Reports provides new evidence that ancient farming practices led to a rise in the atmospheric emission of the heat-trapping gases carbon dioxide and methane - a rise that has continued since, unlike the trend at any other time in Earth's geologic history.

It also shows that without this human influence, by the start of the Industrial Revolution, the planet would have likely been headed for another ice age .

"Had it not been for early agriculture, Earth's climate would be significantly cooler today," says lead author, Stephen Vavrus, a senior scientist in the University of Wisconsin-Madison Center for Climatic Research in the Nelson Institute for Environmental Studies. "The ancient roots of farming produced enough carbon dioxide and methane to influence the environment."

  • Washing Up Wasn’t Enough: Evidence of 8.2 ka Climate Event Found in Çatalhöyük Cooking Pots
  • Can Any Civilization Make It Through Climate Change?
  • Rising to the Challenge: Innovative Civilizations Advanced Through Climate Change

Earth’s climate would be significantly colder today without the effects of ancient farming. (Image: © trattieritratti /Fotolia)

The findings are based on a sophisticated climate model that compared our current geologic time period, called the Holocene, to a similar period 800,000 years ago. They show the earlier period, called MIS19, was already 2.3 degrees Fahrenheit (1.3 C) cooler globally than the equivalent time in the Holocene, around the year 1850. This effect would have been more pronounced in the Arctic, where the model shows temperatures were 9-to-11 degrees Fahrenheit colder.

Using climate reconstructions based on ice core data, the model also showed that while MIS19 and the Holocene began with similar carbon dioxide and methane concentrations, MIS19 saw an overall steady drop in both greenhouse gases while the Holocene reversed direction 5,000 years ago, hitting peak concentrations of both gases by 1850. The researchers deliberately cut the model off at the start of the Industrial Revolution , when sources of greenhouse gas emissions became much more numerous.

Classic climate change cycles

For most of Earth's 4.5-billion-year history, its climate has largely been determined by a natural phenomenon known as Milankovitch cycles, periodic changes in the shape of Earth's orbit around the sun - which fluctuates from more circular to more elliptical - and the way Earth wobbles and tilts on its axis.

Astronomers can calculate these cycles with precision and they can also be observed in the geological and paleoecological records. The cycles influence where sunlight is distributed on the planet, leading to cold glacial periods or ice ages as well as warmer interglacial periods. The last glacial period ended roughly 12,000 years ago and Earth has since been in the Holocene, an interglacial period. The Holocene and MIS19 share similar Milankovitch cycle characteristics.

All other interglacial periods scientists have studied, including MIS19, begin with higher levels of carbon dioxide and methane, which gradually decline over thousands of years, leading to cooler conditions on Earth. Ultimately, conditions cool to a point where glaciation begins.

Ice covered fjord on Baffin Island with Davis Strait in the background. Baffin Island is the largest island in the Canadian Arctic Archipelago and fifth largest island in the world. Credit: NASA / Michael Studinger

Following the gases

Fifteen years ago, study co-author William Ruddiman, emeritus paleoclimatologist at the University of Virginia, was studying methane and carbon dioxide trapped in Antarctic ice going back tens of thousands of years when he observed something unusual.

"I noticed that methane concentrations started decreasing about 10,000 years ago and then reversed direction 5,000 years ago and I also noted that carbon dioxide also started decreasing around 10,000 years ago and then reversed direction about 7,000 years ago," says Ruddiman. "It alerted me that there was something strange about this interglaciation ... the only explanation I could come up with is early agriculture, which put greenhouse gases into the atmosphere and that was the start of it all."

Ruddiman named this the Early Anthropogenic Hypothesis and a number of studies have recently emerged suggesting its plausibility. They document widespread deforestation in Europe beginning around 6,000 years ago, the emergence of large farming settlements in China 7,000 years ago, plus the spread of rice paddies - robust sources of methane - throughout northeast Asia by 5,000 years ago.

  • From Hunters to Settlers: How the Neolithic Revolution Changed the World
  • 14,400-year-old Bread Causes Major Re-think on the Birth of Agriculture
  • Could Resurrecting Mammoths Help Stop Arctic Emissions?

The growth of rice cultivation from 5000 years ago contributed to methane level increase. (Image: CC0)

Ruddiman and others have also been working to test the hypothesis. He has collaborated with Vavrus, an expert in climate modeling, for many years and their newest study used the Community Climate System Model 4 to simulate what would have happened in the Holocene if not for human agriculture. It offers higher resolution than climate models the team has used previously and provides new insights into the physical processes underlying glaciation.

For instance, in a simulation of MIS19, glaciation began with strong cooling in the Arctic and subsequent expansion of sea ice and year-round snow cover. The model showed this beginning in an area known as the Canadian archipelago, which includes Baffin Island, where summer temperatures dropped by more than 5 degrees Fahrenheit.

"This is consistent with geologic evidence," says Vavrus.

Northeast coast of Baffin Island north of Community of Clyde River, Nunavut, Canada. (Image: CC BY-SA 2.0 )

Today, the Arctic is warming. But before we laud ancient farmers for staving off a global chill, Vavrus and Ruddiman caution that this fundamental alteration to our global climate cycle is uncharted territory.

"People say (our work) sends the wrong message, but science takes you where it takes you," says Vavrus. "Things are so far out of whack now, the last 2,000 years have been so outside the natural bounds, we are so far beyond what is natural."

The reality is, we don't know what happens next. And glaciers have long served as Earth's predominant source of freshwater.

"There is pretty good agreement in the community of climate scientists that we have stopped the next glaciation for the long, foreseeable future, because even if we stopped putting carbon dioxide into the atmosphere, what we have now would linger," says Ruddiman. "The phenomenal fact is, we have maybe stopped the major cycle of Earth's climate and we are stuck in a warmer and warmer and warmer interglacial."


Is Farming the Problem?

Here is a story that we tell ourselves. From The Good Ancestor:

“Consider the immense legacy left by our ancestors: those who sowed the first seeds in Mesopotamia 10,000 years ago, who cleared the land, built the waterways and founded the cities where we now live, who made the scientific discoveries, won the political struggles and created the great works of art that have been passed down to us.”

We don’t question this narrative. We simply accept it as “the way things happened”. But read it again with your critical brain engaged. To begin with, this ancestral narrative begins in Mesopotamia. This is not accurate. A few people in the Mesopotamian river basins started writing down what they were doing (mostly with regard to how much grain and gold were passing through their hands), but our cultural story begins long before Mesopotamia and in many different parts of the world, and ultimately, the human story begins in Africa, not the Middle East.

Farming did not begin with sowing seeds. This is a classic chicken egg of an assertion. What seeds? Where did they come from? How did humans even know to put them in the ground and expect to harvest something humans could eat? We’ll come back to this because this is the focus of this essay.

Let’s consider the assertion that humans built waterways. Yes, there are some canals, some irrigation projects, a few long-distance pipes and aqueducts. These are not generally waterways in terms of transport, which is I believe what is being referenced. Humans have not, in any case, built most of the bodies of water we use. Waterways are part of this planet, a priori humans have done more to break rivers, lakes and oceans than to build them. We have built very little of the solid infrastructure — like waterways — that our culture rests upon. We are not able to build these things even with fossil-fuel driven tools. The planet does this for us, and the planet does not generally allow for massive land alterations that She has not built. Time and geological processes will erase everything we do.

I’d almost give a pass to founding cities and making scientific discoveries. Except most cities are not very ancient they are not the homes of most humans and they will definitely shift with human settlement patterns going into the future. As to discoveries, our reflex is to think of Science springing into existence in Europe of the 17th century. Maybe with a few outliers in time and geography. But like sowing seeds that must have existed and were not naturally occurring, scientific discoveries began long before humans began writing down their accomplishments and in many different parts of the world. In fact, creating those seeds ranks high on the list of great scientific discoveries. Most of the foundations of our knowledge of the world were laid in deep time. The ways we build, the things we eat, our material culture, language — all these are far more important to our lives and took far more inventiveness and creativity and sheer dogged experimentation than anything done since the Enlightenment.

And then there’s the political struggles “our” ancestors won… Do I even have to say anything? There isn’t a lot of love for the ancestors who won political struggles because most of us come from the ones who lost. And there should not be political struggle we should be working together not tearing each other apart. Similarly, which great works of art and who gets to deem them great?

As you can see, there are many fatal flaws in the accepted narrative. We need to stop telling ourselves this story. It is wrong. It ignores and debases our true ancestry. Most importantly, it is destructive and divisive.

We need new stories (like this one). We need to understand where we come from. We need to know how this world works and what our place is in it. We need to know how we work. We’re still surprisingly vague on a good many things. For today, let’s look at that chicken egg.

I’ve seen assertions that crops and livestock were domesticated very quickly, maybe within a a couple dozen generations. There are no domestics and then suddenly — BAM — at 10,000 years ago, domesticated food sources are everywhere in the archeological record. Moreover, very shortly after they appear in the archeological record, foods are being grown in volumes sufficient for trade, not merely for subsistence.

It’s hard to argue with this because this is what we do see. Or, more precisely, this is what we see in that portion of human material culture that is preservable, has survived in settings that indicate use, and has been found and properly interpreted. As you see, the record becomes a bit less set in stone (ha) when you consider all the variables going into it. The kind of things that are preservable in the record are stone and bone tools, teeth and bone from animals and humans, and a few of the toughest plant fibers. These have survived in middens, hearths and various caches. We don’t have much evidence from actual farmed fields or gardens until humans began record-keeping. And of course we haven’t even begun to look for evidence of early farming around most of the globe.

Consider how little information we actually have about the earliest farming. Most tools would not have been made of stone they would have been wood. Wood is easier to shape and repair it’s lighter and easier to transport and most trees can provide useful branches to shape into tools whereas only a few rock types knap well. Wood tools were very likely dominant, and yet wood does not preserve well except in very dry conditions — which is about the only ecological niche where stone is favored because trees don’t grow in deserts. So the record is skewed from the outset.

Let’s use logic instead. First, consider our hominid ancestors. Were they farmers? Qualitatively, maybe.

Early hominid anatomy was modified relative to pre-hominids in several key ways that reflect new behaviors. Hominid skeletons show changes in the pelvis, legs and feet for bipedal locomotion large grinding molars and premolars with reduced canines for effective chewing of fibrous plant foods minimal sexual dimorphism because of selection for common behavioral adaptation for both females and males (that is they were changing to do the same things) and brain expansion and reorganization for their developing memory, conceptualization, problem solving, innovativeness and more sophisticated communication, as well as for increased hand skills. These changes were caused by and caused change in social relationships and tool use.

All the foregoing are related to gathering plant food on the savanna as the main selection pressure. Females were innovators in gathering. Because of nutritional requirements of pregnancy and nursing and overt demands from hungry children, women had more motivation for technological inventiveness, for creativity in dealing with the environment, for learning about plants, and for developing tools to increase productivity and save time. Selection was for increasingly efficient, time-saving, energy-saving ways of getting food. We still call necessity the “mother” of invention.

So it appears that our earliest hominid ancestors (mostly on our mother’s side) were already manipulating their food resources. I wouldn’t call this farming, though perhaps it is a form of gardening. But where do we draw a line between gardening the existing food plants and breeding — domesticating — new ones? What does domestication entail?

We know that changing plant or animal morphology takes two things — a genetic mutation that codes for a desirable trait and many generations of breeding stock to disperse that trait. The second part is tricky but manageable, especially in the annual plants that were our earliest domesticates. Still there’s a limit on experimentation, one generation per year. In a human lifespan of maybe 40 years, there’s only so much one can accomplish. And that’s if you’ve managed to find that one plant that has undergone favorable random mutation. This is perhaps the more difficult part of breeding — especially in a small (human-scale) geographical location. So this is the part of domestication that would have taken a very long time and considerable experimentation in many different localities around the world.

Domesticating animals would be even more time-consuming, probably the work of many human generations. Animals don’t produce many offspring in a breeding season many animals do not breed at all in a given season (too old, too young, etc.) and there are several years between generations. The probability of finding a favorable mutation, even in the fast-breeding goats (the first domesticates), is much lower than in a square foot of grain. And once that minor miracle has occurred, it takes decades to breed a new population that consistently expresses the new favorable trait. This is the work of a lifetime! To make each adjustment! If you’ve been lucky enough to find that trait occurring in one of your goats.

Not only did our ancestors have to figure out how to grow and manipulate grain-bearing grasses, they also had to figure out how to process the seeds. Our ancestors had more powerful grinding teeth, but our digestive tract hasn’t changed much and it has a hard time extracting nutrition from seeds. That is, after all, the point of a seed — to make sure the germ inside stays intact until it finds the proper conditions to germinate and grow. Seed shells evolved interdependent with animals. The seeds that could pass unscathed through an animal’s digestive tract (and incidentally the ones that provided no nutrition to the animal) are the seeds that survived to reproduce.

Human digestion would have required some processing of the seeds before eating them. Which is itself a chicken-egg problem. If seeds don’t readily give up nutrition in their naturally occurring state, why would humans have tried eating them? How would they have known to try to process the seeds, and then how would they have known how to process the seeds? And how many times did this discovery happen only to die out with the discoverer? How many repetitions of these lucky chances combined with brilliant insights would it have taken before humans generally knew how to turn a grass seed into food?

While we are asking questions, consider this one: how long would it take to even realize that any of these things can be done? Domestication implies an active process, an actor with motive and agency, somebody doing things with intention. Even if that is not what archeologists mean precisely, domestication is something humans do, not something that just happens. What human first knew to try to mold a wild grass into grain? What would grain have meant to them before grain existed? What was the objective when the object was nonexistent?

I am not at all convinced that this began as a human, or even hominid, activity. I think it was more like co-evolution. Once humans figured out how to turn the seeds into digestible food, they probably nudged evolution along. Over thousands of years, humans chose the biggest grass seeds in their locality for food, dropping many seeds as they went, probably mostly near their hearths. Over time, wherever there were seed-eating humans, the grass seeds got bigger and changed in other ways that appealed to their human partners. After more time, it is likely they figured out that seeds make new plants that roughly match the plants the seeds came from. All this took many generations of observation and knowledge sharing. Eventually, they were so active in this process of co-evolution that they could be called farmers.

Now, here’s another place to use logic. Would humans have been able to forage in temperate climates year-round? Or would it not have been necessary for humans to have farming skills before they ventured into colder regions? Wouldn’t the humans who walked from Africa to Australia (boats notwithstanding) have needed some way to bring their own food with them? Is it likely that they would have been able to eat from the land during that whole journey through vastly different ecosystems where almost nothing would have been recognizable, let alone digestible? Or is it more likely that they carried seeds with them? Maybe even knew some herding techniques? This would push farming back to at least well before 50,000 years ago. “Well before” because farming had to exist before they left Africa.

All this long process fed into the domestication of food stuffs and to the invention of farming. There were thousands of years of experimentation — yes, science! — before humans even knew what they were doing and thousands more before they had succeeded in creating new species and sub-species of plants and animals. In fact, I suspect there were farmers — those actively engaged in growing food, not just foraging — before there were domesticated strains. Else why would domestication have happened if not because there were farmers?

I do not believe that farming began at about 10,000 years ago. Perhaps agriculture did, however. The distinction I’m making here is between growing food for eating — farming — and growing food for use in generating and maintaining wealth and power over others — agriculture. The main point is that farming existed. Domesticated plants and animals existed. Humans were actively manipulating their environment to control their food supplies long before what is commonly called the advent of agriculture.

No, farming did not begin at 10,000 years ago. However, civilization probably did spring into existence in several places around the globe at that time. Why? Because the climate finally became favorable for long-term human settlement. Ice core records tell us that the climate was wildly unstable up until this time. Every human project would have been interrupted, perhaps cataclysmically, within a few generations. And there were large parts of the globe that could support no human projects at all until the end of the Ice Ages. Humans had been engaged in human projects in the most benign regions for thousands of years before suddenly climate stabilized into the unvarying conditions we consider normal. (Ice cores show us this is not true!) And when this happened humans had the skills, the tool set, the breeding stock, and the knowledge already at hand. They started building more elaborate settlements and farming these places as soon as the climate allowed it.

And right on the heels of elaborate settlements, some humans in some regions began to form hierarchies that benefitted themselves — for which many supporting systems were needed, from record keeping to weaponry manufacture to, yes, agriculture. The easiest way to dominate others is to control the food supply. I believe that what we call the birth of agriculture is actually the birth of these domination systems that required food to be grown for state use, not for food. But the point is: farming existed. Extensively. Well before these states.

Farming gets a bad rap these days. Jared Diamond has called it the “worst mistake in the history of the human race”. Many agree with him. Superficially, I sort of agree with him. My garden experience tells me that soil does not like to be disturbed through annual tilling. Churning up the soil reduces its fertility, turns it into dirt, leads to desiccation and erosion. My instincts tell me that plants and animals will grow best if they are growing in the conditions in which they evolved. So tomatoes like long, hot summers with moderate to high rainfall but not damp conditions — because that’s what the natal region for tomatoes, Central America, is like. Tomatoes do not want to grow in the unnatural conditions of a garden, even if I work really hard to mimic their homelands.

But worst mistake is a strong phrase (considering all and sundry…). And I think it’s imprecise. It’s worth looking at the essay that is titled with this assertion.

Diamond’s essay does show that the common narrative of progress from “hunter-gatherers” to moderns is false. Things do not get linearly better through time. Indeed, there is clear correlation between the adoption of sedentary life-ways and a decrease in most of the measures of good health that can be recorded in the archeological record. In fact, Diamond claims there is some evidence that modern populations still haven’t “recovered” from those times.

For example, skeletons from Greece and Turkey show that the average height of foragers toward the end of the Ice Age was around 5′ 9” for men and 5′ 5” for women. By 3000BCE height had dropped to only 5′ 3” for men and 5′ for women. By classical times heights were very slowly on the rise again, but modern Greeks and Turks have still not regained the average height of their distant ancestors. Of course, modern Greeks and Turks are not descended solely from those Ice Age populations, and height is not an equally adaptive trait in all environments… so this is a somewhat spurious claim. But it does show that foragers were not living the “nasty, brutish and short” existence the modern narrative assigns to them.

Diamond places the blame for this decline in health squarely on the adoption of agriculture. But there are problems with this. On the face of it, yes, there is a correlation between a society adopting agriculture and a reduction in health. But almost without fail there are other factors driving both the change in food acquisition strategies and the decreased well-being. The key factor is that there is almost always an elite group presiding over a state apparatus of control. Notably, the archeological remains of elites do not show evidence of declining health. Skeletons from Greek tombs at Mycenae were two or three inches taller and had better teeth than commoners. Among Chilean remains, the elite were distinguished not only by costly grave goods but also by a fourfold lower rate of bone lesions caused by disease.

Diamond then presents this personal anecdote to support his claim that contemporary agrarian societies are also unhealthy.

“Women in agricultural societies were sometimes made beasts of burden. In New Guinea farming communities today I often see women staggering under loads of vegetables and firewood while the men walk empty-handed. Once while on a field trip there studying birds, I offered to pay some villagers to carry supplies from an airstrip to my mountain camp. The heaviest item was a 110-pound bag of rice, which I lashed to a pole and assigned to a team of four men to shoulder together. When I eventually caught up with the villagers, the men were carrying light loads, while one small woman weighing less than the bag of rice was bent under it, supporting its weight by a cord across her temples.”

I might counter that this does not implicate farming it’s the culture of male dominance that is overloading these women. (And it’s not “sometimes made beasts of burden” it’s always.) I think what Diamond’s own essay shows is that elites have been manipulating food production for their exclusive benefit for a long time. It’s not farming. It’s not even sedentary life-styles. It’s men. Farming is not the worst mistake. Farming is not even a mistake. It’s how we secure our food supply in variable climates. No, the mistake is allowing elites control over farming, over that food supply.

Look at the evidence from what is now called “alternative agriculture”. These are farming practices that are not focused on growing a few commodity crops for trade, which is what we think of as conventional agriculture, even though conventional agriculture is not particularly conventional. Agriculture oscillates through periods when farmers are intent on producing one or two major cash crops and when there is no distinct order. These periods can be associated with societal breakdown and with climate breakdown (often interrelated). But all told, those periods that most resemble what we normalize as conventional farming have been short and sporadic. These conventional agricultures always break down after a few centuries some last no more than a generation. Furthermore, even on farms with a profit motive, there are numerous breaks with conventional agriculture. Farmers turn to new crops, new methods, new tools. Variation seems to be more common than convention. This tells me that convention doesn’t work. If it did, farmers wouldn’t be changing their routines so frequently.

The narrative of conventional agriculture seems to be more dominant than the practice of conventional agriculture. A survey of book titles, educational programs, and marketing campaigns reveals almost nothing of alternatives — neither contemporary nor in history. To learn of these alternatives takes research effort. That all these disparate ideas of farming are called “alternative”, in opposition to “conventional” practices when those conventional practices are the outliers, is revealing. Scratch this story and you uncover elite objectives. This is true throughout history. Conventional agriculture is not very good at raising food for community subsistence nor for generating profits for farmers. But it is a highly effective way to both filter wealth upwards and to maintain state control though controlling the food supply.

Which brings us back to stories. History is told by the victors. So our story of ourselves is not ours, but is crafted by and for the elites down through the ages. Did civilization spring into being in Mesopotamia at 10,000 years ago? No. There were cities before there were hierarchies, and human culture existed long before there were cities. This is the story elites crafted to make it seem as though they had a hand in inventing and directing human civil society — and therefore had reason to lead, to be ranked above other humans who had less important roles in creating society, and most importantly to justify taking greater shares of wealth for themselves.

Is farming a mistake? No. But this is a more interesting false narrative. It is not old. It is very recent. Up until a generation or two ago, you would find no people of any background who would make this assertion — even though we’ve been farming for thousands of years and presumably if it was a bad thing it might have occurred to someone else before now to say so. No, this is the tale of modern elites. These are people who do not want to farm, who only reluctantly acknowledge that some form of farming is necessary. These elites are fiercely urban, screen-focused and generally disconnected from biology. Farming is a scapegoat in this narrative. All the wrongs of the modern world are not to be blamed on urbanites and their high levels of consumption but on farming. Look at the waste, they say, cow farts and all. Look at the filth! Look at all the space farming uses, space that could be returned to nature (what they mean by that is always unclear to me).

This story is a smokescreen, created because these urbanites all know on some level that how they live is unsustainable. They fear that they are going to have to do actual work, probably involved in growing food, in the not so distant future. They do not want to farm. So they are demonizing farming with all the hot air in their copious lungs.

But once again, farming is not the problem… it’s still elite men and their bad stories.

Diamond, Jared. 1 May 1999. “The Worst Mistake in the History of the Human Race”. Retrieved from Discover Magazine (https://www.discovermagazine.com/planet-earth/the-worst-mistake-in-the-history-of-the-human-race) on 10 March 2021 at 3pm EST.

Krznaric, Roman. The Good Ancestor: A Radical Prescription for Long-Term Thinking. 2020. The Experiment: New York City, NY.

Tanner, Nancy Makepeace. On Becoming Human. 1981. Cambridge University Press: Cambridge.

Thirsk, Joan. Alternative Agriculture: A History. 1997. Oxford University Press: Oxford.


History of Civilization and Climate Change: What We Can Learn from the Past

Since 2014, each year the scientists have been warning that that was the hottest year since 1880. The Earth’s pre-industrial average temperature was 57. 3 Fahrenheit or 13.6 Celsius.

According to numerous sources in the past several years, the planet’s temperature has risen for more than 1 degree Celsius while an increase of just one more degree could mean a disaster of epic proportions. The Earth’s surface temperature has been continually rising for forty years. In June 2016, the average temperature was 1.3 Celsius above the pre-industrial level and all signs show that it will continue to rise.

Researchers from all over the world agree that this is our own fault and if we wish for our civilization to grow and prosper, we need to take immediate action. As the history of civilization proves, if we create our own doom, we wouldn’t be the first. Humans before have managed to influence the Earth’s climate and cause their own destruction.

Current state of things

Does the term crazy scientist sound familiar? Well, that’s the term most relevant authorities used to describe scientists who first started pointing out the dangers of prolonged emissions of CO2. For those of you unfamiliar with this gas, it’s what made Venus boiling hot and what creates the greenhouse effect.

Since the start of the industrial age, enormous quantities of this gas have been released into our atmosphere. Nobody knows what effect it can have on the future of our planet because it never happened before. What we do know is that our home is warming up. The consequences are numerous and they include melting of the polar ice caps, which in turn could cause sea level to rise by 10 to 40 meters. Just this scenario could wipe out the civilization as we know it.

In November of 2015, world’s greatest powers met in Paris to discuss the strategies for tackling the dangers of the global warming. The result was the Paris agreement which aims to:

Holding the increase in the global average temperature to well below 2 °C above pre-industrial levels and to pursue efforts to limit the temperature increase to 1.5 °C above pre-industrial levels, recognizing that this would significantly reduce the risks and impacts of climate change.

On April 22, 2016, 179 countries signed the agreement in New York, but only 20 countries ratified it, which is not enough for the treaty to enter into force. This could be a historic point for our planet because it takes a collective effort from all the countries in the world to stop what could be the downfall of our civilization.

Climate change and the birth of culture

In order to get to the point at which we are able to influence Earth’s climate, our species needed the climate in which it could thrive. The Last Glacial Period ended around 12.000 years ago, which allowed our civilization to flourish. The most important factor for the growth of a civilization is a steady source of food. Areas of our planet with warm climate gave birth to the first great civilizations like Egypt or Mesopotamia.

Science, art or any form of culture would be impossible without the adequate resources that allow the people to dedicate a sufficient amount of time to them. Before the first empires, humans were mostly hunters who spent their time in search of food, much like all other animals. Somewhere between 7000 and 5000 thousand years ago, the Earth’s climate stabilized, resulting in a period of prosperity.

Still, by burning a lot of trees, our ancestors prevented an ice age some 6.000 years ago. Human’s ability to interfere with the climate has been around much longer than most of us think, and the declines of Egyptian or Indus civilizations could be partially related to the impact they had on their environment.

Examples of major climate crisis can be traced in much more recent history. Medieval Warm Period took place from 900 AD to 1200 AD and Little Ice Age lasted from 1300 AD to 1700 AD. The results of both these events were hunger, epidemics, and wars.

Culture and science decline each time weather conditions change because the primal needs and instinct for survival take over. If the Earth’s atmosphere continues to absorb high amounts of greenhouse gasses, we might witness the greatest climate change since the Last Glacial period.

Despite all of these factors, mankind isn’t the only one to blame for the great shifts in the Earth’s climate. The amount of sunlight and other cosmic circumstances play a large role in the faith of life on our little planet.

Space and the climate on Earth

There would be no life on Earth without a number of cosmic factors that influence Earth’s climate. Our planet’s distance from the Sun is considered to be perfect for the development of life. More importantly, Earth’s rotation plays a huge part in determining whether we are headed for an Ice Age or a Warm period.

The angle of the Earth’s axial tilt changes in relation to the plane of the planet’s orbit. It goes from 22.1° and 24.5° in approximately 41.000 years. The lower angle causes less insulation and conversely the higher angle results in a higher amount of insulation. Scientists think that all major changes in the climate are caused by the variations of the angle of the Earth’s axial tilt.

The current shift is going toward the 22.1° angle and at this point, we should be witnessing the first indications of an Ice Age, but this isn’t happening because of the global warming. Simply put, Earth is heating up instead of cooling down. It seems like we are able to disrupt processes that take thousands of years just by producing ridiculous amounts of CO2. The fragile balance of conditions that made life on Earth possible could easily be disrupted if our civilization doesn’t realize what it is doing to the environment that gave us life.

The history of carbon dioxide emissions

The Anthropocene or the anthropogenic era is an epoch in which civilizations are so advanced that they have the ability to have a significant impact on Earth’s geology and ecosystem. It is the common opinion that the anthropogenic age started 200 hundred years ago with the Industrial Revolution. Scientific studies have discovered that the levels of C02 and CH4 first altered Earth’s atmosphere thousands of years ago.

The early cultures started the process of deforestation, some 8000 years BC, which caused the planet to progressively warm up. Processes similar to this one, which increased the level of greenhouse gasses in the Earth’s atmosphere, repeatedly happened in the history of civilization. Some of the major catastrophes in the past, like bubonic plague, were accompanied by an increase in CO2 emissions.

The amounts of greenhouse gasses that are currently being released into the atmosphere are the highest ever recorded. If cutting and burning trees 8000 years ago stopped an Ice Age, could you imagine what thousands of cars and factories can do to our planet?

Hunger, wars, and diseases are an optimistic scenario, what is more likely to happen is that temperatures will rise so much that the surface of our planet will become inhabitable and the air will become unfit to breathe.

This is a turning point in the history of civilization and it will take a collective effort from all people on Earth to protect our planet and continue the growth of our species. Hopefully, we will not repeat the mistakes of our ancestors, mistakes that caused them years of hardship.


Phiomia (37 Million Years Ago)

LadyofHats / Wikimedia Commons / Public Domain

If you traveled back in time and caught a glimpse of Phosphatherium (previous slide), you probably wouldn't know if it was fated to evolve into a pig, an elephant, or a hippopotamus. The same can't be said about Phiomia, a ten-foot-long, half-ton, early Eocene proboscid that resided unmistakably on the elephant family tree. The giveaways, of course, were Phiomia's elongated front teeth and flexible snout, which adumbrated the tusks and trunks of modern elephants.


Mixed Cropping

Mixed cropping, also known as inter-cropping or co-cultivation, is a type of agriculture that involves planting two or more of plants simultaneously in the same field. Unlike our monocultural systems today (illustrated in the photo), inter-cropping provides a number of benefits, including natural resistance to crop diseases, infestations and droughts.


Riddles of the Anasazi

The four of us walked slowly down the deep, narrow canyon in southern Utah. It was midwinter, and the stream that ran alongside us was frozen over, forming graceful terraces of milky ice. Still, the place had a cozy appeal: had we wanted to pitch camp, we could have selected a grassy bank beside the creek, with clear water running under the skin of ice, dead cottonwood branches for a fire, and—beneath the 800-foot-high rock walls—shelter from the wind.
 

More than seven centuries ago, however, the last inhabitants of the canyon had made quite a different decision about where to live. As we rounded a bend along the trail, Greg Child, an expert climber from Castle Valley, Utah, stopped and looked upward. “There,” he said, pointing toward a nearly invisible wrinkle of ledge just below the canyon rim. “See the dwellings?” With binoculars, we could just make out the facades of a row of mud-and-stone structures. Up we scrambled toward them, gasping and sweating, careful not to dislodge boulders the size of small cars that teetered on insecure perches. At last, 600 feet above the canyon floor, we arrived at the ledge.
 

The airy settlement that we explored had been built by the Anasazi, a civilization that arose as early as 1500 B.C. Their descendants are today’s Pueblo Indians, such as the Hopi and the Zuni, who live in 20 communities along the Rio Grande, in New Mexico, and in northern Arizona. During the 10th and 11th centuries, ChacoCanyon, in western New Mexico, was the cultural center of the Anasazi homeland, an area roughly corresponding to the Four Corners region where Utah, Colorado, Arizona and New Mexico meet. This 30,000-square-mile landscape of sandstone canyons, buttes and mesas was populated by as many as 30,000 people. The Anasazi built magnificent villages such as ChacoCanyon’s Pueblo Bonito, a tenth-century complex that was as many as five stories tall and contained about 800 rooms. The people laid a 400-mile network of roads, some of them 30 feet wide, across deserts and canyons. And into their architecture they built sophisticated astronomical observatories.
 

For most of the long span of time the Anasazi occupied the region now known as the Four Corners, they lived in the open or in easily accessible sites within canyons. But about 1250, many of the people began constructing settlements high in the cliffs—settlements that offered defense and protection. These villages, well preserved by the dry climate and by stone overhangs, led the Anglo explorers who found them in the 1880s to name the absent builders the Cliff Dwellers.
 

Toward the end of the 13th century, some cataclysmic event forced the Anasazi to flee those cliff houses and their homeland and to move south and east toward the Rio Grande and the Little Colorado River. Just what happened has been the greatest puzzle facing archaeologists who study the ancient culture. Today’s Pueblo Indians have oral histories about their peoples’ migration, but the details of these stories remain closely guarded secrets. Within the past decade, however, archaeologists have wrung from the pristine ruins new understandings about why the Anasazi left, and the picture that emerges is dark. It includes violence and warfare—even cannibalism—among the Anasazi themselves. “After about A.D. 1200, something very unpleasant happens,” says University of Colorado archaeologist Stephen Lekson. “The wheels come off.”
 

This past January and February, Greg Child, Renée Globis, Vaughn Hadenfeldt and I explored a series of canyons in southeast Utah and northern Arizona, seeking the most inaccessible Anasazi ruins we could find. I have roamed the Southwest for the past 15 years and have written a book about the Anasazi. Like Greg, who has climbed Everest and K2, Renée is an expert climber she lives in Moab, Utah, and has ascended many desert spires and cliffs. Vaughn, a tour guide from Bluff, Utah, has worked on a number of contract excavations and rock art surveys in southeastern Utah.
 

We were intrigued by the question of why the villages were built high in the cliffs, but we were equally fascinated by the “how”—how the Anasazi had scaled the cliffs, let alone lived there. During our outings, we encountered ruins that we weren’t sure we could reach even with ropes and modern climbing gear, the use of which is prohibited at such sites. Researchers believe the Anasazi clambered up felled tree trunks that were notched by stone axes to form minuscule footholds. These log ladders were often propped on ledges hundreds of feet off the ground. (Some of the ladders are still in place.) But they would not have been adequate to reach several of the dwellings we explored. I believe that archaeologists—who are usually not rock climbers—have underestimated the skill and courage it took to live among the cliffs.
 

The buildings that Greg had spotted were easier to get to than most of the sites we explored. But it wasn’t so easy to navigate the settlement itself. As we walked the ledge of the ruin, the first structure we came to was a five-foot-tall stone wall. Four small loopholes—three-inch-wide openings in the wall—would have allowed sentries to observe anyone who approached. Behind this entry wall stood a sturdy building, its roof still intact, that adjoined a granary littered with 700-yearold, perfectly preserved corncobs. Farther along the narrow ledge, we turned a sharp corner only to be blocked by a second ruined wall. We climbed over it and continued. Twice we were forced to scuttle on our hands and knees as the cliff above swelled toward us, pinching down on the ledge like the jaws of a nutcracker. Our feet gripped the edge of the passage: one careless lurch meant certain death. Finally the path widened, and we came upon four splendidly masoned dwellings and another copious granary. Beneath us, the cliff swooped 150 feet down, dead vertical to a slope that dropped another 450 feet to the canyon floor. The settlement, once home to perhaps two families, seemed to exude paranoia, as if its builders lived in constant fear of attack. It was hard to imagine elders and small children going back and forth along such a dangerous passage. Yet the ancients must have done just that: for the Anasazi who lived above that void, each foray for food and water must have been a perilous mission.
 

Despite the fear that apparently overshadowed their existence, these last canyon inhabitants had taken the time to make their home beautiful. The outer walls of the dwellings were plastered with a smooth coat of mud, and the upper facades painted creamy white. Faint lines and hatching patterns were incised into the plaster, creating two-tone designs. The stone overhang had sheltered these structures so well that they looked as though they had been abandoned only within the past decade—not 700 years ago.
 

Vertiginous cliff dwellings were not the Anasazi’s only response to whatever threatened them during the 1200s in fact, they were probably not all that common in the culture. This became apparent a few days later when Vaughn and I, leaving our two companions, visited Sand Canyon Pueblo in southwest Colorado, more than 50 miles east of our Utah prowlings. Partially excavated between 1984 and 1993 by the not-for-profit Crow Canyon Archaeological Center, the pueblo comprised 420 rooms, 90 to 100 kivas (underground chambers), 14 towers and several other buildings, all enclosed by a stone wall. Curiously, this sprawling settlement, whose well-thought-out architecture suggests the builders worked from a master plan, was created and abandoned in a lifetime, between 1240 and about 1285. Sand Canyon Pueblo looks nothing like Utah’s wildly inaccessible cliff dwellings. But there was a defense strategy built into the architecture nevertheless. “In the late 13th century,” says archaeologist William Lipe of Washington State University, “there were 50 to 75 large villages like SandCanyon in the Mesa Verde, Colorado, region—canyon-rim sites enclosing a spring and fortified with high walls. Overall, the best defense plan against enemies was to aggregate in bigger groups. In southern Utah, where the soil was shallow and food hard to come by, the population density was low, so joining a big group wasn’t an option. They built cliff dwellings instead.”
 

What drove the Anasazi to retreat to the cliffs and fortified villages? And, later, what precipitated the exodus? For a long time, experts focused on environmental explanations. Using data from tree rings, researchers know that a terrible drought seized the Southwest from 1276 to 1299 it is possible that in certain areas there was virtually no rain at all during those 23 years. In addition, the Anasazi people may have nearly deforested the region, chopping down trees for roof beams and firewood. But environmental problems don’t explain everything. Throughout the centuries, the Anasazi weathered comparable crises—a longer and more severe drought, for example, from 1130 to 1180—without heading for the cliffs or abandoning their lands.
 

Another theory, put forward by early explorers, speculated that nomadic raiders may have driven the Anasazi out of their homeland. But, says Lipe, “There’s simply no evidence [of nomadic tribes in this area] in the 13th century. This is one of the most thoroughly investigated regions in the world. If there were enough nomads to drive out tens of thousands of people, surely the invaders would have left plenty of archaeological evidence.”
 

So researchers have begun to look for the answer within the Anasazi themselves. According to Lekson, two critical factors that arose after 1150—the documented unpredictability of the climate and what he calls “socialization for fear”—combined to produce long-lasting violence that tore apart the Anasazi culture. In the 11th and early 12th centuries there is little archaeological evidence of true warfare, Lekson says, but there were executions. As he puts it, “There seem to have been goon squads. Things were not going well for the leaders, and the governing structure wanted to perpetuate itself by making an example of social outcasts the leaders executed and even cannibalized them.” This practice, perpetrated by ChacoCanyon rulers, created a society-wide paranoia, according to Lekson’s theory, thus “socializing” the Anasazi people to live in constant fear. Lekson goes on to describe a grim scenario that he believes emerged during the next few hundred years. “Entire villages go after one another,” he says, “alliance against alliance. And it persists well into the Spanish period.” As late as 1700, for instance, several Hopi villages attacked the Hopi pueblo of Awatovi, setting fire to the community, killing all the adult males, capturing and possibly slaying women and children, and cannibalizing the victims. Vivid and grisly accounts of this massacre were recently gathered from elders by NorthernArizonaUniversity professor and Hopi expert Ekkehart Malotki.
 

Until recently, because of a popular and ingrained perception that sedentary ancient cultures were peaceful, archaeologists have been reluctant to acknowledge that the Anasazi could have been violent. As University of Illinois anthropologist Lawrence Keeley argues in his 1996 book, War Before Civilization, experts have ignored evidence of warfare in preliterate or precontact societies.
 

During the last half of the 13th century, when war apparently came to the Southwest, even the defensive strategy of aggregation that was used at SandCanyon seems to have failed. After excavating only 12 percent of the site, the CrowCanyonCenter teams found the remains of eight individuals who met violent deaths—six with their skulls bashed in—and others who might have been battle victims, their skeletons left sprawling. There was no evidence of the formal burial that was the Anasazi norm—bodies arranged in a fetal position and placed in the ground with pottery, fetishes and other grave goods.
 

An even more grisly picture emerges at Castle Rock, a butte of sandstone that erupts 70 feet out of the bedrock in McElmoCanyon, some five miles southwest of SandCanyon. I went there with Vaughn to meet Kristin Kuckelman, an archaeologist with the CrowCanyonCenter who co-led a dig at the base of the butte.Here, the Anasazi crafted blocks of rooms and even built structures on the butte’s summit. Crow Canyon Center archaeologists excavated the settlement between 1990 and 1994. They detected 37 rooms, 16 kivas and nine towers, a complex that housed perhaps 75 to 150 people. Tree-ring data from roof beams indicate that the pueblo was built and occupied from 1256 to 1274—an even shorter period than Sand Canyon Pueblo existed. “When we first started digging here,” Kuckelman told me, “we didn’t expect to find evidence of violence. We did find human remains that were not formally buried, and the bones from individuals were mixed together. But it wasn’t until two or three years into our excavations that we realized something really bad happened here.”
 

Kuckelman and her colleagues also learned of an ancient legend about Castle Rock. In 1874, John Moss, a guide who had spent time among the Hopi, led a party that included photographer William Henry Jackson through McElmoCanyon. Moss related a story told to him, he said, by a Hopi elder a journalist who accompanied the party published the tale with Jackson’s photographs in the New York Tribune. About a thousand years ago, the elder reportedly said, the pueblo was visited by savage strangers from the north. The villagers treated the interlopers kindly, but soon the newcomers “began to forage upon them, and, at last, to massacre them and devastate their farms,” said the article. In desperation, the Anasazi “built houses high upon the cliffs, where they could store food and hide away ’til the raiders left.” Yet this strategy failed. A monthlong battle culminated in carnage, until “the hollows of the rocks were filled to the brim with the mingled blood of conquerors and conquered.” The survivors fled south, never to return.
 

By 1993, Kuckelman’s crew had concluded that they were excavating the site of a major massacre. Though they dug only 5 percent of the pueblo, they identified the remains of at least 41 individuals, all of whom probably died violently. “Evidently,” Kuckelman told me, “the massacre ended the occupation of Castle Rock.”
 

More recently, the excavators at Castle Rock recognized that some of the dead had been cannibalized. They also found evidence of scalping, decapitation and “face removing”—a practice that may have turned the victim’s head into a deboned portable trophy.
 

Suspicions of Anasazi cannibalism were first raised in the late 19th century, but it wasn’t until the 1970s that a handful of physical anthropologists, including Christy Turner of Arizona State University, really pushed the argument. Turner’s 1999 book, Man Corn, documents evidence of 76 different cases of prehistoric cannibalism in the Southwest that he uncovered during more than 30 years of research. Turner developed six criteria for detecting cannibalism from bones: the breaking of long bones to get at marrow, cut marks on bones made by stone knives, the burning of bones, “anvil abrasions” resulting from placing a bone on a rock and pounding it with another rock, the pulverizing of vertebrae, and “pot polishing”—a sheen left on bones when they are boiled for a long time in a clay vessel. To strengthen his argument, Turner refuses to attribute the damage on a given set of bones to cannibalism unless all six criteria are met.
 

Predictably, Turner’s claims aroused controversy. Many of today’s Pueblo Indians were deeply offended by the allegations, as were a number of Anglo archaeologists and anthropologists who saw the assertions as exaggerated and part of a pattern of condescension toward Native Americans. Even in the face of Turner’s evidence, some experts clung to the notion that the “extreme processing” of the remains could have instead resulted from, say, the post-mortem destruction of the bodies of social outcasts, such as witches and deviants. Kurt Dongoske, an Anglo archaeologist who works for the Hopi, told me in 1994, “As far as I’m concerned, you can’t prove cannibalism until you actually find human remains in human coprolite [fossilized excrement].”
 

A few years later, University of Colorado biochemist Richard Marlar and his team did just that. At an Anasazi site in southwestern Colorado called CowboyWash, excavators found three pit houses—semi-subterranean dwellings—whose floors were littered with the disarticulated skeletons of seven victims. The bones seemed to bear most of Christy Turner’s hallmarks of cannibalism. The team also found coprolite in one of the pit houses. In a study published in Nature in 2000, Marlar and his colleagues reported the presence in the coprolite of a human protein called myoglobin, which occurs only in human muscle tissue. Its presence could have resulted only from the consumption of human flesh. The excavators also noted evidence of violence that went beyond what was needed to kill: one child, for instance, was smashed in the mouth so hard with a club or a stone that the teeth were broken off. As Marlar speculated to ABC News, defecation next to the dead bodies 8 to 16 hours after the act of cannibalism “may have been the final desecration of the site, or the degrading of the people who lived there.”
 

When the Castle Rock scholars submitted some of their artifacts to Marlar in 2001, his analysis detected myoglobin on the inside surfaces of two cooking vessels and one serving vessel, as well as on four hammerstones and two stone axes. Kuckelman cannot say whether the Castle Rock cannibalism was in response to starvation, but she says it was clearly related to warfare. “I feel differently about this place now than when we were working here,” a pensive Kuckelman told me at the site. “We didn’t have the whole picture then. Now I feel the full tragedy of the place.”
 

That the Anasazi may have resorted to violence and cannibalism under stress is not entirely surprising. “Studies indicate that at least a third of the world’s cultures have practiced cannibalism associated with warfare or ritual or both,” says WashingtonStateUniversity researcher Lipe. “Occasional incidents of ‘starvation cannibalism’ have probably occurred at some time in history in all cultures.”
 

From Colorado, I traveled south with Vaughn Hadenfeldt to the Navajo Reservation in Arizona. We spent four more days searching among remote Anasazi sites occupied until the great migration. Because hiking on the reservation requires a permit from the Navajo Nation, these areas are even less visited than the Utah canyons. Three sites we explored sat atop mesas that rose 500 to 1,000 feet, and each had just one reasonable route to the summit. Although these aeries are now within view of a highway, they seem so improbable as habitation sites (none has water) that no archaeologists investigated them until the late 1980s, when husband-and-wife team Jonathan Haas of Chicago’s Field Museum and Winifred Creamer of Northern Illinois University made extensive surveys and dated the sites by using the known ages of different styles of pottery found there.
 

Haas and Creamer advance a theory that the inhabitants of these settlements developed a unique defense strategy. As we stood atop the northernmost mesa, I could see the second mesa just southeast of us, though not the third, which was farther to the east yet when we got on top of the third, we could see the second. In the KayentaValley, which surrounded us, Haas and Creamer identified ten major villages that were occupied after 1250 and linked by lines of sight. It was not difficulty of access that protected the settlements (none of the scrambles we performed here began to compare with the climbs we made in the Utah canyons), but an alliance based on visibility. If one village was under attack, it could send signals to its allies on the other mesas.
 

Now, as I sat among the tumbled ruins of the northernmost mesa, I pondered what life must have been like here during that dangerous time. Around me lay sherds of pottery in a style called Kayenta black on white, decorated in an endlessly baroque elaboration of tiny grids, squares and hatchings—evidence, once again, that the inhabitants had taken time for artistry. And no doubt the pot makers had found the view from their mesa-top home lordly, as I did. But what made the view most valuable to them was that they could see the enemy coming.
 

Archaeologists now generally agree about what they call the “push” that prompted the Anasazi to flee the Four Corners region at the end of the 13th century. It seems to have originated with environmental catastrophes, which in turn may have given birth to violence and internecine warfare after 1250. Yet hard times alone do not account for the mass abandonment—nor is it clear how resettling in another location would have solved the problem. During the past 15 years, some experts have increasingly insisted that there must also have been a “pull” drawing the Anasazi to the south and east, something so appealing that it lured them from their ancestral homeland. Several archaeologists have argued that the pull was the Kachina Cult. Kachinas are not simply the dolls sold today to tourists in Pueblo gift shops. They are a pantheon of at least 400 deities who intercede with the gods to ensure rain and fertility. Even today, Puebloan life often revolves around Kachina beliefs, which promise protection and procreation.
 

The Kachina Cult, possibly of Mesoamerican origin, may have taken hold among the relatively few Anasazi who lived in the Rio Grande and Little Colorado River areas about the time of the exodus. Evidence of the cult’s presence is found in the representations of Kachinas that appear on ancient kiva murals, pottery and rock art panels near the Rio Grande and in south-central Arizona. Such an evolution in religious thinking among the Anasazi farther south and east might have caught the attention of the farmers and hunters eking out an increasingly desperate existence in the Four Corners region. They could have learned of the cult from traders who traveled throughout the area.
 

Unfortunately, no one can be sure of the age of the Rio Grande and southern Arizona Kachina imagery. Some archaeologists, including Lipe and Lekson, argue that the Kachina Cult arose too late to have triggered the 13th-century migration. So far, they insist, there is no firm evidence of Kachina iconography anywhere in the Southwest before A.D. 1350. In any case, the cult became the spiritual center of Anasazi life soon after the great migration. And in the 14 th century, the Anasazi began to aggregate in even larger groups—erecting huge pueblos, some with upwards of 2,500 rooms. Says Stephen Lekson, “You need some sort of social glue to hold together such large pueblos.”
 

the day after exploring the KayentaValley, Vaughn and I hiked at dawn into the labyrinth of the TsegiCanyon system, north of the line-of-sight mesas. Two hours in, we scrambled up to a sizable ruin containing the remains of some 35 rooms. The wall behind the structures was covered with pictographs and petroglyphs of ruddy brown bighorn sheep, white lizard-men, outlines of hands (created by blowing pasty paint from the mouth against a hand held flat on the wall) and an extraordinary, artfully chiseled 40-foot-long snake.
 

One structure in the ruin was the most astonishing Anasazi creation I have ever seen. An exquisitely crafted wooden platform built into a huge flaring fissure hung in place more than 30 feet above us, impeccably preserved through the centuries. It was narrow in the rear and wide in the front, perfectly fitting the contours of the fissure. To construct it, the builders had pounded cup holes in the side walls and wedged the ax-hewn ends of massive cross-beams into them for support. These were overlaid with more beams, topped by a latticework of sticks and finally covered completely with mud. What was the platform used for? No one who has seen it has offered me a convincing explanation. As I stared up at this woodwork masterpiece, I toyed with the fancy that the Anasazi had built it “just because”: art for art’s sake.
 

The Tsegi Canyon seems to have been the last place where the Anasazi hung on as the 13th century drew to a close. The site with the wooden platform has been dated by Jeffrey Dean of the Arizona Tree-Ring Laboratory to 1273 to 1285. Dean dated nearby Betatakin and Keet Seel, two of the largest cliff dwellings ever built, to 1286—the oldest sites discovered so far within the abandoned region. It would seem that all the strategies for survival failed after 1250. Just before 1300, the last of the Anasazi migrated south and east, joining their distant kin.
 

“War is a dismal study,” Lekson concludes in a landmark 2002 paper, “War in the Southwest, War in the World.” Contemplating the carnage that had destroyed Castle Rock, the fear that seemed built into the cliff dwellings in Utah, and the elaborate alliances developed in the KayentaValley, I would have to agree.
 

Yet my wanderings this past winter in search of 13th-century ruins had amounted to a sustained idyll. However pragmatic the ancients’ motives, terror had somehow given birth to beauty. The Anasazi produced great works of art—villages such as Mesa Verde’s Cliff Palace, hallucinatory petroglyph panels, some of the most beautiful pottery in the world—at the same time that its people were capable of cruelty and violence. Warfare and cannibalism may have been responses to the stresses that peaked in the 13th century, but the Anasazi survived. They survived not only whatever crisis struck soon after 1250, but also the assaults of the Spanish conquest in the 16th century and the Anglo-American invasion that began in the 19th. From Taos Pueblo in New Mexico to the Hopi villages in Arizona, the Pueblo people today still dance their traditional dances and still pray to their own gods. Their children speak the languages of their ancestors. The ancient culture thrives.

About David Roberts

David Roberts is is a veteran mountain climber and author of 27 books, including The Mountain of My Fear and Deborah. His latest book, The Lost World of the Old Ones, which chronicles archeological discoveries in the ancient Southwest, is due out this spring.


If You Hate Ice Ages, Thank a Farmer

Idaho Museum of Natural History

University of Virginia climatologist William Ruddiman has spent a good bit of his career studying the Pleistocene cycle of ice ages that began about 2.6 million years ago. Periods of large-scale glaciation and deglaciation are governed by the Milankovitch cycle, in which shifts of the Earth's orbit and its inclination toward the sun change how much sunlight reaches the northern hemipsphere to warm the surface. Based on solely these orbital cycles, global average temperatures of our current interglacial period—the Holcene—should be dropping, with the result that glaciers should now be growing in northern Canada and Siberia. That is not happening. Why?

Puzzled by these anomalies, Ruddiman hypothesized nearly two decades ago that an increase in greenhouse gases that began about 8,000 years ago was keeping the onset of a new ice age at bay. Specifically, he noted that the atmospheric concentrations of the two chief greenhouse gases carbon dioxide (CO2) and methane (CH4) were not following the downward trends observed at similar stages in previous interglacial periods. Fuddiman noted that the ice core data showed no case during past ice ages in which carbon dioxide concentrations rose after peaking at the point of maximum deglaciation.

Based on the trajectory of earlier ice ages, Ruddiman calculated that atmospheric carbon dioxide levels should have fallen from their post-deglaciation peak of around 268 parts per million (ppm) to around 240 ppm by 1800. Instead, pre-industrial carbon dioxide concentrations were actually at around 285 ppm. He also identified similar anomalous upward trends in atmospheric methane trends. What was the cause of these higher-than-normal concentrations of greenhouse gases?

Farmers: carbon dioxide sequestering forests were chopped down to grow crops, while the rotting of vegetation in rice paddies boosted global methane concentrations.

In the current Scientific Reports, Ruddiman and his colleagues use a climate model to compare the temperature trajectory of the interglacial period of about 777,000 years ago, whose orbital characteristics most closely those of our own Holocene era. They find that without the increase in greenhouse gases caused by farming, current global average temperatures would likely have been about 1.3 degrees Celsius (2.3 degrees Fahrenheit) lower than they were around 1850. Arctic temperatures would have been 5 to 6 degrees Celsius (12 degrees Fahrenheit) colder than they were at that time.

Instead of falling, atmospheric concentrations of carbon dioxide are now at around 405 ppm, and those of methane at more than 1,800 parts per billion. Assuming Ruddiman's and his colleagues' calculations are right, the 0.8 degree Celsius increase in global average temperatures since the 19th century suggests that the Earth is now about 2.1 degrees warmer than it would otherwise have been.

"There is pretty good agreement in the community of climate scientists that we have stopped the next glaciation for the long, foreseeable future, because even if we stopped putting carbon dioxide into the atmosphere, what we have now would linger," says Ruddiman at Science Daily. "The phenomenal fact is, we have maybe stopped the major cycle of Earth's climate and we are stuck in a warmer and warmer and warmer interglacial."

While it remains to be seen how well future generations will be able to adapt to the warmer world that farmers and factories have bequeathed them, the onset of a new ice age would be disastrous for humanity.


Should We Thank or Blame the Ancestors? Ancient Farmers Stop the Ice Age - History

Tens of thousands of years ago, during the Ice Age, a new creature appeared on Earth: the dog. How did this happen? And how has the relationship between humans and dogs changed over the years? Two fascinating articles tell an incredible story that connects to science, history—and of course, lots of adorable doggies.

As you read these articles, look for how dogs, and their relationships with humans, have changed over time.

How the Wolf Became the Dog

Life was tough for humans during the Ice Age. A new kind of friend made things better.

Be happy you didn’t live on Earth 35,000 years ago.

That was a time known as the Ice Age. Large sheets of ice covered much of Europe, Asia, and the Americas. There were no nations yet, no cities or towns. For many of our early human ancestors, life was a daily struggle for survival. They lived in caves or huts made of animal bones. They hunted reindeer with sharpened stones and sticks. Danger lurked everywhere—diseases with no cures, saber-toothed tigers with 11-inch fangs, elephant-like mastodons with swordlike tusks.

But it was during this harsh time that something beautiful was born: the friendship between humans and dogs.

Be glad you didn’t live on Earth 35,000 years ago.

That was a time known as the Ice Age. Large sheets of ice covered much of Europe, Asia, and the Americas. There were no nations yet, no cities or towns. Many of our early human ancestors struggled to survive. They lived in caves or huts made of animal bones. They hunted reindeer with sharpened stones and sticks. Danger was everywhere. There were diseases with no cures. There were sabertoothed tigers with 11-inch fangs. There were elephant-like mastodons with long, sharp tusks.

But during this harsh time, something beautiful was born: the friendship between humans and dogs.

Granger, NYC/The Granger Collection

Beloved ancient Egyptian hunting dogs were often turned into mummies.

Dogs have been guarding us, working with us, and snuggling with us for thousands of years. But scientists are only now starting to understand the long history of dogs. There
are many mysteries. One thing is certain though: Every dog has the same ancestor, the gray wolf.

This does not mean that a fierce wolf suddenly and magically morphed into a yapping Chihuahua with a pink bow. The change occurred gradually, over thousands of years. Scientists speculate that the first dog appeared between 15,000 and 38,000 years ago.

At that time, many animals—including the wolf—posed a threat to humans. But at some point, a group of humans and a group of wolves teamed up. How did this happen?

One theory: A few wolves crept into human campsites, lured by tasty food scraps. These wolves were less aggressive than other wolves. But they still helped protect humans from dangerous predators. And so humans let these wolves stick around. The gentler wolves, their bellies full of human food, lived longer than other wolves. They gave birth to even gentler babies, which grew up to have gentle babies of their own. On and on this went, until a new, calmer breed of wolf emerged.

Dogs have been living with humans for thousands of years. But scientists are only now starting to understand the history of dogs. There are many mysteries. But one thing is certain: All dogs have the same ancestor, the gray wolf.

This does not mean that a fierce wolf suddenly morphed into a yapping Chihuahua with a pink bow. The change happened slowly. It took thousands of years. Experts speculate that the first dog appeared between 15,000 and 38,000 years ago.

At that time, many animals posed a threat to humans. Wolves were among them. But at some point, a group of humans and a group of wolves teamed up. How did this happen?

One theory: A few wolves crept into human campsites to eat food scraps. They were less aggressive than other wolves. But they still helped protect humans from other animals. And so the humans let them stay. The gentler wolves ate human food. This helped them live longer than other wolves. They gave birth to even gentler babies, which grew up and had gentle babies too. After a while, there was a new, calmer breed of wolf.

Bettmann Archive/Getty Images

Sergeant Stubby was the most famous dog soldier of World War I.

As the centuries passed, the wolves living near humans continued to change. Their bodies got smaller, their ears floppier. They became friendlier and more eager to please humans. Soon, a new kind of creature had developed: the dog.

Dogs were the first domesticated animals—that is, animals bred and raised to live among us. Today, there are many kinds of domesticated animals—cows that give us milk, chickens that lay eggs, horses that we ride, and sheep that provide wool. But dogs were the first.

Eventually, humans put dogs to work in new ways. Dogs became trained hunters, fighters, and animal herders. Roman warriors marched into battle alongside enormous war dogs. In ancient Egypt, some hunting dogs were so prized that they were turned into mummies and buried with their owners.

Dogs helped in less ferocious ways too. Before people used forks, spoons, and napkins, they’d wipe their greasy hands on dogs that sat by their tables. On icy winter nights, people used dogs as foot warmers. Some European kings wouldn’t take a bite of food until their dog had tasted it first. Only then could they be sure the food hadn’t been poisoned.

Centuries went by. The wolves living near humans continued to change. They got smaller. Their ears got floppier. They became friendlier and more eager to please humans. Over time, a new kind of creature developed: the dog.

Dogs were the first domesticated animals—that is, animals bred and raised to live among us. Today, there are many kinds of domesticated animals. There are cows that give us milk, chickens that lay eggs, horses that we ride, and sheep that provide wool. But dogs were the first.

Humans began putting dogs to work in new ways. They trained dogs to hunt, fight, and herd animals. Roman warriors marched into battle alongside huge war dogs. In ancient Egypt, favorite hunting dogs were turned into mummies and buried with their owners.

Dogs helped in other ways too. Before people used forks, spoons, and napkins, they’d wipe their greasy hands on dogs. On cold nights, people used dogs as foot warmers. In Europe, some kings wouldn’t eat their food until their dog had tasted it first. That way, they could tell if the food had been poisoned.

Bettmann Archive/Getty Images

Balto became a hero for delivering medicine to sick children in Alaska.

In the Americas, dogs have been working alongside humans for thousands of years. Native peoples used dogs as guards and hunting companions. George Washington plotted Revolutionary War battles with his hunting dog Sweetlips by his side. In the early 1800s, explorers Lewis and Clark journeyed across America’s western wilderness with a big black dog named Seaman.

As the centuries have passed, the bond between dogs and people has gotten stronger and stronger. And it all began tens of thousands of years ago, with a family of wolves howling across a dangerous, frozen land.

In the Americas, dogs have been helping humans for many years. Native peoples used dogs as guards and hunting companions. George Washington planned Revolutionary War battles with his hunting dog Sweetlips by his side. In the early 1800s, explorers Lewis and Clark crossed America’s western wilderness with a big black dog named Seaman.

Over time, the bond between dogs and people has grown very strong. And it all began thousands of years ago, with a family of wolves howling across a dangerous, frozen land.

How America Went DOG Crazy

Today, dogs are more than pets. They’re members of the family.

Scout, a little brown dog, seems to be going crazy. He bounces up and down like a furry ball. His tiny pink tongue flaps from his mouth as he licks everyone
in sight.

“He’s just excited,” sighs 12-year-old Ruby. “He’s always excited.”

Since Scout’s arrival in Ruby’s home two years ago, the dog has been an endless source of ear-splitting yaps, slobbery licks, smelly indoor puddles, and brown stains on the rug.

Nobody in Ruby’s family ever imagined that they would own such a spoiled, badly behaved little beast. Nor did the family imagine that they could love an animal as much as they love Scout.

“He’s so annoying,” Ruby moans. But then she snatches up the little dog and kisses his slimy black nose.

You can practically see Ruby’s heart melting with love.

Scout, a little brown dog, seems to be going crazy. He bounces up and down like a furry ball. His tongue flaps from his mouth as he licks everyone in sight.

“He’s just excited,” sighs 12-year-old Ruby. “He’s always excited.”

Scout lives with Ruby’s family. He yaps loudly. He slobbers. He leaves puddles on the floor. He stains the rug.

No one in Ruby’s home ever imagined that they would own such a spoiled, badly behaved little beast. Nor did they imagine that they could love an animal as much as they love Scout.

“He’s so annoying,” Ruby moans. But then she grabs Scout and kisses him.

You can almost see Ruby’s heart melting with love.

Granger, NYC/The Granger Collection

President Franklin D. Roosevelt was rarely seen without his terrier, Fala.

Today, nearly 50 percent of American families own at least one dog. Americans spend tens of billions of dollars on their dogs each year—on everything from veterinarian visits and grooming to gourmet treats and high-tech gadgets like doggy treadmills. A 2015 poll found that 38 percent of U.S. dog owners cook special meals for their dogs. It’s not surprising that 96 percent of owners consider their dogs to be members of the family.

Dogs have been by the sides of humans for tens of thousands of years. But until recently, dogs were mainly valued for the work they could do. They could chase foxes away from chicken coops and clear restaurant kitchens of rats. They could hunt for ducks and pull sleds over snowy hills. When fires broke out in cities, firehouse dogs cleared the way for fire wagons pulled by horses.

These hard-working dogs were too dirty and smelly to be allowed indoors. Dogs that became sick or injured either healed on their own or died most veterinarians provided care only for valuable animals, like horses and cows.

Today, nearly half of all American families own a dog. We spend tens of billions of dollars on our dogs each year. There are vet visits, grooming, gourmet treats, and more. A 2015 poll found that 38 percent of U.S. dog owners cook special meals for their dogs. It’s no surprise that 96 percent of owners think of their dogs as family members.

Dogs have been by the sides of humans for thousands of years. But until recently, dogs were mainly valued for the work they could do. They chased foxes away from chicken coops. They cleared restaurant kitchens of rats. They hunted for ducks. They pulled sleds over snow. When fires broke out in cities, firehouse dogs cleared the way for fire wagons pulled by horses.

These hard-working dogs were too dirty and smelly to live indoors. If they got sick or hurt, they healed on their own or they died. Most vets treated only animals that were seen as valuable at that time, like horses and cows.

Gabi Rona/CBS Photo Archive/Getty Images

In the ‘50s, the show Lassie helped turn dogs into all-American pets.

But by the late 1800s, that was starting to change. America was becoming wealthier. More people could afford to feed and care for a pet. New and powerful soaps scrubbed dogs clean and killed fleas. Companies started selling dog food, which made feeding a dog more convenient. Veterinarians opened offices just for treating dogs and other pets. In the 1950s, some of the most popular TV shows, like Lassie and The Adventures of Rin Tin Tin, helped turn dogs into all-American pets.

Of course, Americans have embraced other pets too. For instance, there are more cats in American homes than dogs. But humans have a uniquely powerful relationship with dogs, one that scientists are just beginning to figure out.

But by the late 1800s, that was changing. America was becoming wealthier. More people could afford to feed and care for a pet. New and powerful soaps scrubbed dogs clean and killed fleas. Companies started selling dog food, which made feeding a dog simpler. Vets opened offices just for treating pets. In the 1950s, TV shows like Lassie and The Adventures of Rin Tin Tin helped turn dogs into popular pets.

Americans love other pets too. There are more cats than dogs in American homes. But humans have a special connection with dogs. Scientists are just starting to figure out this connection.

Studies show that dogs really do improve our lives. Walking a dog several times a day improves the health of elderly people. Dogs can help kids with autism and other challenges cope with stress.

New research is helping to show the scientific basis for our connection to dogs. In 2015, Japanese researchers found that when humans and dogs gaze into each other’s eyes, something happens inside both species’ bodies. Both the human’s and the dog’s brains release a chemical that makes them feel close. This is the same chemical that helps mothers feel close to their babies.

Another study showed that when humans point to something, dogs look where we’re pointing. This shows that dogs try to understand us. Not even our closest animal relative—the chimpanzee—does that naturally.

Today, dogs help humans in many incredible ways. They lead people who can’t see. They find people who are lost. They comfort wounded soldiers.

But most dogs are like Scout, with just one main job: loving us. And for most of us, that’s enough.

Studies show that dogs make our lives better. Dog owners tend to get more exercise those daily walks make them healthier. Dogs can help kids with autism and other challenges cope with stress.

New research is helping to uncover the scientific reason for our connection to dogs. In 2015, Japanese researchers found that when humans and dogs look into each other’s eyes, something happens inside their bodies. Both the human’s and the dog’s brains release a chemical that makes them feel close. It’s the same chemical that helps mothers feel close to their babies.

Another study showed that when humans point to something, dogs look where we’re pointing. This shows that dogs try to understand us. Not even our closest animal relative, the chimpanzee, does that naturally.

Today, dogs help humans in many ways. They lead people who can’t see. They find people who are lost. They comfort wounded soldiers.

But most dogs, like Scout, have just one main job: to love us. And for most of us, that’s enough.

  • Imagine you could transform yourself into either a wolf or a dog. Which would you be? Write a paragraph explaining your choice and what time period you would like to live in. Find details in the articles, and use your imagination, to describe what your life would be like, what your daily activities might be, and what your relationship with humans would be like.
  • Watch the video "Into the World of Military Working Dogs." As you watch, make a list of all the ways dogs help soldiers. Then use your list to write a thank-you note to a military working dog for being such an important helper.
  • Imagine you could transform yourself into either a wolf or a dog. Which would you be? Write a paragraph explaining your choice and what time period you would like to live in. Find details in the articles, and use your imagination, to describe what your life would be like, what your daily activities might be, and what your relationship with humans would be like.
  • Watch the video "Into the World of Military Working Dogs." As you watch, make a list of all the ways dogs help soldiers. Then use your list to write a thank-you note to a military working dog for being such an important helper.
  • Imagine you could transform yourself into either a wolf or a dog. Which would you be? Write a paragraph explaining your choice and what time period you would like to live in. Find details in the articles, and use your imagination, to describe what your life would be like, what your daily activities might be, and what your relationship with humans would be like.
  • Watch the video "Into the World of Military Working Dogs." As you watch, make a list of all the ways dogs help soldiers. Then use your list to write a thank-you note to a military working dog for being such an important helper.
  • Imagine you could transform yourself into either a wolf or a dog. Which would you be? Write a paragraph explaining your choice and what time period you would like to live in. Find details in the articles, and use your imagination, to describe what your life would be like, what your daily activities might be, and what your relationship with humans would be like.
  • Watch the video "Into the World of Military Working Dogs." As you watch, make a list of all the ways dogs help soldiers. Then use your list to write a thank-you note to a military working dog for being such an important helper.

Synthesizing, vocabulary, text evidence, main idea, key details, tone, compare and contrast, cause and effect, text structure, explanatory writing

“How the Wolf Became the Dog” explains where dogs came from and the history of their relationship with humans. “How America Went DOG Crazy” is about how dogs became popular and beloved pets in the United States.

The first text is mainly chronological. Both texts include cause-and-effect and compare-and-contrast structures.

The articles include challenging academic and domain-specific vocabulary (e.g. ancestors, domesticated, morphed, predators), as well as figurative language like similes and rhetorical questions.

Some knowledge of dog characteristics and behavior will aid in comprehension. The articles also include historical references (George Washington, Lewis and Clark) and mention of old TV shows.

860L (on level), 650L (lower level)

Preview Text Features and Vocabulary (20 minutes)

  • Have students look at the photos and captions in both articles. Ask: What difference do you notice between the dogs featured in the first article and those in the second? (The dogs in the first article have important jobs: hunting, fighting, delivering medicine. The ones in the second article seem to be adored pets.)
  • Distribute the vocabulary activity to introduce challenging terms in the text. Highlighted terms: ancestors, mastodons, morphed, speculate, aggressive, domesticated
  • Call on a student to read aloud the Up Close box on page 16 for the class.

Read and Unpack the Text (45 minutes)

Read the articles as a class. Then put students in groups to answer the close-reading questions.

Discuss the critical-thinking question as a class.

“How the Wolf Became the Dog”

Close-Reading Questions

In the first section, the authors write that “life was a daily struggle for survival” during the Ice Age. What evidence do they give to support this statement? (text evidence) The authors explain that many early humans lived in shelters made of animal bones, hunted using simple tools, suffered from diseases with no cures, and faced threats from fierce animals like saber-toothed tigers.

According to “From Wolf to Dog,” what do scientists know for sure about the history of dogs? (main idea) Scientists know that all dogs have the same animal ancestor, the gray wolf, and that it took thousands of years for wolves to turn into the creatures we know as dogs.

What is one theory about how humans and wolves first teamed up? How did this help both species? (key details) One theory is that a group of less aggressive wolves began sneaking into human campsites to eat food scraps. This helped keep the humans safe from other dangerous predators, and helped the wolves live longer than most other wolves.

Based on “Hunters, Napkins,” what is a domesticated animal? What details in this section help you understand what makes dogs domesticated animals? (vocabulary/key details) A domesticated animal is one that has developed to live among humans, often to serve a useful purpose. The section shows that dogs are domesticated by noting that they are “eager to please humans” and that humans have used them to perform jobs like hunting, herding, and even foot-warming.

“How America Went DOG Crazy“

Close-Reading Questions

In the first section, what is the authors’ tone, or attitude, toward Scout? Why do you think they describe Scout in this way? (tone) The authors’ tone is annoyed and disapproving they describe Scout as “a spoiled, badly behaved little beast.” This description shows that his owners’ love for him is strong enough to make up for the annoyance.

Reread the section “Too Dirty and Smelly.” How is the way dogs are treated today different from the way they were treated in the past? (compare and contrast) Today, dogs are treated as important members of the family they’re pampered with treats and rushed to the veterinarian when they’re sick. But in the past, dogs were seen simply as workers. They were kept outside and not considered valuable enough to be taken for medical care.

Based on “From Workers to Pets,” how was America changing in the late 1800s? How did this affect our relationship with dogs? (cause and effect) In the late 1800s, America was becoming wealthier. More people could afford to feed and care for dogs, so dogs became more popular as pets.

Why might the authors have included the section “A Surprising Discovery”? (text structure) The authors likely included this section to help explain one of the article’s main ideas—that humans and dogs have “a uniquely powerful relationship.” Understanding the scientific basis for this relationship helps readers see why dogs are such popular pets.

Critical-Thinking Question

What is the biggest difference between why people own dogs today and why people owned dogs in the past? Use details from both articles in your answer. (synthesizing) Today, most people keep dogs as companions 96 percent of owners even consider their pet dogs to be members of the family. But in the past, people kept dogs mainly to perform jobs like hunting, herding, and fighting.


Oldest DNA Sequenced Yet Comes From Million-Year-Old Mammoths

Woolly mammoths were icons of the Ice Age. Starting 700,000 years ago to just 4,000 years ago, they trundled across the chilly steppe of Eurasia and North America. As ancient glaciers expanded across the Northern Hemisphere, these beasts survived the rapidly cooling temperatures with cold-resistant traits, a characteristic they came by not through evolution, as earlier thought. Woolly mammoths, a new Nature study finds, inherited the traits that made them so successful from a mammoth species closer to a million years old.

The clues come from some incredibly old DNA extracted from a trio of molars uncovered in northeastern Siberia. The oldest is nicknamed the Krestovka mammoth, dated to about 1.2 million years ago. The other two molars are nicknamed the Adycha and Chukochya mammoths, dated to 1 million and 500,000 to 800,000 years old, respectively. The fact that the researchers were able to extract and analyze the DNA from these fossils at all is a landmark. Up until now, the oldest look at ancient genes came from an Ice Age horse that lived over 560,000 years ago. The new mammoth samples double that, taking the title for the oldest DNA yet recovered from fossil remains. “We had to deal with DNA that was significantly more degraded compared to the horse,” says Swedish Museum of Natural History paleogeneticist Love Dalén, an author of the new study.

Understanding such ancient genetic material is a challenge because DNA begins to decay at death. Ancient DNA samples can sometimes become contaminated by modern sources. While preserved snippets of the ancient horse’s DNA were about 78 base pairs long, the fragments of mammoth DNA were about 42-49 base pairs long. Dalén says it can be sometimes difficult to tell which short snippets are from the mammoth and which should be disregarded as modern contamination from bacteria or people. The researchers compared the DNA results of the three teeth to elephants and humans, and discarded any data that seemed like it could have come from humans.

The emerging picture painted by the ancient DNA is different from what researchers expected. “It is indeed a fascinating paper,” says American Museum of Natural History paleontologist Ross MacPhee, who was not involved with the new study, both for setting a new landmark for ancient DNA but also for finding evidence that at least one mammoth species originated as a hybrid.

The story began over a million years ago in Eurasia, when a large species that preceded the woolly mammoth, called the steppe mammoth, Mammuthus trogontherii, lived. These mammoths aren’t as well-known as the woolies and most of what’s been uncovered about them comes from bones alone rather than carcasses with tatters of soft tissue. No one knew whether these beasts were adapted to the cold or not, with the supposition being that the steppe mammoths thrived during warmer interglacial periods and woolly mammoths evolved from the steppe mammoths when the ice expanded its hold on the planet.

Yet the researchers found that the older, million-year-old mammoths had genes for shaggy coats and some other physiological adaptations for life in cold habitats, meaning that the woolies inherited many of their characteristic features. The molar referred to as the Adycha mammoth, at about a million years old and resembling that of a steppe mammoth, contains the genetic markers for these traits even though the mammoth lived hundreds of thousands of years before woolies. What this finding hints, Dalén says, is that many of the critical traits that allowed mammoths to populate cold regions happened much earlier–perhaps during the evolution of the steppe mammoth from its hypothesized ancestor around 1.7 million years ago.

In their genetic analysis, Dalén and colleagues also examined how the three ancient mammoths related to other known specimens and species. The Krestovka mammoth, at about 1.2 million years old, came out as a unique lineage of mammoth that didn’t fit into any previously known species. And this newly discovered mammoth lineage had an important role to play. The researchers hypothesize that Mammuthus columbi–a huge species the roamed North America from 10,500 to 1.5 million years ago–originated as a hybrid between the ancestors of the woolly mammoth and the genetic lineage of the Krestovka mammoth. “That certainly came as a complete surprise to us,” Dalén says.

The molar of the Chukochya mammoth was dated to over 500,000 years old, one of the three samples used in the new study. (Love Dalén)

That Mammuthus columbi originated as a new species, born of a hybridization event, “has major implications for our understanding of the population structure of Pleistocene megabeasts,” MacPhee says. The ancestors of the woolly mammoth and the Krestova mammoth had diverged from each other for about a million years before a population produced a hybrid that was different from both, giving rise to Mammuthus columbi. More than that, MacPhee notes, “it suggests that mammoths in the Old and New Worlds acted as a hugely distributed metapopulation,” with populations able to interbreed with each other despite looking different from each other.

The study is hardly the final word on the mammoth family tree, of course. Paleogeneticists and paleontologists are just beginning to understand how all these mammoths are related. In North America, for example, some fossils were labeled by 20th century paleontologist Henry Fairfield Osborn as Jefferson’s mammoth and sometimes these fossils are categorized as a unique species. The suspicion among experts is that these mammoths are hybrids between woolly mammoths and Mammuthus columbi, an idea that can be tested against the genetic evidence. North American mammoths dated to about 126,000 to 770,000 years ago, Dalén says, might hold additional genetic clues about how mammoth species hybridized with each other to give rise to new forms of mammoth through time.

Ancient genes are revealing that the Ice Age world was very different from our own. Megafauna thrived through the world’s continents, and those animals may have had genetic connections to each other that extinction has obscured. “We don’t think of megabeast species being able to maintain multicontinental ranges these days, but that must at least be partly due to the fact that humans have disrupted their ranges, population structure and mating opportunities for millennia,” MacPhee says.

About Riley Black

Riley Black is a freelance science writer specializing in evolution, paleontology and natural history who blogs regularly for Scientific American.


Do the following statements agree with the information given in Reading Passage?

In boxes 9-14 on your answer sheet write

TRUE if the statement agrees with the information

FALSE if the statement contradicts the information

NOT GIVEN if the there is no information on this

9 Some mega fauna have been eliminated by humans in the past 100 years.
Answer: FALSE Locate

10 Agriculture is considered a primary cause of global warming today.
Answer: FALSE Locate

11 Ruddimans idea caused a great deal of argument among scientists.
Answer: TRUE Locate

12 New scientific evidence proves for certain that Ruddimans theory is correct.
Answer: FALSE Locate

13 The 20 th century has seen the greatest ever increase in global temperatures.
Answer: NOT GIVEN

14 Changes in the Earths orbit can affect global temperatures.
Answer: TRUE Locate


Watch the video: Εποχή των παγετώνων 4 full movie greek