What To Think About the Anthropocene Debate - from an Insider
For six years, I was a member of the Working Group on the Anthropocene, a panel weighing evidence humans are shaping a new epoch of Earth history. The group's work is almost done, but debate rages on.
From 2010 through 2016, I was a member of the Working Group on the ‘Anthropocene’. (The quotation marks are part of the official name for now, as I’ll explain.)
This is the panel tasked in 2009 by the world’s main geological society with weighing evidence humans have so jolted Earth’s operating systems that we have left the Holocene, the name for the span of geological time since the last ice age ended 11,700 years ago, and entered a geological epoch of our own making - named for the force creating it, duration as yet unknown.
I was the lone journalist among several dozen stratigraphers, other scientists and one lawyer focused on environmental treaties. The story of how I came to be on this panel is below.
I was involved in key assessments and 2016 votes confirming the legitimacy of the concept. The working group took formal steps in 2019 to reinforce this conclusion and set the starting point of this new chapter in Earth history around 1950.
The group has just reached another milestone, but Anthropocene debates will flow on for a long time to come - and are about far more than rock layers.
And the technical work is far from over. Raymond Zhong of The New York Times has just written a fine update on the long-running scientific effort. On Saturday, he reports, the current membership of the group wrapped a vote on which spot on Earth should be the site for the eventual Anthropocene “golden spike” - the physical symbol the international geological community deploys to mark a rock or sediment layer denoting each of the big junctures on the Geologic Time Scale.
The results of the vote on that site and a sequence of other recent and pending votes won’t be made public for some months, Zhong writes. And he lays out the daunting path ahead for scientists aiming to formally enshrine the Anthropocene in the Geologic Time Scale.
I encourage you to read his article and come back here for some uniquely intimate context I can offer.
And I’m eager to field your questions, so comment away!
My advice? Don’t hold your breath waiting to see if this proposal, first made by scientists in 2000, survives a sequence of three votes of the wider geological community.
As Zhong reports, deep divisions persist among geologists over whether such a designation is justified under the rules of stratigraphy, and also over concerns the Anthropocene is more a political than scientific designation.
On the technical question, my main concern is about the challenge in trying to define an Earth epoch just as the starting gun is going off. Here’s how I posed this question in a Skype interview in 2011 with Jan Zalasiewicz, a geologist at the University of Leicester who has led the working group since the beginning:
“Most geology is retrospective…. How can you know something has begun and will be significant when we don’t know how long it will last?”
“That’s one of the many novelties of the Anthropocene geologically. We don’t know how the Anthropcoene will pan out. What we do know is that sufficient change has already taken place to say that the course of Earth history has changed and the strata have changed. How much they will change will depend on numerous feedbacks, not the least the kinds that we ourselves as humans will introduce in years and decades and centuries to come.
“I suspect it’s a little bit like standing on the Earth 65 million years ago, a few years after the impact of the meteorite on the Yucatan peninsula in Mexico and seeing that the world has changed but not yet being able to see how it is going to evolve from then on. I think we are somewhere in that kind of position.”
Listen to the full conversation here:
[Insert, Jan 13, 2023 - I forgot to include an important series of papers (explore here: bit.ly/anthropoceneevent) by a batch of scientists spanning an array of disciplines proposing the Anthropocene should be an event, not strictly defined epoch. The abstract of one paper, published in 2022 in the Journal of Quaternary Science, nicely frames the argument:
Over the course of the last decade the concept of the Anthropocene has become widely established within and beyond the geoscientific literature but its boundaries remain undefined. Formal definition of the Anthropocene as a chronostratigraphical series and geochronological epoch following the Holocene, at a fixed horizon and with a precise global start date, has been proposed, but fails to account for the diachronic nature of human impacts on global environmental systems during the late Quaternary. By contrast, defining the Anthropocene as an ongoing geological event more closely reflects the reality of both historical and ongoing human–environment interactions, encapsulating spatial and temporal heterogeneity, as well as diverse social and environmental processes that characterize anthropogenic global changes. Thus, an Anthropocene Event incorporates a substantially wider range of anthropogenic environmental and cultural effects, while at the same time applying more readily in different academic contexts than would be the case with a rigidly defined Anthropocene Series/Epoch.
This framing resonates with my thinking, as you’ll see below.]
My [a/A]nthropocene journey
For a deeper dive, read the 2016 essay I wrote for the inaugural issue of, yes, Anthropocene Magazine, reposted here. As you’ll see, along with the strictly scientific question, the Anthropocene proposal has spawned an invaluable wider debate about what to call this moment on Earth and, more importantly, what to do about it.
My reporting career has taken me from smoldering, fresh-cut roadsides in the Amazon rain forest to the thinning sea ice around the North Pole, from the White House and Vatican to Nairobi’s vast, still-unlit slums. Throughout most of it, I thought I was writing about environmental and social problems and solutions.
Lately I’ve come to realize that my lifelong beat, in essence, has been one species’ growing pains. After tens of thousands of years of scrabbling by, spreading around the planet, and developing tools of increasing sophistication, humans are in surge mode and have only just started to become aware that something profound is going on. The upside has been astounding. Child and maternal mortality rates have plunged. Access to education has soared. Deep poverty is in sharp retreat. Despite the 24/7 distilled drama online and on TV, violence on scales from war to homicide has been in a long decline.
It’s been only a few decades since science began building a picture of the back story to this spectacular ascent. It’s a story about how humans became such a potent environmental influence that a signature of our doings, for good or ill, will be measurable in layered rock for millions of years to come. By altering climate, landscapes, and seascapes as well as flows of species, genes, energy, and materials, we are sealing the fates of myriad other species. And, without a big shift from business-as-usual, we will undermine our own long-term welfare as well.
In 2000, after a century of earlier efforts by scholars, scientists, and at least one journalist (me) to give a name to humanity’s emerging role as a planet-scale force, one word emerged in a heated moment at a global change conference in Cuernavaca, Mexico — anthropocene.
It appears to be here for the long haul. After 16 years of percolation and debate, anthropocene has become the closest thing there is to common shorthand for this turbulent, momentous, unpredictable, hopeless, hopeful time — duration and scope still unknown.
The word is still so novel that no one has even settled on how to pronounce it; the British stress the second syllable and Americans the first. That seems appropriate, given that reactions to the emergence of the term — let alone the actual environmental changes it aims to describe — have come in all colors and flavors. There’s even been a spirited push for alternatives, some rather biting.
I imagine you’ve heard some of the competing words that have bubbled up. We’re actually in the greed-driven Capitalocene, the trash-choked Plasticene, the combustible Pyrocene, the self-loathing Misanthropocene, the testosterone-dominated Manthropocene — even the Obscene. [I left out a few, including the Homogenocene favored by Charles Mann, Keirán Suckling and others (we’re homogenizing global biology), and the Necrocene of Justin McBrien (we’re killing off a lot of things).]
There’s some merit as well as weakness in every label, including the word that sparked it all.
What name seems most suitable to you? Leave a comment!
The anthropocene (both the word and the unfolding age) has so much Rorschach-like plasticity that all I can offer as guidance are my informed but subjective reflections based on what I’ve learned and unlearned in my long, quirky journey. I’d argue that what matters most is not resolving some common meaning so much as engaging in deeply felt discussions, fresh lines of inquiry, and new proposals for sustaining the human journey — all of which have been sparked by the emergence of this concept.
The Anthropocene origin story
To navigate this terrain, it’s best to start with the foundational anthropocene idea, as blurted out in February 2000 during a scientific meeting on human-caused global change. A prominent participant was Paul J. Crutzen, who’d won a Nobel Prize for helping identify the threat certain synthetic chemicals posed to the planet’s protective ozone layer. At the meeting, his frustration grew as peers described momentous shifts in Earth’s operating systems, but always anchored them in time by mentioning the Holocene. Holocene is the formal name for the “wholly recent” epoch of planetary history that began at the end of the last ice age 11,700 years ago.
At one point, Crutzen couldn’t hold back. He interrupted a colleague, as the scientist Will Steffen later described: “Stop using the word Holocene. We’re not in the Holocene any more. We’re in the … the … the … (searching for the right word) … the Anthropocene!”
In his 2014 book “The Anthropocene,” Christian Schwägerl describes how the room fell silent at first, and then the word became the center of conversation. “The scientists in that conference room in Mexico were profoundly shaken,” Schwägerl wrote. “[O]ne of the most frequently cited natural scientists in the world … was not only describing the past with this new term (something to which geologists are accustomed), but he was also redefining and connecting to the future … a new Earth sculpted by humans.”
Shortly after that meeting, Crutzen learned that Eugene F. Stoermer, an admired analyst of tiny lakebed diatom fossils, had used the word in the 1980s. The two scientists collaborated on an essay for a newsletter for Earth systems scientists. They laid out a scientific rationale for the term and explained why, even though there was no tradition of naming geological spans for their causative elements, in this case it was justified:
“Considering these … major and still growing impacts of human activities on Earth and atmosphere, and at all, including global, scales, it seems to us more than appropriate to emphasize the central role of mankind in geology and ecology by proposing to use the term ‘anthropocene’ for the current geological epoch.”
Crutzen and several collaborators refined the concept in subsequent papers. The term quickly spread, propelled in a dizzying array of directions as if filling a linguistic vacuum. It began popping up in peer-reviewed literature in a variety of disciplines and eventually spawned at least three scientific journals (and one magazine) using “Anthropocene” in their titles.
It’s not hard to see why reverberations, pro and con, built so quickly. It was an audacious notion to recommend that a human age deserved to join the Paleocene, Eocene, Oligocene, Miocene, Pliocene, Pleistocene, and Holocene as the epochs of geological history comprising the Age of Mammals. This stretch of time, more formally called the Cenozoic Era, began 65 million years ago, after the mass extinction that ended the dinosaurs’ age and enabled ours. And it could continue for a very long time — if the most powerful mammal, Homo sapiens, demonstrates it can turn the sapience in its name into a sustainable journey.
The proposal of an Anthropocene epoch was particularly audacious because it came from a chemist and an ecologist, not a stratigrapher. Stratigraphy is the discipline within geology that develops and maintains the official Geologic Time Scale and International Chronostratigraphic Chart.
In 2008, a group of stratigraphers and other earth scientists, led by Jan Zalasiewicz of the University of Leicester, published the first careful assessments of the intriguing Crutzen-Stoermer hypothesis. Indeed, they found a concrete and durable human signature — literally. Tens of billions of tons of concrete are part of that signature, along with vast amounts of smelted aluminum and more exotic alloys, distinctive spherical particles of fly ash from power plants, bomb radioisotopes, 6 billion tons (and counting) of plastic, and so much more. In a 2008 paper, Zalasiewicz and others concluded that there appeared to be “sufficient evidence” for an Anthropocene epoch to be considered for formalization by the international geological community.
But a long road lay ahead. The following year, Zalasiewicz and some colleagues began assembling a working group on the “Anthropocene” at the invitation of one of the 16 subcommissions of the International Commission on Stratigraphy. Those quotation marks around “Anthropocene” in the group’s name won’t disappear until some final judgment on the validity of a new epoch is reached.
In 2010 I was invited to join the working group, largely because of a quirky role I had played in the evolution of this anthropocene idea in 1992, when I essentially predicted Crutzen’s Mexico moment and what has unfolded since. Since 1985, I’d been writing articles about human impacts on the climate system. In 1991, I finally got a chance to synthesize what I’d been learning, in a short book that would accompany the first major museum exhibition on global warming, at the American Museum of Natural History. Closing out a chapter on the growing human impact on Earth, I typed an almost offhand proposal that we’d jolted the planet out of the Holocene:
“Perhaps earth scientists of the future will name this new post-Holocene era for its causative element — for us. We are entering an age that might someday be referred to as, say, the Anthrocene. After all, it is a geological age of our own making. The challenge now is to find a way to act that will make geologists of the future look upon this age as a remarkable time, a time in which a species began to take into account the long-term impact of its actions. The alternative will be to leave a legacy of irresponsibility and neglect that will manifest itself in the fossil record as just one more mass extinction — like the record of bones and empty footprints left behind by the dinosaurs.”
I vaguely recall musing on how to spell my passing reference to a name for this age. (I can’t probe the floppy disks on which any trace of that process sits.) “Anthrocene” seemed more streamlined than other choices, and I was pretty naïve when it came to word roots in scientific terminology. It didn’t really matter. The book was published shortly after the end of the Persian Gulf War and the planet-cooling eruption of Mount Pinatubo. Public attention was focused elsewhere. I’m sure no more than a few thousand people read it, certainly not Crutzen or Stoermer. It now floats on Amazon.com’s used listings for as little as a few dollars (plus shipping, of course) — another kind of anthropocene shard, in a way.
~ ~ ~
Interlude - In 2016, the Australian songwriter Nick Cave included a haunting song titled Anthrocene on his album Skeleton Tree. Listen while you read on.
~ ~ ~
Reflecting now on what I wrote in 1992, I’m quite certain that when I wrote “earth scientists of the future,” I was thinking generations, if not centuries, into the future. But it took just eight years for scientific rigor to be applied to the idea of an anthropogenic geological age. We do live in fast-forward times.
Language constantly evolves. In 2014, the word passed a significant milestone. The Oxford English Dictionary (OED) adds batches of words four times a year. The 171 words added in June that year included all manner of obscurities (“cholestasis”), words reflecting trends of the moment (“selfie,” “flexitarian”), and “Anthropocene.”
According to the dictionary’s definition, the Anthropocene is “the era of geological time during which human activity is considered to be the dominant influence on the environment, climate, and ecology of the earth.”
Before including it, the OED editors had wisely let the word percolate for 14 years after it first entered widespread discourse. But I’d argue that they jumped the gun in one important technical way and missed the main, grander meaning of the word. That second point is not a criticism; it just reflects the plasticity and richness of this still-emerging neologism.
The technical problem with the definition? The word, despite having roots springing so directly from stratigraphic nomenclature, could still end up rejected as a formal “era of geological time.”…
Many influential stratigraphers have expressed deep skepticism that the Anthropocene deserves formal standing. For one thing, any new addition to the time scale must be useful to science. Calling an abrupt end to the Holocene could achieve the opposite, creating confusion in the literature. There are significant debates over when to mark the starting point or lower boundary of the Anthropocene in the time scale.
Other scientists are concerned about all those flavors and colors of meaning that surround the word outside of geology — potentially tainting the time scale with environmental messaging. One of the starkest challenges came last spring in a critique written by two influential geologists, Stanley C. Finney and Lucy E. Edwards. Its title laid out what they saw as a murky and open question: The “Anthropocene” epoch: Scientific decision or political statement?
There was some basis for such concerns. Many scientists and others pressing for a more sustainable human relationship with the environment had latched onto the word and idea as a rallying point. In a 2011 interview with Elizabeth Kolbert for National Geographic, Crutzen had put it plainly: “What I hope … is that the term ‘Anthropocene’ will be a warning to the world.”
Now in its seventh year [recall this was 2016], the working group has been under pressure to complete its formal recommendation to the stratigraphic commission. Almost daily, emails fly back and forth among its 35 members, refining drafts of papers (including a response to Finney and Edwards) and planning next steps. There have been three face-to-face meetings of the group’s members, most recently in Oslo in April 2016.
Coincidentally, that meeting kicked off on the 46th Earth Day. We gathered around a long table in an ornate room at the Fridtjof Nansen Institute in a mansion built a century ago by the famed Arctic explorer for whom the institute was named. For two long days, discussions led by Zalasiewicz and Colin Waters of the British Geological Survey centered on a review of the “arguments against formalization.” The 17 bullet points ranged from the technical and straightforward — “stratigraphic record is minimal … based on predictions … ” — to the testy and provocative — “[T]he Anthropocene is political, not scientific.” As if to remind participants of the gravity of the task, there was a plastic-laminated copy of the scale itself at each seat, along with the usual array of writing pads and pens.
My lack of familiarity with norms of stratigraphy prevented me from engaging too deeply, although I’ve been a minor coauthor on several of the group’s papers. What I think I’ve brought to the table is context. In a presentation, I urged the geologists to take comfort in knowing they’re hardly the first discipline to be thrust into policy relevance or to have their norms shaken by disruptive change. I clicked to a slide showing how the “tree of life” envisioned by Darwin had been utterly disrupted now that DNA sequencing allows a more complete view, particularly of microbes. Just days before the Oslo meeting, a new “tree” had been published in which, as Carl Zimmer noted in the New York Times, “All the eukaryotes, from humans to flowers to amoebae, fit on a slender twig” compared to a dizzying spray of lines of bacteria.
And now the revolutionary genetic editing tool CRISPR is poised to imprint humans’ ambitions on that tree at least as profoundly as fossil fuels have changed the physical world. I also noted that the sparring in the stratigraphy community strongly echoed fights that had first erupted in meteorology and climate science 25 years ago, as new lines of evidence and new tools, such as global climate models, pointed to a growing and disruptive human warming influence. “You’re not alone,” I said.
But I stressed, using climate change as an example, that it is possible to separate the “is” of science from the “ought” of society’s choices. With some bumps and bruises, the Intergovernmental Panel on Climate Change had found a way forward. Now it was geology’s turn.
You can explore my slides here.
There was some irony in the stroll each day between our hotel and the Nansen Institute. It took us along the shore in front of a giant Jenga-block scramble of horizontal white towers that belong to Statoil. Norway’s mostly state-owned oil company has contributed substantially not only to Norway’s economy but also to global climate change. Even as Norway was adding incentives for drivers to buy electric vehicles to take advantage of ample domestic hydro-electric power, the company announced plans to expand drilling in the Barents Sea to boost fossil-fuel exports. One got the impression that decisions made in that building would have a bigger impact on world affairs than any conclusions we produced.
But there was a second layer of irony there on the windswept shores of the fjord. The grassy stretch along the sinuous path was also a sculpture park. A vertical slab rose from the grass directly in front of the Statoil building, imprinted with an image of one of Easter Island’s moai — the haunting stone figures carved at the potent pinnacle of the great, but vanished, Rapa Nui civilization.
As August drew to a close, Colin Waters headed to the 35th Congress of the International Union of Geological Sciences in Cape Town, South Africa, to summarize the group’s findings, including the results of a vote of members on critical aspects of the evidence for an Anthropocene Epoch. The key points? “Is the Anthropocene stratigraphically real?” Thirty-four yes, one abstention. “Should the Anthropocene be formalized?” Thirty yes, three no, two abstentions (one of which was me).
In deference to the long chain of approvals that lay ahead, he stressed the work plan, which includes a global quest for an appropriate site for a “golden spike” — an actual physical point displaying the evidence for a Holocene-Anthropocene transition.
While many geologists worry that a human-etched epoch grants us too much power on the basis of too little evidence, a few think the proponents of the geological Anthropocene are thinking way too small. One such expert is Jay Quade of the University of Arizona. After decades of fieldwork and lab analysis on six continents, Quade — whose father and grandfather were geologists — seems to live, breathe, and eat insights from ancient rock. I met him in June at a Santa Fe, New Mexico, gathering of scientists focused on the Quaternary Period. He credited the efforts of Crutzen and scientists such as those in the “Anthropocene” working group for all that they were doing but said his reading of the evidence pointed to an even more massive unfolding geological transition. It could, he believed, be akin to — if not bigger than — the Permian-Triassic mass extinction 250 million years ago and the Cretaceous-Tertiary extinction that cleared out the dinosaurs and led to the Age of Mammals — and us. In his keynote talk, he described the human-driven changes under way on Earth as “creating the mother of all stratigraphic marker horizons.”
One slide took the audience 50 million years into the future, projecting what the human imprint would look like after such a span — kind of like what geologists see now in probing previous great events. Our anthropocene moment appears as a brief pulse of trash, rare earths, and the like — along with a profound constriction of mammal species — followed in future ages by a flourishing of surviving and newly evolved mammals. Are humans among them to assess that record?
Time will tell.
The Lower-Case Anthropocene
To me, the geological discussion, while vital, is not nearly as important as the wider discourse that has emerged around the word and its implications.
What makes this point in entwined human and planetary history special, and has made this word controversial, isn’t our potency. Cyanobacteria, through the evolution of photosynthesis, started flooding the atmosphere with oxygen more than 2.3 billion years ago. Some earth scientists call that the Great Oxygen Catastrophe. The result was a mass extinction followed, over millions of years, by an extraordinary flourishing of life attuned to that new atmosphere.
But cyanobacteria, as far as we know, weren’t aware of their power. And we are, at least haltingly, starting to recognize ours. It remains to be seen whether the current surge of human-generated carbon dioxide, along with our other environmental impacts, creates what future civilizations might call the Great Carbon Dioxide Catastrophe — or not. The wild card is us. The broader meaning of anthropocene, not captured in the Oxford English Dictionary, centers on how awareness (in theory) comes with responsibility.
Is this the beginning of our end, as some have argued, or the turbulent beginning of a potential new age of enlightened cultural and physical evolution? Can the anthropocene, or Anthropocene, be good?
In June 2014, New Yorker staff writer Elizabeth Kolbert addressed this question in a Twitter post. That year she’d won a Pulitzer Prize for The Sixth Extinction. She’d read “The Delusion of the Good Anthropocene,” Clive Hamilton’s biting critique of a talk I’d given on the prospect of a “good” Anthropocene at Pace University.
Hamilton, known for a dark view and a sharp scalpel, is a professor of public ethics at Australia’s Charles Sturt University and the author of Requiem for a Species: Why We Resist the Truth about Climate Change, among other books.
Kolbert’s tweet distilled much.
In a subsequent conversation facilitated by the fine Grist blogger Nathanael Johnson, Hamilton and I clarified differences and found lots of common ground. He wasn’t seeking a “bad” anthropocene, for instance, and I didn’t see this as a good time for global ecology. But we agreed on the uniquely consequential nature of this moment and the value of discourse in search of common ground.
We also agree that the broader implications of humanity’s surging planet-scale impacts can be obscured by technical struggles or disciplinary turf battles over stratigraphic signals. As Hamilton wrote in a commentary in Nature in August 2016, “The new geological epoch does not concern soils, the landscape, or the environment, except inasmuch as they are changed as part of a massive shock to the functioning of Earth as a whole.”
The “Capitalocene” culprit
The idea of the anthropocene resonates loudest within circles tussling over the best ways to chart a sustainable human journey. A leading proponent of the “Capitalocene” alternative, Binghamton University sociologist Jason W. Moore, has written that a focus on the anthropocene could “obscure more than it illuminates.” However well intended its supporters may be, they are — by presenting humanity as a single entity — glossing over the real drivers of both environmental and social degradation: inequality, commodification, imperialism, and more.
His pitch for Capitalocene leaves out environmental ravages committed under Communist regimes in the Soviet Union and China, where destructive policies began under Mao well before that country’s own capitalist tilt. But his point will be vital to consider as discussions flow forward. Who is the “we” when we talk of common human responsibilities?
It took me more than 20 years of regularly using the word “we” in articles or talks on new scientific insights (“we’ve learned”) or global trends (“we’re changing the climate”) before I fully absorbed that, in several important contexts, there is no “we.”
Who is “we”?
It’s humbling for me now to reflect on the naïve, preachy way I framed my “anthrocene” notion in that 1992 climate book.
The passage reads like a sermon. Who was the “we” in that paragraph? Did it include Pacific islanders or rural villagers in India and Africa who scrabble to make a living facing today’s climatic and coastal threats and who contribute no meaningful amount of greenhouse gases to the atmosphere?
The simplest sources of human variability are geographic and economic. If you’re poor and vulnerable or prosperous and protected, an epic storm has completely different meaning. In 2007, I was the lead writer of a special New York Times report describing humanity’s “climate divide” along these lines.
A Dutch woman who had bought a riverside house that floats off its foundation safely in a severe flood said, “We’re looking forward to floating,” as if it were an amusement park ride. A farmer in northeastern India had a very different reaction as he surveyed waterlogged fields following an early spring flood on the Baghmati River. Three acres of wheat — a third of his income — were gone. Barley, mustard, and peas were ruined.
But I also learned in examining behavioral studies that there can be fundamental differences — shaped by deep-rooted behavioral traits — in how individuals, rich or poor, north or south, perceive environmental change. Are you an edge pusher or group hugger? You know the answer. The person next to you likely has a different answer. One body of research calls the source of differences “cultural cognition.” This is why there’ll never be a common comfort level with a word like anthropocene, or with the signals emerging from the biogeophysical world, or with what to do about it. In essence, we are all on different journeys through this consequential juncture in the intertwined history of human beings and their home planet.
There is one other area where the “we” question has emerged. Who is the “we” who should be making judgments on the anthropocene, even within the constrained scientific debate? Just as Jason Moore found fault with an overly simplified anthropocene distillation of human civilization, British environmental economist Kate Raworth found fault with the composition of the “Anthropocene” working group itself.
I had written a blog post following the second meeting of the group, in Berlin in 2014; I noted, somewhat in passing, that it was “very white, western, and male.” Raworth fired an apt salvo on Twitter: “The Anthropocene is bad enough. Spare us a Manthropocene.” She included a photo gallery she’d created of nine female experts in global change. In a welcome move, more women have since been added to the group, including Naomi Oreskes of Harvard University, who combines a geologist’s and a historian’s perspectives.
It’s important not to get too caught up in this rarefied level of discussion. In the real world, however discomfiting this might be to those of us engaged with the word, I’d aggressively wager that at least 90 percent, maybe 95 percent, of humanity has not yet heard the word or considered its implications. Wealthy world citizens are insulated from environmental risk. The poorest are so caught up in survival that the future has little meaning. I haven’t found any polls yet testing awareness of the word “anthropocene.” But try a Google Trends search of “anthropocene, global warming, ISIS” and you can see the relative levels of attention.
In the meantime, whatever you call this period of intertwined human and planetary history, the biogeophysical, and increasingly technological, reality is playing out on scales that aren’t very amenable to old ways of managing risks and opportunities.
Brad Allenby, a longtime analyst of sustainability and technology at Arizona State University, rejects the term Anthropocene entirely because it’s not nearly big enough to encompass what’s going on. He feels referencing geologic time presumes far too much stability and knowledge. “[A]s humans increasingly integrate with the technology around them, and as the evolution of that technology continues to accelerate, it is questionable that what we will have in 50 or 100 years will still be anything like ‘anthro,’” he wrote on the aptly named Future Tense blog earlier this year. “We are trying to tie geologic time to a windstorm.”
A few years ago, after Allenby and I had an onstage discussion of the anthropocene at Arizona State, a member of the audience proposed a hopeful architecture for the coming decades:
“The way I would like to see it is in, say, 100 years in the future the London Geological Society will look back and consider this period… . . . a transition from the lesser Anthropocene to the greater Anthropocene.”
That has a nice feel to it. Fully integrating this awareness into our personal choices and societal norms and policies will take time. I’ve taken to encouraging people to meld urgency and patience, however irrational that might feel.
Reflecting on all that has passed and is to come, I see the prospect of slow but substantial and productive shifts in the human enterprise. It will come along with a rich array of perceptions and responses among and within communities — from the scale of global society to that of the stratigraphic community.
Will this happen fast enough? Who knows. But this is the human way. A big part of engaging with the anthropocene, to my eye, is engaging with and even embracing ourselves as individuals and as a flawed, variegated but amazing species. In 2003, biologists identified “response diversity” as a source of resilience in ecosystems. I’d assert that the same characteristic is an asset in societies as long as they work to level playing fields, foster education and transparency — and communicate.
Perhaps the last thing the world needs is yet another word. But in 2011, I offered a name for that kind of engagement. It might make you chuckle, given my earlier effort at naming something, but here goes. Anthropophilia.
Edward O. Wilson’s Biophilia was a powerful look outward at the characteristics of the natural world that we inherently cherish. Now we need a dose of what I’ve taken to calling anthropophilia as well.
We have to accept ourselves, flaws and all, in order to move beyond what has been something of an unconscious, species-scale pubescent growth spurt enabled by fossil fuels in place of testosterone.
In The World Without Us, Alan Weisman created a haunting, best-selling, thought experiment — imagining a planet awakening after the vanishing of its human tormentor. The challenge: There is a real experiment well under way, and we’re all in the test tube.
We’re stuck with the more complex story of The World WITH Us. It’s time to grasp that uncomfortable, but ultimately hopeful, idea.
Shall we form an Anthropophilia Working Group?
Please explore Anthropocene Magazine, which commissioned and first published this article.
Here’s a TEDx talk I gave on the notion of Anthropophilia in Schenectady, N.Y. in 2016:
This was a fascinating read Andy, and it was interesting to know that you coined Anthrocene prior to Anthropocene. (This reminds me of the word the Institute for Humane Education coined and has been promulgating – solutionary – which was also coined prior to us by at least two others, along with someone coining "solutionist.") At any rate, you asked about our responses to the different alternatives, and I have come to find Anthropocene works well, while the other options feel too narrow.
Reading your essay, I also found myself resisting the idea that people shouldn't be categorized as "we" in this context. Of course not all people are causing the problems typified by the Anthropocene, and those who are living in poverty and without power within societal structures suffer the gravest consequences. But as you point out, the great decline in poverty across the globe has been a result of the Anthropocene, too. So it's a mixed bag. And as Jared Diamond's (and others') work suggests, many indigenous peoples throughout the ages have contributed to Anthropocene-caused harms to the environment and other species – causing extinctions long before modern times.
While different societal systems may be more likely to cause harm than others, humans are a species with certain attributes, and it seems like our default is to struggle with long-term thinking and decision-making as well as avoid taking an expansive view that doesn't just favor the in-group (in-groups meaning both our human in-groups as well as the in-group that excludes other species).
Just as all (or the great majority of) humans have the capacity to cause Anthropocene harms if given the power to do so, all (or the great majority of) humans have the capacity to build more sustainable, just, and humane systems if we learn how to do that. Perhaps we should distinguish between capital "We" and lower case "we" so that we don't create a false dichotomy when it comes to understanding what it means to be human.
Finally, I would say that I believe we should indeed strive for a Good Anthropocene, which to me means educating people to be solutionaries who have cultivated the motivation, skills, and thinking capacities to build healthy and peaceful systems. What else would we want to strive for than this?
Every period we have named, WE have named. Naming a "geologic" period for ourselves seems to be totally an act of creating a self-portrait, a "selfie," if you will, that only exacerbates our horrific effect on the planet. Naming a period of time in which our activities become sedimentary layers can only happen after it is over. And who names that period of time is unknowable. And will probably be none of us.