We are a public forum committed to collective reasoning and imagination, but we can’t do it without you. Join today to help us keep the discussion of ideas free and open to everyone, and enjoy member benefits like our quarterly books.
Imagine the world to come.
Imagine we carry on doing too little too late. Imagine we continue to set inadequate emissions targets with no real intention of meeting them, and then keep right on missing them. The parts per million of atmospheric CO2 relentlessly increase. Temperatures rise, releasing the methane naturally sequestered in permafrost and ocean bed clathrates.
The last of the glaciers melt. We lose the polar icecaps, first in the north and then, inexorably, the south. The oceans rise and, since water expands as it warms, rise still more. Low-lying islands disappear. So do the densely populated river deltas. Coastlines retreat. Hundreds of millions flee the inundation. Hundreds of millions more stay behind and die.
Elsewhere deserts form, and agriculture collapses. Daytime temperatures kill. Those who can, take flight.
More refugees. Endless border violence to keep them out.
Skirmishes, then wars, over food and fresh water.
Large-scale geoengineering projects distribute climate amelioration here and foreseen-but-ignored consequences there. Unforeseen consequences fall where they may.
More corpses. Not just human. The sixth great extinction accelerates. As each species diminishes, its environment collapses a little more.
The planet persists.
But we are gone.
Imagine a world haunted not just by the dead, but by the specter of death. Drawn ever closer by the already locked-in consequences of our actions and inaction. Its domain extended by endless escalating catastrophe.
Imagine a future of foreclosed possibilities.
Haunted by all the worlds that were, and all the worlds that could have been.
Then imagine—as Amitav Ghosh suggests in The Great Derangement (2016), the most widely read and highly regarded book on literature and climate change—that somewhere in the middle of all this, in a future in which “sea-level rise has swallowed the Sunderbans and made cities like Kolkata, New York, and Bangkok uninhabitable,” there are still museums and libraries and bookstores. Picture its inhabitants, urgently examining “the art and literature of our time . . . for traces and portents” of the upheavals that made their “substantially altered” world.
And “when they fail to find them,” Ghosh asks, “what should they—what can they—do other than to conclude that ours was a time when most forms of art and literature were drawn into the modes of concealment that prevented people from recognizing the realities of their plight?”
This is, of course, nonsense.
It requires us to imagine tidal surges discriminating enough to sweep away significant portions of the “science,” “politics,” and “science fiction” shelves, not to mention all that gussied-up “climate fiction” too fancypants to consider itself SF.
And it requires us to believe future humans will have really poor reading skills. Because the truth is: auguries abound. The art and literature of our time is pregnant with catastrophe, with weather and water, wildness and weirdness.
But the truth is also that mundane fiction—what Ghosh calls “serious literary fiction”—has mostly failed to engage with climate change, mainly because the mainstream bourgeois novel mistook exclusion for insight, a glitch for a feature. Muteness is its unique selling proposition.
Ghosh’s account of the built-in shortcomings of mundane fiction is pretty persuasive, but where he sees near-universal failure, I see negotiations with the limitations of the form—not silence but expressive aphasia, teeming with tongue-tied questions. Must a text show tides rising in a world that is elsewhere parched, must it mash up Waterworld (1995) and Mad Max: Fury Road (2015), to be “about” climate change? Must it—like Kim Stanley Robinson’s Science in the Capital trilogy (2004–7) and New York 2140 (2017)—map the progress of the storm? Must it depict smart people in white coats telling each other things they all already know, but the reader might not, about greenhouse gases, hockey-stick curves, retreating glaciers, shrinking ice sheets, Gulf Stream deceleration, clathrate outgassing, and so on? Must the observation of environmental shifts be explicitly connected to climate destabilization, as is the radically altered migration of monarch butterflies in Barbara Kingsolver’s Flight Behavior (2012)? Must fiction be immediately and explicitly about climate change for it to be fiction about climate change? Is there no room for the symbolic? The oblique? The estranged? No room to think about the capitalist, imperialist, patriarchal histories, systems and structures that are hitherto and foreseeably responsible for climate destabilization, and through which it has, is, and will be experienced? No room to consider texts that do not say “climate change” aloud? To discover what happens if we stop assuming a text is not about climate change?
And what of the kinds of fiction that lie beyond Ghosh’s self-imposed restrictions? What of the “serious” fictions of the arthouse cinema and the graphic novel? And of the not-so-serious and not-so-literary? Of blockbusters and genre fictions? Cable movies and comics? The popular, the trashy, and the disreputable? The sort of things the “serious” and the “literary” might wish the seas would swallow?
Somewhere in the African veldt, a hominid ape, uplifted into sentience by an alien monolith, hurls a bone into the air; 4 million years later, in 2001, a similarly shaped nuclear weapons platform orbits the Earth.
In my semicolon—in that space between frames in Stanley Kubrick’s 2001: A Space Odyssey (1968)—lies the starting point and almost all the rest of the Anthropocene.
A series of visual echoes follow: another spacecraft, then the Pan-Am Orion shuttle climbing silently into space, then a pen floating in freefall inside its passenger compartment; the curvature of the Earth, the orbital trajectories around it, the twin wheels of a massive space station, which gently spin, producing gravity-like centrifugal force for its inhabitants; that rotation, the circular path of the drifting pen, the waltz of the Orion’s docking maneuvers. This sequence of images reiterates and extends the evolutionary development implied by cinema’s most famous match cut—from apes learning to use tools to the expansion of humanity (well, white men, mostly) into interplanetary space.
This is the consensus future history promulgated by mid-century Anglophone SF, by such rocketry advocates as Wernher von Braun, Walt Disney, and NASA, and by SF-author-turned-Kubrick-screenwriter Arthur C. Clarke. It echoes and extends U.S. frontier mythology upward and outward; indeed, 2001 was initially conceived as a roadshow epic along the lines of How the West Was Won (1962). However, for skeptical satirical Kubrick, the masculinist–colonialist Cold War conquest of space is profoundly unheroic—a bathetic affair, conducted by bland bureaucrats and other corporate functionaries.
In the two hours of screen time after that match cut, two years of story time pass. A second monolith, excavated on the Moon, transmits a signal toward Jupiter, where another, much larger monolith awaits. Dave Bowman, sole survivor of the mission sent to investigate, plunges headlong into this Stargate’s unfolding vortex of dazzling lights, strange geometries, and otherworldly landscapes. He wakes in a simulacral suite of baroque rooms. Time distorts. He sees older versions of himself, who in turn see older versions of himself. A final monolith appears and transforms him into the luminous Star Child. He floats in space, blankly contemplating the Earth below.
Clarke’s novel, written alongside Kubrick’s film, is much less elliptical, but its ominous conclusion is just as ambiguous. The Star Child wills the Earth’s orbiting atomic arsenal to explode, bringing “a brief, false dawn to half the sleeping globe,” after which he “brood[s] over his still untested powers”: “he was not quite sure what to do next. But he would think of something.” Clearly, humanity is soon to be subjected to another nonhuman intervention, to extraterrestrial experimentation, to posthuman becoming.
After which, the time of the Anthropos will, by definition, be over.
But where exactly in that celebrated match cut did the Anthropocene begin?
The stories we tell about the world matter, but it is not always easy to know where to start or what to call them.
The term “Anthropocene” is derived from Anthropos (“human”) and cene (“recent”). It describes the period in which human activity has disrupted significant geological conditions and processes, and/or in which traces of human activity can be discerned in the geological record. The term is usually attributed to either biologist Eugene Stermer, who used it in the 1980s, or more commonly to atmospheric chemist Paul Crutzen, who independently recoined it in the late 1990s. Subsequently, others have proposed alternative terms to describe this period of unprecedented ecological change or to suggest the different futures we might make, including:
the White (M)anthropocene
the White Supremacy Scene
is the new
But this is no mere glossolalia.
This proliferation of terms—some serious, others playful—did not arise from confusion or obfuscation or jargon-for-jargon’s-sake. Nor is it the result of bandwagon-jumpers coining career-making catchphrases for the heady fame and giddying royalties academic publishing bestows, or even for a meagre slice of the ever-shrinking research-funding pie.
Rather, it is what happens when the implications of a technical stratigraphic issue—primarily of interest to geologists and paleontologists—spill out into wider culture. It is trace evidence of an already rich history of thinking through what it means for humans to have become a geological force.
Each of these terms tells the story from a different perspective and with different emphases. Half a dozen of them were coined by historians Christophe Bonneuil and Jean-Baptiste Fressoz specifically to demonstrate the power a name has to construe a narrative, designate a protagonist, indicate an orientation, and shape perception. For example, they only half-jokingly propose that because the United Kingdom and the United States were responsible for at least 50 percent of global cumulative total CO2 emissions until 1980, “the Anthropocene should rather be called an ‘Anglocene’.”
Changing the name changes the story.
As does, of course, does deciding where you choose to start it.
Geographer Nigel Clark argues for the longest of long Anthropocenes, beginning 1.6 million years ago on the African savannah, when Homo erectus first used fire “for warmth and light, for keeping predators at bay, and for increasing the available nutrient content of foodstuffs.” Noting that the genus Homo evolved on a planet with a unique “combination of oxygen-rich atmosphere, ignition sources and fuels,” Clark goes so far as to suggest that these “pyrophytic tendencies” of the Earth itself somehow rendered the urge to burn carbon irresistible to certain upright hominids. And just as 2001’s bone-as-weapon leads inevitably to orbital nuclear missiles, so our catastrophic consumption of fossil fuels flows naturally from that first prehuman barbecue. The planet made us—makes us—do it.
Climate scientists Michael R. Raupach and Joseph G. Canadell more reasonably suggest that the Anthropocene should at least start with something closer to the actual Anthropos, when “half a million years ago . . . the ancestors of human-kind learned to derive energy from the controlled combustion of detrital biotic carbon such as wood and peat.” But once more, there is a Kubrickian cut that mythologizes causation and obscures massive differences of scale. We are all, it seems, twisted firestarters, and we just cannot help ourselves.
At the other extreme, the shortest of the short Anthropocenes dates either from the July 16, 1945, Trinity nuclear test in the New Mexico desert—with its global spread of radioactive isotopes—or more generally from the end of World War II, when wartime production was retooled to manufacture mass commodities. Inaugurating a new phase of consumer capitalism, this shift expanded and intensified the use of fossil fuels, both as an energy source and in the production of plastics. This “Great Acceleration” seems the option most likely to be selected by stratigraphers as the point from which to date the Anthropocene in the official Geological Time Scale. And you can see why.
In 2004 climate activist Will Steffen and ten coauthors compiled twenty-four graphs of changes in human activity and global scale alterations to the Earth system since 1750. The first dozen chart increases in population, urban population, total real GDP, foreign direct investment, damming of rivers, water use, consumption (of fertilizer, paper, motor vehicles, telephones, international tourism, and McDonald’s restaurants). The second dozen show the depletion rate of ozone and the increases in atmospheric concentrations of greenhouse gases (CO2, N2O, CH4), average surface temperature, species extinctions, land domestication, coastal zone nitrogen flux, coastal zone structural alterations, and in the number of great floods and fully exploited fisheries. Each graph reproduces that distinctive hockey stick curve—a level, then gradually rising line that abruptly swerves upward—and on all of them that sudden steepening occurs around 1950. Which seems pretty conclusive. Especially if all we are interested in is resolving a technical stratigraphic issue.
However, two other short Anthropocenes should give us pause.
They both begin in the 1400s and are so profoundly interrelated that they are probably just different ways of looking at the same conjuncture.
The European “discovery,” exploitation, extraction, and colonization of the Americas devastated indigenous populations through disease, conquest, enslavement, resettlement, and other forms of colonial violence. And since the Atlantic slave trade developed to replace indigenous forced labor in the looting of the “New World,” it also devastated African populations. Indigenous Americans had no immunity to smallpox and other diseases that leapt ahead of the European invaders, sometimes eradicating entire peoples before there was any direct contact. In the 150 years after Columbus landed, colonizers wiped out probably 50 million indigenous people, and the jungle reclaimed agricultural land so quickly that its increased uptake of atmospheric CO2 is discernible in early seventeenth-century ice cores.
Intertwined with this violent collision of worlds—which also brought together plant and animal species that had evolved on separate continents for millions of years—is the early modern development of capitalism as a world system, beginning with the Dutch and British agricultural revolutions. As historian Jason W. Moore notes, this otherwise well-established early history of capitalism is obscured in—and by—accounts that prefer to start a short Anthropocene with the industrial revolution in eighteenth-century Britain, and thus to shift the blame, consciously or not, onto “industrialization” rather than capitalism and colonialism.
Crutzen, for example, selects 1784—which he associates with “James Watt’s design of the steam engine”—as his starting point because that is when “analyses of air trapped in polar ice [show] the beginning of growing global concentrations of carbon dioxide and methane.” But as Moore scathingly enquires, if the “motive force behind this epochal shift” was “coal and steam,” then what was the “driving force between coal and steam?”
Not class. Not capital. Not imperialism. Not even culture . . . you guessed it: the Anthropos. Humanity as an undifferentiated whole.
The Anthropocene makes for an easy story. Easy, because it does not challenge the naturalized inequalities, alienation, and violence inscribed in modernity’s strategic relations of power and production. It is an easy story to tell because it does not ask us to think about these relations at all. The mosaic of human activity in the web of life is reduced to an abstract Humanity: a homogeneous acting unit. Inequality, commodification, imperialism, patriarchy, racial formations, and much more, have been largely removed from consideration.
The Anthropocene recognizes humanity as a geological force, but does so indiscriminately. As the Salvage Editorial Collective put it, in “the time of guilt,” it finally admits as human those it denied “in the time of plenty.” Hence Moore and others prefer to call it the Capitalocene.
Yet some would rather stick with the Anthropocene.
For example, in Anthropocene Fictions (2015), Adam Trexler argues that the term avoids such “politically contentious” phrases as “climate change” and “global warming”—for which it is now effectively a euphemism—and moves discussions away from prognosticating outcomes to asserting “a phenomenon that has already been measured and verified across scientific disciplines and conclusively linked to human emissions of fossil fuels”. It refocuses us on a geological process that far exceeds any solution to be found in individual consumer choices, and emphasizes “larger, nonhuman aspects of climate,” such as “the greenhouse gases already in the atmosphere,” that “will continue to act” independently of us.
Adopting the aura of scientific authority is not without risk. It appeals to science’s own ideals of objectivism, universalism, skepticism, and disinterestedness, at the same time obscuring the extent to which science is not only a human and social practice but one that is increasingly dominated, directed, and shaped by corporate and state interests, often in direct contradiction of those ideals. And it is not at all clear how effective this borrowed mantle can be when capital’s ideologues, sponsors, and bagmen conceal scientific findings that might undermine profit margins. When they spend decades and dollars muddying the waters. When their manufactured uncertainty is misreported by the media as conflict between equally valid, equally scientific viewpoints. And when, in the crazy dance of illiberalism and undemocracy, ascendant populisms deride expertise and throw out the baby of scientific consensus with the bathwater of technocratic governance.
Furthermore, talking about a geological epoch invites awestruck recoil at sublime magnitudes, which is not necessarily a bad thing, since hubris should be clobbered once in a while, but also risks evasion and complacency.
Yes, as Timothy Morton’s Hyperobjects (2013) argues, the climate is “massively distributed in time and space relative to humans,” and thus functions on “profoundly different” scales and temporalities than those we are used to. It is so vast as to be “almost impossible to hold in mind.” It showers us with effects and affects, even as it withdraws from our comprehension: we can see rain, but not climate; a banknote, but not the economy. The weather or the dollar bill is but “a flimsy, superficial appearance,” a “mere local representation” of a phenomenon so massive, so extended in space-time, that it finally shatters idiot illusions of linear cause and effect.
But in the face of such debilitating immensity, we cannot merely shrug and take a selfie. We cannot allow the scale of the crises we are already living through, and of those to come, to trump their urgency.
To do so is to condone the impoverishment, immiseration, and deaths of untold billions, human and otherwise.
Editors’ Note: Excerpted from Anthropocene Unconscious: Climate Catastrophe Culture by Mark Bould, published by Verso Books. Copyright © 2021 by Mark Bould.
Mark Bould teaches Film Studies at the University of the West of England, Bristol, and is author of Science Fiction: The Routledge Film Guidebook. He coedits the journal Science Fiction Film and Television.
…we need your help. Confronting the many challenges of COVID-19—from the medical to the economic, the social to the political—demands all the moral and deliberative clarity we can muster. In Thinking in a Pandemic, we’ve organized the latest arguments from doctors and epidemiologists, philosophers and economists, legal scholars and historians, activists and citizens, as they think not just through this moment but beyond it. While much remains uncertain, Boston Review’s responsibility to public reason is sure. That’s why you’ll never see a paywall or ads. It also means that we rely on you, our readers, for support. If you like what you read here, pledge your contribution to keep it free for everyone by making a tax-deductible donation.
Vital reading on politics, literature, and more in your inbox. Sign up for our Weekly Newsletter, Monthly Roundup, and event notifications.
The Judge Rotenberg Center, a Massachusetts school, still uses electric shock therapy to punish disabled students. How can an entire field of mental health accept this as fine?
A sweeping new history of humanity upends the story of civilization, inviting us to imagine how our own societies could be radically different.
Nearly two years into a global pandemic, uncertainty has profoundly unsettled both our personal and political lives. In our Fall 2021 book, eleven thinkers consider its scientific, philosophical, and economic aspects.