I was listening to an episode of the BBC podcast In Our Time, on which a group of English scholars was discussing the French philosopher Henri Bergson, when one of them mentioned an essay called “The Unreality of Time,” originally published in 1908, by a philosopher named John McTaggart. The phrase startled me—I was writing a book called The Unreality of Memory. It’s possible I’d heard the title before and forgotten I knew it—as the scholars note, it is a famous essay. (“Is forgotten knowledge knowledge all the same?” is the kind of question we asked in my college philosophy classes.) In any case, I had never read it. I paused the podcast and found the essay online, curious what I’d been referencing.
McTaggart does not use “unreality” in the same way I do, to describe a quality of seeming unrealness in something I assume to be real. Instead, his paper sets out to prove that time literally does not exist. “I believe that time is unreal,” he writes. The paper is interesting (“Time only belongs to the existent” … “The only way in which time can be real is by existing”) but not convincing.
McTaggart’s argument hinges in part on his claim that perception is “qualitatively different” from either memory or anticipation—this is the difference between past, present, and future, the way we apprehend events in time. Direct perceptions are those that fall within the “specious present,” a term coined by E. R. Clay and further developed by William James (a fan of Bergson’s).
“Everything is observed in a specious present,” McTaggart writes, “but nothing, not even the observations themselves, can ever be in a specious present.” It’s illusory—the events are fixed, and there is nothing magically different about “the present” as a point on a timeline. This leads to an irresolvable contradiction, to his mind.
© ALLEN / ADOBE STOCK.
Bergson, for his part, believed that memory and perception were the same, that they occur simultaneously: “The pure present is an ungraspable advance of the past devouring the future. In truth, all sensation is already memory.” He thought this explained the phenomenon of déjà vu—when you feel something is happening that you’ve experienced before, it’s because a glitch has allowed you to notice the memory forming in real time. The memory—le souvenir du présent—is attached not to a particular moment in the past but to the past in general. It has a past-like feeling; with that comes an impression one knows the future.
Bergson was hugely popular in the early twentieth century. He was friends with Marcel Proust—and married to Proust’s cousin—and his ideas influenced many other Modernist writers and artists. He is less well-known and celebrated now in part because of a years-long debate with Albert Einstein over the nature of time. Bergson believed that “clock-time” and what he called capital-T Time—time as we experience it, a lived duration—were entirely different. It was this other kind of time, time in the mind, that interested Bergson. Einstein thought this was poppycock. “Il n’y a donc pas un temps des philosophes,” he said on April 22, 1922, at an infamous lecture in Paris: There is no philosophers’ time. Einstein felt his theory of relativity was the final word—time is what clocks measure, in their own frames of reference—and that Bergson did not understand the theory. Einstein thought the separation of time and space was dead as a concept, that he’d killed it. He was wrong—we still think of time and space as different, even if we grasp relativity. Nonetheless, many took his side, and it did lasting damage to Bergson’s reputation.
Philosophers and physicists still speak of the specious present. “The true present is a dimensionless speck,” Alan Burdick writes in his book Why Time Flies. “The specious present, in contrast, is ‘the short duration of which we are immediately and incessantly sensible’ ”—he quotes James. The specious present, Burdick adds, “is a proxy measure of consciousness.” It is what we think of as now. Not the general now, as in “the way we live now,” but right now.
And how long is now? In an eight-minute YouTube video with over one million views, called “What exactly is the present?,” the physicist Derek Muller attempts to explain. According to Muller, engineers working on the problem of syncing video and audio in preparation for the first live television broadcasts found that viewers didn’t actually notice if they were a little out of sync, but there was “an asymmetry”—the sound can lag the video by up to 125 milliseconds before people notice something’s wrong, but if the sound leads the video by more than 45 milliseconds, they know it’s off. Of course, sound and “video” aren’t synced in the real world either: When we watch someone walk down the street, away from us, dribbling a basketball, the sound takes longer and longer to reach us, but we still perceive the bouncing sounds and the bouncing visuals as simultaneous. That’s because “now” is not a speck but a span, of about a tenth of a second. During that interval, Muller says, “your brain can perform manipulations that distort your perception of time and rearrange causality”—syncing up the audio and video, like a live broadcast on a slight delay. It’s as though your brain takes in the information and processes it a little before you do. Researchers have exploited this discovery to fool people into thinking a computer program can read their minds, that it knows what they’re going to do before they actually do it. We’re capable of perceiving an effect before we realize we’ve caused it.
The neuroscientist David Eagleman has said, “You’re always living in the past”—meaning not that the past haunts us, though it does, but that what we experience as the present is in fact the past, the very recent past, the just past. In a way, then, time is memory—not clock-time, perhaps. Not Einstein’s time. But human time is human memory.
I started writing The Unreality of Memory in 2016, in what seemed like a state of emergency. In the months leading up to the election, I was following reality like it was TV, as though every day ended in a cliffhanger. There was something addictive about Donald Trump’s incredible rise—incredible in the original sense, unable to be believed.
This profound sense of unreality reached its culmination on the night of the election. Earlier that day, I’d felt light on my feet, optimistic—I risked jinxing it by purchasing proleptic champagne. I remember the moment, late into the night, when a win for Hillary Clinton had become vanishingly unlikely, though not technically impossible. John and I were watching the returns come in on his laptop, and stress-drinking, though not champagne—that stayed in the fridge. We watched a newscaster nervously talk through the maps showing Clinton’s last outs. John turned and looked at me in horror and said, “He’s going to win.” A bottomless moment.
In the summer of 2017, I spoke on a panel called something like “Art in the Age of Trump.” One writer on the panel insisted that the role of the artist is empathy; with an air of limitless patience, he suggested writing a story or a novel from the perspective of Donald Trump—to attempt to understand him. I felt a portion of the audience grow increasingly restless and frustrated. One man cried out, “There’s no time!” I recognized the note in his voice, a note of urgency unto panic.
That panic, for me, has mostly passed. It has not passed for everyone—not for trans people I know, or for immigrants and the families of immigrants. But as scared as I am of the future, I must admit that for now I’m fairly safe, even comfortable. When news of another school shooting hits—the word “another” seems inadequate—or when I read calm, measured reporting of slowly progressing disasters like ice melt in Antarctica (or, or … I hate these placeholder lists of atrocities), I’m disturbed—logically I’m disturbed. I recognize the facts as disturbing, though what’s no longer shocking or even surprising can verge quite horribly into boring. I still find Trump evil, but I no longer find him interesting. And I still have to work (how can it be so, that I have to waste my life this way, when the world is ending?), eat, sleep, and start over again. I move through the days in a flux of anxiety and denial. But that fear in the background changes things. It changes how I make decisions. I can’t say how long this relative safety will last. It feels like a suspended emergency—like the specious present has been extended in both directions. Now feels longer.
Is the world ending? Which end is the end? For a while I told people, facetiously I suppose, that I was writing a book about the end of the world. Once at a family lunch, my aunt asked me what I was writing about, and I said I was writing about disasters. “What about disasters?” she asked, and I wasn’t sure how to answer. My mother stepped in with a much better elevator pitch: “Isn’t it more about how we think about disasters?” My own thinking, at least with regard to the disaster—the end—has shifted. To be clear, I do worry that civilization is doomed. (The word “worry” seems inadequate; I almost wrote “believe.”) But I’m not sure the doom will occur like a moment, like an event, like a disaster. Like the impact of a bomb or an asteroid. I wonder if the way the world gets worse will barely outpace the rate at which we get used to it.
I don’t have faith that my sense of history, from here inside history, is accurate, or that the view through the rickety apparatus of my body is clear. Eagleman notes that “most of what you see, your conscious perception, is computed on a need-to-know basis.” We ignore what our brains—independently!—deem unnecessary. There is no other self, to tell yourself what to do. The German biologist Jakob von Uexküll had a term for what animals pick up on in their surroundings: the Umwelt. The Umwelt is always limited by the organism’s equipment, by its immediate needs. Eagleman, explaining Uexküll’s ideas, writes: “In the blind and deaf world of the tick, the important signals are temperature and the odor of butyric acid. For the black ghost knife fish, it’s electrical fields. For the echo-locating bat, it’s air-compression waves. The small subset of the world that an animal is able to detect is its Umwelt. The bigger reality, whatever that might mean, is called the Umgebung.” The Umgebung is the unknown unknown, the unperceived unperceived.
There’s the matter of perspective, and there’s also the matter of scale. A young poet I know noticed that I often write about the self watching the self. He quoted an essay in which I wrote that I fantasize in the third person, connecting this to another piece, which mentions Robert Smithson’s earthwork sculpture Spiral Jetty. “Do you think land artists moreso desired their work to be experienced within (standing on the rocks, beside the hole) or from above (via camera, airplane)?” he asked me in an email. My mind spiraled off.
It’s very hard for me, I told him, to be “present in the moment”—I’m always going meta, narrativizing, thinking about what I’m thinking about, imagining the future—and then in my specious present, I’m comparing what is happening to what I had imagined would happen, my souvenir du présent to my memoire de l’avenir. I didn’t say that Smithson didn’t mean for Spiral Jetty to be seen at all, or at least not for long—he built it when the water levels in Great Salt Lake were unusually low: a comment on ephemerality at epic scales. Finished in 1970, the jetty had disappeared by the time he died, in 1973, in a plane crash while surveying sites for a new piece. It stayed hidden for thirty years. Since 2002, drought has kept the water levels low, so it is now usually visible. The ephemerality doubles back: The design exposed, it’s Smithson’s intention, human intention, that’s ephemeral.
I’ve grown tired of reading about disasters. Friends send me links, and I click them and skim halfheartedly. One article, published just after the Notre Dame cathedral in Paris was partially destroyed in a fire, references the sociologist Charles Perrow’s 1984 book, Normal Accidents, which notes that safety systems increase the complexity of technology, inevitably leading to unforeseen errors, which can be catastrophic. The Chernobyl meltdown was triggered by a safety test. (In 2019, HBO made a series about Chernobyl, but I didn’t watch it; I’m tired of disaster movies.) Another questions the slippery use of “we” in writing about climate change, as in “We are emitting more carbon dioxide than ever.” “The we responsible for climate change is a fictional construct, one that’s distorting and dangerous,” writes Genevieve Guenther, a writer who founded a volunteer organization called EndClimateSilence.org. “By hiding who’s really responsible for our current, terrifying predicament, we provides political cover for the people who are happy to let hundreds of millions of other people die for their own profit and pleasure.” It provides cover, in other words, for the giant corporations, like ExxonMobil, Shell, and BP, that are responsible for most greenhouse gas emissions. In 2017, the Carbon Majors Report revealed that a hundred companies account for more than 70 percent of those emissions. “Always remember that there are millions, possibly billions, of people on this planet who would rather preserve civilization than destroy it with climate change,” Guenther writes.
“Most people are good.”
That sentence gives me pause. “Most People Are Good” is also the name of a country song I hate: I believe this world ain’t half as bad as it looks, the guy croons in the chorus. The more I think of it, the more I disagree. I don’t think most people are good, or bad, for that matter. I think people are neutral. From a distance, they look almost interchangeable. It seems to me that “good people” can become “bad people” when provided the opportunity within an existing power structure—to claim and exert power at a deadly cost to others and get away with it. It is not an act of empathy for me to say that Trump is not inherently evil, but “we” have created opportunities for him to be evil. To say that most people are powerless—that evil is a role. In some novel I once read, one character reminded another that a “revolution” is simply a turn of the wheel; it doesn’t break the power structure, it just changes who is on top. I think about that all the time. I think about these lines from an Ilya Kaminsky poem: “At the trial of God, we will ask: why did you allow all this? / And the answer will be an echo: why did you allow all this?” We, you and I, are not corporations, but we do give those corporations godlike power. “They” is a dangerous construct, too. There’s no one to dismantle them but us.
I recently read my friend Chip Cheek’s novel about a honeymoon gone wrong. It starts off feeling escapist—the publisher clearly marketed it as a beach read—but it turns into a kind of apocalypse novel. It’s about what ruin really looks like; there are consequences for the couple’s immoral (and stupid) behavior, but in the end we’re denied the pleasure of an all-out catastrophe, the realization of what Sontag called our “fantasies of doom,” our “taste for worst-case scenarios.” The novel is set in the fifties, but even period fiction written now is climate fiction, I realized; it’s always on some level aware of what we’ve reaped. The storms have levels of foreboding.
My research into past disasters—the plagues and the almost nuclear wars—was often oddly comforting. We’re still here, after all. But I can only take so much comfort in the past. This point in history does feel different, like we’re nearing an event horizon. How many times can history repeat itself? It’s generally accepted that our memories are fallible—that they’re missing information, that they include new details we’ve simply made up—and that over time they are less and less reliable, as we keep rewriting the inaccuracies.
We’re more trusting, though, of what we take to be our direct experience, our experience of the present. I’m drawn to Uexküll’s idea of the Umwelt; like a tick or a bat, we only know what we know. I’m drawn to Bergson’s idea that perception and memory are coterminous. It suggests that we don’t experience reality as it is, and then warp it in recall, but that even the first time we live through X, we are already experiencing our warped version of X.
(Source: The Paris Review)