Sunday, 24 February 2019

Florida is drowning. Condos are still being built. Can't humans see the writing on the wall?

People tend to respond to immediate threats and financial consequences – and Florida’s coastal real estate may be on the cusp of delivering that harsh wake-up call

I stood behind a worn shopping center outside of Crystal River, Florida, looking for the refuge where a hundred manatees were gathered for winter. I found them clustered in the emerald-colored spring, trying to enjoy a wedge of sunlight and avoid the hordes of people like me, boxing them in on kayaks and tour boats, leering over wooden decks. The nearby canals were lined with expensive homes and docks with jetskis. One manatee breached the water for a breath, and I could see the propeller scar on its back.

2018 was the second deadliest year on record for manatees. Like many of our coastal species, they’re vulnerable to habitat loss and warming seas, which are more hospitable to algal blooms and red tide. Science has given us the foresight we need to make decisions that will reduce the future suffering of other species and ourselves, but we don’t heed it. Why?

Studies show that humans don’t respond well to abstract projections. We overvalue short-term benefits, such as driving SUVs, burning coal and building waterfront real estate. We choose these extravagances even though they impede beneficial long-term outcomes, such as saving threatened species, or reducing the intensity of climate change.

Humans tend to respond to immediate threats and financial consequences – and coastal real estate, especially in Florida, may be on the cusp of delivering that harsh wake-up call. The peninsula has outsized exposure: nearly 2 million people live in coastal cities. On the list of the 20 urban areas in America that will suffer the most from rising seas, Florida has five: St Petersburg, Tampa, Miami, Miami Beach and Panama City. In 2016, Zillow predicted that one out of eight homes in Florida would be underwater by 2100, a loss of $413bn in property.

A potential scenario of future sea level rise in South Beach, Miami, Florida. Photograph: Nickolay Lamm/Courtesy Climate Central
I flew into Miami in early December and the risk was visibly apparent from the airplane window. Aerial views of Miami and South Beach show high density construction on flat, sandy slivers of land. A recent National Oceanic and Atmospheric Administration (NOAA) predicts Miami streets will flood every year by 2070.

South Beach was vibrant and populated, with mega-yachts docked in front of luxury homes, sorbet-colored art deco-era hotels rising a block from the water, cafes misting customers on the sidewalks, neon signs flashing bright in the night sky. But I wondered: given the forecasts, why are people still building new condominiums?

In Florida, you will see a bewildering mix of optimism, opportunism and denial in the real estate market: luxury condominiums going up in flood-prone South Beach, and property values rising in the vulnerable Keys, post-Hurricane Irma. And though the House of Representatives passed a bill to require real estate agents to disclose flood risks, the Senate has not reviewed it, and a culture of “systemic, fraudulent nondisclosure” persists in high flood risk areas.

You will see the massive benefits of privilege, and the way it allows a homeowner, particularly a second home owner, to afford the risk. You will see emerging issues like Miami’s climate gentrification, where previously low-income neighborhoods like Little Haiti are rising in value and under pressure from developers because of their higher ground, resulting in the displacement of people and place-based culture. Haitian playwright and bookstore owner Jan Mapou recently told a reporter: “Gentrification is coming forcefully: developers buying the major corners, raising the rents, forcing renters onto month-to-month leases … We’re not against development or modernization … but respect the people living there, their culture, their history.”

I spoke with a developer who wanted to remain anonymous, given business interests. He told me that he’s surprised that people are still buying, building and investing in coastal Florida. He estimated that a decade ago, only one in 10 buyers asked about the property elevation, or expressed concerns about rising seas. Today, nearly six of 10 ask and many decide not to buy in these same critical areas. “I’m worried we’re one bad storm away from a rush for the exits,” he told me.

I sought input from the environmental community as well. “Real estate is a huge economic driver here,” Laura Geselbracht, a senior marine scientist with the Nature Conservancy, said. “And it’s at risk from sea level rise. People don’t want to believe it. That’s a normal human condition – suspension of belief.

“If you’re not a millionaire and you own a property in a vulnerable area, it may be a wise decision to think about moving before the masses think about moving,” Geselbracht said. She also owns waterfront property on a canal in Fort Lauderdale, and is deeply invested in her community, but has cautioned her child not to expect the same lifestyle in the future.

She wonders why she doesn’t see more people of means in south Florida buying electric cars, getting solar panels and living more sustainably. “The quicker we take action, the better. We’ve got to be leaders so that we have a longer horizon of survival here,” she said. When she’s approached community leaders in the past, asking them to take steps toward sustainability, she often hears the same response: “Technology will solve it.”

It’s a high-stakes gamble. Consider innovative mitigation in action: raising roads, shoring up sea walls, adding pumps and drainage upgrades, beginning dredging projects, offering complex insurance structures. Proximity to these short-term solutions are not always pluses in a home buyer’s column, but acute reminders of vulnerability.

While Geselbracht is optimistic about developments like Orlando’s zero emissions goal, and Miami’s forward-looking Forever bond, she’s not ready to pin all her hopes on innovation. She also wonders about the fallacy of “safe” investments elsewhere. “There are air quality issues and forest fires out west, and extreme heat inland.”

I spoke with young farmers who recently decided to purchase a farm away from the coast. “As we looked for farmland to buy, we certainly thought hard about what the climate would be like in 10, 20, 30 years,” one of the owners of Ten Mothers Farm told me. “We knew we didn’t want to be near the coast, and we wondered whether even being in the south-east was a bad idea. Ultimately we decided that the most important thing was to be in a community that’s supportive and that we believe will be resilient.”

While baby boomers may be slow to adjust spending behavior to climate change, the Florida developer told me, millennials will not, and that shift will likely impact the market in the decade to come.


I grew up in two eastern North Carolina towns, Rocky Mount and Atlantic Beach, that have been bludgeoned by hurricanes. There, friends have real estate that falls into an increasingly common, and expensive, pattern: flood, repair, rebuild. Many are locked into this expensive, emotionally draining pattern because they can’t sell their homes, which have been flooded multiple times.

Browsing real estate in nearby New Bern, which was dramatically flooded by Hurricane Florence last year, reveals the terminology indicative of this practice. Homes are presented as a “blank canvas” and “waiting to be brought back to life”. There are optimistic takes, too, like “circumstances have created great potential”, and “great fishing”.

According to a 2018 report from the Union of Concerned Scientists, it’s not just houses that will flood, but also “roads, bridges, power plants, airports, ports, public buildings, military bases and other critical infrastructure along the coast”. Furthermore, the report indicates that financial markets have not accounted for this future downturn. The economic impact will be “staggering” and the window for towns to maintain creditworthiness and build resilience is “narrowing”.

The Union of Concerned Scientists point out that “nearly 175 communities nationwide can expect significant chronic flooding by 2045” and of those “nearly 40% – or 67 communities – currently have poverty levels above the national average”. States with areas of particular concern are North Carolina, Maryland and Louisiana, where a significant percentage of at-risk properties are owned by people of color.

The climate change-induced real estate crisis is imminent in the south, and it’s going to have a brutal impact on those who can’t afford new insurance, relocation, lowered property values, or bandages such as private sea walls. It will have an outsized impact on homeowners who live in flood zones or near over-heated superfund sites and toxic factories, and those who can’t afford to pay taxes on submerged land where they can no longer make a home.

I look at real estate listings and wonder, what if the places you love most are no longer a good investment? What if we’re so focused on denial, data and property that we fail to grasp the human side of the situation?

The moral imperative to act is not about salvaging expensive second homes on the waterfront. It is about taking responsibility for human action, helping frontline communities solve a complicated economic and cultural challenge, and doing what we can to help species whose survival is imperiled by our lack of foresight.

(Source: The Guardian)

Saturday, 23 February 2019

Did Henry VI have a sex coach in his marriage bed?

The medieval monarch and his queen, Margaret of Anjou, were not alone at night, historian Lauren Johnson reveals

For more than eight years, Henry VI and his queen struggled to produce an heir. Now it has emerged that the couple were not alone in their endeavours in the royal bedchamber.

The historian Lauren Johnson has unearthed evidence showing that when Margaret of Anjou visited her husband’s bedroom for marital relations, they were sometimes joined by trusted courtiers.

Johnson told the Observer this was very unusual, adding: “Was it because the famously chaste Henry – who was a virgin until he married – didn’t know what he was doing? I think it’s entirely possible that it had reached a certain point where it perhaps became necessary to make clear to him what he should be doing.

“That couldn’t be done in a public way at all. The king’s chamber is the most private place [where] you could be having this conversation or, indeed, checking what was going on.”
Henry VI and Margaret of Anjou as depicted in the Talbot Shrewsbury
book of 1445. Photograph: World History Archive/Alamy
Royal marriages were once consummated with “bedding ceremonies”, in which newlyweds were put into the marital bed by their guests on their marriage night. The earliest English record dates back to Henry V in the 1420s, when ceremonies involved “the wine cup and the blessing of the bed”.

Johnson said: “While royal ceremonies could involve public blessings and perhaps processions to the bedchamber on the wedding night, after that point no one was in the royal bedchamber when the king and queen were having their marital relations.” She added that what Henry VI and Margaret experienced was different: “This was not just their wedding night. It’s an ongoing thing.”

Johnson found documentary evidence in the National Archives and royal household accounts, among other sources. The Ryalle Boke of court protocol, for example, records that once the king was in bed, “the king’s chamberlain or a squire for the body [should] come for the queen, and with her two gentlewomen and an usher”. Another witness, describing when “the Kinge and the Quene lie together”, noted his chamberlain lay “in the same chamber”. Johnson suggests that it may have been the Duke of Suffolk, chamberlain of England, or Ralph Botiller, chamberlain of the household.

Johnson observes: “The Ryalle Boke does not make it clear at what point they left, leaving open the intriguing suggestion that they remained to make sure the marriage bed was being properly used.”

She recalled that, in reading the documentary evidence, her “eyes and ears pricked up”: “The evidence that there are people staying in the king’s bedroom potentially some years after he is married… is very odd.”

This is among discoveries that will feature in her book, Shadow King: The Life and Death of Henry VI, to be published by Head of Zeus next month. Johnson writes: “The reign of Henry VI is rightly remembered as a nadir in national history – and the king’s shadow fell across his realms for years after he was deposed. This was the age of England’s defeat in the hundred years war and the Wars of the Roses. Conflict was to be Henry’s principal legacy. Yet Henry himself, the child king who became a martyred holy man, was no tyrant. He loved peace before war. He treated his wife and child with affectionate respect.”

Henry VI’s enemies smeared him as weak for taking so long to produce an heir, and spread rumours that the couple’s eventual only child, Edward, was a changeling or bastard. Johnson’s research has also led her to believe that, part of the reason Margaret and Henry took so long to conceive was that the queen had an eating disorder.

She points to a 1467 document that records Margaret “fasting four or five times a week” during her marriage and enduring weak health – “ironically, probably to fulfil religious vows in the hope of getting pregnant,” Johnson suggests.

She writes that lack of an heir led to concern over the succession: “As the first duty of a queen was to bear children, this had a serious impact on her popularity. Infertility was usually blamed on women, but complaints about royal sterility undermined Henry’s masculinity and his authority.”

(Source: The Guardian)

Friday, 22 February 2019

Lynne Tillman and the illusion of realism

Realism disturbs me.
For indeed fiction, if realistic, is a manufactured veil through which we train our gaze in order to obtain a pattern that organizes dots and squiggles into something legible, “an image of a pork chop which looks exactly like a pork chop,” as Terry Eagleton writes in the London Review of Books. Realism is paradoxical: a lie that reads true. We take two pet rocks, name one “Reality,” the other “My (Mimetic) Attempts to Write About It,” and smash them enthusiastically together. What survives is combed into a neat pile, carefully labeled, set out as a sort of snack.

Mimesis is imitation, and when Aristotle talks about it in his Poetics, he means for it to do one thing: Imitation isn’t a faculty poets deploy to represent the world solely for the sake of skillfully representing the world. Imitation is deployed with the specific aim of inspiring recognition—of evoking, in a somewhat distant audience, a feeling of pity. (Aristotle: “Thus the reason why men enjoy seeing a likeness is, that in contemplating it they find themselves learning or inferring, and saying perhaps, ‘Ah, that is he.’ ”) We are brought to tears when someone on stage pokes out his eyes; safe in our chairs, we’ve confused him with ourselves. We’re deceived, yet in awe. Perhaps we resolve not to kill or have sex with our parents (or, failing this, not to get married—regarding which topic, more later).

Ideas about imitating reality have spiraled up through Western civilization with different, though perhaps related, political ends. The realists of nineteenth-century France weren’t exactly Aristotelian in their outlook, but they definitely had ambitions re: mimesis. They wanted to understand the structure of society and, along with the Russians, took great pains to offer precise depictions of things and persons. Balzac may be the paradigmatic example, but I find myself unable to stop thinking about a certain bottle of oil to which a feather has become affixed in a scene in Madame Bovary: “In the corner behind the door, shining hobnailed shoes stood in a row under the slab of the washstand, near a bottle of oil with a feather stuck in its mouth.” (This old translation by Ferdinand Brunetière and Robert Arnot is interesting for the way in which it names old-fashioned things, e.g., hobnails. More recent translations tend to replace outmoded words with more familiar, if less specific, ones.) It’s less the elaboration of a world or a social system that fascinates me here than the skill in representing an item that seems purposeless, if classed. I do occasionally cling to this kind of seemingly pointless vivid materiality in prose. It produces not recognition, foremost—though that, too—but surprise. It makes me think for a moment, pace Aristotle, that it might be possible to have a world without psychology, maybe even, pace Hugo, without fate. (In The Hunchback of Notre Dame, the Greek term ananke, meaning “fate,” is, bizarrely, carved on the side of the cathedral. There seems to be no reason for this, other than that Hugo wanted to imply that fate is an indelible feature of human history. As you see, I find him to be an extremely annoying writer.)

But, of course, we don’t have that world, though Herman Melville’s head is famously turned by the enumerative ecstasy of whale facts. We have a pretty different world, despite materialist trends in certain nineteenth-century novels—and despite their resistance to wanton psychologizing. Although the behemoth twentieth-century psychological realist John Updike may have worshiped at the altar of Flaubert’s scrupulous style, he seems to have taken le réalisme’s lesson only in part, ever subordinating acts of description to the fluid angsts of his American subjects.


Lynne Tillman is a novelist who seems to me to have thought a lot about the above—and in a uniquely deliberate way. In certain of her stories, there is a character named Madame Realism who goes around living a fairly normal New York City life and who is always contemplating art and illusion everywhere she goes because, well, art and illusion are everywhere in the late twentieth and early twenty-first centuries, and in Manhattan, in particular. Madame Realism does not shrink from the scene. In the story “Madame Realism’s Imitation of Life,” spotted by fans in a women’s restroom, Madame Realism overhears one say, “I think that is Madame Realism, but do you think a fictional statement can ever be true?” Paradox abounds—for, in reading the story, one has flattered oneself that one is engaged in an intimate experience with a veiled version of Lynne Tillman, with Tillman’s very thoughts. Yet this is because of the presence of a character, a confection in the close third. Thus, there is Tillman IFF (“if and only if ”), the wry disguise of Madame Realism, at least for the purposes of this story, which in fact reads less like a story than a work of art criticism, which would seem to be part of the point. Normally, I suppose, we’d have the entailment the other way around: character IFF author. But Madame Realism doesn’t work that way. She is not here to imitate reality; she’s here to explain to us how the related fictional affordances of narrative and point of view function. That’s how real she is. (The reader is also advised that Madame Realism is playfully distinct from “Sir Realism,” a.k.a. surrealism, that twentieth-century movement in the visual arts and poetry famous for its modernist mystification of femininity.)

No matter how many paradoxes, neat rock arrangements, and feathers stuck to bottles of oil I pile into this essay, none of these phantasmal objects comes close to the unreal, gonzo vividness of Tillman’s 2006 novel, her fifth, American Genius, A Comedy. At its most insanely, maddeningly banal and delightfully paratactic moments—the novel takes place, after all, in a vaguely defined asylum, artist residency, or spa, where the style of one’s breakfast eggs and memories of deceased childhood pets become major concerns—it remains, maddeningly and delightfully, a story about the impossibility of escaping illusion, even when one is doing almost nothing.

American Genius, A Comedy is also about the extremity of Americans. It’s about the violent movement westward, which seems, in the mind of the novel’s narrator, to culminate in the Manson slayings, along with the present-day inability to pardon the Manson Family member Leslie Van Houten, who participated in the killings of Leno and Rosemary LaBianca in her last year as a teenager and who was sentenced to death in 1971. Hannah Arendt once said that she was glad that Eichmann had been hanged, because the Israelis had “pushed the thing to its only logical conclusion.” Arendt felt that so-called justice can’t have it both ways; if someone cannot be forgiven, then, well, they cannot be forgiven, and it is another form of violence to leave the charge unmet. Though this line of reasoning seems a bit neat to me, the narrator’s obsession with Van Houten, who repeatedly returns to her thoughts throughout the novel, is related. Van Houten was at one time the youngest woman condemned to die in California; a special death row had to be constructed for her, as no women’s section existed. However, the invalidation of pre-1972 death sentences in 1972’s People v. Anderson (now overruled) meant that her sentence was commuted to life in prison. Though the verdict in a second retrial stressed her eligibility for parole, and though other members of the Manson Family were successful in parole requests, Van Houten’s applications for parole in California have been, as of June 2018, repeatedly denied. Protected, in theory, by her whiteness and physical beauty, like the charismatic Manson himself, Van Houten lives out her days in prison, unforgivable if ambiguously responsible for her crime, given her age and mental state at the time of its commission, as well as her gender, this last point being a qualification that must remain unspoken, as it at once exonerates her and leaves her open to endless fantasies of blame that are beyond the scope of the law, at least on paper, to name or know.

The narrator of American Genius, A Comedy, in limbo in her institutional retreat, latches on to this other, discursive limbo, a blank in which America refuses to know itself—as it seems relevant, if ambiguously, to her own identity. While it is probably, again, too neat to say that she, like Van Houten, is doing time, it’s part of the interest of the book that it doesn’t shy away from these sorts of bad analogies. It’s American, in this respect. And this narrator is a former historian, which may contribute to her reluctance to participate in storylines unfolding in present-day reality, so-called. While she seems to allow that the present, as a distinct moment, exists, she seems none too sure that it is more than a mushy amalgam of past temporalities—the history of chair design, for example—and timeless inevitabilities—the much-touted sensitivity of her own skin—lacking any true newness or uniqueness worth, as it were, writing home about.

But our retreating narrator, though withdrawn, is not alone, and this makes all the difference to the form and tenor of her refusal of plot in the present. As in Nathaniel Hawthorne’s The Blithedale Romance and Thomas Mann’s The Magic Mountain, others (“residents”) are interned, for a brief eternity, alongside Tillman’s genius/protagonist. The hope that utopia is to be found in retreat is held out. As readers, we occasionally let ourselves think of it. Indeed, it’s here that the question of the relationship between the novel and so-called realism comes most strongly into play. “Realism,” Terry Eagleton writes in the aforementioned LRB essay, “is calculated contingency.” In other words, realism can be a style of belief in the existence of others—since you need somebody, or somebodies, to whom things are represented, and asylums, artist residencies, and spas are famous for their captive audiences.

In the first chapter of The Blithedale Romance, a fictionalized account of the Brook Farm commune (1841–47), Hawthorne worries about something he calls “the privileges of privacy.” (The narrator is speaking here: “ ‘Zenobia, by the bye, as I suppose you know, is merely her public name; a sort of mask in which she comes before the world, retaining all the privileges of privacy,—a contrivance, in short, like the white drapery of the Veiled Lady, only a little more transparent.’ ”) Hawthorne is a lover of gothic euphemism, not a realist writer, and his cloaked concepts often assume an intimacy between reader and narrator that feels forced, at least to me. So I’m not entirely sure what he means by this phrase, but his tale of intentional community is full of references to privacy, both literal and metaphorical: veils, secrets, false names, confused identity, performative utterances. There’s dissimulation and distancing—and also a fair amount of discussion of the role of women in American society, women who seem to be the origin of all social illusion, at least as far as Hawthorne is concerned. If American Genius, A Comedy (not a romance) is in some sense a rewriting of Hawthorne’s 1852 narrative, by 2006 the commune has become an institution and the narrator a woman (a “nineteenth-century woman in trousers,” as one character has it)—yet the privilege of privacy remains, along with an affection for unusual monikers (we meet the Count, Contesa, and Spike, et al., not their real names). Our narrator and her acquaintances take advantage of their middle-class privilege, in its collective form, to stage a hilariously god-awful dramatization of Kafka’s letters, as well as, in a seriocomic citation of the nineteenth century and its prized illusions, a séance or “ghost theater.” The narrator observes the workings of her own mind during the latter performance, as her tendency to compose speculative lists and bounce from topic to topic—from familial concerns to American history and back again—is overtaken by something more enigmatic and difficult to reconcile: a chilling realization of the possibility of the absence of thought as thought; the nullification of sentience as sentience.

I can’t halt these alien sensations. I place my hands over my eyes and press hard, scrunching my eyes closed again, so that their veins radiate bloody patterns, garishly colored shapes, pale ashes, the papers I burned this afternoon maybe, everything recognizable is ablaze, like my family’s Eames chairs. I can’t hold on to an image, so I tell myself, in a stately manner, Mark this now, fire burns complacent things, and in a flash it occurs to me why I take things apart, and I want to remember the reason but can’t. Another gust of arctic air makes me shiver, there’s nothing to think about, I open my eyes, it’s all gone, I shut them again.

Though what returns in the séance’s transformative and macabre course “with its bizarre seductions” is simple—the fact of death—the effects of this unbelievable fact on those assembled are richly varied, alarming, and enlightening. The narrator has a vision of her deceased father in his distinctive dark brown swimming trunks; others rant about sex addiction and betrayal, the shape of fate, the qualities of evil (“Let me say this about the devil: He exists,” maintains one transfixed party). All in all, it’s quite an event, as well as quite a convincing portrayal of what routinely goes unsaid, even or especially in privileged private. I think, too, that this has to be one of the great scenes of recent American fiction, on par with the unveiling of the P.G.O.A.T. in Infinite Jest, for example, speaking of metaphorically charged drapery. In it, we catch a glimpse of the structure of the contemporary social world, as well as the limitations of realist description. Because you can’t mimetically describe something that is simultaneously there and not there, which is to say, you can’t describe something unspeakable.


As I was beginning to write this essay, wanting to be thorough and a reasonably good historian, I traveled to the Fales Library & Special Collections at New York University, where Tillman’s papers are kept, and went through all the manuscript drafts of American Genius, A Comedy. Because of this adventure of my own into seclusion, I happen to know that the novel had multiple working titles, including American Skin, and that the narrator originally spoke on the first page about writing a novel. (The first sentence of the draft reads: “The food here is bad, but every day there is something I can eat and even like, and there’s a bathtub, which I don’t have at home, so I can have a bath every day if I can get from my studio, where I’m supposed to be writing a novel, to my room, before dinner, which is at 6:30pm.”) That novel, that fictional novel, has since been removed. It’s been replaced, I suppose, by our narrator’s oddball histories and catechisms, and by visits to an aesthetician who palpitates her face, producing emotion. The clarity of that early title and that fake novel has been smudged out, artfully distorted. However, far from ruining the book for me, knowledge of these initial scaffolds deepens its mystery. It’s not that the novel is just better without these tropes; it’s that the novel is about the fact that such tropes are illusory. A certain truism about the reality of novels (i.e., that in their obvious artificiality or autobiography, they presuppose a world in which fact and fiction are stable, easily distinguished categories) is missing here, can’t be reclaimed. This is not a semiautobiographical novel about a novelist, written by a novelist—what we now call autofiction—nor is it purely a work of invention. It’s something else.

Tillman types, in her draft notes, that “the worst thing is that it’s not over yet—everything’s not safe yet—forest fire—desire to be safe—post 9/11.” She also quotes Freud: “One cannot overcome an enemy who is absent or not within range.” I think I’m starting to understand, more and more, what Tillman is getting at, how she is attempting to capture the complex narratological formats of her time, the interrelated and rather too-real chimeras of news, politics, and history. As you may have heard, Aristotle’s chapter on comedy has been lost; it’s mentioned in the Poetics, but no longer extant. I gesture to this fact from ancient literary history because I’d like to be able to say something definitive about the style of recognition American Genius, A Comedy sets up, being a comedy and all—what sort of mimesis Tillman is after, whether it makes sense to say she is an illusionist, an antirealist writer who has moonlighted as a fictional art writer whose last name is (funnily enough!) Realism; who, being fictional, doesn’t like to be recognized wandering at large in reality by her fans; who may be an anachronism, too, a nineteenth-century character in pants prone to fainting spells; who likes wild cats and also dogs, and also chairs, and so on; who may have put her hero, a modern woman, if of a vaguely Victorian stripe, in the awkward position of having to exist inside a postmodern novel. It may be, too, that Tillman is at once ahead of her time and living concertedly in it. She once wrote something similar of Andy Warhol, and I think that, as also for Warhol, one of her ways of being in and with her own time is to describe an imaginary future that infuses all the presents and the pasts enumerated in her fiction. (In this remark, from The Velvet Years, 1965–67: Warhol’s Factory, I’m inspired by Tillman’s description of Warhol’s relationship to time, both historical and not: “One of the mandates of the avant-garde, which Warhol broke from, was to be ahead of one’s time and to know in what way one was. Shifting into the postmodern, one is pressed to learn how to think, live, work, breathe the present—even if it’s inescapable, like inhaling an unrecuperable past. It’s harder to live in and think the present than be ahead of it; there’s no exit. It’s no aesthetic failing to be in time, with it. The imaginary future is always there and not there, to envision or make up, to wonder and worry about, to live into and even for.”) However, in spite of these chrononautical insights, fun though they may be, the only definition related to genre and imitation I seem to be able to muster—a deficiency entirely my own—is rather generic: At the end of a comedy, people are supposed to get married.

This (marriage) is no laughing matter, nor does Tillman’s comedy deal much in that sort of contract and/or denouement, except to note that American women are unfortunate, in that they often marry for love. Rather, American Genius, A Comedy, a sort of hypertext of recollection and ingenious displacement, a sort of postmodern nineteenth-century novel, ends on a Tuesday, with a facial.

(Source: The Paris Review)

Thursday, 21 February 2019

Journalism isn't dying, it's returning to its roots

THE PAST FEW weeks have brought bad news to the hardworking scribes of the news business. Three leading digital outlets—BuzzFeed, the Huffington Post, and Vice—announced layoffs that left many accomplished journalists unemployed. The fingers of blame quickly pointed to the great bogeymen of our media age—Facebook and Google—and warned about a threat to democracy. After all, if the most savvy and avant-garde of the new digital journalists can’t make a living, what hope is there for old-school newspapers? To many, the health of our democracy is inextricably tied to the health of our journalism: If the latter begins to die, the former must immediately follow.

That’s a curious sentiment, because if you were to magically teleport the architects of our democracy—men like Ben Franklin or Samuel Adams (newspapermen, both of them)—to today, they’d find our journalistic ecosystem, with its fact-checked both-sides-ism and claims to “objectivity,” completely unrecognizable. Franklin wrote under at least a dozen pseudonyms, including such gems as Silence Dogood and Alice Addertongue, and pioneered the placement of advertising next to content. Adams (aka Vindex the Avenger, Philo Patriae, et al.) was editor of the rabidly anti-British Boston Gazette and also helped organize the Boston Tea Party, when activists dumped tea into Boston Harbor rather than pay tax on it. Adams duly covered the big event the next day with absolute aplomb. They’d have no notion of journalistic “objectivity,” and would find the entire undertaking futile (and likely unprofitable, but more on that soon).

If, however, you explained Twitter, the blogosphere, and newsy partisan outlets like Daily Kos or National Review to the Founding Fathers, they’d recognize them instantly. A resurrected Franklin wouldn’t have a news job inside The Washington Post; he’d have an anonymous Twitter account with a huge following that he’d use to routinely troll political opponents, or a partisan vehicle built around himself like Ben Shapiro’s Daily Wire, or an occasional columnist gig at a less partisan outlet like Politico, or a popular podcast where he’d shoot the political breeze with other Sons of Liberty, à la Chapo Trap House or Pod Save America. “Journalism dying, you say?” Ben Franklin v 2.0 might say. “It’s absolutely blooming, as it was in my day.”

What is dying, perhaps, is that flavor of “objective” journalism that purports to record an unbiased account of world events. We take journalistic objectivity to be as natural and immutable as the stars, but it’s a relatively short-lived artifact of 20th-century America. Even now it’s foreign to Europeans—cities such as London cultivate a rowdy passel of partisan scribblers who don’t even pretend there’s an impregnable wall between reportage and opinion. The US was much the same until the late 19th and early 20th century. Until 1900 or so, most newspapers were overtly political, and a name like The Press Democrat meant Democrat with a big D. Advertising was a minor concern, as party leaders encouraged members to subscribe to their local party organ, obviating the need for anything more than classifieds.

A National Market for Ads
The bigger switch happened as a national market for consumer goods opened after the Civil War, when purveyors like department stores wanted to reach large urban audiences. Newspapers responded by increasing the number of ads relative to content, and switched to models that went light on the political partisanship in the interest of expanding circulation. This move was driven not exclusively by lofty ideals but also by mercenary greed. And it worked. Newspapers used to make lots of money. Mountains of money. As late as the 1980s and ’90s, many papers had margins exceeding 30 percent, greater than Google’s margins now. Media might now be a sick man, but it wasn’t always so, and needn’t be so.

Jill Abramson, former executive editor of The New York Times, offers a peek into this collision between the legacy grandeur (and profitability) of journalism and the current zeitgeist in her memoir Merchants of Truth (which faces claims of errors and plagiarism). In one scene, the Times’ CEO asks Abramson to cook up new revenue ideas, to which she indignantly responds, “If that’s what you expect, you have the wrong executive editor.” Our reborn Founding Father journalist would find this disconnect between editorial and business absolutely inconceivable. Franklin knew very well on what side his journalistic toast was buttered, and would have leapt at any new monetization ideas.

Abramson also displays her old-guard credentials in her attitude toward her younger colleagues. She chides journalists at outlets like Vice and BuzzFeed for overtly taking partisan sides in their public Twitter personas, diminishing the decorum of supposedly disinterested journalists.

Well … so what?

As Abramson concedes, Trump has been a boon for digital subscriptions at outlets like the Times and The Washington Post. Last week, the Times reported a record $708 million in digital revenue for 2018, helped by a 27 percent jump in subscriptions. It’s heartwarming to think the American public rallied to support abstract principles like the free press by subscribing to the Times. In reality, they forked over their hard-earned money because they wanted to see a highly unpopular president roasted endlessly, and they got what they wanted.

Let’s face it: We live in a Rashomon reality in which every event is instantly captured from a dozen angles and given at least as many interpretations, whether it’s a Supreme Court confirmation hearing or a video of Catholic school kids at a march. The thought that one media outfit will produce what’s taken as God’s gospel truth, under the demands of today’s light-speed media cycle and subject to the vigilante fact-checking of Twitter, seems a bit quaint. By now the savvy media consumer knows to wait 24 hours before making any conclusion about a scoop, to cross-check at least a handful of sources and two dozen Twitter accounts for takes across the political spectrum. “Objectivity” is an atavism from the days of studiously inoffensive and circulation-expanding reportage lavishly supported by unquestioning advertiser budgets. That’s all gone now. And it’s not clear that this studious “objectivity” more closely approximates the truth. Iraq and the WMDs? Madame President? Those were headlines produced under rigorously “objective” (and wrong) coverage, while those who got it right—and there were some—spoke from less regimented perches.

Journalists pining for a return to their golden age of advertising-supported journalism are disturbingly similar to aged Midwestern factory workers seeking a return to the time when high-school-educated labor could afford middle-class lives with total job security. Both golden ages resulted from a unique set of economic and political circumstances that are now gone and impossible to reproduce. Those who claim democracy requires the precise flavor of journalism we’ve known for a century or so will have to explain how our republic survived the century preceding.

While the tone of journalism might be headed back to the 19th century, clearly the business models are not. Revenue-wise, the Great 21st Century Journalism Shakeout will likely end with smaller organizations inventing new business models that those villains—the internet and social media—enabled. Technology outlets such as TechCrunch and Recode pioneered expensive (and expensable) conferences. Gimlet Media, just acquired by Spotify for a reported $200 million, produces high-quality journalistic podcasts, pitching them as shows to Netflix and Hollywood, while selling ads. Gear review sites like Wirecutter (which The New York Times acquired in 2016) make substantial revenue via affiliate marketing, taking a cut of sales they drive on ecommerce sites. Books, those antique vestiges of a preinternet age, still command large advances, and audiobook sales at most publishers are growing at a healthy pace. (As a personal anecdote, I have five times the number of reviews on Audible as I do on Amazon: I’m theoretically an author, but I have more listeners than I do readers.)

For larger, especially national, organizations, the money machine will be a portfolio of all of the above, and probably others. (The solutions for local journalism are less obvious, as services like NextDoor or Facebook Groups threaten local journalism’s claim on the neighborhood scuttlebutt.) The luckiest will be kept alive by wealthy largesse, ironically much of it from the technology world—Laurene Powell Jobs at The Atlantic or Jeff Bezos at The Washington Post. Neither democracy nor journalism will die. In fact, I suspect we’re about to have way more of both than we’ve had in a while. The path to the next golden age in American journalism isn’t nostalgia for a vanishing past but the same way that led to the previous golden age, namely, that of profit. More than likely, given the new business models, this will mean some partiality from journalism as well. That’s just fine too. It’s what Ben Franklin would have done.

(Source: Wired)

Cacao Hunters knows where the best chocolate is: Colombia

‘Don’t chew the chocolate,” instructs chocolate maker Mayumi Ogata. “First sniff the surface, where most of the aromatics are concentrated. Then take a small bite and let it melt over your tongue.”

I’m at Bricolage Bread & Co. in Tokyo’s Roppongi district, where Ogata is delivering a presentation about cacao — the pod-shaped fruit whose fermented beans are used to produce the main ingredient in chocolate — and Cacao Hunters Japan, the brand she helped to develop in Colombia. There, the company partners with various indigenous communities to source heirloom varieties of cacao for the Cacao Hunters line of premium chocolates.

“Different species of cacao have different characteristics. Some have more bitterness, others more sweetness. The roasted aromas, nutty notes and lime-like acidity are the flavors of fermentation,” she explains.
Getting to the good stuff: Cacao producers in Colombia pry open a cacao pod with a knife. | COURTESY OF CACAO HUNTERS

On the plate in front of me are three squares of chocolate, each containing a different percentage of cacao. I pick up a piece of 82-percent dark chocolate made with cacao from the Tumaco region of Colombia and follow Ogata’s directions. The aroma is high-toned and fruity; there’s a citrusy acidity on the palate and a touch of Earl Grey tea, with bitterness and sweetness balancing out the finish. It’s my first taste of chocolate from Tumaco, and I am impressed by its depth and range of flavors.

Although Cote d’Ivoire and Ghana are the world’s largest producers of cacao by volume, cacao trees originated in the Amazon Basin. Artifacts found in Ecuador suggest that indigenous people began cultivating the crop over 5,000 years ago. In recent years, Latin American countries such as Peru, Brazil and Ecuador have emerged as key players in the specialty chocolate market, but Colombia has remained relatively unknown.

“Colombia has suffered from ongoing conflict for many years. Warring between paramilitary forces, guerillas and drug cartels has prevented rural areas from thriving,” says Cacao Hunters co-founder Carlos Ignacio Velasco. “Our goal is to transform the cacao industry here and empower these communities.”

Velasco grew up in Popayan, a picturesque colonial town in southwestern Colombia. After graduating university, he began working with the Colombian Coffee Growers Federation. The job brought him to Tokyo, where he lived for four years, opening his eyes to the potential of high-end agricultural products for rural economies. Products such as coffee, he explains, are usually traded at low prices as commodities; turning them into specialty items adds value and boosts income for growers.

Eventually, he hit upon the idea of working with cacao. Native species of cacao trees abound in the jungles of Colombia’s Pacific coast. However, most of the cacao harvest “goes to the domestic, low-quality market.”

For the past seven years, Velasco has been working with small-scale farmers — around 1,500 families descended from African slaves in Tumaco, and approximately 200 families from the Arhuacos tribes in the Sierra Nevada region. He helps to select heirloom varieties and set up post-harvesting infrastructure, providing vital equipment such as fermentation tanks and drying stations, in order to improve the quality of the cacao beans and resulting chocolate.

Heirloom chocolate: Although Peru, Brazil and Ecuador have emerged as key players in the specialty chocolate market, Colombia remains relatively unknown. | COURTESY OF CACAO HUNTERS
“The most important element of what we do is provide the farmers with a stable and high price for the product. Then we can create the right incentives for them to keep cultivating cacao as an alternative to coca, which contributes to the development of the country as a whole,” Velasco says. According to an impact assessment conducted by U.S.-based investment firm Acumen, the project has led to an increase in net income of 58 to 75 percent for producers.

Through contacts in Tokyo, Velasco met Mayumi Ogata, who had been working as a consultant for a major chocolate company. He invited her to Sierra Nevada to meet with the Arhuacos, and she soon became a partner in the venture.

“When I first visited Colombia, I was blown away by the biodiversity of the cacao, but the (production and distribution) systems were not organized at all. I felt that there was so much work for us to do. I thought that if we could just make certain adjustments, we would be able to make great chocolate, at such a high level,” Ogata recalls. “I see a lot of potential for Colombian chocolate going forward.”

In 2013, the two set up a factory in a corner of Velasco’s mother’s pastry shop in Popayan. Velasco made chocolate in tiny batches and Ogata brought suitcases filled with samples to Japan. In 2015, Cacao Hunters received its first gold medal at the prestigious International Chocolate Awards, and the company has expanded operations, opening a new facility to meet demand.

The Cacao Hunters brand has since caught the attention of top chefs and patissiers such as Jordi Roca, of Spain’s El Celler de Can Roca, who visited farmers in Sierra Nevada last year. In Tokyo, Shinobu Namae uses the chocolate in recipes at Bricolage and his Michelin-starred restaurant, L’Effervescence.

“Before, I was using chocolate from big international companies and thought of it as a product without uniqueness. Now I see there is so much more, and this has changed the way I use it,” Namae says.

Working with distinctive ingredients, he explains, requires more creativity and experimentation. The chocolate croissant at Bricolage, for example, is made with Cacao Hunters’ 82-percent chocolate, which necessitates a slightly saltier pastry dough to “highlight the fruity flavor of the chocolate.” The result? Delicious.

(Source: JT)