I can only sketch here the many other things that trouble me about Gatsby and its place in our culture. There is the convoluted moral logic, simultaneously Romantic and Machiavellian, by which the most epically crooked character in the book is the one we are commanded to admire. There’s the command itself: the controlling need to tell us what to think, both in and about the book. There’s the blanket embrace of that great American delusion by which wealth, poverty, and class itself stem from private virtue and vice. There’s Fitzgerald’s unthinking commitment to a gender order so archaic as to be Premodern: corrupt woman occasioning the fall of man. There is, relatedly, the travesty of his female characters—single parenthesis every one, thoughtless and thin.
I always enjoy a good rant, however wrongheaded it is.
In the course of reviewing Aldo Schiavone’s Spartacus (just published in English translation by Harvard), Mary Beard explains just how little we actually know about the gladiator-leader of a servile revolt.
In the entrance hall of a fairly ordinary house in ancient Pompeii, buried beneath layers of later paint, are the faint traces of an intriguing sketch of two men fighting on horseback. They are named in captions above their heads, written in Oscan—one of the early languages of South Italy that was eventually wiped out by the Latin of the Romans. The name of one is scarcely legible, but probably says “Felix the Pompeian” (or “Lucky from Pompeii”). The other reads clearly, in Oscan, “Spartaks,” which in Latin would be “Spartacus”—a name best known to us from the slave and gladiator who in the late 70s BC led a rebellion that, it is said, very nearly managed to defeat the power of Rome itself.
At first sight, the scene painted on the wall looks like a military battle. But the trumpeters on either side of this pair of fighters match those often found next to gladiators in ancient paintings. So this is probably meant to depict mounted gladiatorial combat. The men must be the equites, or “horsemen,” who sometimes appeared in those bloody Roman spectacles, alongside the more familiar, heavily armed characters who fought on foot.
It is, of course, possible that the painting has nothing to do with the famous Spartacus, and that it refers to some other gladiator who just happened to have the same name; that is certainly what some skeptics argue. But there are nevertheless good reasons for linking the painting to the famous rebel: it very likely dates to the lifetime of “our” Spartacus, in the early years of the first century BC (as both the archaeological setting and the use of the Oscan language suggest); and Pompeii was, in any case, less than forty miles from Capua, where Spartacus underwent training for combat and from where he is said to have launched his rebellion—the two towns were presumably on the same gladiatorial circuit. There is a fair chance that this image gives us a glimpse of the future enemy of Rome when he was still just an ordinary gladiator—and to judge from the picture, not a totally successful one. For “Felix the Pompeian” is certainly getting the better of the retreating Spartaks. In fact, we might guess that it was to celebrate the victory of the local man that the Pompeian householder put up this image in his front hall.
It has been more than 30 years since James Salter, whom I consider a quite interesting writer of the second rank, published his last novel. The publication of his new book has provoked frequent observation that Salter is really a much better writer writing more substantive and thematically worthy of attention novels than certain better-known establishment writers of fiction, but his work has, for five decades now, somehow mysteriously escaped wide attention.
The new novel is certainly not a masterpiece, but it is a very satisfactory read. It reminded me of John Marquand. The protagonist, Phillip Bowman, serves as a young naval officer in WWII, attends Harvard, and then becomes an editor in a NY publishing house. He marries a young woman out of the same Virginia equestrian circles I currently frequent, and Salter delivers a quite accurate portrait of the Virginia Hunt Country and its unique ethos. The marriage fails for reasons that are not entirely clear. The problem is apparently simply the fact that Bowman takes her away from Old Virginia and moves her to New York where she is obliged to live without horses and hunting and her family and home society. Bowman goes on without excessive perturbation to have other relationships and affairs. He meets a woman returning on a flight from Europe. They share a taxi, and ultimately live together. But she, too, leaves him, and opportunistically gains ownership of the house he purchased for their use by means of legal chicanery. A good while later, he runs into his former lover’s daughter, whom he had known when she was a child. He is friendly, dismissive of the wrong her mother did him, and he proceeds to take advantage of the opportunities which present themselves to make love to her. He persuades her to let him take her on a trip to Paris, where he shares with her his sophisticated knowledge of the city and its restaurants and Picasso’s paintings of Marie-Thérèse Walter, and skillfully makes love to her. When he has finally succeeded in bringing her to the peak of erotic fulfillment, he calculates that she will, before very long, come to her senses about the enormous gap between their ages and the unsuitability of their relationship. He then simply walks out, paying the hotel bill, and leaving her penniless in Paris, knowing full well that she will have to seek the assistance of her mother. The reader is likely to think Phillip Bowman cruel. Like Marquand, Salter writes basically about what he knows, and tends to present fictional versions of himself, portraits of the gentleman of accomplishment, the cynical and astute observer of society and humanity, and the homme moyen sensual, the connoisseur of sexual relationships and the female body.
“All That Is” is a novel about love, but Salter’s view of love is appreciative yet unsentimental. Phillip Bowman is grateful for the female companionship that life brings his way, but he is not wildly optimistic about his own motives and capacity for enduring affection or those of any of his successive string of partners. For Salter, love is always, as the title of an earlier novel put it, A Sport and a Pastime.
There was this count, and his wife said to him one day that their son was growing up and wasn’t it time he learned about the birds and the bees? All right, the count said, so he took him for a walk. They went down to a stream and stood on a bridge looking down at peasant girls washing clothes. The count said, your mother wants me to talk to you about the birds and the bees, what they do. Yes, father, the son said. Well, you see the girls down there? Yes, father. You remember a few days ago when we came here, what we did with them? Yes, father. Well, that’s what the birds and the bees do.”
Matt Kahn is undertaking an interesting challenge. He intends to read his way through a century’s worth of Publishers Weekly’s annual bestsellers, which means that he has to read (and review) 94 individual titles, since a small number of books succeeded in capturing the title for two years running.
This sort of enterprise will doubtless be at times laborious, but it definitely will have its rewards. When he’s done, Mr. Kahn will be a wiser man with a much better understanding of the ways the consciousness of the American reading public has changed and has not changed.
Kahn started off in 1913 with a real forgotten clunker, Winston Churchill (the American novelist, not the British politician)’s The Inside of the Cup, a dated and tendentious screed attempting to prove novelistically that Christianity and Progressive politics are the same thing. (Yes, Virginia, popular culture was rife with bolshevism and anti-business agitprop, even way back then.)
The Publisher Weekly annual bestseller list turns out to be a bit odd. Hardly any canonical classics get into it (though some by Sinclair Lewis do). It seems to rise from the primitive turn-of-the-century stuff to virtuous middle-brow “important books” interspersed with big pulp, and then—with the 1960s—becomes quite erratic.
Oddly enough, it is perfectly evident that even the most purple examples of forgotten teens and twenties tripe will not constitute the roughest patches of Mr. Kahn’s literary road. When he gets to the 1990s (Gawd help him!), it’s going to be John Grisham and Dan Brown all the way, ending with a bang at Fifty Shades of Grey.
Everyone knows that the code-hero career of Ernest Hemingway ended with the great man putting a shotgun to his own forehead, after years of infidelity to a series of wives, disgraceful episodes of bullying, and embarrassing displays of drunkenness and vanity. By the time Hemingway pulled the trigger on his 12-gauge Boss, it was all gone for him: the powerful athletic physique and once superlative health, the unsurpassed ability to produce clear and elegant English prose, even the penetrating insight and cool lucidity underlying his impeccably stoical point of view.
He had destroyed his talent himself. Why should he blame this woman because she kept him well? He had destroyed his talent by not using it, by betrayals of himself and what he believed in, by drinking so much that he blunted the edge of his perceptions, by laziness, by sloth, and by snobbery, by pride and by prejudice, by hook and by crook. ...What was his talent anyway? It was a talent all right but instead of using it, he had traded on it. It was never what he had done, but always what he could do. And he had chosen to make his living with something else instead of a pen or a pencil.
Paul Hendrickson takes Hemingway’s 38-foot Wheeler cabin cruiser, the Pilar, built for him in 1934, as the center and symbol of the final 27-year, 3-month trajectory of the author’s literary career and life, and chronicles Hemingway’s whole sad end game, the struggle of the human being to live up to his own masterfully-designed and brilliantly-marketed personal myth, his failure, crack-up, and decline. Yet, Hendrickson sympathizes and finds in Hemingway’s process of personal self-destruction still ever so much to pity and admire. As he puts it in the title of his prologue: “Amid So Much Ruin, Still the Beauty.”
Few great writers have ever received such an extraordinary tribute. Hemingway’s Boat represents the product of massive and intensely focused research. Hendrickson can lovingly describe the details of the room where Hemingway used to stay in the Ambus Mundos Hotel, as well as tell you exactly which models of Vom Hofe and Hardy salt water reels he fished. Hendrickson even throws in some rather significant and ground-breaking criticism, arguing quite persuasively that it was Hemingway, in Green Hills of Africa (1935), who really invented the non-fiction novel, not Capote or Mailer thirty years later). Hemingway’s Boat is, in the final analysis, a passionate and deeply personal eulogy to a great man delivered in finely crafted prose that is worthy of its own subject.
It has been a very long time since anyone has produced a fishing memoir as good as Luke Jennings’ Blood Knots.
Jennings, who I found is, oddly enough, dance critic for the Observer, describes the (exotic to Americans) bildungsroman of an ordinary British angler, who starts off—like the rest of us—with cheap tackle and humble access to low quality, near-home angling opportunities before gradually progressing to more exciting waters and nobler quarry.
In Jennings’ case, we get some astonishingly exciting accounts of how much sporting excitement can be found in pike and carp, barbel, tench and rudd. Luke Jennings can make the encounter with a canal-bred pike lurking off a London tow-path read like Jim Corbett stalking a man-eater in the Himalayan foothills.
But Blood Knots is not only a fishing book. It is an account of the coming of age and moral education, in today’s modern world, of a surprisingly exotic survival: the recusant Catholic gentleman. Jennings’ family, as he puts it, was of “bookish gentry, each beggaring itself to pay for the education of the next… born of windy vicarages and dusty cantonments.”
His first powerful influence was his father, a Hussar officer awarded the Military Cross for pressing home an armored attack at Ijsselstein in September of 1944, despite two tanks being shot out from under him. The second, as the saying goes, “brewed up,” and Jennings’ father only lived because he was thrown out of the tank by the explosion. He was badly burned. The scars on his face remained highly visible, and Mrs. Jennings had to dress his burned fingers every day for the fifty years of their marriage.
Jennings attended the (Benedictine) Ampleforth College, and provides this testimony to its unmodern ethos.
Father Paul Neville, the former headmaster of Ampleforth, was once talking to a fellow principal who informed him expansively that his own establishment’s purpose was ‘to prepare boys for life.’ ‘Ah,’ said Father Paul quietly. ‘Ours is to prepare them for death.’”
At Ampleforth, Jennings met his second major influence, a recent Ampleforth graduate named Robert Nairac, then serving as junior master.
It was important to know whom you were dealing with and so, on the first night of the autumn term, three of us cooked up a excuse to knock on the new master’s door. We trooped in to be greeted by a tough-looking figure with unkempt black hair and a cheerful grin. He was lying on his bed in his shirtsleeves, smoking. Around him, on the sheets, lay the constituent parts of a twelve-bore shotgun and a pair of cleaning rods. On top of the chest of drawers was a falconer’s leather gauntlet, the fingers dark with dried blood, and a battered fishing-bag in which I could see a jumble of wire traces and pike lures. With the small sash-window closed, the air was heavy with gun oil and Balkan tobacco.
Nairac proved a superb sporting mentor immersing Jennings in “the rituals of the field sports” and “the near mystical sense of place and history that, on occasion, can accompany them.”
The same Robert Nairac, a few years later, became part of history. After Oxford, he joined the Grenadier Guards, and worked undercover against the Provisional IRA terrorists in Ireland. In May of 1977, while visiting a pub to gather intelligence, he was abducted, brutally tortured, and finally murdered by the IRA. His body was never found.
Blood Knots is the best kind of fishing memoir, the kind of book that demonstrates the necessary role of active participation in the processes of Nature in fulfilling essential needs in the cultivated human being’s spiritual life.
Suw Charman-Anderson, in Forbes, notes a watershed moment in the world of books and readers. For the first time, a book self-published by its author has broken through traditional barriers and gained the attention of important establishment book reviews.
analyzes a dozen “great millennial dramas” that have forged a new golden age in TV: bold, innovative shows that have pushed the boundaries of storytelling, mixed high and low culture, and demonstrated that the small screen could be an ideal medium for writers and directors eager to create complex, challenging narratives with “moral shades of gray.”
But the New York Times’ Michiko Kakutani wasn’t the only mainstream book critic to write about Sepinwall’s book. USA Today carried an interview with Sepinwell at the end of November, Time published a review of its own, The Huffington Post carried a review, so did the New Yorker.
Sepinwall got the kind of coverage that most traditionally published authors can only dream of. To some extent, this might just be reviewers reviewing another reviewer, a little bit of moral support from your friends, except Sepinwall’s friends have very big megaphones. But at the same time, it illustrates that the idea of a division between ‘traditionally published’ and ‘self-published’ is becoming a ridiculous construct with no meaning whatsoever. ...
The reasons that self-published books don’t get reviewed boil down, I think, to the lack of infrastructure. A traditional publishing company can get to know different reviewers and send them the books that they think will go down best with that person. And the reviewer works on the assumption that what he or she is sent by the publisher has to be at least half-decent and thus worth opening. This whole process works because it’s mediated and because of the assumption that a third party stamp of approval for a book guarantees minimum levels of quality. ...
[R]eviewers depend on publishers acting as winnowers, sorting out the wheat from the chaff, and at least attempting to make sure that they are sent books they are actually interested in. It’s this weeding out process that’s missing in self-publishing.
This is bound to be only the first instance of what will before very long become normal.
Technology has made self publication and book distribution easy, inexpensive, and available to anyone.
Even successful and well-established popular authors like Barry Eisler as far back as 2011 have found the economics and creative control offered by self publishing to be irresistible. (Eisler was interviewed here about his at-the-time astonishing decision to dump his relatively prestigious print publisher and move off into the new frontier of electronic self publication.)
What have you to recommend? I answer at once, Nothing. The whole current of thought and feeling, the whole stream of human affairs, is setting with irresistible force in that direction. The old ways of living, many of which were just as bad in their time as any of our devices are in ours, are breaking down all over Europe, and are floating this way and that like haycocks in a flood. Nor do I see why any wise man should expend much thought or trouble on trying to save their wrecks. The waters are out and no human force can turn them back, but I do not see why as we go with the stream we need sing Hallelujah to the river god.
—FitzJames Stephen, Liberty, Equality, Fraternity, 1873.
61 years ago, the young William F. Buckley Jr. launched what would become a splendiferous career as celebrity commentator and public intellectual by publishing not long after his graduation from Yale a scathing critique of his alma mater, titled God and Man at Yale.
God and Man at Yale represented Buckley’s first major effort at “standing athwart history yelling ‘Stop!,’” and we may now read with a certain poignancy the report of Nathan Harden, Sex and God at Yale, compiled at a posting station considerably farther along the road to Hell in a handbasket, demonstrating just how little either History or Yale was listening.
The youthful naysayer of 1951, Buckley, was a classic version of the privileged insider. Buckley was rich, handsome, and stylish, educated at elite preparatory schools in Britain and the United States. At Yale, he was the kind of celebrity undergraduate BMOC that basically ceased to exist after coeducation: Captain of the Debating Team, Chairman of the Daily News, and—of course—member of Skull and Bones.
The contrast between Buckley and Harden could scarcely be more extreme. Nathan Harden was home-schooled, knows what manual labor is like, and grew up in a family that was short of cash living all over the Southern United States. Harden was turned down by Yale initially, attended one of the Claremont Colleges, then got into a one-term visiting student program at Yale, tried transferring and was turned down again, and finally re-applied and was accepted. He was 22 years old and already married by the time he started college in California, so he must have been 24 (and still married) by the time he finally got to Yale as a degree candidate. Harden did his junior year abroad in Rome and, though he speaks with some familiarity of Political Union debates, he clearly never became any kind of BMOC and obviously did not get into Bones.
Nathan Harden came to Yale with the ability to appreciate the richness of her centuries of history and tradition. He speaks openly of the intense pleasure to be found in exploring Yale’s incomparably rich academic offerings served up by some of the greatest living minds while living in the midst of a community of the most spectacularly talented people of one’s own generation sharing the same Arcadian existence. He also understands exactly why Yale is superior to Harvard.
But… like any representative of ordinary America studying at one of America’s most elite universities today, Nathan Harden was also frequently shocked by the estrangement from, and hostility toward, the America he came from of his alma mater, and appalled by the strange gods of Multiculturalism and Political Correctness who have ousted the Congregationalist Jehovah from that ancient university’s temple.
For Nathan Harden, Sex Week at Yale (which we learn from him recently constituted an eleven-day biennial Saturnalia of Smut in which all of the university’s best known lecture halls (!) were turned over to demonstrators of sex toys, porn stars, and dirty film moguls to dispense technical instruction and even career advice to the Yale undergraduate community) serves as a crucial synecdoche for the moral crisis at the heart of American university education generally and particularly at Yale.
Harden argues that “For God, For Country, and For Yale,” Yale’s motto, has become not so much a series of aspirative ends ranked in hierarchical order but rather an accurate historical description of Yale’s own primary locus of value.
Yale was founded as a college, intended to serve God by educating Congregationalist clergymen to fill the pulpits of the Colony of Connecticut. Over time it evolved into a national institution educating an elite group of leaders in business, the military, politics, the arts, and the sciences for the United States. Today Yale is decidedly a hotbed of infidelity to both Christianity and the United States. Secular Progressivism has thoroughly replaced Congregationalism and Christianity, and loyalty to an international elite community of fashion has supplanted any particularist sentiment in favor of the United States. The Yale Administration operates neither to serve God nor Country, but instead directs its efforts entirely toward forwarding its own goals and enhancing its own prestige.
Armed with an almost-unequaled cash endowment and an equally impressive historical legacy and accumulation of multi-generational glory and therefore a concomitant ability to attract talent and additional funding, the Yale Administration is formidably equipped to mold, educate, and inform in any direction it wishes, but as Nathan Harden explains, the problem that is increasingly evident is the practical inability of the University Administration to distinguish good from bad, right from wrong, or up from down in the complex contemporary world of conflicting claims.
Presidents Angell, Seymour, and Griswold would have had no difficulty at all in understanding why the University ought not to lend the principal lecture halls in Linsley-Chittenden, W.L. Harkness, and Sheffield-Sterling-Strathcona Halls for porn stars to demonstrate sexual techniques or heads of pornography studios to proffer career advice. Richard Levin obviously does not understand why Sex Week at Yale is inappropriate (to say the least), any more than he understands why Yale should not be devoting 10% of its undergraduate places to foreigners, or why Yale should not be renting out its name and reputation to Third World governments.
Harden understands the problem and, though he has very recently graduated, he’d be a lot more qualified to run Yale than the current administration.
Yale… enjoys a strong tradition of educating American political leaders. Over the course of its first two hundred years, as Yale’s spiritual mission faded slowly into the background, a political purpose emerged as a new defining agenda. Serving country became a proxy for serving God. A patriotic purpose replaced a spiritual one. It was assumed for a long time that the interests of America were, by extension, Yale’s interests as well. A large percentage of Yale graduates enrolled in the military immediately following graduation. And, of course, many went on to hold high political office.
The diversity that came to Yale in the sixties was a good thing. Other changes were less positive. In the late 1960s, Yale’s patriotic ethos disintegrated in the face of pressures from the radical new left. The old-guard liberals, who had long governed the university, were replaced by a new, younger set. The old-guard liberals were in the mold of Jack Kennedy—they were New Deal liberals who were sympathetic to religion and proud of their country. They were traditionalists. The new leftists, on the other hand, wanted radical social transformation. They wanted to challenge the old moral assumptions and revolutionize the economic system. Empowered by the backlash against the Vietnam War, and a sanctimonious belief in the justness of their cause, students rose up and violently took over the agenda of the American left. ... About this same time, the patriotic purpose that had defined the university for two hundred years disappeared. The faculty had voted the year before to revoke academic credit for ROTC courses. Later, Yale moved to restrict military recruiters’ access to students. With the destruction of Yale’s patriotic ethos, the last remaining sense of Yale having any higher educational purpose in service of the nation went out the door.
That isn’t to say that Yale ceased being political. But from that point onward, Yale’s political agenda was no longer tied to American interests. In fact, Yale’s political climate came to be defined more and more by anti-Americanism. Economic theories in opposition to free markets became prevalent. Identity politics and interest-group politics began to take over academic life, endangering free speech in the name of cultural sensitivity, and ushering in a new era of suffocating political correctness.
The shift happened quickly. Only a couple of decades before, during World War II, faculty sentiment had been united against America’s enemies in Nazi Germany and Fascist Italy. Now, if the topic of international affairs happened to be raised in the faculty lounge, it had become fashionable to speak of America as the bad guy. Saying nice things about America’s enemies became a mark of intellectual sophistication—of rising above mindless nationalism-Patriotism, like religion, had become a mark of low intelligence, an anachronism. ...
Yale is a place where one can find people expressing almost every imaginable viewpoint and belief system. But here is the unanswerable question: How does a secular university judge between the competing moral claims of its members when those claims breach the private sphere and enter the public realm? ...
Nihilism is, ultimately, where Yale is headed. Yale was built in order to nurture ideas that would elevate the soul and advance human understanding, but it now has no governing moral principle-As a result, the knowledge generated there is divorced from any larger human purpose. Apart from a kind of vague appreciation or certain concepts like tolerance and diversity, Yale is a moral vacuum. Therefore, almost anything goes. Yale is among a dwindling number of institutions that provide a classical liberal education, focusing on the great books of the Western canon—topped off with porn in HD. As I observed, within its walls, images of women being beaten and humiliated for no other reason than the pleasure and profit of others, I became aware that I was witnessing much more than the decline of a great university. I was witnessing nothing less than a prophetic vision of America’s descent into an abyss of moral aimlessness, at the hands of those now charged with educating its future leaders.
Rachel Cooke goes for a walk in the course of interviewing Robert Macfarlane, author of a new book (being released in October in the USA, but already in print in the UK) on Britain’s ancient tracks, holloways, drove roads, and sea paths.
Examine a large-scale map of the Essex coastline between the river Crouch and the river Thames, and you’ll see a footpath which departs the land at a place called Wakering Stairs and heads east, straight into – or so it appears – the North Sea. A few hundred yards on, it veers north, heading out across Maplin Sands until, three miles later, it turns back in the direction whence it came, finally making landfall at Fisherman’s Head, on the edge of Foulness Island.
Can this carefully traced line be for real? Certainly. You are not hallucinating. This is the Broomway, a path that is said to date from Roman times, and when Robert Macfarlane agrees to go walking with me, it’s his first idea. Am I excited about this? Yes, and no. I’m thrilled at the idea of heading out with Macfarlane; I feel like a marathon runner who’s been invited to train with Paula Radcliffe. But then I read his book, The Old Ways, and anxiety rolls in, like Essex mist. The Broomway, which can only be crossed when the tide is out, is the deadliest path in Britain; Edwardian newspapers, relishing its rapacious reputation – 66 of its dead lie in Foulness churchyard – rechristened it “the Doomway”. As he notes, even the Ordnance Survey map registers the “gothic” atmosphere of the path: “WARNING,” it reads. “Public rights of way across Maplin Sands can be dangerous. Seek local advice.” I admire Macfarlane hugely; I would love to watch him “walking on silver water” in the “mirror-world” that is the Broomway. On the other hand, I would probably prefer not to drown in the service of trying to tell you what a good writer he is.
——————————————— Wikipedia: The Broomway provided the main access to Foulness for centuries. It is an ancient track, which starts at Wakering Stairs, and runs for 6 miles (9.7 km) along the Maplin Sands, some 440 yards (400 m) from the present shoreline. The seaward side of the track is defined by bunches of twigs and sticks, shaped like upside-down besom brooms or fire-brooms, which are buried in the sands. Six headways run from the track to the shore, giving access to local farms. The track was extremely dangerous in misty weather, as the incoming tide floods across the sands at high speed, and the water forms whirlpools because of flows from the River Crouch and River Roach. Under such conditions, the direction of the shore cannot be determined, and the parish registers record the burials of many people who were drowned.
Several weeks ago, returning from shopping, as I proceeded along our driveway, I saw a skunk standing in broad daylight, right outside our fenced house compound. I slowed deliberately, intending to give the skunk a chance to scamper off, away from threatening human beings and cars. The skunk, however, failed to respond appropriately. It stood there, swaying a little from side to side, and then it began to stagger, not away toward the woods, but in the direction of a gate in the fence around the house area.
Not good, I thought. That skunk is sick, and it probably has rabies.
My dogs were outside, and if the skunk went under that gate, he could easily have run into them.
I hurriedly drove around the corner, and ran into the yard. Fortunately, both our dogs came to me immediately, and I was able to lead them into the house and safety. I’d been target-shooting recently with Karen’s 9mm Walther pistol, and it was the nearest available gun, lying ready for use on a handy shelf beneath the kitchen counter. I grabbed up the Walther and went back outside.
I walked down to the corner of the fence, and found that the skunk had not moved very far. It was still swaying. It still looked terribly sick.
Skunks present a pretty impressive hazard even without rabies, and I definitely wanted to be out of range of both deliberate and terminally-reflexive spraying, so I worked the slide and took aim from a good long 20 feet. I shot the skunk in the head with a 9mm bullet, but I had no desire to try disposing of it until it was absolutely certainly dead and completely inert, so I proceeded to empty the magazine into the animal’s head and neck region. The skunk quivered in response to the first shot, and subsequent rounds knocked it over and moved it a bit. After 10 rounds, I finally felt sure that it was dead, dead, dead, and completely past any kind of retaliation.
I walked back and got a shovel. I picked up the skunk on the blade of the shovel, got into my truck, and balancing the shovel on the car window with one hand, managed to carry the dead skunk outside the vehicle, back out our long driveway. I then carefully got out and pitched the skunk far into the uninhabited woods across the road. That placed it almost a quarter of mile from our house and much farther than that from any other homes.
Disposing of the sick skunk actually went very smoothly, but the possibilities were frightening. Our two dogs and two of our cats could have run into that skunk and been infected.
Alice Gregory’s review of a new cultural history of rabies makes it clear that that particular disease is really far more awful than we normally realize.
“Ours is a domesticated age,” writes Bill Wasik and Monica Murphy in Rabid: A Cultural History of the World’s Most Diabolical Virus
. Wasik is an editor at Wired and Murphy, his wife, a veterinarian. Together they have coauthored a sprawling chronicle of rabies, which until you get the numbers, seems like a willfully anachronistic topic. I did not know, for instance, that rabies is the most fatal virus in the world (only six unvaccinated people have survived, the first in 2004.) A fun party trick is forcing people to guess how many rabies fatalities there are each year. Optimists will hazard 100. Skeptics, 1,000. The real answer is 55,000, a figure so large it transforms your audience into a bunch of stoned teenagers marveling at the fact equivalent of a Big Gulp.
Wasik and Murphy’s subject might seem like a deliberately strange one, but they exercise nothing but user-friendly restraint when it comes to historical detail and medical explanation. It’s a rare pleasure to read a nonfiction book by authors who research like academics but write like journalists. They have mined centuries’ worth of primary sources and come bearing only the gems. My favorites were the archaic cures, some of which were reasonable (lancing, cauterization), while others were plain perverted. The Sushruta Samhita recommends pouring clarified butter into the infected wound and then drinking it; Pliny the Elder suggests a linen tourniquet soaked with the menstrual fluid of a dog. The virus comes up surprisingly often in literary history, too. A Baltimore-based cardiologist speculates that Edgar Allan Poe, who died in a gutter wearing somebody’s else’s soiled clothes, perished not of alcoholism, as has long been thought, but of rabies. In the most famous anecdote about Emily Bronte, she is bit by a mad dog while dawdling in a moor. Terrified of infection, she rushes home and secretly cauterizes the lesion with an iron.
UPDATE: I should have mentioned that I live in Virginia these days, where vultures abound, and my property is actually infested by black vultures who try to hang out on the barn roof, nearby trees, and even occasionally the house.
They and I have reached a modus vivendi in which they know that when I say: “Get going!” they had better take off and fly somewhere else, or very soon .22 Long Rifle bullets are going to come whistling rather near them.
They commonly sit at the top of some tall Locust trees at the end of our driveway. They were not there when I disposed of the dead skunk, but they had already completely cleaned up that skunk by late afternoon (when I went out to get the mail).
Mark Anthony Signorelli turns his review of Mark Steyn’s After America: Get Ready for Armageddon into an essay supplementary to Steyn’s book, arguing the author’s view of cause and effect can be improved by reading a much earlier (1930) attack on the same forces of dissolution by the Spanish philosopher Ortega y Gasset.
Throughout his book, Steyn catalogues the demoralizing effects of unlimited government upon the American citizenry. No one can ignore the power of the case he presents. But as much as government overreach erodes the character of a people, the debased character of a people manifests itself in arbitrary government. Bad institutions make bad people, but bad people also make bad institutions. Our ugly politics is every bit a reflection of our cultural failings as are our worthless schools. Steyn is not unaware of these facts; one of the passages I found most compelling in his book was when he argues that the truly horrifying thing about the rise of Obama was the fact that the majority of the American people had been duped by such an evident buffoon. Our folly created his administration, and all of its works. So Steyn clearly understands the way a people’s faults can manifest themselves in inept government. Still, the obvious emphasis of his book is on the causal relationship which runs opposite, on the way that inept government debases the character of a people. I think that emphasis is misplaced; I think the effects of a people’s character on the character of their government are more fundamental, more decisive to their happiness, and more subject to reform than the effects which flow from a corrupted government upon the citizenry. Or, to put the point in a different way, I believe that culture is far more consequential for the maintenance of a well-ordered community than politics. Steyn himself advises that, “changing the culture is more important than changing the politics,” but since the emphasis of his book is on the way that bad politics has changed our culture for the worse, he actually seems to undermine this bit of advice.
The book that most effectively delineates the ruinous social mechanisms of liberal democracy is The Revolt of the Masses, by the early twentieth-century philosopher Jose Ortega y Gasset. For Ortega, modern western society was marked by the rise to power of the “mass-man,” the unqualified or uncultivated man, who, lacking all necessary intellectual and moral training in the duties of civic life, had nonetheless asserted his immutable right to impose his own mediocrity of spirit upon society: “The characteristic of the hour is that the commonplace mind, knowing itself to be commonplace, has the assurance to proclaim the rights of the commonplace and to impose them wherever it will.” The mass-man is not bound by any traditions or maxims of prudence; he cares only about having his own way in the world. And when he is taught (as all modern political theory teaches him) that the state is a manifestation of his own will, he freely grants it an unlimited scope of action, just as he (theoretically) grants himself a perfect freedom of action: “This is the gravest danger that today threatens civilization: State intervention, the absorption of all spontaneous social effort by the State…when the mass suffers any ill-fortune, or simply feels some strong appetite, its great temptation is that permanent, sure possibility of obtaining everything – merely by touching a button and setting the mighty machine in motion.” The consequences of this trend are catastrophic:
The result of this tendency will be fatal. Spontaneous social action will be broken up over and over again by State intervention; no new seed will be able to fructify. Society will have to live for the State, man for the governmental machine. And as, after all, it is only a machine whose existence and maintenance depend on the vital supports around it, the State, after sucking out the very marrow of society, will be left bloodless, a skeleton, dead with that rusty death of machinery, more gruesome than the death of a living organism.
Exactly as Steyn describes it in his book, some eighty years later. But what Ortega makes us see is that “big government” results from the prior moral corruption of the people, in particular from their unbounded self-love and self-assurance. It destroys them in the end, but at the first, it was their creature.
Dick Cheney is clearly a better memoirist than his one-time boss and both predecessor and successor at the Defense Department Donald Rumsfeld. I still have not finished Rumsfeld’s Known and Unknown which came out last February. I think that Cheney seems somehow more forthcoming, direct, and personally present in his recounting of his life and career in government service.
Most people, I’m sure, have seen reviews elsewhere noting that Dick Cheney did make a point of settling certain scores, noting the disloyalty of Colin Powell and his associates at the State Department toward the president and toward administration policy when the going got tough in Iraq, and highlighting the failure of Powell and his subordinate Richard Armitage to deflect a barrage of accusations of having outed Valerie Plame directed at innocent members of the administration which would have avoided a large-scale investigation and the appointment of a special prosecutor, and ultimately the conviction on a secondary-level charge of Dick Cheney’s own chief of staff, Scooter Libby,when Powell knew perfectly well that Armitage himself was the source of the leak. Cheney describes Powell’s silence in response to press inquiries after a 2003 cabinet meeting with not actually openly phrased, but nonetheless withering, contempt.
He is perhaps even harsher in describing at length Condolezza Rice’s dishonest and ill-advised efforts to obtain some chimerical version of a non-proliferation deal with North Korea, and her discreditably enthusiastic willingness to participate in sham agreements with that nefarious regime at the expense of the safety of the United States and other nations.
Beyond those best known portions of the Cheney memoir, I found a few other interesting details.
On 9/11, Dick Cheney found himself being forcibly propelled out of his office by the Secret Service, which led him hastily to the safer location of the underground Presidential Emergency Operations Center (PEOC), deep beneath the White House. Dick Cheney provides an inadvertent testimony to the general competence with the government spends its billions and trillions when he describes the subsequent scene.
While we were managing things from the PEOC, another meeting was under way in the White House Situation Room. The PEOC staff attempted to set up a videoconference to connect the two rooms, and we managed to get images of the Situation Room meeting up on one of our screens, but we couldn’t get any audio of the meeting. We were getting better real-time information from the news reports on TV, but because of a technical glitch, I couldn’t hear those reports when the video of the Sit Room meeting was on display. I told Eric [Feldman, Cheney’s deputy national security advisor] to get on the phone and try to listen to the Sit Room meeting, but after a few minutes he described the audio quality as ‘worse than lisening to Alvin and the Chipmunks at the bottom of a swimming pool.’ I told him to hang up. If something important was happening upstairs, they could send someone down or call us direct.
Visions of the gazillions of dollars spent on custom-built high tech communications equipment and infrastructure for the Presidential Emergency Operations Center and the White House Situation Room swam before my eyes. Clearly, they could have just gone out to Radio Shack and done better.
In describing his early career as congressman from Wyoming and a member of the House Intelligence Committee, Dick Cheney serves up one very provocative little nugget.
In May 1987 I received a call from legendary CIA counterintelligence director James Jesus Angleton. He said that he had something of vital importance to tell me and that it could be conveyed only in person. ...
I called Henry Hyde, the Intel Committee’s ranking Republican and invited him to sit in on the meeting. A few days later, before our scheduled meeting, Jim Angleton died. I never learned what it was he wanted to tell me.
There is the plot of a great spy thriller right there in the story of the unconveyed Angleton secret.
Joseph Epstein finds in the recently published Cambridge History of the American Novel a perfect demonstration of exactly what has happened to university English departments in recent decades and thinks all this probably has something to do with the percentage of students majoring in English having been roughly cut in half over the same period.
Only 40 or 50 years ago, English departments attracted men and women who wrote books of general intellectual interest and had names known outside the academy—Perry Miller, Aileen Ward, Walter Jackson Bate, Marjorie Hope Nicolson, Joseph Wood Krutch, Lionel Trilling, one could name a dozen or so others—but no longer. Literature, as taught in the current-day university, is strictly an intramural game.
This may come as news to the contributors to “The Cambridge History of the American Novel,” who pride themselves on possessing much wider, much more relevant, interests and a deeper engagement with the world than their predecessors among literary academics. Biographical notes on contributors speak of their concern with “forms of moral personhood in the US novels,” “the poetics of foreign policy,” and “ecocriticism and theories of modernization, postmodernization, and globalization.”
Yet, through the magic of dull and faulty prose, the contributors to “The Cambridge History of the American Novel” have been able to make these presumably worldly subjects seem parochial in the extreme—of concern only to one another, which is certainly one derogatory definition of the academic. These scholars may teach English, but they do not always write it, at least not quite. A novelist, we are told, “tasks himself” with this or that; things tend to get “problematized”; the adjectives “global” and “post”-this-or-that receive a good workout; “alterity” and “intertexuality” pop up their homely heads; the “poetics of ineffability” come into play; and “agency” is used in ways one hadn’t hitherto noticed, so that “readers in groups demonstrate agency.” About the term “non-heteronormativity” let us not speak.
These dopey words and others like them are inserted into stiffly mechanical sentences of dubious meaning. “Attention to the performativity of straight sex characterizes . . . ‘The Great Gatsby’ (1925), where Nick Carraway’s homoerotic obsession with the theatrical Gatsby offers a more authentic passion precisely through flamboyant display.” Betcha didn’t know that Nick Carraway was hot for Jay Gatsby? We sleep tonight; contemporary literary scholarship stands guard.
“The Cambridge History of the American Novel” is perhaps best read as a sign of what has happened to English studies in recent decades. Along with American Studies programs, which are often their subsidiaries, English departments have tended to become intellectual nursing homes where old ideas go to die. If one is still looking for that living relic, the fully subscribed Marxist, one is today less likely to find him in an Economics or History Department than in an English Department, where he will still be taken seriously. He finds a home there because English departments are less concerned with the consideration of literature per se than with what novels, poems, plays and essays—after being properly X-rayed, frisked, padded down, like so many suspicious-looking air travelers—might yield on the subjects of race, class and gender. “How would [this volume] be organized,” one of its contributors asks, “if race, gender, disability, and sexuality were not available?”
Think of any customer experience that has made you wince or kick the cat. What jumps to mind? Waiting in multiple lines at the Department of Motor Vehicles. Observing the bureaucratic sloth and lowest-common-denominator performance of public schools, especially in big cities. Getting ritually humiliated going through airport security. Trying desperately to understand your doctor bills. Navigating the permitting process at your local city hall. Wasting a day at home while the gas man fails to show up. Whatever you come up with, chances are good that the culprit is either a direct government monopoly (as in the providers of K-12 education) or a heavily regulated industry or utility where the government is the largest player (as in health care).”
Will thinks these authors are really on to something.
A generation that has grown up with the Internet “has essentially been raised libertarian,” swimming in markets, which are choices among competing alternatives.
And the left weeps. Preaching what has been called nostalgianomics, liberals mourn the passing of the days when there was one phone company, three car companies, three television networks, and an airline cartel, and big labor and big business were cozy with big government.
The America of one universally known list of Top 40 records is as gone as records. When the Census offered people the choice of checking the “multiracial” category, Maxine Waters, then chairing the Congressional Black Caucus, was indignant: “Letting individuals opt out of the current categories just blurs everything.” This is the voice of reactionary liberalism: No blurring, no changes, no escape from old categories, spin the world back to the 1950s.
“Declaration of Independents” is suitable reading for this summer of debt-ceiling debate, which has been a proxy for a bigger debate, which is about nothing less than this: What should be the nature of the American regime? America is moving in the libertarians’ direction not because they have won an argument but because government and the sectors it dominates have made themselves ludicrous. This has, however, opened minds to the libertarians’ argument.
The essence of which is the common-sensical principle that before government interferes with the freedom of the individual and of individuals making consensual transactions in markets, it ought to have a defensible reason for doing so. It usually does not.
The late Susan Sontag’s hyperintellectual perspective was formed as part of the post-WWII Beat, Queer, Żydokomuna (a Polish term for the well-known Jewish cultural penchant for Marxism) international left-wing counter-cultural intelligentsia. Sontag actually broke with the left in the early 1980s, after the news of what had happened in Cambodia came out, but inevitably over the course of her long literary career, Susan Sontag was normally to be found in the mainstream of contemporary political fashion, and she several times went on the record saying very foolish things.
In Saturday’s Wall Street Journal, the sharp-tongued Joseph Epstein took the occasion of the publication of a new memoir of life with Sontag by one of her former minions, Sempre Susan: A Memoir of Susan Sontag, to deliver some just criticism for some of Sontag’s worst statements and behavior and to put her in her place in cultural history once and for all.
In Epstein’s view, Susan Sontag was just a pretty girl with a remarkable gift for self-promotion.
A single essay, “Notes on ‘Camp,’” published in Partisan Review in 1964, launched Susan Sontag’s career, at the age of 31, and put her instantly on the Big Board of literary reputations. People speak of ideas whose time has not yet come; hers was a talent for promoting ideas that arrived precisely on time. “Notes on ‘Camp,’” along with a companion essay called “Against Interpretation,” vaunted style over content: “The idea of content,” Ms. Sontag wrote, “is today merely a hindrance, a subtle or not so subtle philistinism.” She also held interpretation to be “the enemy of art.” She argued that Camp, a style marked by extravagance, epicene in character, expressed a new sensibility that would “dethrone the serious.” In its place she would put, with nearly equal standing, such cultural items as comic books, wretched movies, pornography watched ironically, and other trivia.
These essays arrived as the 1960s were about to come to their tumultuous fruition and provided an aesthetic justification for a retreat from the moral judgment of artistic works and an opening to hedonism, at least in aesthetic matters. “In place of a hermeneutics,” Sontag’s “Against Interpretation” ended, “we need an erotics of art.” She also argued that the old division between highbrow and lowbrow culture was a waste not so much of time as of the prospects for enjoyment. Toward this end she lauded the movies—”cinema is the active, the most exciting, the most important of all the art forms right now”—as well as science fiction and popular music.
These cultural pronunciamentos, authoritative and richly allusive, were delivered in a mandarin manner. They read as if they were a translation, probably, if one had to guess, from the French. They would have been more impressive, of course, if their author were herself a first-class artist. This, Lord knows, Susan Sontag strained to be. She wrote experimental fiction that never came off; later in her career she wrote more traditional fiction, but it, too, arrived dead on the page.
The problem is that Sontag wasn’t sufficiently interested in real-life details, the lifeblood of fiction, but only in ideas. She also wrote and directed films, which were not well-reviewed: I have not seen these myself, but there is time enough to do so, for I have long assumed that they are playing as a permanent double feature in the only movie theater in hell.
Good abuse, but not entirely just. True, Susan Sontag yearned to write important novels, to score a breakthrough with some plus nouveaux nouveau roman and also to rise to the level of auteur in the most challenging regions of the cinema where she felt herself most at home as a critic and a fan. And it is true that she was not particularly successful as a novelist. Her earlier novels The Benefactor and Death Kit were formalist experiments whose only excellence lay in inducing sleep with certainty. Her later novels seemed to me even less interesting.
Her films were clearly not successful. I cannot defend or criticize her four films, as I too am waiting to see them repeatedly in the hereafter with mild alarm. But Sontag does deserve better on the basis of her essays and her criticism.
It is easy to mock the manifesto calling for criticism as an erotics of art, rather than a hermeneutics. Susan Sontag’s rhetoric and critical aspirations were bold and uninhibited and a trifle prone to overreach, but her critical essays were also a breath of fresh and exotic air blowing into middlebrow American culture from the heights of Montparnasse.
Countless Americans found their way to the accessible cinema of Bergman, Fellini, and Truffaut beckoned by the beacon of Sontag’s travelogues from the remote and inaccessible regions of Antonioni, Bresson, and Ozu. Sontag made the concept of the avante-garde into the art cinema’s equivalent of “the banner with a strange device.”
It was not enough, this passionate young woman persuaded readers, to appreciate the familiar and the beautiful, it was necessary to press on, to leap beyond present artistic and cultural forms of understanding and expression, to conquer strange new heights and plumb unprecedented depths. Susan Sontag seemed, back then, a cultural Joan of Arc, leading the literary and cinematic audience forward in a headlong assault on possibility and the existing state of literature and the arts in a brave and determined effort to break through the barriers and liberate new forms of cultural expression and understanding.
Today, when I watch Last Year at Marienbad or L’Aventurra, when I look into a novel by Nathalie Saurraute, I feel rather the way a veteran of a lost, romantic cause, like some aged grenadier of the wars of Napoleon, must feel thinking back and remembering Austerlitz or Marengo. I smile ruefully at the memory of being young and naive enough to believe that this sort of thing would come to anything, but I also remember the aspirations and the hopes we entertained back then.
Susan Sontag is extremely vulnerable to all the criticisms to which mainsteam Western high culture in the second half of the last century is vulnerable. She was naively romantic, prone to left-wing postures and insanity, and not above following the community of fashion herd into disgraceful positions. But she was still a heroine who, at times, at least, brought great honor to that same high culture and the same civilization her entire class was usually busy trying to destroy.
I knew her a little, and when I lived in New York, I would exchange greetings with her at the kind of key cultural events at which we would both invariably be present. I would also run into her sometimes at the revival houses, and we occasionally sat together and watched Mizoguchi or Renoir at Bleeker Street. Perhaps someday at the cinema in Tartarus mentioned by Mr. Epstein, I can sit beside her and discuss Duet for Cannibals and Brother Carl.
The authors set about to solve the problems of a modern secular culture. The greatest problem, as they see it, is a certain anxiety of choosing. In the Middle Ages, everyone shared the same frame of values. One could offend against that frame by sinning, but the sins were clear, their place in the overall scheme of things ratified by consensus. Now that we do not share such a frame of reference, each person must forge his or her own view of the universe in order to make choices that accord with it. But few people have the will or ability to think the universe through from scratch.
So how can one make intelligent choices? Hubert Dreyfus and Sean Dorrance Kelly call modern nihilism “the idea that there is no reason to prefer any answer to any other.” They propose what they think is a wise and accepting superficiality. By not trying to get to the bottom of things, one can get glimpses of the sacred from the surface of what they call “whoosh” moments—from the presence of charismatic persons to the shared excitement of a sports event. This last elation is sacred and unifying:
There is no essential difference, really, in how it feels to rise as one in joy to sing the praises of the Lord, or to rise as one in joy to sing the praises of the Hail Mary pass, the Immaculate Reception, the Angels, the Saints, the Friars, or the Demon Deacons.
I had a number of courses at Yale from the late John N. Findlay, whose normally lofty and Olympian demeanor could actually be ruffled by any reference to Heidegger (whose thought is the foundation of the Nihilism of Messrs. Drefus & Kelly).
Findlay’s customarily serene blue eyes would flash fire at the mention of the odious Swabian sexton’s son. I remember Findlay once pausing to explain, in Oxonian tones dripping with bitterness and contempt, that Heidegger was guilty of systematically confusing emotional states with metaphysical objects. As Dreyfus and Kelly demonstrate, that kind of thing leads, if not to murderous totalitarianism, at least to incontinent puerility.