Suw Charman-Anderson, in Forbes, notes a watershed moment in the world of books and readers. For the first time, a book self-published by its author has broken through traditional barriers and gained the attention of important establishment book reviews.
[T]his week, the New York Times, one of the most important source of book reviews, published a long and enthusiastic review of a self-published book, Alan Sepinwall’s The Revolution Was Televised. Based on his TV criticism blog, What’s Alan Watching, Sepinwall’s book:
analyzes a dozen “great millennial dramas†that have forged a new golden age in TV: bold, innovative shows that have pushed the boundaries of storytelling, mixed high and low culture, and demonstrated that the small screen could be an ideal medium for writers and directors eager to create complex, challenging narratives with “moral shades of gray.â€
But the New York Times’ Michiko Kakutani wasn’t the only mainstream book critic to write about Sepinwall’s book. USA Today carried an interview with Sepinwell at the end of November, Time published a review of its own, The Huffington Post carried a review, so did the New Yorker.
Sepinwall got the kind of coverage that most traditionally published authors can only dream of. To some extent, this might just be reviewers reviewing another reviewer, a little bit of moral support from your friends, except Sepinwall’s friends have very big megaphones. But at the same time, it illustrates that the idea of a division between ‘traditionally published’ and ‘self-published’ is becoming a ridiculous construct with no meaning whatsoever. …
The reasons that self-published books don’t get reviewed boil down, I think, to the lack of infrastructure. A traditional publishing company can get to know different reviewers and send them the books that they think will go down best with that person. And the reviewer works on the assumption that what he or she is sent by the publisher has to be at least half-decent and thus worth opening. This whole process works because it’s mediated and because of the assumption that a third party stamp of approval for a book guarantees minimum levels of quality. …
[R]eviewers depend on publishers acting as winnowers, sorting out the wheat from the chaff, and at least attempting to make sure that they are sent books they are actually interested in. It’s this weeding out process that’s missing in self-publishing.
This is bound to be only the first instance of what will before very long become normal.
Technology has made self publication and book distribution easy, inexpensive, and available to anyone.
Even successful and well-established popular authors like Barry Eisler as far back as 2011 have found the economics and creative control offered by self publishing to be irresistible. (Eisler was interviewed here about his at-the-time astonishing decision to dump his relatively prestigious print publisher and move off into the new frontier of electronic self publication.)
What have you to recommend? I answer at once, Nothing. The whole current of thought and feeling, the whole stream of human affairs, is setting with irresistible force in that direction. The old ways of living, many of which were just as bad in their time as any of our devices are in ours, are breaking down all over Europe, and are floating this way and that like haycocks in a flood. Nor do I see why any wise man should expend much thought or trouble on trying to save their wrecks. The waters are out and no human force can turn them back, but I do not see why as we go with the stream we need sing Hallelujah to the river god.
61 years ago, the young William F. Buckley Jr. launched what would become a splendiferous career as celebrity commentator and public intellectual by publishing not long after his graduation from Yale a scathing critique of his alma mater, titled God and Man at Yale.
God and Man at Yale represented Buckley’s first major effort at “standing athwart history yelling ‘Stop!,'” and we may now read with a certain poignancy the report of Nathan Harden, Sex and God at Yale, compiled at a posting station considerably farther along the road to Hell in a handbasket, demonstrating just how little either History or Yale was listening.
The youthful naysayer of 1951, Buckley, was a classic version of the privileged insider. Buckley was rich, handsome, and stylish, educated at elite preparatory schools in Britain and the United States. At Yale, he was the kind of celebrity undergraduate BMOC that basically ceased to exist after coeducation: Captain of the Debating Team, Chairman of the Daily News, and –of course– member of Skull and Bones.
The contrast between Buckley and Harden could scarcely be more extreme. Nathan Harden was home-schooled, knows what manual labor is like, and grew up in a family that was short of cash living all over the Southern United States. Harden was turned down by Yale initially, attended one of the Claremont Colleges, then got into a one-term visiting student program at Yale, tried transferring and was turned down again, and finally re-applied and was accepted. He was 22 years old and already married by the time he started college in California, so he must have been 24 (and still married) by the time he finally got to Yale as a degree candidate. Harden did his junior year abroad in Rome and, though he speaks with some familiarity of Political Union debates, he clearly never became any kind of BMOC and obviously did not get into Bones.
Nathan Harden came to Yale with the ability to appreciate the richness of her centuries of history and tradition. He speaks openly of the intense pleasure to be found in exploring Yale’s incomparably rich academic offerings served up by some of the greatest living minds while living in the midst of a community of the most spectacularly talented people of one’s own generation sharing the same Arcadian existence. He also understands exactly why Yale is superior to Harvard.
But… like any representative of ordinary America studying at one of America’s most elite universities today, Nathan Harden was also frequently shocked by the estrangement from, and hostility toward, the America he came from of his alma mater, and appalled by the strange gods of Multiculturalism and Political Correctness who have ousted the Congregationalist Jehovah from that ancient university’s temple.
For Nathan Harden, Sex Week at Yale (which we learn from him recently constituted an eleven-day biennial Saturnalia of Smut in which all of the university’s best known lecture halls (!) were turned over to demonstrators of sex toys, porn stars, and dirty film moguls to dispense technical instruction and even career advice to the Yale undergraduate community) serves as a crucial synecdoche for the moral crisis at the heart of American university education generally and particularly at Yale.
Harden argues that “For God, For Country, and For Yale,” Yale’s motto, has become not so much a series of aspirative ends ranked in hierarchical order but rather an accurate historical description of Yale’s own primary locus of value.
Yale was founded as a college, intended to serve God by educating Congregationalist clergymen to fill the pulpits of the Colony of Connecticut. Over time it evolved into a national institution educating an elite group of leaders in business, the military, politics, the arts, and the sciences for the United States. Today Yale is decidedly a hotbed of infidelity to both Christianity and the United States. Secular Progressivism has thoroughly replaced Congregationalism and Christianity, and loyalty to an international elite community of fashion has supplanted any particularist sentiment in favor of the United States. The Yale Administration operates neither to serve God nor Country, but instead directs its efforts entirely toward forwarding its own goals and enhancing its own prestige.
Armed with an almost-unequaled cash endowment and an equally impressive historical legacy and accumulation of multi-generational glory and therefore a concomitant ability to attract talent and additional funding, the Yale Administration is formidably equipped to mold, educate, and inform in any direction it wishes, but as Nathan Harden explains, the problem that is increasingly evident is the practical inability of the University Administration to distinguish good from bad, right from wrong, or up from down in the complex contemporary world of conflicting claims.
Presidents Angell, Seymour, and Griswold would have had no difficulty at all in understanding why the University ought not to lend the principal lecture halls in Linsley-Chittenden, W.L. Harkness, and Sheffield-Sterling-Strathcona Halls for porn stars to demonstrate sexual techniques or heads of pornography studios to proffer career advice. Richard Levin obviously does not understand why Sex Week at Yale is inappropriate (to say the least), any more than he understands why Yale should not be devoting 10% of its undergraduate places to foreigners, or why Yale should not be renting out its name and reputation to Third World governments.
Harden understands the problem and, though he has very recently graduated, he’d be a lot more qualified to run Yale than the current administration.
Yale… enjoys a strong tradition of educating American political leaders. Over the course of its first two hundred years, as Yale’s spiritual mission faded slowly into the background, a political purpose emerged as a new defining agenda. Serving country became a proxy for serving God. A patriotic purpose replaced a spiritual one. It was assumed for a long time that the interests of America were, by extension, Yale’s interests as well. A large percentage of Yale graduates enrolled in the military immediately following graduation. And, of course, many went on to hold high political office.
The diversity that came to Yale in the sixties was a good thing. Other changes were less positive. In the late 1960s, Yale’s patriotic ethos disintegrated in the face of pressures from the radical new left. The old-guard liberals, who had long governed the university, were replaced by a new, younger set. The old-guard liberals were in the mold of Jack Kennedy—they were New Deal liberals who were sympathetic to religion and proud of their country. They were traditionalists. The new leftists, on the other hand, wanted radical social transformation. They wanted to challenge the old moral assumptions and revolutionize the economic system. Empowered by the backlash against the Vietnam War, and a sanctimonious belief in the justness of their cause, students rose up and violently took over the agenda of the American left. … About this same time, the patriotic purpose that had defined the university for two hundred years disappeared. The faculty had voted the year before to revoke academic credit for ROTC courses. Later, Yale moved to restrict military recruiters’ access to students. With the destruction of Yale’s patriotic ethos, the last remaining sense of Yale having any higher educational purpose in service of the nation went out the door.
That isn’t to say that Yale ceased being political. But from that point onward, Yale’s political agenda was no longer tied to American interests. In fact, Yale’s political climate came to be defined more and more by anti-Americanism. Economic theories in opposition to free markets became prevalent. Identity politics and interest-group politics began to take over academic life, endangering free speech in the name of cultural sensitivity, and ushering in a new era of suffocating political correctness.
The shift happened quickly. Only a couple of decades before, during World War II, faculty sentiment had been united against America’s enemies in Nazi Germany and Fascist Italy. Now, if the topic of international affairs happened to be raised in the faculty lounge, it had become fashionable to speak of America as the bad guy. Saying nice things about America’s enemies became a mark of intellectual sophistication—of rising above mindless nationalism-Patriotism, like religion, had become a mark of low intelligence, an anachronism. …
Yale is a place where one can find people expressing almost every imaginable viewpoint and belief system. But here is the unanswerable question: How does a secular university judge between the competing moral claims of its members when those claims breach the private sphere and enter the public realm? …
Nihilism is, ultimately, where Yale is headed. Yale was built in order to nurture ideas that would elevate the soul and advance human understanding, but it now has no governing moral principle-As a result, the knowledge generated there is divorced from any larger human purpose. Apart from a kind of vague appreciation or certain concepts like tolerance and diversity, Yale is a moral vacuum. Therefore, almost anything goes. Yale is among a dwindling number of institutions that provide a classical liberal education, focusing on the great books of the Western canon—topped off with porn in HD. As I observed, within its walls, images of women being beaten and humiliated for no other reason than the pleasure and profit of others, I became aware that I was witnessing much more than the decline of a great university. I was witnessing nothing less than a prophetic vision of America’s descent into an abyss of moral aimlessness, at the hands of those now charged with educating its future leaders.
Rachel Cooke goes for a walk in the course of interviewing Robert Macfarlane, author of a new book (being released in October in the USA, but already in print in the UK) on Britain’s ancient tracks, holloways, drove roads, and sea paths.
Examine a large-scale map of the Essex coastline between the river Crouch and the river Thames, and you’ll see a footpath which departs the land at a place called Wakering Stairs and heads east, straight into – or so it appears – the North Sea. A few hundred yards on, it veers north, heading out across Maplin Sands until, three miles later, it turns back in the direction whence it came, finally making landfall at Fisherman’s Head, on the edge of Foulness Island.
Can this carefully traced line be for real? Certainly. You are not hallucinating. This is the Broomway, a path that is said to date from Roman times, and when Robert Macfarlane agrees to go walking with me, it’s his first idea. Am I excited about this? Yes, and no. I’m thrilled at the idea of heading out with Macfarlane; I feel like a marathon runner who’s been invited to train with Paula Radcliffe. But then I read his book, The Old Ways, and anxiety rolls in, like Essex mist. The Broomway, which can only be crossed when the tide is out, is the deadliest path in Britain; Edwardian newspapers, relishing its rapacious reputation – 66 of its dead lie in Foulness churchyard – rechristened it “the Doomway”. As he notes, even the Ordnance Survey map registers the “gothic” atmosphere of the path: “WARNING,” it reads. “Public rights of way across Maplin Sands can be dangerous. Seek local advice.” I admire Macfarlane hugely; I would love to watch him “walking on silver water” in the “mirror-world” that is the Broomway. On the other hand, I would probably prefer not to drown in the service of trying to tell you what a good writer he is.
——————————
Wikipedia: The Broomway provided the main access to Foulness for centuries. It is an ancient track, which starts at Wakering Stairs, and runs for 6 miles (9.7 km) along the Maplin Sands, some 440 yards (400 m) from the present shoreline. The seaward side of the track is defined by bunches of twigs and sticks, shaped like upside-down besom brooms or fire-brooms, which are buried in the sands. Six headways run from the track to the shore, giving access to local farms. The track was extremely dangerous in misty weather, as the incoming tide floods across the sands at high speed, and the water forms whirlpools because of flows from the River Crouch and River Roach. Under such conditions, the direction of the shore cannot be determined, and the parish registers record the burials of many people who were drowned.
Several weeks ago, returning from shopping, as I proceeded along our driveway, I saw a skunk standing in broad daylight, right outside our fenced house compound. I slowed deliberately, intending to give the skunk a chance to scamper off, away from threatening human beings and cars. The skunk, however, failed to respond appropriately. It stood there, swaying a little from side to side, and then it began to stagger, not away toward the woods, but in the direction of a gate in the fence around the house area.
Not good, I thought. That skunk is sick, and it probably has rabies.
My dogs were outside, and if the skunk went under that gate, he could easily have run into them.
I hurriedly drove around the corner, and ran into the yard. Fortunately, both our dogs came to me immediately, and I was able to lead them into the house and safety. I’d been target-shooting recently with Karen’s 9mm Walther pistol, and it was the nearest available gun, lying ready for use on a handy shelf beneath the kitchen counter. I grabbed up the Walther and went back outside.
I walked down to the corner of the fence, and found that the skunk had not moved very far. It was still swaying. It still looked terribly sick.
Skunks present a pretty impressive hazard even without rabies, and I definitely wanted to be out of range of both deliberate and terminally-reflexive spraying, so I worked the slide and took aim from a good long 20 feet. I shot the skunk in the head with a 9mm bullet, but I had no desire to try disposing of it until it was absolutely certainly dead and completely inert, so I proceeded to empty the magazine into the animal’s head and neck region. The skunk quivered in response to the first shot, and subsequent rounds knocked it over and moved it a bit. After 10 rounds, I finally felt sure that it was dead, dead, dead, and completely past any kind of retaliation.
I walked back and got a shovel. I picked up the skunk on the blade of the shovel, got into my truck, and balancing the shovel on the car window with one hand, managed to carry the dead skunk outside the vehicle, back out our long driveway. I then carefully got out and pitched the skunk far into the uninhabited woods across the road. That placed it almost a quarter of mile from our house and much farther than that from any other homes.
Disposing of the sick skunk actually went very smoothly, but the possibilities were frightening. Our two dogs and two of our cats could have run into that skunk and been infected.
Alice Gregory‘s review of a new cultural history of rabies makes it clear that that particular disease is really far more awful than we normally realize.
“Ours is a domesticated age,†writes Bill Wasik and Monica Murphy in Rabid: A Cultural History of the World’s Most Diabolical Virus
. Wasik is an editor at Wired and Murphy, his wife, a veterinarian. Together they have coauthored a sprawling chronicle of rabies, which until you get the numbers, seems like a willfully anachronistic topic. I did not know, for instance, that rabies is the most fatal virus in the world (only six unvaccinated people have survived, the first in 2004.) A fun party trick is forcing people to guess how many rabies fatalities there are each year. Optimists will hazard 100. Skeptics, 1,000. The real answer is 55,000, a figure so large it transforms your audience into a bunch of stoned teenagers marveling at the fact equivalent of a Big Gulp.
Wasik and Murphy’s subject might seem like a deliberately strange one, but they exercise nothing but user-friendly restraint when it comes to historical detail and medical explanation. It’s a rare pleasure to read a nonfiction book by authors who research like academics but write like journalists. They have mined centuries’ worth of primary sources and come bearing only the gems. My favorites were the archaic cures, some of which were reasonable (lancing, cauterization), while others were plain perverted. The Sushruta Samhita recommends pouring clarified butter into the infected wound and then drinking it; Pliny the Elder suggests a linen tourniquet soaked with the menstrual fluid of a dog. The virus comes up surprisingly often in literary history, too. A Baltimore-based cardiologist speculates that Edgar Allan Poe, who died in a gutter wearing somebody’s else’s soiled clothes, perished not of alcoholism, as has long been thought, but of rabies. In the most famous anecdote about Emily Bronte, she is bit by a mad dog while dawdling in a moor. Terrified of infection, she rushes home and secretly cauterizes the lesion with an iron.
UPDATE: I should have mentioned that I live in Virginia these days, where vultures abound, and my property is actually infested by black vultures who try to hang out on the barn roof, nearby trees, and even occasionally the house.
They and I have reached a modus vivendi in which they know that when I say: “Get going!” they had better take off and fly somewhere else, or very soon .22 Long Rifle bullets are going to come whistling rather near them.
They commonly sit at the top of some tall Locust trees at the end of our driveway. They were not there when I disposed of the dead skunk, but they had already completely cleaned up that skunk by late afternoon (when I went out to get the mail).
Mark Anthony Signorelli turns his review of Mark Steyn’s After America: Get Ready for Armageddon into an essay supplementary to Steyn’s book, arguing the author’s view of cause and effect can be improved by reading a much earlier (1930) attack on the same forces of dissolution by the Spanish philosopher Ortega y Gasset.
Throughout his book, Steyn catalogues the demoralizing effects of unlimited government upon the American citizenry. No one can ignore the power of the case he presents. But as much as government overreach erodes the character of a people, the debased character of a people manifests itself in arbitrary government. Bad institutions make bad people, but bad people also make bad institutions. Our ugly politics is every bit a reflection of our cultural failings as are our worthless schools. Steyn is not unaware of these facts; one of the passages I found most compelling in his book was when he argues that the truly horrifying thing about the rise of Obama was the fact that the majority of the American people had been duped by such an evident buffoon. Our folly created his administration, and all of its works. So Steyn clearly understands the way a people’s faults can manifest themselves in inept government. Still, the obvious emphasis of his book is on the causal relationship which runs opposite, on the way that inept government debases the character of a people. I think that emphasis is misplaced; I think the effects of a people’s character on the character of their government are more fundamental, more decisive to their happiness, and more subject to reform than the effects which flow from a corrupted government upon the citizenry. Or, to put the point in a different way, I believe that culture is far more consequential for the maintenance of a well-ordered community than politics. Steyn himself advises that, “changing the culture is more important than changing the politics,†but since the emphasis of his book is on the way that bad politics has changed our culture for the worse, he actually seems to undermine this bit of advice.
The book that most effectively delineates the ruinous social mechanisms of liberal democracy is The Revolt of the Masses, by the early twentieth-century philosopher Jose Ortega y Gasset. For Ortega, modern western society was marked by the rise to power of the “mass-man,†the unqualified or uncultivated man, who, lacking all necessary intellectual and moral training in the duties of civic life, had nonetheless asserted his immutable right to impose his own mediocrity of spirit upon society: “The characteristic of the hour is that the commonplace mind, knowing itself to be commonplace, has the assurance to proclaim the rights of the commonplace and to impose them wherever it will.†The mass-man is not bound by any traditions or maxims of prudence; he cares only about having his own way in the world. And when he is taught (as all modern political theory teaches him) that the state is a manifestation of his own will, he freely grants it an unlimited scope of action, just as he (theoretically) grants himself a perfect freedom of action: “This is the gravest danger that today threatens civilization: State intervention, the absorption of all spontaneous social effort by the State…when the mass suffers any ill-fortune, or simply feels some strong appetite, its great temptation is that permanent, sure possibility of obtaining everything – merely by touching a button and setting the mighty machine in motion.†The consequences of this trend are catastrophic:
The result of this tendency will be fatal. Spontaneous social action will be broken up over and over again by State intervention; no new seed will be able to fructify. Society will have to live for the State, man for the governmental machine. And as, after all, it is only a machine whose existence and maintenance depend on the vital supports around it, the State, after sucking out the very marrow of society, will be left bloodless, a skeleton, dead with that rusty death of machinery, more gruesome than the death of a living organism.
Exactly as Steyn describes it in his book, some eighty years later. But what Ortega makes us see is that “big government†results from the prior moral corruption of the people, in particular from their unbounded self-love and self-assurance. It destroys them in the end, but at the first, it was their creature.
Dick Cheney is clearly a better memoirist than his one-time boss and both predecessor and successor at the Defense Department Donald Rumsfeld. I still have not finished Rumsfeld’s Known and Unknown which came out last February. I think that Cheney seems somehow more forthcoming, direct, and personally present in his recounting of his life and career in government service.
Most people, I’m sure, have seen reviews elsewhere noting that Dick Cheney did make a point of settling certain scores, noting the disloyalty of Colin Powell and his associates at the State Department toward the president and toward administration policy when the going got tough in Iraq, and highlighting the failure of Powell and his subordinate Richard Armitage to deflect a barrage of accusations of having outed Valerie Plame directed at innocent members of the administration which would have avoided a large-scale investigation and the appointment of a special prosecutor, and ultimately the conviction on a secondary-level charge of Dick Cheney’s own chief of staff, Scooter Libby,when Powell knew perfectly well that Armitage himself was the source of the leak. Cheney describes Powell’s silence in response to press inquiries after a 2003 cabinet meeting with not actually openly phrased, but nonetheless withering, contempt.
He is perhaps even harsher in describing at length Condolezza Rice’s dishonest and ill-advised efforts to obtain some chimerical version of a non-proliferation deal with North Korea, and her discreditably enthusiastic willingness to participate in sham agreements with that nefarious regime at the expense of the safety of the United States and other nations.
Beyond those best known portions of the Cheney memoir, I found a few other interesting details.
On 9/11, Dick Cheney found himself being forcibly propelled out of his office by the Secret Service, which led him hastily to the safer location of the underground Presidential Emergency Operations Center (PEOC), deep beneath the White House. Dick Cheney provides an inadvertent testimony to the general competence with the government spends its billions and trillions when he describes the subsequent scene.
While we were managing things from the PEOC, another meeting was under way in the White House Situation Room. The PEOC staff attempted to set up a videoconference to connect the two rooms, and we managed to get images of the Situation Room meeting up on one of our screens, but we couldn’t get any audio of the meeting. We were getting better real-time information from the news reports on TV, but because of a technical glitch, I couldn’t hear those reports when the video of the Sit Room meeting was on display. I told Eric [Feldman, Cheney’s deputy national security advisor] to get on the phone and try to listen to the Sit Room meeting, but after a few minutes he described the audio quality as ‘worse than lisening to Alvin and the Chipmunks at the bottom of a swimming pool.’ I told him to hang up. If something important was happening upstairs, they could send someone down or call us direct.
Visions of the gazillions of dollars spent on custom-built high tech communications equipment and infrastructure for the Presidential Emergency Operations Center and the White House Situation Room swam before my eyes. Clearly, they could have just gone out to Radio Shack and done better.
In describing his early career as congressman from Wyoming and a member of the House Intelligence Committee, Dick Cheney serves up one very provocative little nugget.
In May 1987 I received a call from legendary CIA counterintelligence director James Jesus Angleton. He said that he had something of vital importance to tell me and that it could be conveyed only in person. …
I called Henry Hyde, the Intel Committee’s ranking Republican and invited him to sit in on the meeting. A few days later, before our scheduled meeting, Jim Angleton died. I never learned what it was he wanted to tell me.
There is the plot of a great spy thriller right there in the story of the unconveyed Angleton secret.
Joseph Epstein finds in the recently published Cambridge History of the American Novel a perfect demonstration of exactly what has happened to university English departments in recent decades and thinks all this probably has something to do with the percentage of students majoring in English having been roughly cut in half over the same period.
Only 40 or 50 years ago, English departments attracted men and women who wrote books of general intellectual interest and had names known outside the academy—Perry Miller, Aileen Ward, Walter Jackson Bate, Marjorie Hope Nicolson, Joseph Wood Krutch, Lionel Trilling, one could name a dozen or so others—but no longer. Literature, as taught in the current-day university, is strictly an intramural game.
This may come as news to the contributors to “The Cambridge History of the American Novel,” who pride themselves on possessing much wider, much more relevant, interests and a deeper engagement with the world than their predecessors among literary academics. Biographical notes on contributors speak of their concern with “forms of moral personhood in the US novels,” “the poetics of foreign policy,” and “ecocriticism and theories of modernization, postmodernization, and globalization.”
Yet, through the magic of dull and faulty prose, the contributors to “The Cambridge History of the American Novel” have been able to make these presumably worldly subjects seem parochial in the extreme—of concern only to one another, which is certainly one derogatory definition of the academic. These scholars may teach English, but they do not always write it, at least not quite. A novelist, we are told, “tasks himself” with this or that; things tend to get “problematized”; the adjectives “global” and “post”-this-or-that receive a good workout; “alterity” and “intertexuality” pop up their homely heads; the “poetics of ineffability” come into play; and “agency” is used in ways one hadn’t hitherto noticed, so that “readers in groups demonstrate agency.” About the term “non-heteronormativity” let us not speak.
These dopey words and others like them are inserted into stiffly mechanical sentences of dubious meaning. “Attention to the performativity of straight sex characterizes . . . ‘The Great Gatsby’ (1925), where Nick Carraway’s homoerotic obsession with the theatrical Gatsby offers a more authentic passion precisely through flamboyant display.” Betcha didn’t know that Nick Carraway was hot for Jay Gatsby? We sleep tonight; contemporary literary scholarship stands guard.
“The Cambridge History of the American Novel” is perhaps best read as a sign of what has happened to English studies in recent decades. Along with American Studies programs, which are often their subsidiaries, English departments have tended to become intellectual nursing homes where old ideas go to die. If one is still looking for that living relic, the fully subscribed Marxist, one is today less likely to find him in an Economics or History Department than in an English Department, where he will still be taken seriously. He finds a home there because English departments are less concerned with the consideration of literature per se than with what novels, poems, plays and essays—after being properly X-rayed, frisked, padded down, like so many suspicious-looking air travelers—might yield on the subjects of race, class and gender. “How would [this volume] be organized,” one of its contributors asks, “if race, gender, disability, and sexuality were not available?”
Think of any customer experience that has made you wince or kick the cat. What jumps to mind? Waiting in multiple lines at the Department of Motor Vehicles. Observing the bureaucratic sloth and lowest-common-denominator performance of public schools, especially in big cities. Getting ritually humiliated going through airport security. Trying desperately to understand your doctor bills. Navigating the permitting process at your local city hall. Wasting a day at home while the gas man fails to show up. Whatever you come up with, chances are good that the culprit is either a direct government monopoly (as in the providers of K-12 education) or a heavily regulated industry or utility where the government is the largest player (as in health care).â€
Will thinks these authors are really on to something.
A generation that has grown up with the Internet “has essentially been raised libertarian,†swimming in markets, which are choices among competing alternatives.
And the left weeps. Preaching what has been called nostalgianomics, liberals mourn the passing of the days when there was one phone company, three car companies, three television networks, and an airline cartel, and big labor and big business were cozy with big government.
The America of one universally known list of Top 40 records is as gone as records. When the Census offered people the choice of checking the “multiracial†category, Maxine Waters, then chairing the Congressional Black Caucus, was indignant: “Letting individuals opt out of the current categories just blurs everything.†This is the voice of reactionary liberalism: No blurring, no changes, no escape from old categories, spin the world back to the 1950s.
“Declaration of Independents†is suitable reading for this summer of debt-ceiling debate, which has been a proxy for a bigger debate, which is about nothing less than this: What should be the nature of the American regime? America is moving in the libertarians’ direction not because they have won an argument but because government and the sectors it dominates have made themselves ludicrous. This has, however, opened minds to the libertarians’ argument.
The essence of which is the common-sensical principle that before government interferes with the freedom of the individual and of individuals making consensual transactions in markets, it ought to have a defensible reason for doing so. It usually does not.
The late Susan Sontag’s hyperintellectual perspective was formed as part of the post-WWII Beat, Queer, Żydokomuna (a Polish term for the well-known Jewish cultural penchant for Marxism) international left-wing counter-cultural intelligentsia. Sontag actually broke with the left in the early 1980s, after the news of what had happened in Cambodia came out, but inevitably over the course of her long literary career, Susan Sontag was normally to be found in the mainstream of contemporary political fashion, and she several times went on the record saying very foolish things.
In Saturday’s Wall Street Journal, the sharp-tongued Joseph Epstein took the occasion of the publication of a new memoir of life with Sontag by one of her former minions, Sempre Susan: A Memoir of Susan Sontag, to deliver some just criticism for some of Sontag’s worst statements and behavior and to put her in her place in cultural history once and for all.
In Epstein’s view, Susan Sontag was just a pretty girl with a remarkable gift for self-promotion.
A single essay, “Notes on ‘Camp,'” published in Partisan Review in 1964, launched Susan Sontag’s career, at the age of 31, and put her instantly on the Big Board of literary reputations. People speak of ideas whose time has not yet come; hers was a talent for promoting ideas that arrived precisely on time. “Notes on ‘Camp,'” along with a companion essay called “Against Interpretation,” vaunted style over content: “The idea of content,” Ms. Sontag wrote, “is today merely a hindrance, a subtle or not so subtle philistinism.” She also held interpretation to be “the enemy of art.” She argued that Camp, a style marked by extravagance, epicene in character, expressed a new sensibility that would “dethrone the serious.” In its place she would put, with nearly equal standing, such cultural items as comic books, wretched movies, pornography watched ironically, and other trivia.
These essays arrived as the 1960s were about to come to their tumultuous fruition and provided an aesthetic justification for a retreat from the moral judgment of artistic works and an opening to hedonism, at least in aesthetic matters. “In place of a hermeneutics,” Sontag’s “Against Interpretation” ended, “we need an erotics of art.” She also argued that the old division between highbrow and lowbrow culture was a waste not so much of time as of the prospects for enjoyment. Toward this end she lauded the movies –cinema is the active, the most exciting, the most important of all the art forms right now –as well as science fiction and popular music.
These cultural pronunciamentos, authoritative and richly allusive, were delivered in a mandarin manner. They read as if they were a translation, probably, if one had to guess, from the French. They would have been more impressive, of course, if their author were herself a first-class artist. This, Lord knows, Susan Sontag strained to be. She wrote experimental fiction that never came off; later in her career she wrote more traditional fiction, but it, too, arrived dead on the page.
The problem is that Sontag wasn’t sufficiently interested in real-life details, the lifeblood of fiction, but only in ideas. She also wrote and directed films, which were not well-reviewed: I have not seen these myself, but there is time enough to do so, for I have long assumed that they are playing as a permanent double feature in the only movie theater in hell.
Ouch!
Good abuse, but not entirely just. True, Susan Sontag yearned to write important novels, to score a breakthrough with some plus nouveaux nouveau roman and also to rise to the level of auteur in the most challenging regions of the cinema where she felt herself most at home as a critic and a fan. And it is true that she was not particularly successful as a novelist. Her earlier novels The Benefactor and Death Kit were formalist experiments whose only excellence lay in inducing sleep with certainty. Her later novels seemed to me even less interesting.
Her films were clearly not successful. I cannot defend or criticize her four films, as I too am waiting to see them repeatedly in the hereafter with mild alarm. But Sontag does deserve better on the basis of her essays and her criticism.
It is easy to mock the manifesto calling for criticism as an erotics of art, rather than a hermeneutics. Susan Sontag’s rhetoric and critical aspirations were bold and uninhibited and a trifle prone to overreach, but her critical essays were also a breath of fresh and exotic air blowing into middlebrow American culture from the heights of Montparnasse.
Countless Americans found their way to the accessible cinema of Bergman, Fellini, and Truffaut beckoned by the beacon of Sontag’s travelogues from the remote and inaccessible regions of Antonioni, Bresson, and Ozu. Sontag made the concept of the avante-garde into the art cinema’s equivalent of “the banner with a strange device.”
It was not enough, this passionate young woman persuaded readers, to appreciate the familiar and the beautiful, it was necessary to press on, to leap beyond present artistic and cultural forms of understanding and expression, to conquer strange new heights and plumb unprecedented depths. Susan Sontag seemed, back then, a cultural Joan of Arc, leading the literary and cinematic audience forward in a headlong assault on possibility and the existing state of literature and the arts in a brave and determined effort to break through the barriers and liberate new forms of cultural expression and understanding.
Today, when I watch Last Year at Marienbad or L’Aventurra, when I look into a novel by Nathalie Saurraute, I feel rather the way a veteran of a lost, romantic cause, like some aged grenadier of the wars of Napoleon, must have felt thinking back and remembering Austerlitz or Marengo. I smile ruefully at the memory of being young and naive enough to believe that this sort of thing would come to anything, but I also remember the aspirations and the hopes we entertained back then.
Susan Sontag is extremely vulnerable to all the criticisms to which mainsteam Western high culture in the second half of the last century is vulnerable. She was naively romantic, prone to left-wing postures and insanity, and not above following the community of fashion herd into disgraceful positions. But she was still a heroine who, at times, at least, brought great honor to that same high culture and the same civilization her entire class was usually busy trying to destroy.
I knew her a little, and when I lived in New York, I would exchange greetings with her at the kind of key cultural events at which we would both invariably be present. I would also run into her sometimes at the revival houses, and we occasionally sat together and watched Mizoguchi or Renoir at Bleeker Street. Perhaps someday at the cinema in Tartarus mentioned by Mr. Epstein, I can sit beside her and discuss Duet for Cannibals and Brother Carl.
Gary Wills reviews, with well-deserved derision, Hubert Dreyfus and Sean Dorrance Kelly’s All Things Shining: Reading the Western Classics to Find Meaning in a Secular Age, a recent effort by two prominent academic philosophers (Mr. Dreyfus is a professor of Philosophy at Berkeley, Mr. Kelly is chairman of the Philosophy Department at Harvard) to find an authentic basis for values compatible with postmodern Continental Nihilism.
The authors set about to solve the problems of a modern secular culture. The greatest problem, as they see it, is a certain anxiety of choosing. In the Middle Ages, everyone shared the same frame of values. One could offend against that frame by sinning, but the sins were clear, their place in the overall scheme of things ratified by consensus. Now that we do not share such a frame of reference, each person must forge his or her own view of the universe in order to make choices that accord with it. But few people have the will or ability to think the universe through from scratch.
So how can one make intelligent choices? Hubert Dreyfus and Sean Dorrance Kelly call modern nihilism “the idea that there is no reason to prefer any answer to any other.†They propose what they think is a wise and accepting superficiality. By not trying to get to the bottom of things, one can get glimpses of the sacred from the surface of what they call “whoosh†moments—from the presence of charismatic persons to the shared excitement of a sports event. This last elation is sacred and unifying:
There is no essential difference, really, in how it feels to rise as one in joy to sing the praises of the Lord, or to rise as one in joy to sing the praises of the Hail Mary pass, the Immaculate Reception, the Angels, the Saints, the Friars, or the Demon Deacons.
I had a number of courses at Yale from the late John N. Findlay, whose normally lofty and Olympian demeanor could actually be ruffled by any reference to Heidegger (whose thought is the foundation of the Nihilism of Messrs. Drefus & Kelly).
Findlay’s customarily serene blue eyes would flash fire at the mention of the odious Swabian sexton’s son. I remember Findlay once pausing to explain, in Oxonian tones dripping with bitterness and contempt, that Heidegger was guilty of systematically confusing emotional states with metaphysical objects. As Dreyfus and Kelly demonstrate, that kind of thing leads, if not to murderous totalitarianism, at least to incontinent puerility.
If you want to go to naked parties, first you have to be admitted to the appropriate elite college, and even if you don’t want to go to naked parties, you are going to need to get your ticket stamped in our credential-obsessed society in order to get any kind of serious job.
In my day, places like Yale, in the aftermath of Sputnik, were scouring the country in search of anybody with good standardized test scores. All you had to do was ace the 9th grade Stanford-Binet IQ test, then do well on the SATs and alumni representatives of Yale would come and plead with you to accept a full scholarship. Things are a bit more complicated today.
The most darkly humorous aspect of this often hilarious book is its depiction of an admissions process that corrupts everything it touches.
It’s a process that discourages reticence by requiring students to write revealing and disingenuous personal essays; discourages thrift by regarding parental savings as fair game in the financial-aid evaluation; discourages intellectual curiosity by encouraging students to pursue grades rather than knowledge; and discourages honesty by transforming adolescence into a period of cynical calculation.
“At its most intense,” Mr. Ferguson writes, “the admissions process didn’t force kids to be Lisa Simpson; it turned them into Eddie Haskell. . . . It guaranteed that teenagers would pursue life with a single ulterior motive, while pretending they weren’t. It coated their every undertaking in a thin lacquer of insincerity. Befriending people in hopes of a good rec letter; serving the community to advertise your big heart; studying hard just to puff up the GPA and climb the greasy poll of class rank—nothing was done for its own sake.”
This stressful process practically demands cynicism from all parties. To “climb the page” in the closely watched U.S. News & World Report rankings, schools solicit applications so that they can increase the numbers they reject, thereby appearing more selective. Elite institutions claim to be open to all but devote wide swaths of their entering classes to athletes, the offspring of donating alumni, members of minority groups and others with “hooks” that give them an edge.
Matters have been complicated in recent years by the success of girls, who persist in outperforming boys academically in high school and outnumbering them in college. But a university may admit so many girls that a tipping point is reached, making boys even less likely to apply or, as Mr. Ferguson notes, “attracting the wrong kind of boys for the wrong reasons.”
Admissions officers have tried to rectify this problem by making schools more appealing to male applicants, expanding math and science departments, adding sports—and lowering admission standards for males, most of whom are white. Asian boys generally don’t need any such help. “After several generations of vicious racism,” Mr. Ferguson says, “followed by protest marches, civil rights lawsuits, accusations of bigotry, appeals to color-blindness, feminism, and eloquent invocations of the meritocratic ideal, the latest admissions trend in American higher education is affirmative action for white men. Just like the old days.”
Have you ever heard of anyone who drank while he worked? You’re thinking of Faulkner. He does sometimes — and I can tell right in the middle of a page when he’s had his first one.
E.M. Forster’s Howards End, according to Katherine Mansfield (1915)
Putting my weakest books to the wall last night I came across a copy of ‘Howards End’ and had a look into it. Not good enough. E.M. Forster never gets any further than warming the teapot. He’s a rare fine hand at that. Feel this teapot. Is it not beautifully warm? Yes, but there ain’t going to be no tea.
And I can never be perfectly certain whether Helen was got with child by Leonard Bast or by his fatal forgotten umbrella. All things considered, I think it must have been the umbrella.