Category Archive 'Nihilism'
20 Jul 2018
05 Jun 2018
Julie Mehretu, The Mural, 2010, Goldman Sachs, New York.
“Great nations write their autobiographies in three manuscriptsâ€”the book of their deeds, the book of their words, and the book of their art. Not one of these books can be understood unless we read the two others; but of the three, the only quite trustworthy one is the last.”
— John Ruskin, St. Mark’s rest; the history of Venice (1877).
James McElroy leafs through “the book of their art” of today’s community of fashion elite and shudders.
In 2010, Goldman Sachs paid $5 million for a custom-made Julie Meheretu mural for their New York headquarters. Expectations are low for corporate lobby art, yet Meheretuâ€™s giant painting is remarkably uglyâ€”so ugly that it helps us sift through a decade of Goldman criticisms and get to the heart of what is wrong with the elites of our country.
Julie Mehretuâ€™s â€œThe Muralâ€ is an abstract series of layered collages the size of a tennis court. Some layers are colorful swirls, others are quick black dash marks. At first glance one is struck by the chaos of the various shapes and colors. No pattern or structure reveals itself. Yet a longer look reveals a sublayer depicting architectural drawings of famous financial facades, including the New York Stock Exchange, The New Orleans Cotton Exchange, and even a market gate from the ancient Greek city of Miletus.
What are we to make of this? Meheretu herself confirms our suspicion that there is no overarching structure to the piece. â€œFrom the way the whole painting was structured from the beginning there was no part that was completely determined ever. It was always like the beginning lines and the next shapes. So it was always this additive process,â€ she said in an Art 21 episode. …
Scottish philosopher Alasdair MacIntyre gave a lecture to Notre Dameâ€™s Center of Ethics and Culture in 2000 about the compartmentalization of our ethical lives. He argued that in modern Western culture these different areas are governed by different ethical norms and standards. The example he gives is how a waiter at a restaurant acts differently in the kitchen than in front of the customer. In the kitchen it is normal to yell, curse, and touch the food with his bare hands; none of this would be appropriate in front of the customer. And when the waiter goes home, his personal life is dictated by a further third set of norms. Or consider how the ethics of lying are treated differently during a job interview versus at home or at a law office. Like the painting in the Goldman Sachs lobby, our ethical lives seem to be made of different layers that donâ€™t connect. Our culture no longer shares a single ethical narrative, and so our choices are not weighed against a standard thatâ€™s consistent. Rather, people ask that their choices be accepted simply because they were made. When the bankers over-leveraged prior to 2008, they made a series of compartmentalized choices without considering the larger societal implications. They and the art in their lobby are the same.
I do not think the bankers at Goldman spend each morning scrutinizing their lobbies for larger ethical implications. Nihilistic art does not create nihilistic bankers. Yet both the elites of art and the elites of finance come out of the same culture. Both are indicative of where we are as a society. The Occupy Wall Street crowd may call Goldman a vampire squid wrapped around the face of humanity, but they never apply the same harsh rhetoric to our cultural institutions. A decade after the recession, our contemporary high art is more nihilistic than ever. This informs all areas of our culture. When powerful institutions are discussed we often critique in terms of isms: capitalism, liberalism, managerialism. We forget to mention that our institutions are made up of individuals who share the same culture that we all do. IRS auditors listen to Katy Perry. Federal judges watch comic book movies. The spies at the CIA read Zadie Smith novels. Our morality is informed in part by the art, both high and popular, that surrounds us.
18 Sep 2016
Jesse Singal explains how and who they are.
“Why did he post a suicide note on livejournal before killing himself?” “I hear he did it for the lulz.”
The Chanterculture (as in 4chan –JDZ) predates the rise of Trump by years (Gamergate was obviously a big moment for it), but suffice it to say that the emergence of Trump, a larger-than-life walking middle finger to political correctness, hit this subculture like a mainlined bottle of Mountain Dew â€” Trump is their hero, and like so much else in their online world they have rendered him in cartoonish, superhero hues.
Part of what makes the Chanterculture confusing and difficult for outsiders to penetrate is that, as Bernstein puts it, â€œIt unites two equally irrepressible camps behind an ironclad belief in the duty to say hideous things: the threatened white men of the internet and the â€˜I have no soulâ€™ lulzsters.â€ That is, some proportion of Chanterculture warriors actually believe the things they say â€” some dedicated real-life internet Nazis like Andrew Auernheimer, a.k.a. weev, came up in chan culture â€” while others are just in it for the outrage. (Many channers find the idea of having an actual ideology â€” or expressing it online, at least â€” rather distasteful, with the only exception being instances in which cloaking oneâ€™s online persona in an offensive ideology can elicit lulz.) …
Underlying chan culture is a fundamental hostility to earnestness and offense that plays out in how its members interact with each other and with outsiders. To wit: If you, a channer, post a meme in which Homer and Lisa Simpson are concentration camp guards about to execute Jewish prisoners, and I respond by pointing out that thatâ€™s fucked up, Iâ€™m the chump for getting upset. Nothing really matters to the average channer, at least not online. Feeling like stuff matters, in fact, is one of the original sins of â€œnormies,â€ the people who use the internet but donâ€™t really understand what itâ€™s for (chaos and lulz) the way channers do. Normies, unlike channers â€” or the identity channers like to embrace â€” have normal lives and jobs and girlfriends and so on. Theyâ€™re the boring mainstream. Normies donâ€™t get it, and thatâ€™s why theyâ€™re so easily upset all the time. Triggering normies is a fundamental good in the chanverse.
And when channer and normie culture collide, normie culture indeed tends to spasm with offense. From the point of view of a normie, why would you post Holocaust imagery unless you actually hate Jews or want them to die? To which the channer responds internally, For the lulz. That is, for the sake of watching normies get outraged, and for recognition from their online buddies.
Read the whole thing.
04 Oct 2015
Nihilistic Password Security Questions.
– – – –
What is the name of your least favorite child?
In what year did you abandon your dreams?
What is the maiden name of your fatherâ€™s mistress?
At what age did your childhood pet run away?
What was the name of your favorite unpaid internship?
In what city did you first experience ennui?
What is your ex-wifeâ€™s newest last name?
What sports team do you fetishize to avoid meaningful discussion with others?
What is the name of your favorite canceled TV show?
What was the middle name of your first rebound?
On what street did you lose your childlike sense of wonder?
When did you stop trying?
01 Jul 2015
Kit Wilson identifies the leading cultural disease of modern times.
We seek to make society blinkered, mindless and immature. Look at the way todayâ€™s businesses choose to market themselves. They invent names that imitate the nonsense words of babies: Zoopla, Giffgaff, Google, Trivago. They deliberately botch grammar in their slogans to sound naÃ¯ve and cutesy: â€œFind your happyâ€, â€œBe differenterâ€, â€œThe joy of doneâ€. They make their advertisements and logos twee and ironic â€” a twirly moustache here, a talking dog there â€” just to show how carefree and fun they are.
Those in our society who actually still have children have them later and in smaller numbers than ever. Many simply choose to forego the responsibilities of parenthood altogether. Marriage is an optional extra â€” one from which we can opt out at any point, regardless of the consequences for the children.
Students expect to be treated like five-year-olds: one conference recently prohibited applause for fear it would, somehow, trigger a spate of breakdowns. Many of my fellow twentysomethings reach adulthood believing they can recreate in their everyday lives the woolly comforts of social media. They discover, with some surprise, that they cannot simply click away real confrontation, and â€” having never developed the psychological mechanisms to cope with it â€” instead seek simply to ban it.
The effects of social media donâ€™t end there. A Pew Research Centre study last year found that regular social media users are far more likely than non-users to censor themselves, even offline. We learn to ignore, rather than engage with, genuine disagreement, and so ultimately dismantle the most important distinction between civil society and the playground â€” the ability to live respectfully alongside those with whom we disagree.
Social media assures us that the large civilisational questions have already been settled, that undemocratic nations will â€” just as soon as theyâ€™re able to tweet a little more â€” burst into glorious liberty, and that politics is, thus, merely a series of gestures to make us feel a bit better. Hence the bewildering range of global issues we seem to think can be somehow resolved with a sober mugshot and a meaningful hashtag.
In reality, our good fortune is an anomaly. Weâ€™ll face again genuine, terrifying confrontations of a kind we can scarcely imagine today. And weâ€™ll need something a little more robust than an e-petition and a cat video.
Sadly, our philosophical approach seems to have been to paper over Nietzscheâ€™s terrifying abyss with â€œKeep calm . . .â€ posters. If one were to characterise the Westâ€™s broad philosophical outlook today, it would be this: sentimental nihilism. We accept, as â€œrisen apesâ€, that itâ€™s all meaningless. But hey, weâ€™re having a good time, right?
This is gleefully expressed by our societyâ€™s favourite spokespeople â€” comedians, glorifying the saccharine naivety of a culture stuck in the present. When the New York Times columnist Ross Douthat asked the comedian Bill Maher to locate the source of human rights, he simply shrugged his shoulders and said, â€œItâ€™s in the laws of common sense.â€
Read the whole thing.
20 May 2013
Gay Marriage Equality symbol used on social media
Mark Rothko, Black on Maroon, 1958, Tate Gallery
Nihilism as a psychological state will have to be reached, first, when we have sought a â€œmeaningâ€ in all events that is not there: so the seeker eventually becomes discouraged. Nihilism, then, is the recognition of the long waste of strength, the agony of the â€œin vain,â€ insecurity, the lack of any opportunity to recover and to regain composureâ€”being ashamed in front of oneself, as if one had deceived oneself all too long.â€”This meaning could have been: the â€œfulfillmentâ€ of some highest ethical canon in all events, the moral world order; or the growth of love and harmony in the intercourse of beings; or the gradual approximation of a state of universal happiness; or even the development toward a state of universal annihilationâ€”any goal at least constitutes some meaning. What all these notions have in common is that something is to be achieved through the processâ€”and now one realizes that becoming aims at nothing and achieves nothing.â€” Thus, disappointment regarding an alleged aim of becoming as a cause of nihilism: whether regarding a specific aim or, universalized, the realization that all previous hypotheses about aims that concern the whole â€œevolutionâ€ are inadequate (man no longer the collaborator, let alone the center, of becoming).
Nihilism as a psychological state is reached, secondly, when one has posited a totality, a systematization, indeed any organization in all events, and underneath all events, and a soul that longs to admire and revere has wallowed in the idea of some supreme form of domination and administration (â€”if the soul be that of a logician, complete consistency and real dialectic are quite sufficient to reconcile it to everything). Some sort of unity, some form of â€œmonismâ€: this faith suffices to give man a deep feeling of standing in the context of, and being dependent on, some whole that is infinitely superior to him, and he sees himself as a mode of the deity.â€”â€œThe well-being of the universal demands the devotion of the individualâ€â€”but behold, there is no such universal! At bottom, man has lost the faith in his own value when no infinitely valuable whole works through him; i. e., he conceived such a whole in order to be able to believe in his own value.
Nihilism as psychological state has yet a third and last form.
Given these two insights, that becoming has no goal and that underneath all becoming there is no grand unity in which the individual could immerse himself completely as in an element of supreme value, an escape remains: to pass sentence on this whole world of becoming as a deception and to invent a world beyond it, a true world. But as soon as man finds out how that world is fabricated solely from psychological needs, and how he has absolutely no right to it, the last form of nihilism comes into being: it includes disbelief in any metaphysical world and forbids itself any belief in a true world. Having reached this standpoint, one grants the reality of becoming as the only reality, forbids oneself every kind of clandestine access to afterworlds and false divinitiesâ€”but cannot endure this world though one does not want to deny it.
What has happened, at bottom? The feeling of valuelessness was reached with the realization that the overall character of existence may not be interpreted by means of the concept of â€œaim,â€ the concept of â€œunity,â€ or the concept of â€œtruth.â€ Existence has no goal or end; any comprehensive unity in the plurality of events is lacking: the character of existence is not â€œtrue,â€ is false. One simply lacks any reason for convincing oneself that there is a true world. Briefly: the categories â€œaim,â€ â€œunity,â€ â€œbeingâ€ which we used to project some value into the worldâ€”we pull out again; so the world looks valueless.
— Friedrich Nietzsche, The Will to Power
Quoted directly from Fred Lapides.
27 Aug 2012
What have you to recommend? I answer at once, Nothing. The whole current of thought and feeling, the whole stream of human affairs, is setting with irresistible force in that direction. The old ways of living, many of which were just as bad in their time as any of our devices are in ours, are breaking down all over Europe, and are floating this way and that like haycocks in a flood. Nor do I see why any wise man should expend much thought or trouble on trying to save their wrecks. The waters are out and no human force can turn them back, but I do not see why as we go with the stream we need sing Hallelujah to the river god.
— FitzJames Stephen, Liberty, Equality, Fraternity, 1873.
61 years ago, the young William F. Buckley Jr. launched what would become a splendiferous career as celebrity commentator and public intellectual by publishing not long after his graduation from Yale a scathing critique of his alma mater, titled God and Man at Yale.
God and Man at Yale represented Buckley’s first major effort at “standing athwart history yelling ‘Stop!,'” and we may now read with a certain poignancy the report of Nathan Harden, Sex and God at Yale, compiled at a posting station considerably farther along the road to Hell in a handbasket, demonstrating just how little either History or Yale was listening.
The youthful naysayer of 1951, Buckley, was a classic version of the privileged insider. Buckley was rich, handsome, and stylish, educated at elite preparatory schools in Britain and the United States. At Yale, he was the kind of celebrity undergraduate BMOC that basically ceased to exist after coeducation: Captain of the Debating Team, Chairman of the Daily News, and –of course– member of Skull and Bones.
The contrast between Buckley and Harden could scarcely be more extreme. Nathan Harden was home-schooled, knows what manual labor is like, and grew up in a family that was short of cash living all over the Southern United States. Harden was turned down by Yale initially, attended one of the Claremont Colleges, then got into a one-term visiting student program at Yale, tried transferring and was turned down again, and finally re-applied and was accepted. He was 22 years old and already married by the time he started college in California, so he must have been 24 (and still married) by the time he finally got to Yale as a degree candidate. Harden did his junior year abroad in Rome and, though he speaks with some familiarity of Political Union debates, he clearly never became any kind of BMOC and obviously did not get into Bones.
Nathan Harden came to Yale with the ability to appreciate the richness of her centuries of history and tradition. He speaks openly of the intense pleasure to be found in exploring Yale’s incomparably rich academic offerings served up by some of the greatest living minds while living in the midst of a community of the most spectacularly talented people of one’s own generation sharing the same Arcadian existence. He also understands exactly why Yale is superior to Harvard.
But… like any representative of ordinary America studying at one of America’s most elite universities today, Nathan Harden was also frequently shocked by the estrangement from, and hostility toward, the America he came from of his alma mater, and appalled by the strange gods of Multiculturalism and Political Correctness who have ousted the Congregationalist Jehovah from that ancient university’s temple.
For Nathan Harden, Sex Week at Yale (which we learn from him recently constituted an eleven-day biennial Saturnalia of Smut in which all of the university’s best known lecture halls (!) were turned over to demonstrators of sex toys, porn stars, and dirty film moguls to dispense technical instruction and even career advice to the Yale undergraduate community) serves as a crucial synecdoche for the moral crisis at the heart of American university education generally and particularly at Yale.
Harden argues that “For God, For Country, and For Yale,” Yale’s motto, has become not so much a series of aspirative ends ranked in hierarchical order but rather an accurate historical description of Yale’s own primary locus of value.
Yale was founded as a college, intended to serve God by educating Congregationalist clergymen to fill the pulpits of the Colony of Connecticut. Over time it evolved into a national institution educating an elite group of leaders in business, the military, politics, the arts, and the sciences for the United States. Today Yale is decidedly a hotbed of infidelity to both Christianity and the United States. Secular Progressivism has thoroughly replaced Congregationalism and Christianity, and loyalty to an international elite community of fashion has supplanted any particularist sentiment in favor of the United States. The Yale Administration operates neither to serve God nor Country, but instead directs its efforts entirely toward forwarding its own goals and enhancing its own prestige.
Armed with an almost-unequaled cash endowment and an equally impressive historical legacy and accumulation of multi-generational glory and therefore a concomitant ability to attract talent and additional funding, the Yale Administration is formidably equipped to mold, educate, and inform in any direction it wishes, but as Nathan Harden explains, the problem that is increasingly evident is the practical inability of the University Administration to distinguish good from bad, right from wrong, or up from down in the complex contemporary world of conflicting claims.
Presidents Angell, Seymour, and Griswold would have had no difficulty at all in understanding why the University ought not to lend the principal lecture halls in Linsley-Chittenden, W.L. Harkness, and Sheffield-Sterling-Strathcona Halls for porn stars to demonstrate sexual techniques or heads of pornography studios to proffer career advice. Richard Levin obviously does not understand why Sex Week at Yale is inappropriate (to say the least), any more than he understands why Yale should not be devoting 10% of its undergraduate places to foreigners, or why Yale should not be renting out its name and reputation to Third World governments.
Harden understands the problem and, though he has very recently graduated, he’d be a lot more qualified to run Yale than the current administration.
Yale… enjoys a strong tradition of educating American political leaders. Over the course of its first two hundred years, as Yale’s spiritual mission faded slowly into the background, a political purpose emerged as a new defining agenda. Serving country became a proxy for serving God. A patriotic purpose replaced a spiritual one. It was assumed for a long time that the interests of America were, by extension, Yale’s interests as well. A large percentage of Yale graduates enrolled in the military immediately following graduation. And, of course, many went on to hold high political office.
The diversity that came to Yale in the sixties was a good thing. Other changes were less positive. In the late 1960s, Yale’s patriotic ethos disintegrated in the face of pressures from the radical new left. The old-guard liberals, who had long governed the university, were replaced by a new, younger set. The old-guard liberals were in the mold of Jack Kennedyâ€”they were New Deal liberals who were sympathetic to religion and proud of their country. They were traditionalists. The new leftists, on the other hand, wanted radical social transformation. They wanted to challenge the old moral assumptions and revolutionize the economic system. Empowered by the backlash against the Vietnam War, and a sanctimonious belief in the justness of their cause, students rose up and violently took over the agenda of the American left. … About this same time, the patriotic purpose that had defined the university for two hundred years disappeared. The faculty had voted the year before to revoke academic credit for ROTC courses. Later, Yale moved to restrict military recruiters’ access to students. With the destruction of Yale’s patriotic ethos, the last remaining sense of Yale having any higher educational purpose in service of the nation went out the door.
That isn’t to say that Yale ceased being political. But from that point onward, Yale’s political agenda was no longer tied to American interests. In fact, Yale’s political climate came to be defined more and more by anti-Americanism. Economic theories in opposition to free markets became prevalent. Identity politics and interest-group politics began to take over academic life, endangering free speech in the name of cultural sensitivity, and ushering in a new era of suffocating political correctness.
The shift happened quickly. Only a couple of decades before, during World War II, faculty sentiment had been united against America’s enemies in Nazi Germany and Fascist Italy. Now, if the topic of international affairs happened to be raised in the faculty lounge, it had become fashionable to speak of America as the bad guy. Saying nice things about America’s enemies became a mark of intellectual sophisticationâ€”of rising above mindless nationalism-Patriotism, like religion, had become a mark of low intelligence, an anachronism. …
Yale is a place where one can find people expressing almost every imaginable viewpoint and belief system. But here is the unanswerable question: How does a secular university judge between the competing moral claims of its members when those claims breach the private sphere and enter the public realm? …
Nihilism is, ultimately, where Yale is headed. Yale was built in order to nurture ideas that would elevate the soul and advance human understanding, but it now has no governing moral principle-As a result, the knowledge generated there is divorced from any larger human purpose. Apart from a kind of vague appreciation or certain concepts like tolerance and diversity, Yale is a moral vacuum. Therefore, almost anything goes. Yale is among a dwindling number of institutions that provide a classical liberal education, focusing on the great books of the Western canonâ€”topped off with porn in HD. As I observed, within its walls, images of women being beaten and humiliated for no other reason than the pleasure and profit of others, I became aware that I was witnessing much more than the decline of a great university. I was witnessing nothing less than a prophetic vision of America’s descent into an abyss of moral aimlessness, at the hands of those now charged with educating its future leaders.
25 Mar 2011
Gary Wills reviews, with well-deserved derision, Hubert Dreyfus and Sean Dorrance Kelly’s All Things Shining: Reading the Western Classics to Find Meaning in a Secular Age, a recent effort by two prominent academic philosophers (Mr. Dreyfus is a professor of Philosophy at Berkeley, Mr. Kelly is chairman of the Philosophy Department at Harvard) to find an authentic basis for values compatible with postmodern Continental Nihilism.
The authors set about to solve the problems of a modern secular culture. The greatest problem, as they see it, is a certain anxiety of choosing. In the Middle Ages, everyone shared the same frame of values. One could offend against that frame by sinning, but the sins were clear, their place in the overall scheme of things ratified by consensus. Now that we do not share such a frame of reference, each person must forge his or her own view of the universe in order to make choices that accord with it. But few people have the will or ability to think the universe through from scratch.
So how can one make intelligent choices? Hubert Dreyfus and Sean Dorrance Kelly call modern nihilism â€œthe idea that there is no reason to prefer any answer to any other.â€ They propose what they think is a wise and accepting superficiality. By not trying to get to the bottom of things, one can get glimpses of the sacred from the surface of what they call â€œwhooshâ€ momentsâ€”from the presence of charismatic persons to the shared excitement of a sports event. This last elation is sacred and unifying:
There is no essential difference, really, in how it feels to rise as one in joy to sing the praises of the Lord, or to rise as one in joy to sing the praises of the Hail Mary pass, the Immaculate Reception, the Angels, the Saints, the Friars, or the Demon Deacons.
How proud Harvard must be.
Read the whole thing.
I had a number of courses at Yale from the late John N. Findlay, whose normally lofty and Olympian demeanor could actually be ruffled by any reference to Heidegger (whose thought is the foundation of the Nihilism of Messrs. Drefus & Kelly).
Findlay’s customarily serene blue eyes would flash fire at the mention of the odious Swabian sexton’s son. I remember Findlay once pausing to explain, in Oxonian tones dripping with bitterness and contempt, that Heidegger was guilty of systematically confusing emotional states with metaphysical objects. As Dreyfus and Kelly demonstrate, that kind of thing leads, if not to murderous totalitarianism, at least to incontinent puerility.
Hat tip to Karen L. Myers.
25 Sep 2010
In October of 1903, a 23-year-old prodigy who had recently finished his first book and who was widely regarded as a genius, Otto Weininger rented a room in the house in Vienna where Ludwig van Beethoven died 76 years earlier, and shot himself in the heart.
Weininger, a prodigy who had received his doctorate at an unusually young age, wrote a book, titled Geschlecht und Charakter (Sex and Character) arriving at extremely troubling conclusions. Weininger believed that human beings and human culture and society inevitably contain a mixture of positive, active, productive, moral, and logical (male, Christian) traits and impulses as well as their passive, unproductive, amoral, and sensual (female and Jewish) opposites.
Weininger was of Jewish descent and afflicted with homosexual inclinations and was in despair over the decline of modern Western civilization due to ascendancy of the female/Jewish impulses he deplored, so acting in consistency with his philosophical conclusions, Weininger took his own life.
Last Saturday, Mitchell Heisman, a 35-year-old psychology graduate from the University of Albany, shot himself in the head in front of Memorial Church in the Harvard Yard within the sight of a campus tour. Heisman had been residing nearby in Somerville, Massachusetts, supporting himself on a legacy from his father and by working in some Boston area bookshops, while pursuing his own studies and working on a (so far unpublished) book.
Mitchell Heisman published on the Internet a 1905-page suicide note in which he explains his actions as an experiment in nihilism undertaken in search of objectivity. Heisman, like Weininger of Jewish descent, is critical of liberal democracy, egalitarianism, materialism, modernism, and Jewish ethical opposition to “biological realism and the eugenic evolution of biological life.”
The suicide note pdf is fascinating document displaying considerable learning and evidencing a sharp sense of humor and originality of thought.
The most rigorous objectivity implies indifference to the consequence of objectivity, i.e. whether the consequences of objectivity yield life or death for the observer. In other words, the elimination of subjectivity demands indifference to self-preservation when self-preservation conflicts with objectivity. The attempt at rigorous objectivity could potentially counter the interests of self-preservation or even amount to rational self-destruction. The most total objectivity appears to lead to the most total self-negation. Objectivity towards biological factors is objectivity towards life factors. Indifference to life factors leads to indifference between the choices of life and death. To approach objectivity with respect to self-interest ultimately leads to indifference to whether one is alive or dead.
The dead are most indifferent; the least interested; the least biased; the least prejudiced one way or the other. What is closest to total indifference is to be dead. If an observer hypothesizes death then, from that perspective, the observer has no vested interests in life and thus possible grounds for the most objective view. The more an observer is reduced to nothing, the more the observer is no longer a factor, the more the observer might set the conditions for the most rigorous objectivity.
It is likely that most people will not even consider the veracity of this correlation between death and objectivity even if they understand it intellectually because most will consciously or unconsciously choose to place the interests of self-preservation over the interests of objectivity. In other words, to even consider the validity of this view assumes that one is willing and able to even consider prioritizing objectivity over oneâ€™s own self-preservation. Since it not safe to simply assume this on an individual level, let alone a social level, relatively few are willing and able to seriously address this issue (and majority consensus can be expected to dismiss the issue). In short, for most people, including most â€œscientistsâ€, overcoming self-preservation is not ultimately a subject for rational debate and objective discussion.
Maximizing objectivity can be incompatible with maximizing subjective interests. In some situations, anything less than death is compromise. The choice between objectivity and self-preservation may lead one to a Stoicâ€™s choice between life and death.
Whereas the humanities cannot be what they are without human subjectivities, the inhumanities, or hard sciences, require the subjective element be removed as much as possible as sources of error. Objectivity leads towards the elimination of subjectivity, i.e. the elimination of oneâ€™s â€œhumanityâ€. A value free science has no basis on which to value human things over non-human things and thus no basis to value life over death or vice versa. Social science will become equal to the standards of physical science when social scientists overcome the subjective preference for the life of humanity over the death of humanity.
To attempt to resolve the contradiction of myself as a scientist and a human being on the side of science leads towards viewing myself as a material object. While this contradiction may be impossible to resolve, the closest approximation of reconciliation may consist of the state of death. In death, the teleologically-inclining biases of human subjectivity that hinder one from viewing oneâ€™s self as a material object are eliminated.
I cannot fully reconcile my understanding of the world with my existence in it. There is a conflict between the value of objectivity and the facts of my life. This experiment is designed to demonstrate a point of incompatibility between â€œtruthâ€ and â€œlifeâ€. In this experiment I hypothesize that the private separation of facts and values, when disclosed to the wider social world, creates a conflict of interest between the value of sociobiological objectivity and the â€œfactsâ€ of my sociobiological existence such that it leads to a voluntary and rational completion of this work in an act of self-destruction. …
How far would one be willing to go in pursuit of scientific objectivity? Objectivity and survival are least compatible when objectivity becomes a means of life, subordinate to life as opposed to life subordinated to objectivity. If the greatest objectivity implicates confronting the most subjective biases, this implicates confronting those truths that most conflict with the subjective will to live. By simply changing my values from life values to death values, and setting my trajectory for rational biological self-destruction, I am able to liberate myself from many of the biases that dominate the horizons of most peopleâ€™s lives. By valuing certain scientific observations because they are destructive to my life, I am removing self-preservation factors that hinder objectivity. This is how I am in a position to hypothesize my own death.
So if objectivity is not justified as end, then objectivity can be a means of rational self-destruction through the overcoming of the bias towards life. Rational self-destruction through the overcoming of the bias towards life, in turn, can be a means of achieving objectivity. And this means: To will death as a means of willing truth and to will truth as a means of willing death. …
Why am I doing this? Ah, yes, now I remember the punchline: Iâ€™ll try anything once!
There is nothing to take seriously!
I have not had time yet to read the whole thing, so I’m not completely sure just what I think of all of the late Mr. Heisman’s opinions, but I am intrigued enough to have resolved to read all of it. I’ve even downloaded and saved a copy.
My guess, at this point, is that his book is probably well worth publishing.
New York Post
07 Jun 2010
…for many a time
I have been half in love with easeful Death,
Call’d him soft names in many a mused rhyme,
To take into the air my quiet breath;
Now more than ever seems it rich to die,
To cease upon the midnight with no pain.
In the New York Times, Princeton’s professional ethicist supreme Peter Singer admires the work of antinatalist South African philosopher David Benatar:
Schopenhauerâ€™s pessimism has had few defenders over the past two centuries, but one has recently emerged, in the South African philosopher David Benatar, author of a fine book with an arresting title: â€œBetter Never to Have Been: The Harm of Coming into Existence.â€ One of Benatarâ€™s arguments trades on something like the asymmetry noted earlier. To bring into existence someone who will suffer is, Benatar argues, to harm that person, but to bring into existence someone who will have a good life is not to benefit him or her. Few of us would think it right to inflict severe suffering on an innocent child, even if that were the only way in which we could bring many other children into the world. Yet everyone will suffer to some extent, and if our species continues to reproduce, we can be sure that some future children will suffer severely. Hence continued reproduction will harm some children severely, and benefit none.
Benatar also argues that human lives are, in general, much less good than we think they are. We spend most of our lives with unfulfilled desires, and the occasional satisfactions that are all most of us can achieve are insufficient to outweigh these prolonged negative states. If we think that this is a tolerable state of affairs it is because we are, in Benatarâ€™s view, victims of the illusion of pollyannaism. This illusion may have evolved because it helped our ancestors survive, but it is an illusion nonetheless. If we could see our lives objectively, we would see that they are not something we should inflict on anyone.
Here is a thought experiment to test our attitudes to this view. Most thoughtful people are extremely concerned about climate change. Some stop eating meat, or flying abroad on vacation, in order to reduce their carbon footprint. But the people who will be most severely harmed by climate change have not yet been conceived. If there were to be no future generations, there would be much less for us to feel to guilty about.
So why donâ€™t we make ourselves the Last Generation on Earth? If we would all agree to have ourselves sterilized then no sacrifices would be required â€” we could party our way into extinction! …
Is a world with people in it better than one without? Put aside what we do to other species â€” thatâ€™s a different issue. Letâ€™s assume that the choice is between a world like ours and one with no sentient beings in it at all. And assume, too â€” here we have to get fictitious, as philosophers often do â€” that if we choose to bring about the world with no sentient beings at all, everyone will agree to do that. No oneâ€™s rights will be violated â€” at least, not the rights of any existing people. Can non-existent people have a right to come into existence?
I do think it would be wrong to choose the non-sentient universe. In my judgment, for most people, life is worth living. Even if that is not yet the case, I am enough of an optimist to believe that, should humans survive for another century or two, we will learn from our past mistakes and bring about a world in which there is far less suffering than there is now. But justifying that choice forces us to reconsider the deep issues with which I began. Is life worth living? Are the interests of a future child a reason for bringing that child into existence? And is the continuance of our species justifiable in the face of our knowledge that it will certainly bring suffering to innocent future human beings?
Friedrich Nietszche (Thus Spake Zarathustra, 1883-1885, prologue, Â§5) predicted with complete accuracy that the result of nihilism would be Benatars and Singers.
Alas! There cometh the time when man will no longer give birth to any star. Alas! There cometh the time of the most despicable man, who can no longer despise himself.
Lo! I show you THE LAST MAN.
“What is love? What is creation? What is longing? What is a star?”–so asketh the last man and blinketh.
The earth hath then become small, and on it there hoppeth the last man who maketh everything small. His species is ineradicable like that of the ground-flea; the last man liveth longest.
“We have discovered happiness”–say the last men, and blink thereby.
They have left the regions where it is hard to live; for they need warmth. One still loveth one’s neighbour and rubbeth against him; for one needeth warmth.
Turning ill and being distrustful, they consider sinful: they walk warily. He is a fool who still stumbleth over stones or men!
A little poison now and then: that maketh pleasant dreams. And much poison at last for a pleasant death.
One still worketh, for work is a pastime. But one is careful lest the pastime should hurt one.
One no longer becometh poor or rich; both are too burdensome. Who still wanteth to rule? Who still wanteth to obey? Both are too burdensome.
No shepherd, and one herd! Every one wanteth the same; every one is equal: he who hath other sentiments goeth voluntarily into the madhouse.
“Formerly all the world was insane,”–say the subtlest of them, and blink thereby.
They are clever and know all that hath happened: so there is no end to their raillery. People still fall out, but are soon
reconciled–otherwise it spoileth their stomachs.
They have their little pleasures for the day, and their little pleasures for the night, but they have a regard for health.
“We have discovered happiness,”–say the last men, and blink thereby.