Tuesday, March 27, 2012

Promising the Moon


It is impossible for any human being to describe an event (“Just the facts, ma’am”) completely devoid of “spin.” However much we try to be neutral, we inevitably bring our own presuppositions and values into the story, omitting this and emphasizing that according to what we consider to be important, what to be unimportant, and what already should be understood by the listener. There is always an element of advocacy, however unintended, in whatever we say. This is true (in a more constrained way than elsewhere) even in scientific literature; the actual collected data may be value-free, but the choice of what information to collect in the first place is not – nor, often, is what to conclude from it.

Most of us are aware of this and allow for it when we listen to others speak. In an election year such as this one, however, the airwaves (and optical fibers) are filled not just with spin but out-and-out propaganda. Arguably, the utter lack of pretense of objectivity in campaign ads is a kind of honesty, but the half-truths and glaring omissions typically found in them are so close to lies as to be indistinguishable from them. It’s always easiest to gasp at the nonsense in the ads of the opposing party, of course, while we dismiss the dubious rhetoric of our own favored candidate as inconsequential exaggerations born of campaign enthusiasm. Meantime we worry that those “still undecideds” who so often determine the outcome of elections will be so gullible as to be swayed by the other side’s misrepresentations rather than by our own.

This raises the question of just how gullible we are. Here is an old rough-and-ready test. Answer this question: Do you consider yourself to be a gullible person? If you answered “no” you probably are. We all get taken in by nonsense every day, falling more easily for lies than for spin – for, while we expect a person to have a distinct perspective, we don’t always expect him to be untruthful by his own lights. The less we acknowledge our own capacity to be fooled, the more readily we will be.

Kids, as psychologists have shown repeatedly, believe just about anything they are told. Skepticism is something we learn the hard way; it is not the philosophy with which we start out. Adult gullibility is more circumscribed: adults generally believe new information if it does not conflict with what they already believe – they just have more pre-existing beliefs than kids and so appear less gullible. Furthermore, once having adopted some new belief, adults will tend to maintain it in the face of contrary evidence (“belief perseverance”).

For years there has been an internet prank about dihydrogen monoxide, a deadly chemical that can kill you if you breathe it. (See http://www.dhmo.org/facts.html.) Dihydrogen monoxide is, of course, water, but it is easy to get people to sign a petition to ban it. The presentation seems credible to them.

Tabloid newspapers thrive on readers’ willingness to believe the most amazing (sometimes libelous) things. The name, by the way, was borrowed from condensed pills, once called tabloids rather than tablets; the tabloid papers initially were condensed news. In the US, the rise of the tabloid is usually traced to The Great Moon Hoax of 1835. (The tabloid newspapers in the UK have a different provenance.) In 1835, The New York Sun announced that astronomer Sir John Herschel "by means of a telescope of vast dimensions and an entirely new principle" had discovered intelligent life on the moon and observed it in great detail. In a series of six illustrated articles, supposedly written by Herschel’s assistant “Dr. Andrew Grant,” the Sun described strange animals such as biped beavers and, most shockingly, winged moon people who pursued rather hedonistic lifestyles. Unlike the savage earth, the moon displayed a "universal state of amity among all classes of lunar creatures." The author added verisimilitude by describing an observatory accident when the powerful telescope’s lens caught the sun’s rays and started a fire.

The New York Sun scored a huge hit. All the NY papers reprinted the series, which soon went international. The story was not only wildly popular but widely believed. The real Sir John was in South Africa at the time and didn’t learn of the hoax until afterward. If he was upset by it, he didn’t say so.

Lest we think we think that we are less easily fooled today, consider all the varied conspiracy theory advocates out there – I mean advocates for the silly theories: the ones we ourselves buy are obviously true. I guarantee that if a major media outlet ran a hoax story, presented as a “leak,” that SETI was in contact with aliens but was sitting on the information, the story would have no shortage of believers even after the hoax was revealed. “That’s just part of the cover-up!”

So, how do we fool those independents into voting our way? Apparently, the propagandists will have to tune their message to lie in a way that doesn’t conflict with those voters’ pre-existing beliefs. The side that does this the best should win.

1835 Lithograph of Life on the Moon




When We Were Gullible Enough to Believe that Full Scale Development of the Moon Was Only a Few Years Away: Intro to Groovy Movie Moon Zero 2 (1969)

 

Thursday, March 22, 2012

Dionysus Shrugged


This morning a neighbor, who is a freshman at nearby community college, found herself wheelless and asked if I could give her a ride to the campus – she’d hitch a ride with a classmate for the return trip, she said. I rarely decline doing simple favors for pretty young women, even though these days I neither ask, nor expect, nor receive anything other than a cursory wave of thanks in return, so she got the ride. She wore a Nirvana tee shirt which she values for its vintage/nostalgia aspects.

Nirvana was founded 25 years ago in 1987; “Smells Like Teen Spirit,” the band’s signature hit from the Nevermind album, was released in 1991. Both of those events were before my neighbor was born. Last September I blogged that nostalgia fads typically peak at +- 25 years, specifically mentioning Nirvana as ripe for revival (see http://richardbellush.blogspot.com/2011/09/when-classy-tomatoes-had-gams.html); if only my stock market picks had been as good.

Unlike my neighbor, I do remember 25 years ago. In 1987, I was reasonably young and trouble-free. My immediate family was alive and well. I had a satisfyingly close-enough/far-enough relationship with a young lady on Long Island. (She ceased to find the close/far balance satisfactory in 1989.) Yes, 1987 was a fine year in my book. It was a fine year for lots of people until the stock market and housing market melted down in a spectacular fashion unseen since 1929; on October 19, the Dow dropped over 22% in a single trading session and, in the days afterward, went on to lose 36.7% overall before hitting bottom. A similar crash wasn’t seen again until 2008 when the market lost 18% over five days and 44% overall, though the Dow then recovered enough to end 2008 with a 34% loss off the year’s high. In 1987, however, I didn’t own any stocks and didn’t plan to sell my house, so I escaped all but a little personal damage from the economic events. Had I been asked on December 31, 1987 if anything substantive had changed in my lifestyle since January 1 of that year, I’d have said no.

Yet, something had changed, and it is brought to mind by my DVD selection last night, Song of the Thin Man (1947), a movie which preceded my birth about as much as Nirvana’s foundation preceded my neighbor’s T-shirt, and which (along with the other 5 movies in the series) I sometimes watch for the vintage/nostalgia aspects. The movie features the bibulous husband-wife investigative team Nick and Nora Charles (William Powell and Myrna Loy); this was the last film of the series, so Nick and Nora never again overindulged in dry martinis on screen.

As it happens, 1987 almost surely was the last time my own blood alcohol level ever exceeded the legal definition of intoxication. Oh, I might have skirted close once or twice; I’m not a complete teetotaler, and the limit is lower than most people imagine. By US standards, “one drink” is 1.5 ounces (44.36 ml) of 80-proof (40%) liquor, which is equal to a 5-oz. glass of wine or a 12 oz. bottle of beer at the usual alcohol concentrations; two drinks for a 200-pound (91 kg) person or only one for a 100-pounder will be enough to exceed the limit, and drinks wear off at a rate of only one per hour. Still, I very much doubt I’ve passed it since then; a dedicated binge drinker would have little trouble exceeding my usual annual intake in a single night.

This tilt to sobriety was not some big choice; it simply followed of its own accord from a lot of little choices made without any sense of a plan. I had been fond of spirits for more than a decade previously (Southern Comfort was my brand of choice), and still liked the flavor and initial kick; I wasn’t ever fond of the aftereffects, however. By the mid 80s the morning-after often loomed as too much to pay for a night-before. I don’t just refer to an out-and-out hangover, but also to the sensation of being off – of not being altogether sharp – that always lingered into the next evening after even only relatively modest amounts of alcohol. (I concede that my constitutional robustness with regard to booze might not be as great as that of many – perhaps most – other people.) So, increasingly I ordered club soda, and soon grew to enjoy being the sober (and undrugged) guy at rock concerts, raves, and afterparties – these have grown less frequent for me, too, actually, but for rather different reasons. The un-stoned un-buzzed condition at such events provides a different perspective – almost a unique one in the 80s – and ever since 1987 it’s the perspective I’ve maintained.

If that sounds like advocacy, it isn’t meant to. Drunkenness is a bad habit (as is any artificial high), but it actually is of value in some contexts, and I’d never go all Eliot Ness and argue against it per se. Some people find alcohol clinically addictive, of course, and these should handle the stuff gingerly if at all, but the ancient Persians were onto something when, according to Herodotus, they considered important questions both sober and drunk to see if they reached the same conclusions both ways. The Greeks were onto something, too, with their revelries in honor of Dionysus. Nietzsche argued that the greatest art arises from a tension between the primally ecstatic Dionysian in human nature and the rational ordered Apollonian. More recently, Camille Paglia (Sexual Personae) went further by saying that civilization itself is built on the tension, and that there is a biological basis for the division: the higher cortex that overlies the more primitive limbic brain. We need to be in touch with both to be complete people, and a few substances (alcohol the oldest and most venerable among them) can help to break the barrier between them – even without full-blown ancient style Bacchanalia to attend.

I broke down those barriers quite enough earlier in my life by that method, so I don’t feel a nostalgic need to do so again. I’ll attend that Bacchanalian festival though, if anyone cares to throw one, and I bet I still could get in touch with my Dionysian side even with serum level below 0.08.


Tickling the Limbic Brain: Dragnet (1987) 


Friday, March 16, 2012

“Guns Aren't Lawful; Nooses Give; Gas Smells Awful; You Might As Well Live” – Dorothy Parker


Suffering the fate of so many near-future scifi novels, Gore Vidal’s 1978 Kalki has been overtaken by the actual future. The darkly funny apocalyptic tale is still an enjoyable read, however – nowadays as an alternative history. The central character Teddy Ottinger is an aviatrix (despite Teddy’s feminist principles, she likes the word too much to give it up) who narrates the novel in first person. The character is heavily influenced by Amelia Earhart, who in fact was a Vidal family friend. The plot: James K. Kelly, a former soldier at a secret U.S. bio-weapons facility in Saigon, declares himself to be Kalki, the final incarnation of Vishnu. He starts a cult and announces the end of the world. He pulls it off, killing every human on the planet except for five whom he inoculated: himself, his wife Lakshmi, and three “Perfect Masters” (including Ottinger), whom he has chosen partly for their useful skills and partly for their sterility. He and Lakshmi intend to repopulate the world themselves. The 1978 hardcover is still on my shelf, and is due for a dust-off and re-read.

This is a special year for apocalyptic tales, what with the world ending on December 21 and all. (Mine, by the way, is titled Slog and is posted at my Richard’s Novel Ideas blogsite.) Such stories are popular in any year though. It is an old and persistent genre. The oldest stories tend to be religious, and plenty of this type still crop up, but fictional doomsdays appeal to secularists, too, e.g. Terminator, The Road, Dr. Strangelove, War of the Worlds, etc.

Why is the genre so enduringly popular? Perhaps in part it is because so many of our everyday interactions with other people are quite frankly unpleasant. Auto drivers cut us off, sellers demand payment, customers complain, bosses yell, co-workers gossip, governments harass us with taxes and petty regulations, kids scream, lovers cheat, neighbors are rude, and so forth. It’s no wonder we sometimes fantasize about making them all go away and leave us alone. Well, nearly all of them: in most of these fantasies we allow ourselves a companion or two, often of a romantic nature.

Some folks don’t stop at the fantasy. They eagerly anticipate the end. Harold Camping and his followers, who predicted the end of the world in 1988, 1994, and 2011, are among the latest. History is full of such cults. A few cults actively try to jumpstart the end, such as Aum Shinrikyo, members of which carried out sarin gas attacks on the Tokyo subway in 1995. Still others are suicidal, such as the Heaven’s Gate sect members in their sneakers (what was with the Nikes?). The Shakers didn’t kill themselves, but their belief in lifelong celibacy nevertheless doomed the cult for obvious reasons.

One of the most idiosyncratic doomsday cults is the strongly pro-environment Church of Euthanasia. The founder, Reverend Chris Korda explains that “the four pillars of the Church of Euthanasia are suicide, abortion, cannibalism and sodomy. The Church only has one commandment: Thou Shall Not Procreate. All four pillars help reduce the population.” Korda adds, “We're only tangentially interested in the fate of the human species, but we're most interested in the fate of the planet we happen to inhabit and dominate... so our support of those pillars is both symbolic and actual.” The church’s website used to list painless methods of suicide, but, because of civil litigation concerns, these have been removed.

When asked why the church members don’t follow their own advice and commit suicide, Korda answers with some logic that somebody has to stick around to spread the word.

I frequently sympathize with the “leave me alone” feeling, but I don’t really wish for the end of civilization. As it happens, I do think global population is too high, but lower birthrates are the best way to deal with that. I don’t begrudge anyone already here. As for that “leave me alone” sensation, a weekend of solitude is usually enough to dispel it. If one day that’s not enough, I’m thinking beachside in Panama. According to US News and World Report, a comfortable home in Las Tablas rents for under $300 per month. That should be a pleasant and affordable place to be a hermit. The Church of Euthanasia will have to do without me.


Church of Euthanasia Rally


Rude but Funny SNL faux Commercial for Keds Sneakers after the Heaven's Gate Event

 


Saturday, March 10, 2012

Mediocracy


We humans, individually, are lousy judges of ourselves, and we usually err on the side of self-flattery. Commentators on the human condition have noted as much for millennia, and researchers David Dunning and Justin Kruger (Cornell and NYU) have reconfirmed this experimentally. They find that people typically and routinely overestimate their relative performances on cognitive tests, whether the subject matter is grammar, spatial skills, deductive logic, or anything else; by a large majority, test-takers – even when told their absolute scores – always believe their scores are above average in the group. Dunning and Kruger explain, “Because people choose what they think is the most reasonable and optimal option, the failure to recognize that one has performed poorly will instead lead one to assume that one has performed well. As a result, the incompetent will tend to grossly overestimate their skills and abilities.” (Kruger/Dunning, Unskilled and Unaware of It.) The best performers are always the most realistic about their relative scores, but this is a statistical phenomenon; someone in the top quartile, for example, can’t possibly rate himself or herself in any higher quartile, so he or she can only be accurate or misjudge on the low side, and few people do the latter.

Such self-delusion might be dismissed as an endearing human foible were it not for related misjudgments. People are just as bad at judging the performances of others in a group as they are at judging their own performances. Their own incompetency makes it difficult for them to recognize competency or the lack of it in others. Unsurprisingly, the worst performers are also the worst judges.

Some see a serious problem for democracy in results such as these. Dunning, quoted in Life’s Little Mysteries, remarks, "Very smart ideas are going to be hard for people to adopt, because most people don’t have the sophistication to recognize how good an idea is." Lacking the skills to judge, say, rival reform plans for financial industry regulation, average folks also lack the skills to judge rival planners; worse, they remain unaware of their own limitations in this regard. German sociologist Mato Nagel, inspired by Kruger/Dunning, modeled elections using a bell curve of voter leadership skills and the assumption that voters cannot recognize competency greater than their own. In his computer-simulations, mediocre candidates always win elections – results that arguably mirror reality pretty well. While the best possible candidates don’t win in his model, on the plus side at least the worst possible candidates don’t win either.

Suspicion of democracy is nothing new in intellectual circles. Witness Aristotle, who tells us that there are three “true” or good forms of government: monarchy, aristocracy, and a constitutional republic. He then says there are three “perversions” of these three: tyranny, oligarchy, and democracy. The true forms, he says, are perverted into their evil twins whenever those in power pursue primarily their own interests instead of the common interests. The U.S. Founders were avid classicists all (as well as an elite potentially at risk from the majority), and they took Aristotle very much to heart. They thereby deliberately tried to craft a constitutional republic that shackled the power of the majority by limiting government without handing power to an aristocracy or to a monarch. They regarded “democracy” as something to be feared:

Thomas Jefferson: “A democracy is nothing more than mob rule, where 51% of the people may take away the rights of the other 49%.”
Alexander Hamilton: “Real liberty is never found in despotism or in the extremes of democracy.”
John Adams: “There never was a democracy yet that did not commit suicide.”
Benjamin Franklin: “Democracy is two wolves and a lamb voting on what to have for lunch.”
etc.

The word “democracy” didn’t really lose its negative Aristotelian connotation on these shores until at least the era of Andrew Jackson.

Nowadays, few people pay much attention to Aristotle (or to Jefferson for that matter). “Democracy” and “good” are virtual synonyms in common political discourse and no serious national politician (at least in the Western democracies) would argue against either publicly. Yet, there is a lingering doubt among various elites that the majority has the wisdom to govern, or even the wisdom to know when it doesn’t have the wisdom.

This doubt surely is justified. Kruger/Dunning merely give numerical values to what already was obvious. But what is the alternative? Somebody has to have last say. (Constitutional restrictions on the power of government are fine and I’m all in favor of them, but the judges who interpret those restrictions – often out of any useful existence – are still chosen directly or indirectly by the voters.) Should “somebody” be the few (which few?) or the many? Churchill’s hoary old comment still rings true: "Democracy is the worst form of government, except for all those other forms that have been tried from time to time." So, is a choice among mediocrities really the best we can do? Yeah, sadly enough, it looks like it.

What about the future? Well, if reports of the ongoing “dumbing down” of society have any truth, Mato Nagel’s results indicate our leaders will dumb down right along with it. Now there is an interesting prospect.

 

Monday, March 5, 2012

All the Breaks


Spring is just around the corner, and with it the enthusiastically anticipated Spring Break. Even in college, it seems, the best part about school is getting away from it.

This is not surprising. School and pain always have been entwined – for most of history quite literally. An ancient Egyptian clay tablet survives on which a student had scratched this praise for his teacher: "You beat me and knowledge entered my head." One suspects that he was told to write this rather than that he felt a sudden urge to record his gratitude for posterity. Corporal punishment remained a normal educational technique through the 19th century and it persisted in diminishing quantities well into the 20th. My mother spoke of public school teachers whacking her hands with rulers for petty infractions of school rules. Even the kindly schoolmaster in the sentimental Goodbye Mr. Chips (1939) whips a schoolboy while uttering the classic disclaimer that it gives him no pleasure.

Nowadays, beyond the tedium of all-too-frequently joyless classes, the greatest pain is financial in the form of property taxes, tuition fees, and student loans. The rise in the cost of education has outpaced general inflation consistently for decades. There at last may be light on the horizon, however. Increasingly, education is moving online; even traditional brick and mortar schools often require students to take an online class or two in order to accustom them to the practice. Yet, these very classes expose the truth that live teachers and physical classrooms are no longer entirely necessary. A syllabus and an internet connection work as well most of the time. Online colleges, vastly cheaper than the conventional kind, already have a strong and expanding foothold.

This development was foreseen long ago by sci-fi author and futurist guru Isaac Asimov. In 1951, almost 30 years before the first home computers were marketed, Asimov imagined the devices serving as home teaching machines for children in his short story The Fun They Had. In this tale, two children in the 22nd century discover an old paper and ink school book in the attic. They are astonished to learn that large groups of children once attended “classes” together and that these classes were led by live human teachers. The story concludes, “Margie was thinking about how the kids must have loved it in the old days. She was thinking about the fun they had.”

This story was and still is widely anthologized in school books. I remember reading it in grade school. My teacher, missing the point of the story, told us how lucky we were to be living in those “old days.” She wasn’t alone. Educators almost always miss Asimov’s point. Asimov was saying the opposite. Don’t take my word for it. In his eclectic book The Roving Mind, Asimov specifically complains that school anthologies “together with certain letters I get, often make it clear that the story is interpreted non-ironically as a boost for contemporary education.” He then compares opponents of computer and online learning to people who once believed no automobile ever could replace a sensitive living horse. (Yes, I know I said something like that in a recent blog, but I do drive a car nonetheless.) His fictional future kids on their machines, able to proceed at their own pace and to break for play on their own schedules, are learning better and are having a better time doing it. As for the social aspects of school (many of them awful, really), they can be duplicated much more cheaply and simply without the big educational apparatus attached to them.

But what about Spring Break? If colleges cease to be a weary grind, will there be such enthusiasm about getting away from them on the beaches of Daytona or Panama City, Florida? I don’t think the state’s tourist industry need worry. 20-year-olds are pretty sure to show up for a party regardless. If you throw it, they will come.