Friday, August 31, 2012

Pressing Question

Yesterday I retrieved a couple pieces of trim from above my garage for a minor repair task that didn’t require matching pieces. My dad was a builder, so he stashed up there numerous leftover mismatched doorknobs, window screens, paneling boards, and so forth, some now more than 50 years old. While rummaging about, I also noticed the two wine presses also stored up there, one from each set of grandparents.

All my grandparents experienced Prohibition while in their 20s and early 30s, and both sets lived on farms, so it is not surprising that they bypassed the law by fermenting their own wine, using not only grapes but apples, elderberry, blackberries, dandelions and whatever else was handy. (Making wine at home requires patience but it isn’t really difficult: juice, water, sugar, and yeast are the basic ingredients; different recipes have different proportions, and additives such as honey are always an option.) Left to ferment naturally, wine maxes out at about 12% alcohol; beyond that it kills off its own yeast, so the process is self-limiting. Nevertheless, so far as I know, my grandparents didn’t supplement the table wine with moonshine, which is just as well. My dad remembered from his childhood repeatedly seeing neighbors’ barns and houses burn in the distance with the characteristic blue flames of distillery accidents.

I’ll choose to assume the wine was just for medicinal purposes. Studies as early as 1900 revealed the value of moderate alcohol consumption in reducing heart disease – studies dismissed by Prohibitionists as hokum but recently reconfirmed. In fact, moderate consumption is presently tied to a multitude of health benefits. One Health site (See ) poses and answers the following question:

“Why drink to reduce the risk of heart disease? Wouldn't eating a good diet, exercising, and losing weight do the same thing? No, it wouldn't. The moderate consumption of alcohol appears to be more effective than most other lifestyle changes that are used to lower the risk of heart and other diseases.”

Of course, a good diet and exercise are beneficial too, but, apparently, if you are going to pick only one intervention, booze is best, provided you keep the amount down to one or two drinks per day. (1 drink = 1 beer = 1 glass of wine = 1 shot of 80-proof liquor.)

Though scientific studies to this effect are fairly recent, even the ancients were well aware of the health benefits of wine. They knew it made water safe to drink and that it reduced infections in open wounds and surgeries. It also was a common ingredient in medicines – sometimes the only ingredient.

The foremost medical authority of the ancient world was Galen (129 AD – c.200). Today he is known mostly for what he got wrong, but, given that Roman law hampered him by forbidding the dissection of human cadavers, he got a surprising amount right, including his analysis of voluntary muscle control via a brain-centered nervous system. (Aristotle, by contrast, thought the brain merely cooled the blood; outside the Capital Beltway, this is untrue.) In addition to being an author of medical texts, he was a skilled hands-on surgeon noted for his successful cataract operations. Galen also was the personal physician of the emperor Marcus Aurelius. Nearly all the medicines he prescribed were based on wine. Just to be sure the emperor got the very best medicine, Galen “in the execution of my duty” thoroughly sampled the wines in Marcus Aurelius’ huge wine cellar: “I kept on until I found a wine without a trace of bitterness.” It was a 20-year-old Falernian white aged to a golden yellow; you still can buy Falerno bianco, by the way, if you wish. Unfortunately, it wasn’t enough. Or maybe it was too much. When Marcus fell ill with the plague (probably smallpox) that badly weakened the legions and the empire at a critical moment in Rome’s history, Galen had him consume nothing but bowls of his medicine (mostly that 20-year-old Falernian) for a week. Marcus Aurelius did what he was told and died. Maybe he would have died then anyway, but one can’t help wondering if being bombed for a week really was helpful to his natural defenses.

And there we have the problem: that word “moderate.” I’m sure there are people in the world who drink one or two glasses every day, but never more than two. I never met any, but they probably exist. (Perhaps it is more common outside of the United States and northern Europe.) Everyone I’ve ever known has drunk substantially less or substantially more, and at a far less even pace. Let’s be clear: a good average won’t do. 7 – 14 drinks on Saturday do not confer the same benefits as one or two drinks daily during the week. Binge drinking is defined as 5 or more drinks per occasion, and the effects are deleterious, not healthful.  Those of my acquaintance who drink every day also binge drink with frequency, typically every weekend. Those who don’t drink daily either binge drink occasionally, scarcely drink at all, or are actual teetotalers. (Disclosure: in my 20s sporadic over-indulgence was something of a hobby; since age 30, I’ve been close enough to being a teetotaler to share all the health risks.)

Don’t get me wrong. I have no quarrel with the vine. Alcohol is deeply embedded in human culture, and has been since before the dawn of civilization. I don’t want Prohibition back, and I think the reset of the drinking age to 21 in the US was misguided and counterproductive. (Supporters claim it has reduced accidents, but accidents have dropped by the same amount in countries such as the UK and Australia where the legal age is still 18.) However, I think studies and websites that tout the benefits of moderate alcohol consumption are misleading when they don’t mention what percentage of the population actually succeeds in drinking this way.

To anyone who can keep to the prescribed daily regimen, you have my respect. However, I know myself not to be one of you, so I think I’ll leave the wine presses where they are – or, if I remove them, it will be only to display them as antiques next to my two kerosene heaters that haven’t been lit since 1934.

Friday, August 24, 2012

All in All, I’d Rather Be Errol

A podcast on the Scientific American site caught my ear the other day. It was by James Flynn, after whom the Flynn Effect is named. I had read here and there about this effect before, but it was interesting to hear from the fellow himself. The Flynn Effect is the apparent and puzzling rise in IQs since IQ tests were invented a century ago. He documented the rise back in the 80s and has kept up with his research since then.

The rise hadn’t received attention prior to the 80s because of the way IQ tests are written and scored. The tests – first devised prior to World War 1 – never were intended to measure IQ changes over time. They were and are designed to measure relative intelligence at a particular moment in time. Assume for the moment that the tests remain unchanged for two years. (Actually, new questions are added and old ones dropped frequently, but a test might remain the same for a few years running.) In any given year, the average number of correct answers by test-takers is arbitrarily given a score of 100, which, by definition, is “average IQ.” The most common tests have a standard deviation of 15, meaning 50% of the population falls in the “normal” range of 85-115. 25% fall above and 25% below. 95% score between 70 and 130, with only 2.5% above 130 and 2.5% below 70. But what if the average number of correct answers is higher among test-takers this year than last year? That higher number is called 100 this year. So, the identical answers to the identical test can produce two different IQ scores depending on when the test was taken.

This is exactly what has happened, and the effect over decades is not a small one. If you don’t “renormalize” the scores (i.e. reset the average to 100), the raw data indicate that IQs consistently have risen a full 3 points per decade in the US since the tests started to be given. Most of the world has seen a similar rise, though there is variation nation by nation. So, if you give students in 2012 the exact same intelligence test given to students in 1952 and score the test exactly the same way it was scored in 1952, today’s students have an average score of 118. Another way of saying this is that the average student in 1952 was of below normal intelligence by 2012 standards. The average person of 1910 would have been below 70 IQ by 2012 standards, and so not legally competent. Einstein by 2012 scoring was a fairly ordinary guy. (There were no IQ tests in the time of Isaac Newton, but, if we extrapolate back to then, the average person would have had no brains at all.)

Clearly, something is wrong here, and Flynn himself expresses doubt about what the numbers mean. “Why,” he asks, “did teachers of 30 years experience not express amazement at finding their classes filling up with gifted students?” They don’t express any such thing, of course. Quite the contrary, at least in mid and upper grades. Thanks mostly to pre-school (almost nonexistent a half century ago), very young kids actually do outperform their peers of 50 years ago in reading and arithmetic. Kids often enter kindergarten already reading; no one did in my kindergarten, and that was the norm. The head start of today’s young kids doesn’t confer any lasting advantage, however. By 5th grade it has faded away. Despite heavier homework loads and vastly more expensive schools, kids in high school perform worse than their older peers – so much so that SATs had to be made easier to keep the nominal scores from tanking too far. If you took SATs in the 1970s you should add 70 points to your verbal scores and 30 points to your math in order to adjust them to 2012 standards. (Note that SATs test general knowledge, not abstract reasoning.) 12th graders have a smaller active vocabulary than 12th graders of 50 years ago. They have no better understanding of algebra or geometry, and, if you take away their calculators (which, admittedly, no one does anymore), their basic math skills are worse.

Some analysts go so far as to argue that modern high school graduates are intellectually impaired. See The Dumbest Generation: How the Digital Age Stupefies Young Americans and Jeopardizes Our Future (Or, Don't Trust Anyone Under 30) by Mark Bauerlein. I think his concerns are overblown, but not all of Bauerlein’s arguments are fanciful.

There is another point. I’m old enough to remember average people who were born in the 19th century, including two of my grandparents. (The other two were 1900, which technically was also 19th century, but, by convention, we choose to call it the first year of the 20th.) None of them was retarded. Anything but. In practical matters (mechanics, construction, animal husbandry, etc.), all of them were cleverer and more competent than I am; if you thought you could outsmart them you probably were making a mistake. However, none went beyond the 8th grade (only 5% of Americans did in 1900) and I can see how they might have been baffled by abstract questions on IQ tests – e.g. this very typical example: (picture of squiggly line) is most similar to A. (picture of cube) B. (picture of square) C. (picture of straight line) or D. (picture of circle). I know all of my grandparents would have considered this a damn fool question: “They aren’t similar!” If forced to answer, they might have said circle because of the curved line. The answer, of course, is C because it is one-dimensional (other than the thickness of the line itself) while the other images display area or volume. My grandparents understood dimensions. The charming novel Flatland: A Romance of Many Dimensions was written in 1884, after all. They would have “gotten” why C was the right answer when you pointed it out to them. But they just weren’t used to thinking this way on written exams, and still would have thought it a damn fool question. Modern students would pick the correct answer right away, because this is exactly the sort of question they encounter regularly in school and on aptitude tests. They are taught, sometimes privately tutored besides, to answer questions like this. Abstract questions, unsurprisingly, are the ones on which modern test-takers have gotten better; they haven’t improved on the basic knowledge side of the IQ tests.

Beyond this enhanced performance on certain types of standardized tests, is there any indication that people have gotten smarter? Not in school, apparently. Elsewhere? If so, it’s not obvious. There is no noticeable increase, for example, in “critical thinking,” the ability to evaluate hypotheses skeptically. Outlandish conspiracy theories are as popular as ever, and belief in the paranormal actually increases with education. According to a 2006 study by Bryan Farha and Gary Steward Jr. (source: LiveScience), 23% of college freshman believe in the general gamut of the paranormal (including astrology, clairvoyance, and ghosts), while 31% of seniors and 34% of graduate students do. The numbers are far higher for individual beliefs: among college freshman, for example, 40% believe in haunted houses, with 25% unsure.

Has the intellectual quality of political discourse risen? Flynn thinks so, pointing to some boneheaded comments in the House of Representatives a hundred years ago. I think he is overly generous to the current crop of politicos. I can link to plenty of boneheaded comments and speeches today, including a video of one sitting member of Congress expressing concern that the island of Guam might tip over. On balance I think the trend is the other way. Consider these very characteristic remarks before a crowd of farmers and local merchants by the barely schooled Abraham Lincoln in 1838. He praises the Founders of the nation and then warns about this:

“The field of glory is already harvested and the crop is already appropriated. But new reapers will arise and they too will seek a field. It is to deny what the history of the world tells us is true to suppose that men of ambitions and talents will not continue to spring up among us. And when they do they will as naturally seek the gratification of their ruling passions as others have done before them. The question, then, is can that gratification be found in supporting and maintaining the edifice that has been erected by others? Most certainly it cannot.”

No politician talks to a crowd this way today, and few would be capable of it. None would offer a theory of history or philosophize about the nature of man. Modern politicians assume (probably correctly) that such things are way over the heads of most voters, and they dumb down their messages accordingly. I also guarantee that folks in 1865 would have seen the irony in these earlier remarks, even if many in 2012 do not.

Yet, the rise in raw scores on IQ tests is real. The tests are measuring a change in something, but what? People today of all ages face abstract questions and puzzles more often than in years past. Many of these puzzles are encountered in non-school environments, such as when navigating the artificial realities of video games or when figuring out the arbitrary but (usually) self-consistent rules of some computer program. We develop a knack for solving them. Flynn uses the example of crossword puzzles. When he first became interested in them, he was bad at them. He had to learn to think in terms of unusual uses of words and to anticipate puns in the clues. Eventually, he became good at them. He thinks something analogous might be going on in IQ tests, but believes this to be only a partial answer for the rise in scores. While acknowledging that he is the expert, I nonetheless suspect it is the whole answer, or at least something very close to the whole. It would explain why these higher scores haven’t translated into better academic performance. I, too, have developed a modest skill at crossword puzzles in the past few years, but learning that “wapiti” is the word for the North American elk (among other trivia) hasn’t enhanced my performance at anything other than more crossword puzzles. It is a niche skill. So, it seems, is whatever is being measured in IQ tests.

Perhaps there is something valuable in these ever-more-finely honed test-taking skills, even if it is just a greater facility at playing Call of Duty on the latest Xbox. Let us hope so. Even if the only transference is to an aptitude for being foolish in a more clever fashion, that at least would have entertainment value, to the observers anyway.

Sam Was Ahead of His Time Academically

Friday, August 17, 2012

Fandom at Random

My guest bedroom also serves as my computer room and as a catch-all place for things such as my telescope (nothing fancy or expensive), microscope (also nothing fancy), and odd bits of memorabilia. On the walls are autographed 8 x 10 photographs of celebs I have met over the years at places such as Chiller Theater Conventions. I usually acquire only two or three new photos per year, but, since I’ve been doing this quite a while, the walls are getting pretty crowded.

A visitor at my house the other day asked to use the computer to check her e mail. This happens less and less as internet-linked mobile phones become the rule rather than the exception (though I presently don’t have one), but it still happens.

“Wow, you know all these people?” she asked, looking at the walls.

“No, not really. I just snare the occasional pic at conventions.”

Being decades younger than I (she wasn’t there to visit me, alas), she added “I don’t recognize any of them.”

“I probably wouldn’t recognize anyone posted on your wall either,” I responded, almost surely correctly.

“Oh, I know that one. He was in something years ago.” She was pointing at Edward James Olmos as Admiral Adama, which he played 2004-2009.

Battlestar Galactica,” I said.

“Yeah, I didn’t really watch that.”


I let her get on with her online activity.

Fame is not actually fleeting, though it does diminish. After all, there are sizable crowds at those conventions seeking autographs of people my guest didn’t recognize, and long dead actors have facebook pages with thousands of friends. Fashionable heartthrobs come and go, however, since youth (relative youth anyway) is always a factor in that particular status. I’m a fan of many of the occupants of my computer room walls, but of all of them I had a true schoolboy crush (yes, at the appropriate age) on only one: Britt Ekland. I was kind enough not to tell her that while she was signing her photo; it’s not something one wishes to hear from someone graying around the ears.

In the last decade there has been an increase in scholarly papers and books about fandom. It is a broad and rich topic, in part because fans vary so much. Sports fans, Harry Potter fans, and band groupies seem have little in common other than focused dedication. The degree of dedication varies a lot among fans, too, but among many it can be intense, even violent. More than a few riots have followed sports matches. There is nothing new about this. In 531 AD a riot broke out at the hippodrome in Constantinople among supporters of the Green and Blue chariot teams. (Red and White supporters evidently were a more sedate bunch.) The rioters ran amok for days, burned much of the city, and turned their uprising into a full scale revolt against the government. The Emperor Justinian (a Blue team fan) sent in the troops. According to the historian Procopius, 35,000 people were killed, making this the deadliest sports riot ever. Fortunately, fans usually are satisfied just to cheer or boo.

For fans of individuals, the whole relationship is much more personal, even romantic. Psychologists call this sort of fandom Parasocial Interaction, which seems to me a misnomer since frequently there isn’t any interaction. The relationship is entirely one-sided. This sort of fandom can occur without modern media, though I’ll nonetheless refer to a movie for an example. There is a scene City Slickers (1991) in which Curly (Jack Palance) tells Mitch (Billy Crystal) about being smitten by a woman he once espied from a distance.

Mitch: What happened?
Curly: I just turned around and rode away.
Mitch: Why?
Curly: I figured it wasn't gonna get any better than that.
Mitch: But you could have been, you know...with her.
Curly: Been with lots of women.
Mitch: Yeah, but you know, she could have been the love of your life.
Curly: She is.

It is true, though, that modern media are what make the phenomenon a mass effect. When silent film star Rudolph Valentino died in 1926 there was a rash of suicides across the nation. 100,000 people showed up at his funeral and rioted. Nothing like it had been seen before. We’ve seen plenty like it since. Now we are accustomed to millions grieving over James Dean or Elvis Presley or Michael Jackson as though they had lost personal friends. Sociologists often point out that we spend as much or more time in the “company” of favorite stars we see on the screen as we do with actual friends and family. From well-publicized crimes, we all know that some disturbed individuals are unable to distinguish fantasy from reality; a few of them become troublesome or dangerous stalkers. Most people, though, are perfectly aware of the difference between real friends and Friends; they can feel familiar with the cast of the show while knowing they really aren’t.

“Engagement in a devotee world isn't inherently harmful,” according to Jeff Rudski, a psychologist  specializing in fandom, Harry Potter fandom in particular. “But for some, the object of devotion begins to substitute for other rewards in life.” Perhaps, but for most folks fandom is not only harmless but life-enhancing. If you wish to see a lot of happy playful people enjoying each other’s company, go to a Star Trek Convention or to Comic-Con. More often than not, we are richer for such enthusiasms. As for our adolescent “parasocial” attachments, long after we have taken down the posters from our bedroom walls they continue to influence our tastes and values in subtle and often unconscious ways. As a possible example, it may be pure coincidence that my first serious romantic attachment was to someone (hi Angela) who bore a distinct resemblance to Britt. Then again, maybe not.     

Sunday, August 12, 2012

Galactic Grandstanders

A plurality of Americans, according to a National Geographic poll, believes alien spaceships are buzzing the earth: 36% believe, 17% don’t, and the rest are unsure. I’ll not argue with democracy. However, unlike those discreet critters in their stealthy ships, who leave no definitive evidence even when they crash, and who preferentially reserve face-to-face meetings for drunken fishermen and troubled attention-seekers who will never be believed, earthlings scatter signs of their existence hither and yon while shouting out to the heavens on deep space radio transmitters, “Here we are!”

We haven’t gone very far, of course, but where we have gone we have strewn our junk all around. No fewer than four probes have crashed on Mars, for example: one Russian, one British, and two American. The fate of a fifth, the Russian lander Mars 6 launched in 1973, is unknown; it stopped transmitting while landing, but a radio failure doesn’t necessarily mean it crashed. Eight more craft have landed successfully, including three rovers, two of which are currently operational. Far from being coy about who sent all this stuff, we stuck on the Phoenix lander a DVD (next to the little flag in the picture below) containing the names of a quarter million people along with scifi and artwork. I’m not sure if Martian DVD players are in a compatible format, though.

Four robotic spacecraft are currently nearing the heliopause, the boundary between the solar system and interstellar space: Pioneer 10 & 11 and Voyager 1 and 2. Voyager 1, faster than the earlier Pioneers, will get there first. Beyond the heliopause the solar wind and magnetic field no longer dominate the spatial environment. The two Pioneers stopped transmitting in 1995 and 2002. The two Voyagers remain in touch; their nuclear power sources should be good until 2020. All four of them contain information about ourselves for the edification of whomever might find them. Each Pioneer carries a plaque with etched info, while each Voyager carries a gold-plated copper DVD full of sounds and images of earth, plus musical selections. (The joke at the time of launch was that we would get a message back: “Send more Chuck Berry.”) It will be a long time before anybody finds them, if anyone ever does. Voyager 1 won’t pass another star (AC+79 3888 in the constellation Camelopardalis) for 40,000 years, and 1.6 light years isn’t a very close pass.

Not willing to wait 40,000 years, we send out intentional signals into space along with all the ones (e.g. defense radar and UHF TV) that we emit carelessly. I don’t know what the LGM (little green men) will make of them. This is what was sent from the Evapatoria Deep Space Antenna in Ukraine: . Be honest, what can you make of these pages? And we both think like the humans who designed them. If the LGM can (or would think to) reconstruct those pages in the first place from the binary signal, and if they somehow can translate them (which makes them way smarter than I), will they not think we are a planet of crazed naked math tutors? This is the English translation for when you have given up trying on your own: .

In a way, there is something charmingly child-like about all this: “Hey everyone! Look at me! Look at me!” Whether the LGM of the galaxy (if there are any) will agree is anyone’s guess.

DVD on Phoenix Lander

Voyager Recording: the etchings explain how to play it

Monday, August 6, 2012

A Curious Thing

When I was born there was life on Mars. Well, not really – unless it was (is) of a hardy microbial kind, which, though not impossible, is unlikely. However, the general consensus back then was that the shifts in the light and darks patches fuzzily visible in earth telescopes were best explained by seasonal variations in vegetation. Vegetation implied animal life to eat the plants. Where there were animals, might there not be civilizations? Percival Lowell in the late 19th and early 20th centuries famously convinced himself he saw engineered canals on Mars. His observation was proof of intelligent life; unfortunately, it was just his own. The only line on his hand-drawn Martian maps that proved not to be imaginary was the Valles Marineris canyon. Seasonal color variations are now attributed to dust storms and frozen carbon dioxide (dry ice).

Martians abounded in popular culture, and I did nothing to try to escape them. The second full-length novel I ever read, children’s literature aside, was H.G. Wells’ War of the Worlds (1898) – the very first was Arthur Conan Doyle’s The Lost World, which was lost right here on earth. Edgar Rice Burroughs in 1912 began his John Carter series set on Mars; Disney’s fx-heavy movie version this past spring shows there still is life in that franchise, if not actually on the planet. I don’t remember when I read Burroughs’ A Princess of Mars (the first in the series), but it was before Grade 7 sometime. I discovered Bradbury and Heinlein a few years later. Invaders From Mars (1953) was one of my favorite movies as a kid. As late as 1963 My Favorite Martian could be the name of a TV series – one which I watched whenever possible.

On my birthday in 1964, I watched the launch of Mariner 4 from Cape Kennedy, as Cape Canaveral was called between 1963 and 1973. The probe would make the first ever close fly-by of the planet. The craft neared Mars in July 1965. Given the lingering public perception of the plausibility of Martian life – though scientific opinion had grown skeptical by the 1960s – it is not surprising that millions of viewers were glued to their TVs on July 14 (7:18 PM EST) as the first images from Mariner 4 assembled line-by-line on our screens. To a young sci-fi fan such as me, the images were dismaying. Mariner 4 took close-up pics of only 1% of Mars’ surface, and, by luck, that swath happened to be a region atypically dense with craters. There were no forests and no remains of ruined cities. Mars looked like the moon. It’s hard to quash the spirits of young space cadets for long, however. “Terraforming” already was a familiar concept in sci-fi, as in Heinlein’s Farmer in the Sky (1953) set on the Jovian moon Ganymede. If Mars wasn’t habitable, we could make it so; few of us were yet ready to give up on the red planet. Mars remained the next frontier with a surface area equal to that of all the terrestrial continents.

Nearly 50 years later, I still buy science fiction, though nowadays it is a minority (though not a tiny minority) of my recreational reading selections. Among recent novelists, Kim Stanley Robinson probably has the most detailed descriptions of Martian terraformation – to the point sometimes of overwhelming the underlying stories. Actual manned flight to Mars, however, remains “20 years away” which is exactly the same distance it supposedly has been my entire life. As least we are holding our own.

Mars is back in the news this week, of course, thanks to the successful deposit of Curiosity on the planet’s surface. The landing sequence looked like something designed by Rube Goldberg, but it worked – then again, so did most of Rube’s improbable machines. It is the most capable robot yet sent to Mars. It joins the much smaller Opportunity rover, which remains active. (The mission of the rover Spirit terminated last year.)

So, at last there are Martians. They are our own robot children.

P.S. Perhaps oddly, only one of my own short stories is set on Mars: .

Curiosity Landing Sequence (double-click for full-screen)

Purdue University Rube Goldberg Machine

Thursday, August 2, 2012

Missing Gore

Gore Vidal’s passing this week at the age of 86 has not gone unnoticed. Nearly every US news outlet of consequence along with many overseas at least mentioned it. A few did so at length. Most, however, treated the story as they would the passing of any minor celebrity. More than a few ran clips from the 1968 Democratic Convention in which he and William F. Buckley sparred angrily in what was for both of them uncharacteristically sophomoric fashion – “good television” in today’s reality TV terms, but not really fair to either. None of us should be judged by the occasional off moment. (Gore came off better than Bill, but both were out of line.)

I doubt Gore Vidal expected anything different. In an interview several years ago Gore remarked to an interviewer that he was once a famous novelist. When assured he still was, Vidal argued that the adjective no longer fits the noun. He might well be famous as a TV personality, but not as a novelist, for novels no longer occupy a central place in the culture. Only a tiny minority of adults read novels. The movies and other media dominate instead. He was right, but amid that minority he stood tall.

Gore grew up well connected. His grandfather was Senator Thomas Gore (yes, related to Al Gore), his father was FDR’s aviation expert and a personal friend of Amelia Earhart, and he himself was friends with the Kennedys, he and Jackie Kennedy were “related through divorce” (they had a stepfather in common), and he socialized with many of the postwar literary lights including Tennessee Williams and Truman Capote.

Favorite authors are like old friends. Their voices are on hand whenever we need them, and they inform our thoughts as much as anyone we know in person. In this sense Gore is a very old friend, even though we never met and perhaps wouldn’t have liked each other if we had. Our one-sided introduction came when I was 13 years old and rarely read recreationally anything more challenging than science fiction novels aimed at boys my age. For some reason I picked off a shelf at home Dark Green Bright Red, Vidal’s novel of revolution in a Central American country; my mom must have bought it. Even then, something about his style caught my mind’s ear in a way only Mark Twain had done previously. Gore had won a new reader.

A quick look at his bibliography reveals that I’ve read 21 of his 26 novels and short story collections – the missing ones are mostly his early work, though I have read his first book Williwaw as well as The City and the Pillar, the 1948 novel that caught him so much flak at the time for its homosexual themes. I’ve read 2 of his 8 stage plays, a collection of his 1950s teleplays, 13 out of 26 collections of essays, and 2 of his 5 pseudonymous novels. I’ve seen 8 of the 14 movies for which he wrote the screenplays. Not an exhaustive exposure, but a plentiful one.

Vidal’s fiction is varied, to say the least, though the writing is uniformly well-crafted. The historical novels on what he liked to call the American Empire are the finest of their kind, and anyone who is bored by textbook histories would do well to pick up these engrossing books instead: Burr, Lincoln, 1876, Empire, Hollywood, Washington, D.C., The Golden Age. His novels on classical times, Creation and Julian, are on a par with anything by Robert Graves. His off-beat novels, e.g. Myra Breckinridge (don’t judge by the awful movie), are not only fun but have something to say about culture, human nature, politics, and sex.

As an essayist, Gore was unparalleled. He wrote literary criticism and quite a lot about politics and culture. I often found myself on the other side of the political fence from him, but he invariably knew where the fence was and described it with sardonic wit. On purely social issues I almost always agreed with him, and on foreign policy matters I usually did, but we parted company elsewhere. It didn’t matter. He was instructive – perhaps the most instructive – when we disagreed.

Gore was very much a man of the Left and grew more radical as he grew older. He had a visceral distaste for private wealth, especially inherited wealth. Yet, while always antagonistic toward the traditional Right, social conservatives, and neo-cons, he was never doctrinaire. His commitment to civil liberties so outweighed other considerations with him, that in the 90s when asked about party politics, he remarked, “I’m partial to the Libertarians.” In 1980 he said that the best choice was between the Citizens Party (Barry Commoner’s far left party) and the Libertarians. He favored the former, but respected the consistency of the latter; respect for philosophical opponents is something that has grown rare in recent years.

Gore often ignored politically correct niceties. For example, though a fierce proponent of sexual freedom, he disliked the word gay, and argued that there was no such thing as a homosexual or heterosexual. “I’ve always said it was just an adjective. It is not a noun.” He said the words describe particular acts; they do not indicate states of being. He argued, sometimes mischievously citing Freud, that individual tastes do indeed vary along a continuum, but that everyone is bisexual. This position exasperated Larry Kramer in a 1992 interview:

LK: But Gore you are gay. You’ve lived with a man for 40 years or something, and everyone who knows you personally knows you are gay. And I think you think of yourself as gay.
GV: I assure you I do not think of myself in these categories. It is like saying I’m a carnivore.

He meant, of course, that he preferred meat dishes but was capable of eating veggies, and that he didn’t form a sense of identity around his food preference.

Whatever one thinks of Gore’s politics, opinions, and fiction, the man by the time of his death was America’s foremost “man of letters,” a term one scarcely hears applied to anyone anymore. And while, once again, we never met, he is also an old friend. I’ll miss him.

Howard Stern's Unconventional Obit