I’m not normally greatly impressed by fx. Give me a good script and cheesy fx over the reverse any day -- any day but yesterday when I caught Avatar.
Avatar’s cgi animation achievement (in 3D) is so over-the-top that it deserves to seen just as a spectacle. See it in the theater no matter how big your plasma or LCD screen may be. This needs a really big screen. The script? Well, there is a certain irony to a film that relies so heavily on advanced industrial technology carrying an anti-high-tech-industry message. The whole Noble Savage thing may be a bit overdone (in the way that a Caterpillar D10 is a trifle large for a tractor). For all that, it is well-plotted, well-written, and well-paced – so much so that the 162 minutes passed quickly. In this it is unlike Cameron’s earlier film Titanic which seemed to me interminable – I know it was a hit, but by an hour into it I was rooting heartily for the iceberg. The villains in Avatar are villainous, the heroes heroic, and their clash makes for a rousing drama. This kind of thing could give blockbusters a good name.
Now if I only could get my hands on some Un-obtainium. I'm guessing that’s the stuff that makes the mountains float in the air on the moon Pandora.
Wednesday, December 23, 2009
Thursday, December 10, 2009
Rebel without a Van
I was stopped by a patrol car a couple weeks ago, ostensibly for not using my turn signal. (I did signal: I had spotted the police vehicle while approaching the traffic light and made a point of signaling.) The real reason was that it was Saturday night and there are lots of bars along the road off of which I just had turned. It was obvious to the officer that I had not been drinking (I haven’t been buzzed, much less drunk, in about 25 years), so after the customary license and registration check, I was let off with a warning. That’s OK. I’d put the odds at about 25% that a random driver turning off that road at that hour on that night had good reason to fear a breathalyzer, so I don’t mind the dubious basis for the stop. Hazardous drivers really were out and about, and I suspect the officer got to meet one or more of them before the night ended.
Draconian laws against drinking and driving haven’t ended the practice of mixing the two, though they influence the rates. This is a violation, however, which most people understand puts others at serious risk. Hence, even people who might get caught themselves (what percentage of drinkers – who are more than 80% of the total population – at some time or other drive when they shouldn’t?) by and large favor tough laws and enforcement.
The situation is different with laws and regulations that simply seem bothersome – at least for us. (Hey, other people may not be able to talk on a cell phone and drive at the same time, for example, but you and I can do it safely, right?) There is a sometimes dangerous but very human tendency for people to ignore such rules that are difficult or irksome to obey. So, they leave safety doors jammed open in factories rather than fumble with locks twenty times a day. They remove batteries from smoke detectors that go off from shower steam. They mix their 2 and 4 plastics together. They drive 9 mph over the speed limit hoping police really live by the rhyme, "9 you're fine, 10 you're mine." (Don't count on this one.) What these kinds of rules have in common is that they seem arbitrary. Thus, they are not really a matter of ethics, in the way that theft and assault are ethical violations. A sign that says "one hour parking" is an arbitrary regulation that just as easily could have said “two hour parking” if some municipal bureaucrat had felt so inclined, so few people feel they have done anything inherently wrong if they leave a car parked under it for 80 minutes.
Oddly, it seems there is one class of people especially likely to violate petty and arbitrary rules of all sorts: female drivers of soccer vans. Huh? Yes, really. Dr. Wiseman in his book Quirkology refers to various studies by Professor John Trinkhaus of CUNY, who stumbled on the pattern by accident. Trinkhaus and his student aides studied express checkout counters in supermarkets in 1993 and again in 2002. For demographic information, they checked the cars in the parking lot of those customers who took more than the prescribed 10 items through the express lane. They were surprised to discover fully 80% were female van drivers. Intrigued, they counted cars that parked illegally in the supermarket fire zones; 35% were put there by female van drivers, far more than their proportion of drivers in the lot. They went on to record violators of the speed limit in school zones. All groups of drivers were bad at this – 86% of men violated it, for example – but the female van drivers were the worst: 96% violated the limit. Next, Trinkhaus’ team spent 32 hours at boxed intersections recording which vehicles failed to keep clear of the box during red lights. 40% of all violators, a very disproportionate amount, were female van drivers. Stop signs yielded similar results. The least compliant drivers were female van drivers, 99% of whom failed to stop at stop signs.
I mentioned this curiosity to my friend Ken and said I could see no reason why female van drivers should be such a particular menace, at least with regard to arbitrary rules. Unlike myself, Ken has raised a family. He said, "I understand it. Drive around with a van full of kids for a week, and you’ll understand it, too. They drive you crazy." I'll take his word for it.
Draconian laws against drinking and driving haven’t ended the practice of mixing the two, though they influence the rates. This is a violation, however, which most people understand puts others at serious risk. Hence, even people who might get caught themselves (what percentage of drinkers – who are more than 80% of the total population – at some time or other drive when they shouldn’t?) by and large favor tough laws and enforcement.
The situation is different with laws and regulations that simply seem bothersome – at least for us. (Hey, other people may not be able to talk on a cell phone and drive at the same time, for example, but you and I can do it safely, right?) There is a sometimes dangerous but very human tendency for people to ignore such rules that are difficult or irksome to obey. So, they leave safety doors jammed open in factories rather than fumble with locks twenty times a day. They remove batteries from smoke detectors that go off from shower steam. They mix their 2 and 4 plastics together. They drive 9 mph over the speed limit hoping police really live by the rhyme, "9 you're fine, 10 you're mine." (Don't count on this one.) What these kinds of rules have in common is that they seem arbitrary. Thus, they are not really a matter of ethics, in the way that theft and assault are ethical violations. A sign that says "one hour parking" is an arbitrary regulation that just as easily could have said “two hour parking” if some municipal bureaucrat had felt so inclined, so few people feel they have done anything inherently wrong if they leave a car parked under it for 80 minutes.
Oddly, it seems there is one class of people especially likely to violate petty and arbitrary rules of all sorts: female drivers of soccer vans. Huh? Yes, really. Dr. Wiseman in his book Quirkology refers to various studies by Professor John Trinkhaus of CUNY, who stumbled on the pattern by accident. Trinkhaus and his student aides studied express checkout counters in supermarkets in 1993 and again in 2002. For demographic information, they checked the cars in the parking lot of those customers who took more than the prescribed 10 items through the express lane. They were surprised to discover fully 80% were female van drivers. Intrigued, they counted cars that parked illegally in the supermarket fire zones; 35% were put there by female van drivers, far more than their proportion of drivers in the lot. They went on to record violators of the speed limit in school zones. All groups of drivers were bad at this – 86% of men violated it, for example – but the female van drivers were the worst: 96% violated the limit. Next, Trinkhaus’ team spent 32 hours at boxed intersections recording which vehicles failed to keep clear of the box during red lights. 40% of all violators, a very disproportionate amount, were female van drivers. Stop signs yielded similar results. The least compliant drivers were female van drivers, 99% of whom failed to stop at stop signs.
I mentioned this curiosity to my friend Ken and said I could see no reason why female van drivers should be such a particular menace, at least with regard to arbitrary rules. Unlike myself, Ken has raised a family. He said, "I understand it. Drive around with a van full of kids for a week, and you’ll understand it, too. They drive you crazy." I'll take his word for it.
Monday, November 30, 2009
The Zen of Kitchen Remodeling
A friend is considering selling a property “when the market recovers,” whenever that may be. I don’t expect to list it anytime soon. For most of the past year we have been treated to weekly news stories telling us the real estate market has “bottomed out.” True enough. So has the Titanic. It is still a long way back to the surface.
Anyway, she asked advice about installing upgrades to the house in the meantime. Many real estate brokers encourage this, of course, since we all prefer to market sparkly homes with all the bells and whistles. They frequently point to enthusiastic articles in magazines aimed at homeowners. For example, "The Cost vs. Value Report" article in a recent Remodeling Magazine issue gushes that a seller commonly can recover “86% of the cost” of sensible remodels of kitchens and baths through a higher sale price. I’m an odd broker who rarely recommends anything more than cosmetic spit and polish. The reason should be obvious: 86% is not 100%. I may not be a math wizard but I can do simple sums. If a higher sale price doesn’t cover the cost of the upgrade, there isn’t much point in it.
Things are a bit different for one’s own home than for an investment property. At home, a remodel may still be a luxury expense, but we all treat ourselves to a few luxuries. It makes some sense to enjoy new cabinets, ovens, and tubs yourself, rather just handing them unused to the next owner. If you eventually recover 86% of the cost, so much the better.
The folks I find fascinating, though, are the serial remodelers. You know the ones. The construction never quite stops because the house is never quite right. They seem to believe they’ll finally find happiness if they just can find and install the right granite countertops and oven ranges. It seems unlikely to me, but who knows? Perhaps they are right.
Anyway, she asked advice about installing upgrades to the house in the meantime. Many real estate brokers encourage this, of course, since we all prefer to market sparkly homes with all the bells and whistles. They frequently point to enthusiastic articles in magazines aimed at homeowners. For example, "The Cost vs. Value Report" article in a recent Remodeling Magazine issue gushes that a seller commonly can recover “86% of the cost” of sensible remodels of kitchens and baths through a higher sale price. I’m an odd broker who rarely recommends anything more than cosmetic spit and polish. The reason should be obvious: 86% is not 100%. I may not be a math wizard but I can do simple sums. If a higher sale price doesn’t cover the cost of the upgrade, there isn’t much point in it.
Things are a bit different for one’s own home than for an investment property. At home, a remodel may still be a luxury expense, but we all treat ourselves to a few luxuries. It makes some sense to enjoy new cabinets, ovens, and tubs yourself, rather just handing them unused to the next owner. If you eventually recover 86% of the cost, so much the better.
The folks I find fascinating, though, are the serial remodelers. You know the ones. The construction never quite stops because the house is never quite right. They seem to believe they’ll finally find happiness if they just can find and install the right granite countertops and oven ranges. It seems unlikely to me, but who knows? Perhaps they are right.
Sunday, November 15, 2009
The Half-Baked and the Dead
Gore Vidal once remarked to an interviewer that he was once a famous novelist. When assured he still was, Vidal argued that the term had lost meaning. The adjective doesn’t fit the noun. He himself might well be famous simply as a celebrity, but not as a novelist, for novels no longer occupy a central place in the culture. They are not “discussed in the agora” as they once were. Books have been replaced there by movies and other media. Speak of books in public, and the conversation will be one-sided, or, at best, limited to a small splinter set. Speak of the movies and ears perk up all around while opinions fly. As a screenwriter as well as a novelist, Gore was complaining less than observing.
His eyesight was keen enough. Yet, novels retain value, even when discussed in the alleyway instead of the marketplace.
Vidal’s occasional nemesis Norman Mailer is another formerly famous novelist. He has a better excuse for the “formerly” than Gore. He died two years ago. However, even when alive, he underscored Vidal’s point by being more known than read despite repeated appearances on the New York Times best seller list. Outside of schools and colleges, where students are forced to read, readers of novels (Harry Potter and Twilight notwithstanding) are a minority of the population. Readers of (dreaded word) “serious” fiction are a minority within a minority. Accordingly, a typical “bestseller” is read only by a tiny fraction of the public. The more literary the novel, the tinier the fraction will be. Yet, pretty much everyone has heard of Norman Mailer.
Mailer’s celebrity is unsurprising. He was a tabloid writer’s dream: public feuds, outrageous comments, six marriages, a knife assault on one of the wives, and that disastrous business with Jack Abbott, author of In the Belly of the Beast, who committed murder a month after a parole which Mailer championed.
Put all that aside for a moment. Mailer’s work alone is worth a few words. I won’t even mention the (apparently apocryphal) Tallulah Bankhead story.
Students of 20th century American literature cannot afford to ignore Norman Mailer. Starting with the war novel The Naked and the Dead, his books were too much too significant a part of the times he lived. He is not one of my favorites of the period, though. His novels mix brilliance with carelessness in a way that is downright annoying. His prose is off-putting and captivating on the same page -- sometimes in the same sentence. A good example is Ancient Evenings, a historical novel set in Egypt. The research is solid, the descriptions are superb, and the battle scenes are breathtakingly presented (these are hard to write – try it). The wooden characters and hack dialogue are all the more jarring on this account. “Norman,” I recall thinking while reading the book 25 years ago, “you don’t have the right to write tripe. You can do better.” That thought (from someone who has earned fame neither as novelist nor a celebrity) was exceedingly ungenerous, but read the book for yourself and see if you don’t think the same thing.
Mailer’s non-fiction is excellent, but, being largely journalistic, it suffers the fate of all old news. Historians may delight, but other readers may no longer be interested except, perhaps, in the quirky stuff, such as his infatuated prose about Marilyn Monroe.
With a nod to the contemporary agora, I recommend the movie Tough Guys Don’t Dance, written and directed by Mailer. Based on his 1984 novel of the same name, it is a tale of intrigue and murder on Cape Cod. It shows everything that is right and wrong about the man and his works. The script shows flashes of brilliance and the dark humor is enormously funny, but the movie is mindlessly hung-up sexually – bizarrely homophobic – and much of the dialogue sounds like a bad soap opera. The characters are drawn with mixed success. The nouveau-riche trash Patty Lareine is on target (I’ve met her first cousin), yet the Southern WASP patrician Wardley Meeks III is a wild miss. The movie is worth seeing, yet it could have – should have – been better.
All the same, an intermittent brilliance is to be preferred to a constant dim light, so I miss Norman. I would have liked the chance to be annoyed one more time.
His eyesight was keen enough. Yet, novels retain value, even when discussed in the alleyway instead of the marketplace.
Vidal’s occasional nemesis Norman Mailer is another formerly famous novelist. He has a better excuse for the “formerly” than Gore. He died two years ago. However, even when alive, he underscored Vidal’s point by being more known than read despite repeated appearances on the New York Times best seller list. Outside of schools and colleges, where students are forced to read, readers of novels (Harry Potter and Twilight notwithstanding) are a minority of the population. Readers of (dreaded word) “serious” fiction are a minority within a minority. Accordingly, a typical “bestseller” is read only by a tiny fraction of the public. The more literary the novel, the tinier the fraction will be. Yet, pretty much everyone has heard of Norman Mailer.
Mailer’s celebrity is unsurprising. He was a tabloid writer’s dream: public feuds, outrageous comments, six marriages, a knife assault on one of the wives, and that disastrous business with Jack Abbott, author of In the Belly of the Beast, who committed murder a month after a parole which Mailer championed.
Put all that aside for a moment. Mailer’s work alone is worth a few words. I won’t even mention the (apparently apocryphal) Tallulah Bankhead story.
Students of 20th century American literature cannot afford to ignore Norman Mailer. Starting with the war novel The Naked and the Dead, his books were too much too significant a part of the times he lived. He is not one of my favorites of the period, though. His novels mix brilliance with carelessness in a way that is downright annoying. His prose is off-putting and captivating on the same page -- sometimes in the same sentence. A good example is Ancient Evenings, a historical novel set in Egypt. The research is solid, the descriptions are superb, and the battle scenes are breathtakingly presented (these are hard to write – try it). The wooden characters and hack dialogue are all the more jarring on this account. “Norman,” I recall thinking while reading the book 25 years ago, “you don’t have the right to write tripe. You can do better.” That thought (from someone who has earned fame neither as novelist nor a celebrity) was exceedingly ungenerous, but read the book for yourself and see if you don’t think the same thing.
Mailer’s non-fiction is excellent, but, being largely journalistic, it suffers the fate of all old news. Historians may delight, but other readers may no longer be interested except, perhaps, in the quirky stuff, such as his infatuated prose about Marilyn Monroe.
With a nod to the contemporary agora, I recommend the movie Tough Guys Don’t Dance, written and directed by Mailer. Based on his 1984 novel of the same name, it is a tale of intrigue and murder on Cape Cod. It shows everything that is right and wrong about the man and his works. The script shows flashes of brilliance and the dark humor is enormously funny, but the movie is mindlessly hung-up sexually – bizarrely homophobic – and much of the dialogue sounds like a bad soap opera. The characters are drawn with mixed success. The nouveau-riche trash Patty Lareine is on target (I’ve met her first cousin), yet the Southern WASP patrician Wardley Meeks III is a wild miss. The movie is worth seeing, yet it could have – should have – been better.
All the same, an intermittent brilliance is to be preferred to a constant dim light, so I miss Norman. I would have liked the chance to be annoyed one more time.
Friday, November 6, 2009
No Shiitake
There is something humbling about the information that the world’s largest single organism is a fungus. Located in the Blue Mountains of Oregon the Armillaria ostoyae is a blob infiltrating 2,384 acres of soil (10 square kilometers). It is estimated to be 2400 years old, a number derived from its growth rate, but it could be much older if it had periods of stalled growth or die-back.
Sunday, October 25, 2009
“We Have Nothing to Fear but Fear Itself”
Like all the most effective political soundbites, this one from FDR (who lifted it from Francis Bacon) is memorable, inspiring, and untrue.
The quote comes to mind because this is Halloween week. The holiday is an old one with pan-European roots, but the symbols from Ireland and Scotland are the ones that have stuck. Yet, it was in the United States – at least in modern times -- that the holiday really took off. Despite the kid-friendly aspects of the holiday as currently celebrated, the central feature, a blurring of life and death, has never been lost. We welcome vampires, ghosts, and ax murderers at our doors, and give them candy. The holiday turns the fear of death into something fun. The American way of celebrating the day is spreading, having become popular even in such a very un-Celtic place as Japan. The spread is not always welcome. This, from Reuters:
MOSCOW (Reuters) - Moscow schools have been ordered to ban students from celebrating the cult of the dead, better known as Halloween, despite the widespread popularity of the imported festival to Russia.
“This is destructive for the minds and the spiritual and moral health of pupils,” said Gavrilov, saying the ban had been recommended by psychiatrists.
It is a mockery of death rather than a cult, but I understand the Russian concerns. I think they miss the point, but I understand them.
Some graveyard humor can be unsettling to onlookers, as when Bridget Marquardt (one of Hugh Hefner's former girlfriends) on an episode of The Girls Next Door posed on her back for a photo at the spot where the Black Dahlia's body was discovered. Slasher-films and other such holiday fare may seem sadistic to some. Perhaps they are, but they are also more.
Dexter, the popular fictional serial killer invented by author Jeff Lindsay (and adapted for SHO), often refers to his Dark Passenger, the part of him that revels in sanguineous pastimes. Everyone has a Dark Passenger; it is part of being human. It is not unhealthy (occasionally less than tasteful, but not unhealthy) to face this with humor. That doesn't mean we give the Passenger free use of the chain saw. It annoys me when other people pompously quote Nietzsche, but I'm going to do it anyway: "I laugh at those who think themselves good because they have no claws." To have claws, even to celebrate them, and yet choose not to maul with them is what is admirable. To borrow from another seasonally appropriate tale, Dr. Jekyll's mistake was in trying to excise Mr. Hyde; that just set the fellow loose. The Doctor should have let his Hyde side chuckle freely at macabre stories by Poe now and then, and he could have gone on being the decent person he was.
There are real monsters in the world, who don't have so much a Dark Passenger as a Dark Driver. They cause untold damage and grief. There is nothing funny about this, and the monsters themselves rarely show much of a sense of humor, at least of the self-mocking kind. The rest of us deal with them as we must with the seriousness the task requires. At least one day a year, though, we also can celebrate not being one of them by dressing up as their ilk, just as we celebrate being alive with jokes about death.
A man dies and his wife calls up the obituary column of the local newspaper.
Caller: I want to place the most inexpensive notice possible. Just say "Bernie is dead."
Obit Operator: You can have up to six words for the same price.
Caller: OK. Say, "Bernie is dead. Toyota for sale."
The quote comes to mind because this is Halloween week. The holiday is an old one with pan-European roots, but the symbols from Ireland and Scotland are the ones that have stuck. Yet, it was in the United States – at least in modern times -- that the holiday really took off. Despite the kid-friendly aspects of the holiday as currently celebrated, the central feature, a blurring of life and death, has never been lost. We welcome vampires, ghosts, and ax murderers at our doors, and give them candy. The holiday turns the fear of death into something fun. The American way of celebrating the day is spreading, having become popular even in such a very un-Celtic place as Japan. The spread is not always welcome. This, from Reuters:
MOSCOW (Reuters) - Moscow schools have been ordered to ban students from celebrating the cult of the dead, better known as Halloween, despite the widespread popularity of the imported festival to Russia.
“This is destructive for the minds and the spiritual and moral health of pupils,” said Gavrilov, saying the ban had been recommended by psychiatrists.
It is a mockery of death rather than a cult, but I understand the Russian concerns. I think they miss the point, but I understand them.
Some graveyard humor can be unsettling to onlookers, as when Bridget Marquardt (one of Hugh Hefner's former girlfriends) on an episode of The Girls Next Door posed on her back for a photo at the spot where the Black Dahlia's body was discovered. Slasher-films and other such holiday fare may seem sadistic to some. Perhaps they are, but they are also more.
Dexter, the popular fictional serial killer invented by author Jeff Lindsay (and adapted for SHO), often refers to his Dark Passenger, the part of him that revels in sanguineous pastimes. Everyone has a Dark Passenger; it is part of being human. It is not unhealthy (occasionally less than tasteful, but not unhealthy) to face this with humor. That doesn't mean we give the Passenger free use of the chain saw. It annoys me when other people pompously quote Nietzsche, but I'm going to do it anyway: "I laugh at those who think themselves good because they have no claws." To have claws, even to celebrate them, and yet choose not to maul with them is what is admirable. To borrow from another seasonally appropriate tale, Dr. Jekyll's mistake was in trying to excise Mr. Hyde; that just set the fellow loose. The Doctor should have let his Hyde side chuckle freely at macabre stories by Poe now and then, and he could have gone on being the decent person he was.
There are real monsters in the world, who don't have so much a Dark Passenger as a Dark Driver. They cause untold damage and grief. There is nothing funny about this, and the monsters themselves rarely show much of a sense of humor, at least of the self-mocking kind. The rest of us deal with them as we must with the seriousness the task requires. At least one day a year, though, we also can celebrate not being one of them by dressing up as their ilk, just as we celebrate being alive with jokes about death.
A man dies and his wife calls up the obituary column of the local newspaper.
Caller: I want to place the most inexpensive notice possible. Just say "Bernie is dead."
Obit Operator: You can have up to six words for the same price.
Caller: OK. Say, "Bernie is dead. Toyota for sale."
Sunday, September 27, 2009
The Road from Morocco
Successful generals are rarely specialists. More often they are well-read, well-rounded polymaths. More than a few are good writers of history. Julius Caesar set the pattern with Conquest of Gaul and Civil War. American generals have been fond of the pen, too. Sherman’s memoirs are the best of this type for my money. Though he is still reviled in parts of the country for his methods in the Civil War, his writings are succinct, intelligent, and insightful. Sherman also knew that historical writing is a battlefield as bitter, if not so bloody, as Chattanooga, so he preempted critics in his introduction by reminding the reader that three people watching the same brawl in a tavern will give three separate versions of events afterward.
Recently I picked out Crusade in Europe by Dwight D. Eisenhower which had been sitting on my shelf for quite some time. It covers his experiences from the invasion of Morocco and Algeria in 1942 until the German surrender to the Allies three years later. The tone, like that of Ike’s public personality (which, somewhat scarily, I remember), is uncomplicated, unassuming, and competent. Yet, the reader (like a careful listener of his speeches) senses somehow that this is a cover for a clever and ambitious mind. Such, by the way, was Nixon’s sense of the man, too, and Nixon surely knew a complex, crafty, ambitious person when he met one. An example is the way Eisenhower handled George Patton, who was full of bluster but actually (Ike’s term) “soft-hearted.” Patton at one meeting demanded that dozens of senior officers be fired for cowardice. Eisenhower calmly agreed if Patton would submit the list of names in writing; Patton, who hadn’t expected Eisenhower to agree, sheepishly withdrew the demand.
Published in 1948, Crusade was written with one eye on the White House, and it shows. As a demonstration of the author’s fitness for command, it is fine campaign material.
The memoir tradition is alive and well. Norman Schwarzkopf and Tommy Franks both wrote about their campaigns in Iraq. These also are worth a read, but both lack significant commentary on the aftermath, omissions which are, under the circumstances, painful.
The U.S. has elected a fair number of Presidents who first made their name in the military. Yet, Eisenhower was the last, and he left office half a century ago. There have been medal-earning vets to be sure, including JFK and Bush senior, but there has been no one first known primarily as a military leader. Perhaps this is because decisive victories have been a bit sparse since 1945. There have been many successful operations, but the larger results have been murkier or remain yet unsettled.
Recently I picked out Crusade in Europe by Dwight D. Eisenhower which had been sitting on my shelf for quite some time. It covers his experiences from the invasion of Morocco and Algeria in 1942 until the German surrender to the Allies three years later. The tone, like that of Ike’s public personality (which, somewhat scarily, I remember), is uncomplicated, unassuming, and competent. Yet, the reader (like a careful listener of his speeches) senses somehow that this is a cover for a clever and ambitious mind. Such, by the way, was Nixon’s sense of the man, too, and Nixon surely knew a complex, crafty, ambitious person when he met one. An example is the way Eisenhower handled George Patton, who was full of bluster but actually (Ike’s term) “soft-hearted.” Patton at one meeting demanded that dozens of senior officers be fired for cowardice. Eisenhower calmly agreed if Patton would submit the list of names in writing; Patton, who hadn’t expected Eisenhower to agree, sheepishly withdrew the demand.
Published in 1948, Crusade was written with one eye on the White House, and it shows. As a demonstration of the author’s fitness for command, it is fine campaign material.
The memoir tradition is alive and well. Norman Schwarzkopf and Tommy Franks both wrote about their campaigns in Iraq. These also are worth a read, but both lack significant commentary on the aftermath, omissions which are, under the circumstances, painful.
The U.S. has elected a fair number of Presidents who first made their name in the military. Yet, Eisenhower was the last, and he left office half a century ago. There have been medal-earning vets to be sure, including JFK and Bush senior, but there has been no one first known primarily as a military leader. Perhaps this is because decisive victories have been a bit sparse since 1945. There have been many successful operations, but the larger results have been murkier or remain yet unsettled.
Wednesday, September 16, 2009
Economic Ape
In the past year, so many things went wrong with the economy at once that economic theory itself has been shaken. After all, most leading economists were caught completely off-guard by the scale of the melt-down. True, there were a few doomsayers here and there who had predicted a collapse of asset values, but nearly all of them were habitual doomsayers from way back. They were bound to be right eventually. The mainstream theorists were astonished.
Are the very foundations of modern economic theory flawed? Well, yes, in spots. One of the weak spots is the notion of Economic Man, the economist’s ideal person who always seeks out maximum economic gain. As an approximation of human behavior, it is close enough to the truth to make economic models possible that are broadly valid most of the time, but “close enough” isn’t terribly close. For the purpose of predicting the behavior of individuals rather than groups, it isn’t close at all. Many individuals forgo profits and accept losses because of non-economic values. He or she may turn down a better-paying job, for example, for a more enjoyable one, or may make the extremely non-economic decision to raise children. Another deviation from Economic Man – one with obvious political consequences – derives from the human sense of fairness. What is "fair" is open to interpretation, of course.
A study in the Proceedings of the National Academy of Sciences USA reported results on a variation of what is sometimes called an Ultimatum Game. The game allowed one player to choose how to split $15 with a second player. The second person was free to reject the offer, but in that case neither party got anything – no further negotiations allowed. The economically rational act for the second person in this case is always to accept any offer, even $1, since something is always better than nothing. Instead, in this study as in others, lopsided offers typically were rejected. Offers under $3 always were rejected, despite the personal cost, in order to punish the first player’s greed.
Is a “fairness” sense just a human thing? Apparently not. Other animals sometimes are visibly annoyed by lopsided rewards. However, they don’t necessarily resort to the same sort of self-harmful retribution. Dr. Hauser, in a study published in Current Biology, describes an Ultimatum Game played by chimpanzees. Hauser devised an elaborate mechanism with trays, ropes, and treats. A pair of chimpanzees had to co-operate to work the mechanism in order to retrieve treats, but the first chimp could choose how to split the loot. If the second didn’t like the split, he could stop co-operating, in which case the first chimp wouldn’t get anything. Oddly, this never happened. The first chimp always took the biggest share of treats for himself that the mechanism allowed, and the second always co-operated rather than punish the other ape’s greed at the cost of forgoing a meager treat for himself. In other words, the apes behaved with pure economic rationality, perfectly in synch with the Economic Man model. Something about this is unsettling.
Are the very foundations of modern economic theory flawed? Well, yes, in spots. One of the weak spots is the notion of Economic Man, the economist’s ideal person who always seeks out maximum economic gain. As an approximation of human behavior, it is close enough to the truth to make economic models possible that are broadly valid most of the time, but “close enough” isn’t terribly close. For the purpose of predicting the behavior of individuals rather than groups, it isn’t close at all. Many individuals forgo profits and accept losses because of non-economic values. He or she may turn down a better-paying job, for example, for a more enjoyable one, or may make the extremely non-economic decision to raise children. Another deviation from Economic Man – one with obvious political consequences – derives from the human sense of fairness. What is "fair" is open to interpretation, of course.
A study in the Proceedings of the National Academy of Sciences USA reported results on a variation of what is sometimes called an Ultimatum Game. The game allowed one player to choose how to split $15 with a second player. The second person was free to reject the offer, but in that case neither party got anything – no further negotiations allowed. The economically rational act for the second person in this case is always to accept any offer, even $1, since something is always better than nothing. Instead, in this study as in others, lopsided offers typically were rejected. Offers under $3 always were rejected, despite the personal cost, in order to punish the first player’s greed.
Is a “fairness” sense just a human thing? Apparently not. Other animals sometimes are visibly annoyed by lopsided rewards. However, they don’t necessarily resort to the same sort of self-harmful retribution. Dr. Hauser, in a study published in Current Biology, describes an Ultimatum Game played by chimpanzees. Hauser devised an elaborate mechanism with trays, ropes, and treats. A pair of chimpanzees had to co-operate to work the mechanism in order to retrieve treats, but the first chimp could choose how to split the loot. If the second didn’t like the split, he could stop co-operating, in which case the first chimp wouldn’t get anything. Oddly, this never happened. The first chimp always took the biggest share of treats for himself that the mechanism allowed, and the second always co-operated rather than punish the other ape’s greed at the cost of forgoing a meager treat for himself. In other words, the apes behaved with pure economic rationality, perfectly in synch with the Economic Man model. Something about this is unsettling.
Tuesday, September 8, 2009
Why I Love Morticia Addams
I refer to the Carolyn Jones version rather than the Angelica Huston. I could say simply that she is beautiful, feminine, unconventional, passionate, and open-minded -- she once urged tolerance of the neighbors despite their ghastly taste for petunias. She possesses that special way with carnivorous plants. What’s not to love? But there is more.
TV in the 50s and early 60s was more innovative than we commonly remember – perhaps because few of us are old enough to remember. Playhouse 90 and similar programs offered new original screenplays with no recurring characters every week. Alfred Hitchcock Presents, One Step Beyond, and the Twilight Zone still hold up if seen today. The comedy show of Ernie Kovacs was truly off-the-wall. However, the sitcom genre was another matter. It was family-friendly with a vengeance: Donna Reed, Leave It to Beaver, Make Room for Daddy, Ozzie and Harriet, and the like. There was and is nothing wrong with shows of this type, of course, except for the fact that these were the only images of domesticity presented in that format prior to 1964.
The Addams Family, based on the macabre cartoons of Charles Addams, was something different. It was marvelously subversive. The show very much reflected the 60s social revolution in a way that its obvious competition, The Munsters, did not. (Beneath the make-up and décor, the Munsters had solidly Ozzie and Harriet values.)
I loved the very first episode of The Addams Family, which aired on September 18, 1964, and at age eleven I also was smitten by the seductive lady in black. The Addams Family stood ordinary conventions on their heads, as in one early episode when Morticia finds a baseball glove in her son’s closet and holds it up at arm’s length by two fingers, as appalled as another mother might be by drug paraphernalia. The characters are not merely oddballs, they are seriously dangerous. They serve their guests henbane tea. They casually contemplate murder and suicide. Their children literally play with dynamite. The sensual interaction of Morticia and Gomez was then, and remains today, something unseen among other sitcom couples; modern shows have much more randy jokes and frequent direct references to sex, but that is very far from the same thing. The Addams household defies (along with all social norms) property maintenance regulations, zoning regulations, weapons regulations, safety regulations, and just about every other kind of busybody reg to which the rest of us long have been resigned.
The family remains likable for all that, even if it is advisable to decline the offer of tea when you visit. Here were some folks who definitely did not live by the standards of the Cleavers, and yet they were deeply appealing. They still are, and the point is still a valuable one.
I purchased all four seasons of the series on DVD. I enjoy each episode as much as when I was eleven, and my crush on the long haired beauty with the lovely gray pallor remains unbroken.
TV in the 50s and early 60s was more innovative than we commonly remember – perhaps because few of us are old enough to remember. Playhouse 90 and similar programs offered new original screenplays with no recurring characters every week. Alfred Hitchcock Presents, One Step Beyond, and the Twilight Zone still hold up if seen today. The comedy show of Ernie Kovacs was truly off-the-wall. However, the sitcom genre was another matter. It was family-friendly with a vengeance: Donna Reed, Leave It to Beaver, Make Room for Daddy, Ozzie and Harriet, and the like. There was and is nothing wrong with shows of this type, of course, except for the fact that these were the only images of domesticity presented in that format prior to 1964.
The Addams Family, based on the macabre cartoons of Charles Addams, was something different. It was marvelously subversive. The show very much reflected the 60s social revolution in a way that its obvious competition, The Munsters, did not. (Beneath the make-up and décor, the Munsters had solidly Ozzie and Harriet values.)
I loved the very first episode of The Addams Family, which aired on September 18, 1964, and at age eleven I also was smitten by the seductive lady in black. The Addams Family stood ordinary conventions on their heads, as in one early episode when Morticia finds a baseball glove in her son’s closet and holds it up at arm’s length by two fingers, as appalled as another mother might be by drug paraphernalia. The characters are not merely oddballs, they are seriously dangerous. They serve their guests henbane tea. They casually contemplate murder and suicide. Their children literally play with dynamite. The sensual interaction of Morticia and Gomez was then, and remains today, something unseen among other sitcom couples; modern shows have much more randy jokes and frequent direct references to sex, but that is very far from the same thing. The Addams household defies (along with all social norms) property maintenance regulations, zoning regulations, weapons regulations, safety regulations, and just about every other kind of busybody reg to which the rest of us long have been resigned.
The family remains likable for all that, even if it is advisable to decline the offer of tea when you visit. Here were some folks who definitely did not live by the standards of the Cleavers, and yet they were deeply appealing. They still are, and the point is still a valuable one.
I purchased all four seasons of the series on DVD. I enjoy each episode as much as when I was eleven, and my crush on the long haired beauty with the lovely gray pallor remains unbroken.
Friday, August 28, 2009
Matrimonially Challenged
Another month lurks just the other side of this weekend. September has its share of annual events: Labor Day, the new school year, the autumnal equinox, and the start of the new auto model year, to name a few. One of the less well-known, but one that appeals to singular me, is National Unmarried and Single Americans Week, this year September 20-26 (see http://www.unmarriedamerica.org/usaweek/intro.htm), "celebrating the lives and contributions of unmarried and single Americans."
You see, we are an oppressed minority. Sort of. At 92 million adults singles are a majority of households. So, by that method of counting we are a majority. Well, we feel oppressed anyway. According to Bella DePaulo, author of Singled Out, there are 1,138 federal provisions in which marital status affects benefits and privileges, always in favor of marrieds – such as social security benefits to a surviving spouse. Singles, having paid just as much into the system, cannot leave them to anyone. State-level rules are on top of that, plus private benefits such as health insurance with lower rates for spouses. Then there are simple social presumptions. For example, how many “happy endings” in movies consist of the leading characters getting married? How many consist of them getting or staying single? The prickly movie Love Stinks comes to mind, but few others.
Despite our numbers, we are still waiting for a single President. I know some of the historians out there are shouting “James Buchanan!” Our 15th president (by most reckonings the worst – no mean achievement considering the competition – because he could have prevented the Civil War but didn't) never married, so haven’t we been there and done that? Well, I don’t think Jim really counts. Given his fifteen year live-in relationship with Senator William Rufus King, which prompted Andrew Jackson nastily to refer to them as “Miss Nancy” and “Aunt Fancy,” and Aaron Brown (Postmaster in Buchanan’s Administration) to call them “Buchanan and his wife,” his “first” seems to belong to another category. True, he did court an heiress to an exceptionally large fortune in his youth, but it was a courtship notable for his inattention to it, and the young lady died of a laudanum (alcohol and opium) overdose before anything came of it. He then attained financial security on his own in short order and never pursued another woman, whether or not those two facts are connected.
Anyway, the role of true singles (of whatever orientation) in politics no doubt will increase. As our share of the population continues to grow, so will our political clout. As mentioned, we are already a majority of households, so traditional families should be regarded as the ones pursuing an “alternate lifestyle.” The rest of us should strive to be to be tolerant of them.
So, in late September, spend a week giving thought to single people, or celebrate being one. Actually, we prefer to be called “matrimonially challenged.” Just kidding.
You see, we are an oppressed minority. Sort of. At 92 million adults singles are a majority of households. So, by that method of counting we are a majority. Well, we feel oppressed anyway. According to Bella DePaulo, author of Singled Out, there are 1,138 federal provisions in which marital status affects benefits and privileges, always in favor of marrieds – such as social security benefits to a surviving spouse. Singles, having paid just as much into the system, cannot leave them to anyone. State-level rules are on top of that, plus private benefits such as health insurance with lower rates for spouses. Then there are simple social presumptions. For example, how many “happy endings” in movies consist of the leading characters getting married? How many consist of them getting or staying single? The prickly movie Love Stinks comes to mind, but few others.
Despite our numbers, we are still waiting for a single President. I know some of the historians out there are shouting “James Buchanan!” Our 15th president (by most reckonings the worst – no mean achievement considering the competition – because he could have prevented the Civil War but didn't) never married, so haven’t we been there and done that? Well, I don’t think Jim really counts. Given his fifteen year live-in relationship with Senator William Rufus King, which prompted Andrew Jackson nastily to refer to them as “Miss Nancy” and “Aunt Fancy,” and Aaron Brown (Postmaster in Buchanan’s Administration) to call them “Buchanan and his wife,” his “first” seems to belong to another category. True, he did court an heiress to an exceptionally large fortune in his youth, but it was a courtship notable for his inattention to it, and the young lady died of a laudanum (alcohol and opium) overdose before anything came of it. He then attained financial security on his own in short order and never pursued another woman, whether or not those two facts are connected.
Anyway, the role of true singles (of whatever orientation) in politics no doubt will increase. As our share of the population continues to grow, so will our political clout. As mentioned, we are already a majority of households, so traditional families should be regarded as the ones pursuing an “alternate lifestyle.” The rest of us should strive to be to be tolerant of them.
So, in late September, spend a week giving thought to single people, or celebrate being one. Actually, we prefer to be called “matrimonially challenged.” Just kidding.
Tuesday, August 18, 2009
It's a Date
While channel surfing last night, I noticed an ad for bootycall.com. What caught my eye was not the ad content but rather its presence on a cartoon channel. This begs any number of easy wisecracks, but I’ll let that opportunity slip. Network sites on which I have pages sport similar ads. "Skip that annoying dinner-and-movie with the uncertain outcome,” they say in essence. “Here is a sure thing."
Many folks always have preferred to short circuit the whole courting business, of course. This is nothing new, even leaving out of consideration the professional services always procurable from entrepreneurs. I'm old enough to remember the weekly meet at a neighbor's house for what then was still called wife swapping. I was too young to participate, but personally knew many of the people who did. Local suburban couples showed up, and, so some veterans of the events told me, the men literally tossed keys on the coffee table. Each woman picked a key at random and went home with whomever the owner of the key happened to be. For an hour once per week, it was a busy driveway. I'm not suggesting this was typical married behavior in the 1960s. It wasn’t. Yet, it was not as rare as one might think either, especially in the age group older than the boomers but younger than the boomers' parents; a lot of these folks feared they were missing out on the decade's social revolution, which was intensely youth-oriented, and they rushed to pluck what fruits of it they could. Bob & Carol & Ted & Alice was not just some scriptwriter's fantasy.
Nowadays, clubs and network groups offering quasi-anonymous sex tend to be geared more toward singles, presumably because there are so many more of them. This, rather than concern over sexism, likely explains the word change to “swinging.” The participants typically have no spouse to swap.
It is hard to come by reliable numbers, but anecdotally (and credibly), online sites offering such arrangements have far fewer customers than dating sites for people seeking something less edgy – something in which last names might be mentioned. Most people (eventually anyway) apparently are still interested in stable, more mainstream relationships. All the same, the members of "adult singles" sites are still pretty numerous.
I certainly have no moral judgments to make. Just yesterday a good friend chided me on my own dating history -- I initially wrote habits instead of history, but the time is past when I made a habit of dating. I answered her, "Men are like swimming pools. Even the deepest ones are shallow on one end."
Nevertheless, key-exchanges, or their modern counterparts, are much too catch-as-catch-can for my taste. On my rare forays, I prefer, as I always did, to chase someone utterly inappropriate for me after due and careful consideration.
Many folks always have preferred to short circuit the whole courting business, of course. This is nothing new, even leaving out of consideration the professional services always procurable from entrepreneurs. I'm old enough to remember the weekly meet at a neighbor's house for what then was still called wife swapping. I was too young to participate, but personally knew many of the people who did. Local suburban couples showed up, and, so some veterans of the events told me, the men literally tossed keys on the coffee table. Each woman picked a key at random and went home with whomever the owner of the key happened to be. For an hour once per week, it was a busy driveway. I'm not suggesting this was typical married behavior in the 1960s. It wasn’t. Yet, it was not as rare as one might think either, especially in the age group older than the boomers but younger than the boomers' parents; a lot of these folks feared they were missing out on the decade's social revolution, which was intensely youth-oriented, and they rushed to pluck what fruits of it they could. Bob & Carol & Ted & Alice was not just some scriptwriter's fantasy.
Nowadays, clubs and network groups offering quasi-anonymous sex tend to be geared more toward singles, presumably because there are so many more of them. This, rather than concern over sexism, likely explains the word change to “swinging.” The participants typically have no spouse to swap.
It is hard to come by reliable numbers, but anecdotally (and credibly), online sites offering such arrangements have far fewer customers than dating sites for people seeking something less edgy – something in which last names might be mentioned. Most people (eventually anyway) apparently are still interested in stable, more mainstream relationships. All the same, the members of "adult singles" sites are still pretty numerous.
I certainly have no moral judgments to make. Just yesterday a good friend chided me on my own dating history -- I initially wrote habits instead of history, but the time is past when I made a habit of dating. I answered her, "Men are like swimming pools. Even the deepest ones are shallow on one end."
Nevertheless, key-exchanges, or their modern counterparts, are much too catch-as-catch-can for my taste. On my rare forays, I prefer, as I always did, to chase someone utterly inappropriate for me after due and careful consideration.
Friday, August 7, 2009
Dumpster Diving
While slipping into my player a DVD of "I Married a Monster from Outer Space" (1958) the other night, the thought struck me (along with the inevitable one, "Who doesn't?") that this movie, considered throwaway trash in its day, is now "classic 1950s sci-fi."
The line between high and low culture always has been fuzzier than most self-styled connoisseurs admit. Regardless of how they style themselves though, few people would deny there is some distinction. It is not just a question of money: baseball cards can be more valuable than paintings. It is not just a question of difficulty: catching a greased pig is as hard as scoring a goal in polo. It has nothing to do with prudery: there is plenty of nudity in an average art gallery but none at a roller derby. It is something more ethereal. It is the difference between a wine tasting and a keg party, even though both are just get-togethers of folks who like to drink.
As mentioned, the line never has been sharp, but there is little doubt that in the past few decades it not only has further blurred but has lowered. This is no bad thing. It has allowed, for example, talented director John Waters to shift his residence from one to the other without changing his style, though his budgets have gone up. Waters often comments that American culture is trash culture. He doesn't mean it as an insult. Arguably we are witnessing a return to classical tradition: Aristophanes certainly had no trouble being both ribald and erudite.
To be sure, crack dens never will be afternoon teas and burping contests never will be reviewed on the same pages as the latest revival of "Richard III." Nor should they be equated. Nonetheless, perhaps we should less often (not always, but less often) employ the class-loaded words “high and low” and simply judge with the terms “good and bad” instead. After all, there are good beers and lousy wines, as well as the reverse.
The line between high and low culture always has been fuzzier than most self-styled connoisseurs admit. Regardless of how they style themselves though, few people would deny there is some distinction. It is not just a question of money: baseball cards can be more valuable than paintings. It is not just a question of difficulty: catching a greased pig is as hard as scoring a goal in polo. It has nothing to do with prudery: there is plenty of nudity in an average art gallery but none at a roller derby. It is something more ethereal. It is the difference between a wine tasting and a keg party, even though both are just get-togethers of folks who like to drink.
As mentioned, the line never has been sharp, but there is little doubt that in the past few decades it not only has further blurred but has lowered. This is no bad thing. It has allowed, for example, talented director John Waters to shift his residence from one to the other without changing his style, though his budgets have gone up. Waters often comments that American culture is trash culture. He doesn't mean it as an insult. Arguably we are witnessing a return to classical tradition: Aristophanes certainly had no trouble being both ribald and erudite.
To be sure, crack dens never will be afternoon teas and burping contests never will be reviewed on the same pages as the latest revival of "Richard III." Nor should they be equated. Nonetheless, perhaps we should less often (not always, but less often) employ the class-loaded words “high and low” and simply judge with the terms “good and bad” instead. After all, there are good beers and lousy wines, as well as the reverse.
Tuesday, July 28, 2009
Whose Prerogative?
Musical taste is immensely affected -- not quite, but almost, determined -- by one's generation. I'm referring to popular music here, not to Bach and Brahms. So, the pop sounds which dominate so much of the airwaves today seem repetitive and uninteresting to me, while the R&B-based music which occupies so much of my shelf (Burden, King, Joplin, et al.) is a bore to many younger listeners. At least the pop genre isn't downright aggravating to me, which probably means it isn't doing its intended job. (There are other types of contemporary sound which succeed, but no one has tried to inflict much of it on me lately.) I think there is something else at work in this generational divide, though, that wasn’t there in previous ones.
Stagecraft always has been part of show biz, but until recently it wasn't dominant in popular music. Sinatra had few stage accessories other than a microphone and a spotlight. ZZ Top’s distinctive appearance makes little difference at a show other than to help us identify them as the real thing. It's the sound that matters. For most current performers, however, the staging matters a lot, sometimes far more than the sound. Fans can be unforgiving when a singer or group stumbles on live performances, as when Britney was so famously unrehearsed and out of shape one occasion a couple years ago. This puts harsh demands on current acts to be well choreographed and sexy at all times. Music alone doesn’t sell albums. Consequently, the odds of modern pop stars having careers to match that of, say, Peggy Lee, who in her 70s sang Fever to full audiences from a wheelchair, are, to put it gently, low.
Stagecraft always has been part of show biz, but until recently it wasn't dominant in popular music. Sinatra had few stage accessories other than a microphone and a spotlight. ZZ Top’s distinctive appearance makes little difference at a show other than to help us identify them as the real thing. It's the sound that matters. For most current performers, however, the staging matters a lot, sometimes far more than the sound. Fans can be unforgiving when a singer or group stumbles on live performances, as when Britney was so famously unrehearsed and out of shape one occasion a couple years ago. This puts harsh demands on current acts to be well choreographed and sexy at all times. Music alone doesn’t sell albums. Consequently, the odds of modern pop stars having careers to match that of, say, Peggy Lee, who in her 70s sang Fever to full audiences from a wheelchair, are, to put it gently, low.
Sunday, July 19, 2009
The Call of the Riled
The "do not call list" has thinned out, though not entirely eliminated, the sales calls to my home, but several a day still arrive at my business phone. I am polite to callers from respectable companies who offer their wares in a respectable way -- for example, the caller from Pitney Bowes who just now tried to sign me up for a low volume postage meter. It was a No Sale, as such calls to me always are. On unshakable principle born of early hard experiences, I never buy anything offered to me by an unsolicited caller. I'll repeat that. I never buy anything offered to me by an unsolicited caller, no matter how fabulous the value: no business equipment, no lines of credit, no stocks and bonds, no personalized key chains, no anything. I don't accept anything offered for "free" either. If I need something, I know about it without anyone calling me up to tell me, and I determine where and how to get it. Nevertheless, I say my "no" politely to these folks, once anyway. Regrettably, my polite “no” is seldom the last word spoken. I would like to know what training manual for sales reps says to ignore the "no" and to go on rudely pitching; most sales callers have read it. That chapter of the manual needs to be removed. It doesn't work. It just makes me hang up on the caller.
In another category entirely are the callers from scumbag companies who call up and say something like, "I just need the model number of your copier" or "I'm just updating your listing information; you are still at such-and-such address, correct?" as though the call were from your regular office supplier or from some publication with which you have a "listing." If s/he gets an inexperienced office worker on the line who co-operates, you will get an unwanted product or listing in some obscure ad book -- and, of course, an outrageous bill. The "no" these people get from me is not polite even once.
In another category entirely are the callers from scumbag companies who call up and say something like, "I just need the model number of your copier" or "I'm just updating your listing information; you are still at such-and-such address, correct?" as though the call were from your regular office supplier or from some publication with which you have a "listing." If s/he gets an inexperienced office worker on the line who co-operates, you will get an unwanted product or listing in some obscure ad book -- and, of course, an outrageous bill. The "no" these people get from me is not polite even once.
Friday, July 10, 2009
The Marquis and We
Most people are familiar with the Milgram Experiment of 1961 in which subjects were told by researchers to deliver painful electric shocks to “students” (actually fellow researchers) as part of a supposed experiment on learning. All but a handful of subjects zapped the “students” precisely as they were told to do. The experiment was re-created and the results aired a couple times in the UK in the past few years with results virtually identical to the first one. Somehow, the British researchers found subjects who hadn’t heard of the Milgram Experiment for their re-creations.
Somewhat less famous is a 1971 experiment by Philip Zimbardo, though a few references were made to it in the popular press at the time of the Abu Graib to-do. One reason it is less famous is because it produced such disturbing behavior from the subjects that it was aborted midway.
In the Zimbardo experiment, university student subjects were randomly divided into "guards" and "prisoners." The former were given real keys and real authority over the latter. Normal respect and civility broke down between the two groups almost immediately as the subjects adopted the mindsets of their assigned roles. Guards behaved sadistically and the prisoners became emotional wrecks. Zimbardo felt he couldn't allow the experiment to continue and terminated it in only six days. None of the "guard" students had any criminal records or any known predilection for abusive behavior. None of the “prisoner” students had any known serious psychological issues prior to the experiment. Zimbardo concluded that about a third of the guards were genuinely sadistic – they seized an unwonted opportunity to taunt and persecute others – while the others were corrupted by peer pressure.
This is why we need guards on guards. Authority should be limited whenever possible and placed under supervision when it isn’t possible. (The roots of the two words "supervision" and "oversight" literally mean the same thing, but I generally prefer the former since the latter also means a kind of error. However, I love the term "Congressional Oversight Committee.")
All of us have a capacity for cruelty. It is part of being human. We all understand the Marquis de Sade, which is why his books are unsettling. Still, there is a distinction between people like the Marquis and the rest of us. The Marquis deliberately sought out opportunities to be brutal. He needed no peer pressure or encouragement. His pleasure in the pain of others was central to who he was, not some extra capacity he normally stored away in a closet. Let out of prison in the wake of the French Revolution, he was returned there by the Revolutionaries who soon realized their mistake in having set him free.
The descent to sadism is easy. That doesn't let anyone off the hook, of course. 6-year-olds quickly learn "But everyone was doing it," doesn’t work as an excuse; the excuse doesn't work for adults either. Yet it is worth bearing in mind how easily otherwise upstanding folks can fall into what is normally called evil behavior when it is encouraged. The good news is that decency can be encouraged successfully by peer pressure too – so long as those doing the encouraging don’t get sadistically abusive about it.
Somewhat less famous is a 1971 experiment by Philip Zimbardo, though a few references were made to it in the popular press at the time of the Abu Graib to-do. One reason it is less famous is because it produced such disturbing behavior from the subjects that it was aborted midway.
In the Zimbardo experiment, university student subjects were randomly divided into "guards" and "prisoners." The former were given real keys and real authority over the latter. Normal respect and civility broke down between the two groups almost immediately as the subjects adopted the mindsets of their assigned roles. Guards behaved sadistically and the prisoners became emotional wrecks. Zimbardo felt he couldn't allow the experiment to continue and terminated it in only six days. None of the "guard" students had any criminal records or any known predilection for abusive behavior. None of the “prisoner” students had any known serious psychological issues prior to the experiment. Zimbardo concluded that about a third of the guards were genuinely sadistic – they seized an unwonted opportunity to taunt and persecute others – while the others were corrupted by peer pressure.
This is why we need guards on guards. Authority should be limited whenever possible and placed under supervision when it isn’t possible. (The roots of the two words "supervision" and "oversight" literally mean the same thing, but I generally prefer the former since the latter also means a kind of error. However, I love the term "Congressional Oversight Committee.")
All of us have a capacity for cruelty. It is part of being human. We all understand the Marquis de Sade, which is why his books are unsettling. Still, there is a distinction between people like the Marquis and the rest of us. The Marquis deliberately sought out opportunities to be brutal. He needed no peer pressure or encouragement. His pleasure in the pain of others was central to who he was, not some extra capacity he normally stored away in a closet. Let out of prison in the wake of the French Revolution, he was returned there by the Revolutionaries who soon realized their mistake in having set him free.
The descent to sadism is easy. That doesn't let anyone off the hook, of course. 6-year-olds quickly learn "But everyone was doing it," doesn’t work as an excuse; the excuse doesn't work for adults either. Yet it is worth bearing in mind how easily otherwise upstanding folks can fall into what is normally called evil behavior when it is encouraged. The good news is that decency can be encouraged successfully by peer pressure too – so long as those doing the encouraging don’t get sadistically abusive about it.
Tuesday, June 30, 2009
Family Feud
The authors of science zines must spend much time reading one another, because they often carry similar stories even when nothing special in the news has prompted it. If one zine carries an article on, say, border clashes between tribes of chimps in Angola, simply because the subject interests one of the writers, other zines are sure to follow in the next few days with similar articles.
This must explain a recent spate of articles in science zines about how closely all humans are related. None of the zines cite any significant new paper in any major journal. Nevertheless, the point is an interesting one though it is nothing new. It long has been the consensus that it is mathematically necessary, due to the doubling of direct ancestors with each generation, for every person now living to be descended from every person in the world who was alive no further in the past than 7000 years ago who left a still intact line. This is so even if very conservative assumptions about human movements are applied. The number is closer to 5000 years if more liberal and more likely assumptions about migration are used. Even a tiny rate of infiltration by travelers over the steppes, the deserts, and the seas ensures this universal relativity - and migration was often anything but tiny. Also, all living people, no matter how remote, share at least one direct common ancestor by 3000 years ago (more likely 2000).
There is something neat about this. A person may not think of himself as hailing from the banks of the Chang Jiang River or the grasslands of West Africa if his great great grandparents sailed to the US from Ireland in 1849, but it seems that he does. His ancestors hauled stones to Giza too. They quite likely besieged (and defended) Troy. It is worth an occasional thought, while claiming this rather than that heritage, that those other folks are cousins too. Cousins, of course, don't always get along, but it is still worth a thought.
This must explain a recent spate of articles in science zines about how closely all humans are related. None of the zines cite any significant new paper in any major journal. Nevertheless, the point is an interesting one though it is nothing new. It long has been the consensus that it is mathematically necessary, due to the doubling of direct ancestors with each generation, for every person now living to be descended from every person in the world who was alive no further in the past than 7000 years ago who left a still intact line. This is so even if very conservative assumptions about human movements are applied. The number is closer to 5000 years if more liberal and more likely assumptions about migration are used. Even a tiny rate of infiltration by travelers over the steppes, the deserts, and the seas ensures this universal relativity - and migration was often anything but tiny. Also, all living people, no matter how remote, share at least one direct common ancestor by 3000 years ago (more likely 2000).
There is something neat about this. A person may not think of himself as hailing from the banks of the Chang Jiang River or the grasslands of West Africa if his great great grandparents sailed to the US from Ireland in 1849, but it seems that he does. His ancestors hauled stones to Giza too. They quite likely besieged (and defended) Troy. It is worth an occasional thought, while claiming this rather than that heritage, that those other folks are cousins too. Cousins, of course, don't always get along, but it is still worth a thought.
Friday, June 19, 2009
Karma Curmudgeon
85 years ago, in separate incidents, Beulah Annan and Belva Gaertner slew their boyfriends. Allegedly. Both were acquitted. Beulah didn’t actually deny shooting hers; she simply said she acted in self-defense and not at all because of his intention to dump her. After the shooting she played a foxtrot record while waiting for the fellow to die, which he did 4 hours later. Then she called the police. Belva never admitted doing anything. Sure she was in the front seat of her boyfriend’s car where he was killed by a gun found on the seat next to him. Other people had seen her there. Sure she owned guns. Other people had seen her with those too; a girl had to be careful of robbers, she said. Sure she was still covered in her boyfriend’s blood when the police arrived at her apartment. But she had been drinking, you see, and for the life of her couldn’t remember what happened in the car. She sincerely doubted she had anything to do with her boyfriend’s death though.
These became celebrity cases. The juries bought the defendants’ stories and both walked free. Beulah’s husband stood by her through the trial, but after it was over she dumped him and married a boxer.
If all this sounds familiar, it should. Maurine Dallas Watkins wrote a play based on the events called Chicago which ran on Broadway. It was made into a movie in 1927. The movie was remade in 1942 as Roxie Hart with Ginger Rogers in the title role. It returned to the stage in 1975, renamed Chicago, this time as a musical. The best known version is probably the 2002 movie adaptation starring Renee Zellweger and Catherine Zeta-Jones.
I’ve never seen the straight original play, but I’m fond of the 1927 version (the scene where Roxie is being coached by her lawyer is here: http://www.youtube.com/watch?v=9-zYAd5FdBE ), the 1942 version, and the Broadway musical. My fondness reinforced a resistance to seeing the 2002 movie that I had anyway. It long has been my opinion that stage musicals rarely translate well to the screen; they never turn out better on screen, despite all the superior stagecraft and fx possible, and sometimes achieved, with film – it’s just not the right medium. On a recent sleepless night, though, I finally watched it. It actually isn’t bad. I much prefer Broadway, but it isn’t bad.
I’m not sure what about this kind of story catches our attention so much in real life and in the movies. It may be wonderment that karma really doesn’t balance things out. What goes around doesn’t come around – unless we decide to make it do so. Even then, we often get it wrong.
These became celebrity cases. The juries bought the defendants’ stories and both walked free. Beulah’s husband stood by her through the trial, but after it was over she dumped him and married a boxer.
If all this sounds familiar, it should. Maurine Dallas Watkins wrote a play based on the events called Chicago which ran on Broadway. It was made into a movie in 1927. The movie was remade in 1942 as Roxie Hart with Ginger Rogers in the title role. It returned to the stage in 1975, renamed Chicago, this time as a musical. The best known version is probably the 2002 movie adaptation starring Renee Zellweger and Catherine Zeta-Jones.
I’ve never seen the straight original play, but I’m fond of the 1927 version (the scene where Roxie is being coached by her lawyer is here: http://www.youtube.com/watch?v=9-zYAd5FdBE ), the 1942 version, and the Broadway musical. My fondness reinforced a resistance to seeing the 2002 movie that I had anyway. It long has been my opinion that stage musicals rarely translate well to the screen; they never turn out better on screen, despite all the superior stagecraft and fx possible, and sometimes achieved, with film – it’s just not the right medium. On a recent sleepless night, though, I finally watched it. It actually isn’t bad. I much prefer Broadway, but it isn’t bad.
I’m not sure what about this kind of story catches our attention so much in real life and in the movies. It may be wonderment that karma really doesn’t balance things out. What goes around doesn’t come around – unless we decide to make it do so. Even then, we often get it wrong.
Friday, June 12, 2009
Clockwatching
Author Ray Bradbury, who will be 89 this August, once remarked that when he was a boy, a visit to the relatives meant a visit to the graveyard. It was a pre-birth-control era of large families, and it was unlikely that all of one's siblings would make it to adulthood. Modern medicine in 1920 was just beginning to make to make serious headway against common diseases and infections, and something curable a quarter-century later with simple penicillin could very well be fatal. He said the greatest changes in social attitudes in his lifetime have come from the increasing insulation of modern life from everyday connection to death. Violence in video games doesn’t count, since we know that is as much a fantasy as the aliens and zombies shooting back at our avatars in the games; if anything, it adds to the sense of detachment. We no longer expect to lose friends and relatives in real life before they are old. It happens, of course, for any number of reasons, but we no longer expect it. We often forget about our own mortality altogether, until blindsided by some event that that forces our awareness.
This surely has much to do with the abundance of Peter Pans and Wendys running about with graying hair. If life is neverending, there is not much need to look at the clock to see what time it is. Perhaps paying attention to time is a definition of maturity. I know I didn't begin to grow up until I started to lose those close to me, and I haven't finished the job even at this late stage. Fortunately, having avoided taking on such adult responsibilities as were avoidable (fatherhood being the big one), I've been able to dodge most of the dire consequences of a Never-never-land existence so far, though no doubt I've missed some benefits too.
This is not entirely a bad thing. Perhaps the old saw "Live each day as if it were your last" should be modified to "Live each day as if it were your first." After all, it probably won't be your last and you'll be stuck paying for the party. Nevertheless, perhaps we also should remember, at least occasionally, that there really is a clock. None of us knows to what time the alarm is set, but we often can make an educated guess.
In another week or so on the summer solstice, nature's clock, I’ll sit out back and toast time – if I remember to look at the calendar. How about you?
This surely has much to do with the abundance of Peter Pans and Wendys running about with graying hair. If life is neverending, there is not much need to look at the clock to see what time it is. Perhaps paying attention to time is a definition of maturity. I know I didn't begin to grow up until I started to lose those close to me, and I haven't finished the job even at this late stage. Fortunately, having avoided taking on such adult responsibilities as were avoidable (fatherhood being the big one), I've been able to dodge most of the dire consequences of a Never-never-land existence so far, though no doubt I've missed some benefits too.
This is not entirely a bad thing. Perhaps the old saw "Live each day as if it were your last" should be modified to "Live each day as if it were your first." After all, it probably won't be your last and you'll be stuck paying for the party. Nevertheless, perhaps we also should remember, at least occasionally, that there really is a clock. None of us knows to what time the alarm is set, but we often can make an educated guess.
In another week or so on the summer solstice, nature's clock, I’ll sit out back and toast time – if I remember to look at the calendar. How about you?
Wednesday, June 3, 2009
Something Wilde
I haven't learned very much in my years on this planet, but I have learned (the hard way, of course) enough to know there is a land mine inside this excerpt from an Economist article:
"Americans expect a lot from marriage. Whereas most Italians say the main purpose of marriage is to have children, 70% of Americans think it is something else. They want their spouse to make them happy. Some go further and assume that if they are not happy, it must be because they picked the wrong person."
Another person cannot make you happy. Let me repeat that. Another person cannot make you happy. It is trite to say (annoyingly, most truths are), but you have to find happiness inside yourself.
On the other hand, another person can make you miserable.
In my observation, there are two basic sets of people: 1) those who are naturally happy unless especially bad things are happening to them, and 2) those who are naturally unhappy unless especially good things are happening to them. If you're one of the former, you don't need to look for someone else to cheer you up. You’ll be just fine whittling on the porch by yourself. If you're one of the latter, no one else ever will supply you with enough good times to keep you smiling. Attempts to extract enough out of your companion merely will add to you own disappointments while making him or her miserable.
Type 1s almost always marry type 2s. Type 2s almost always marry type 1s.
Said Oscar Wilde, "I always give away good advice. It never is of the slightest use to myself." So, here goes. Just try to find someone who doesn't make you miserable. At least he or she won't get in the way.
"Americans expect a lot from marriage. Whereas most Italians say the main purpose of marriage is to have children, 70% of Americans think it is something else. They want their spouse to make them happy. Some go further and assume that if they are not happy, it must be because they picked the wrong person."
Another person cannot make you happy. Let me repeat that. Another person cannot make you happy. It is trite to say (annoyingly, most truths are), but you have to find happiness inside yourself.
On the other hand, another person can make you miserable.
In my observation, there are two basic sets of people: 1) those who are naturally happy unless especially bad things are happening to them, and 2) those who are naturally unhappy unless especially good things are happening to them. If you're one of the former, you don't need to look for someone else to cheer you up. You’ll be just fine whittling on the porch by yourself. If you're one of the latter, no one else ever will supply you with enough good times to keep you smiling. Attempts to extract enough out of your companion merely will add to you own disappointments while making him or her miserable.
Type 1s almost always marry type 2s. Type 2s almost always marry type 1s.
Said Oscar Wilde, "I always give away good advice. It never is of the slightest use to myself." So, here goes. Just try to find someone who doesn't make you miserable. At least he or she won't get in the way.
Saturday, May 16, 2009
Pulp Friction
I have four paper-and-ink books available for sale and have one more online for free, but Stephen King has no cause to look worriedly over his shoulder. I am not gaining on him.
That is OK (sort of). I’ll never refuse a royalty, but my fiction would exist if there were no readers at all. I write to scratch an itch, as nearly all writers do. Some writers have more remunerative itches than others, of course. Many successful authors write with grace, depth, and quality (even many potboiler authors: Koontz and Ketchum, for example, write not just commercially but well), but others are not so handicapped by talent.
A friend of mine, who regularly and cheerily reminds me of my absence from the bestseller list, recently (for no particular reason that I could see) pondered aloud what I was "doing wrong." Using his fingers, he counted off well-known titles by authors of dubious talent, and then asked me what I thought were the elements of a commercially successful novel. I countered that I obviously didn't know, and changed the subject. Yet, the question stirred something in my memory. It seemed to me that a hypothesis about this had once been proposed and put to the test. I did a little research and soon stumbled upon the experiment that I recalled from my schooldays.
One day in 1966, Newsday columnist Mike McGrady picked up The New York Times book review section and looked at the bestseller list. He was irritated by what he saw, and concluded "inept writing held the key to success." He was convinced he could write as badly as any author on the list, and that many of his colleagues could, too. Inspired by the thought, he playfully wrote a memo and sent it to 24 members of the Newsday staff.
"You are hereby invited to become the co-author of a best selling novel," he wrote. "There will be an unremitting emphasis on sex and true excellence in writing will be quickly blue-penciled into oblivion."
The staff responded enthusiastically. They devised a simple plot: a woman discovers her husband is cheating on her, so she takes revenge by indulging in one lurid sexual escapade after another. That is the whole plot. Her lovers are sophisticates and ruffians from all walks of life. The staff divvied up the writing in appropriate fashion: Newsday's crime writer wrote the chapter about her fling with a gangster, the sports writer the one about an affair with a boxer, and so on. They invented an "author" and named her Penelope Ashe. McGrady's sister-in-law agreed to play Penelope on the talk show circuit. Released in 1969, the novel was a stunning success. Naked Came the Stranger soared to 4 on The New York Times bestseller list, elbowing aside Jacqueline Susann's The Love Machine. Hollywood came knocking and bought the movie rights for a large sum (yes, the movie was made, and it, too, was successful by the standards of low-budget x-rated films). The revelation of the hoax by McGrady did nothing to diminish sales.
I don't think tastes have elevated in the 40 years since, so McGrady's formula probably still has legs. So, all my scrawling friends out there, perhaps together we too can tap out something truly awful.
One of my books, Trash & Other Litter, by the way, actually has a few elements of the formula. Sales nonetheless are modest. I prefer to suppose this is because Mike would have taken his blue pencil to it.
That is OK (sort of). I’ll never refuse a royalty, but my fiction would exist if there were no readers at all. I write to scratch an itch, as nearly all writers do. Some writers have more remunerative itches than others, of course. Many successful authors write with grace, depth, and quality (even many potboiler authors: Koontz and Ketchum, for example, write not just commercially but well), but others are not so handicapped by talent.
A friend of mine, who regularly and cheerily reminds me of my absence from the bestseller list, recently (for no particular reason that I could see) pondered aloud what I was "doing wrong." Using his fingers, he counted off well-known titles by authors of dubious talent, and then asked me what I thought were the elements of a commercially successful novel. I countered that I obviously didn't know, and changed the subject. Yet, the question stirred something in my memory. It seemed to me that a hypothesis about this had once been proposed and put to the test. I did a little research and soon stumbled upon the experiment that I recalled from my schooldays.
One day in 1966, Newsday columnist Mike McGrady picked up The New York Times book review section and looked at the bestseller list. He was irritated by what he saw, and concluded "inept writing held the key to success." He was convinced he could write as badly as any author on the list, and that many of his colleagues could, too. Inspired by the thought, he playfully wrote a memo and sent it to 24 members of the Newsday staff.
"You are hereby invited to become the co-author of a best selling novel," he wrote. "There will be an unremitting emphasis on sex and true excellence in writing will be quickly blue-penciled into oblivion."
The staff responded enthusiastically. They devised a simple plot: a woman discovers her husband is cheating on her, so she takes revenge by indulging in one lurid sexual escapade after another. That is the whole plot. Her lovers are sophisticates and ruffians from all walks of life. The staff divvied up the writing in appropriate fashion: Newsday's crime writer wrote the chapter about her fling with a gangster, the sports writer the one about an affair with a boxer, and so on. They invented an "author" and named her Penelope Ashe. McGrady's sister-in-law agreed to play Penelope on the talk show circuit. Released in 1969, the novel was a stunning success. Naked Came the Stranger soared to 4 on The New York Times bestseller list, elbowing aside Jacqueline Susann's The Love Machine. Hollywood came knocking and bought the movie rights for a large sum (yes, the movie was made, and it, too, was successful by the standards of low-budget x-rated films). The revelation of the hoax by McGrady did nothing to diminish sales.
I don't think tastes have elevated in the 40 years since, so McGrady's formula probably still has legs. So, all my scrawling friends out there, perhaps together we too can tap out something truly awful.
One of my books, Trash & Other Litter, by the way, actually has a few elements of the formula. Sales nonetheless are modest. I prefer to suppose this is because Mike would have taken his blue pencil to it.
Friday, May 1, 2009
The Quiet Riot
As a new month of May quietly arrives, my thoughts turn to
an earlier more riotous May. As riots go, it was a pleasant one.
In the spring of 1971, ARVN troops launched an ill-fated
offensive against the Ho Chi Minh Trail in Laos . The sudden re-escalation of
the war put new life into the anti-war movement in the US . 200,000
protesters jammed into Washington ,
DC , for a May Day march. The
initial march was peaceful, but leaflets spread all over the city announced a
plan for May 3 to shut down the government by occupying key bridges and
intersections thereby preventing government workers from reaching their
offices. The Metro was still under construction, so commuters relied entirely
on cars and buses.
This was my freshman year at GWU and I lived in Mitchell
Hall, an 8-story dorm at 514 19th
St NW in DC, three blocks from the White House and
a short walk from the Mall. The rooms and hallways of the dorm were crammed
with out-of-town students who were there for the demonstrations. I'm afraid my
primary interest was unpolitical: it was whether any of the visitors were pretty.
On the night of May 2 my friend Don and I walked to the Washington Monument grounds as we had done the day before. Loud rock music was
coming from a stage on the far side of the grounds. I saw only one police
officer. He stood in Constitution
Avenue trying to direct traffic. A young blonde
woman in jeans and buckskin, and high on something more than the thick haze of
marijuana in the air, stood next to him and "assisted" with grand
sweeping gestures in the direction of walls and lampposts. Finally he became
exasperated and gently shooed her away.
"You're a good cop!" she shouted far too loudly
into his face.
"Yeah, I know," he said, shaking his head as she
staggered toward the Monument.
Trying to navigate across the crowded Monument lawn, I
stepped over and around human beings, who sat and lay about in various states
of consciousness. The spaces between them were filled largely by blankets,
coolers, and knapsacks. It was a surreal scene, so Don and I mingled
and lingered. Eventually, we moseyed back to our dorm.
Our timing was fortuitous. Just before dawn on May 3,
thousands of police and National Guardsmen surrounded the grounds and the Mall.
Most of the protestors were fleet-footed
enough to evade immediate arrest. The assault on the roads and bridges went
forward. It was no small event. Over the next three days 12,000 demonstrators
were arrested. The practice field by RFK Stadium was used to hold them.
I opted simply to observe, and Mitchell Hall proved a good
vantage point to watch some of the action. I kept to the rooftop and upper
floors, not only for the better view, but to get above the street-level tear
gas. The tang of the stuff cleared my sinuses even on the roof. Below,
protestors drifted from intersection to intersection, while convoys of police
cars and squads of CDU police on foot chased them away. One rag-tag troupe
surrounded a Metro bus in the street below; they opened the engine cover and
tried to disable the bus, but police cars arrived before they could finish.
They scattered. This was a typical scene all over town. The protesters
definitely interfered with traffic, but they didn't stop it. Government
commuters got to work, though they could not have enjoyed driving through tear
gas. Some street action sputtered on for a couple days, but by May 6 it was all
over. A fellow student passed by me in the dorm hallway with books under his left arm. "Revolution's over," he said to me. So it was.
What struck me most about the whole affair was the remarkable
lack of rancor in the tone of the event. Police and protesters seemed, if
anything, to be having fun. I heard more than a little laughter from both
sides. Yes, protesters were violent and police did get rough, but there was no
overall sense that the violence would turn deadly. None of my dorm residents who had gotten arrested had so much as a small bruise to show off for it.
Many civil disturbances in this era and since were both
mean and deadly. Why was this one such a soft riot? In large part it was
because the police were professional and restrained. Also, it was because the
protesters by and large were overwhelmingly middle-class college kids. They
came from supportive families where they had been raised on Disney and Dr.
Spock. They weren't angry enough to be truly dangerous.
So, my first large scale riot was pretty tame, by the
standards of such things. I'd just as soon avoid being present at a second,
since odds are it would be different.
Saturday, April 25, 2009
String Theory
I heard a tune the other day which sounded familiar. It was recorded by the Andrews Sisters in 1940, but I couldn't place where I had heard it before. (Why I was listening to the Andrews Sisters probably requires another explanation, but we’ll leave that for another time.)
I've got no strings
To hold me down
To make me fret, or make me frown
I had strings
But now I'm free
There are no strings on me
A Google search quickly provided the answer, which seems obvious in retrospect. It is from the Disney cartoon Pinocchio (1940). I hadn’t interpreted the lyrics literally when I heard them, and perhaps the lyricist Ned Washington didn’t when he wrote them, which means he may have had some explaining to do at home about them (assuming he had strings).
I’ve taken Disney to task in the past for suckering the young (and not just the young) with false expectations of castles and fairy tale romances and happily-ever-afters. When instead they encounter mortgage payments, working-on-our-relationships, and daily stress, many naturally feel victims of a bait-and-switch. Yet, there is nothing really wrong with escapist fairy tales provided we don’t dupe ourselves into thinking they are anything else. Besides, reviewing these and other Disney lyrics, I think I may have overlooked a subtle subversive streak in Walt and his minions after all.
I've got no strings
To hold me down
To make me fret, or make me frown
I had strings
But now I'm free
There are no strings on me
A Google search quickly provided the answer, which seems obvious in retrospect. It is from the Disney cartoon Pinocchio (1940). I hadn’t interpreted the lyrics literally when I heard them, and perhaps the lyricist Ned Washington didn’t when he wrote them, which means he may have had some explaining to do at home about them (assuming he had strings).
I’ve taken Disney to task in the past for suckering the young (and not just the young) with false expectations of castles and fairy tale romances and happily-ever-afters. When instead they encounter mortgage payments, working-on-our-relationships, and daily stress, many naturally feel victims of a bait-and-switch. Yet, there is nothing really wrong with escapist fairy tales provided we don’t dupe ourselves into thinking they are anything else. Besides, reviewing these and other Disney lyrics, I think I may have overlooked a subtle subversive streak in Walt and his minions after all.
Sunday, April 19, 2009
A Pin in the Neck
For as long as I remember, new men's shirts have come out of the package booby-trapped. There were pins in the sleeves, pins in the collars, pins in the pockets and pins in the cuffs. No matter how many you plucked out, there was bound to be one you missed. When you put the shirt on the first time, "Gotcha!"
A couple of shirts arrived by UPS from Sheplers the other day. I was pleased to notice the fasteners were plastic clips. "Whoopie! No pins!" I shouted (I take my pleasures where I can get them these days). Last night I plucked off the clips and put on one of the shirts. "Youch!" Just for old times sake, apparently, Sheplers had left a single pin in place at the location of the top button hole by the throat. I had missed it.
I should have known the past doesn't get left behind as easily as all that. Just when you put it out of your mind, some remnant of it jabs you in the throat.
A couple of shirts arrived by UPS from Sheplers the other day. I was pleased to notice the fasteners were plastic clips. "Whoopie! No pins!" I shouted (I take my pleasures where I can get them these days). Last night I plucked off the clips and put on one of the shirts. "Youch!" Just for old times sake, apparently, Sheplers had left a single pin in place at the location of the top button hole by the throat. I had missed it.
I should have known the past doesn't get left behind as easily as all that. Just when you put it out of your mind, some remnant of it jabs you in the throat.
Sunday, April 12, 2009
Sex, Drugs, and Rocky Road
As reported in Scientific American, according to the National Center for Health Statistics, which conducted a survey of more than 6,000 people (a largish sample for such purposes), ninety-six percent of U.S. residents have engaged in sex by the age of 20. The report notes various patterns based on ethnicity, education, income, and so on.
I'm not quite sure why this was considered a health issue per se, since sex is dangerous, so the old line goes, only if you do it right. With reasonable precautions one usually can escape injury, but the study didn't focus on precautions.
The study moved on to substance abuse.
"More than 19 percent of those aged 20 to 29 said they had tried cocaine, crack or another street drug, excluding marijuana. This rose to 27 percent for people aged 30 to 39 and nearly 26 percent for those in their 40s."
These figures probably are on the light side as people notoriously are reluctant to answer truthfully about misbehavior. It is hard to say how much they lie, but it is likely to be at least as much as they lie about tobacco and alcohol. We have some indication of this from a review of the Substance Abuse and Mental Health Services Administration:
"According to a White House briefing paper analyzing SAMHSA's figures regarding Americans' alcohol and tobacco use, respondents have historically underreported their usage of these two legal substances by as much as 30 to 50 percent. (Revenues from alcohol and tobacco taxes allow researchers to cross check respondents admitted usage patterns with actual annual consumption rates…)" – Paul Armentano, Federal Drug Use Surveys and Fuzzy Math
Nevertheless, even taking the figures at face value, it means about a quarter of adults commit rather serious drug offenses, even when excluding from consideration marijuana, the most commonly used illegal substance. By comparison, according to the 2003 National Health Interview Survey (NHIS), 21.6 percent of U.S. adults smoke cigarettes.
In this case I agree with the NCHS (it seems like there is some duplication of alphabet soup agencies, doesn't it?). Drug abuse is a health issue. So are the laws against it.
Drug warriors by and large are well-meaning people who honestly think they are doing good. It even is likely their efforts save a few people from themselves by scaring them with the threat of punishment. At the same time, the panache of illegality actually tempts others, just as illegal booze did in the old speakeasy days. The side effects have been massive . The drug warriors inadvertently have enriched gangs and dealers, undermined civil liberties, turned neighborhoods into war zones, and made a huge percentage (by some counts a majority) of our citizens criminals. The US has the highest prison population – both per capita and in absolute terms – of any nation on earth, and two-thirds of the inmates are there for drug related crimes.
Though drug prohibition dates back to World War One (and gathered steam in the 30s), Richard Nixon declared a full scale War on Drugs 38 years ago in 1971. It is obvious drugs won. It is time for another approach. Legalization combined with an offer of treatment may not reduce the number of abusers. Reducing drug abuse simply may not be possible – you can toss lifelines to people, but ultimately you can't make them grab on – but legalization would be a kinder alternative for users and for the rest of us, and it would be cheaper too. Nor is the approach entirely untried. The Swiss and Portuguese have decriminalized heroin (Swiss hospitals actually supply it to registered addicts), and both countries have reduced street crime and new AIDS cases without any noticeable uptick in use. When the Bourbons regained the French throne after Napoleon, it was said “they have learned nothing and forgotten nothing.” The results were unfortunate. When Kentucky bourbon was restored to American taverns in 1933 after Prohibition, we failed to learn or forget. We immediately reinforced prohibition on drugs other than alcohol with results just as unfortunate.
My personal addictions are more in the nature of various flavors of ice cream, which, fortunately, are not yet illegal and need not be bought on the street, though the NIH has much to say about substances such as these too. One day at a time.
I'm not quite sure why this was considered a health issue per se, since sex is dangerous, so the old line goes, only if you do it right. With reasonable precautions one usually can escape injury, but the study didn't focus on precautions.
The study moved on to substance abuse.
"More than 19 percent of those aged 20 to 29 said they had tried cocaine, crack or another street drug, excluding marijuana. This rose to 27 percent for people aged 30 to 39 and nearly 26 percent for those in their 40s."
These figures probably are on the light side as people notoriously are reluctant to answer truthfully about misbehavior. It is hard to say how much they lie, but it is likely to be at least as much as they lie about tobacco and alcohol. We have some indication of this from a review of the Substance Abuse and Mental Health Services Administration:
"According to a White House briefing paper analyzing SAMHSA's figures regarding Americans' alcohol and tobacco use, respondents have historically underreported their usage of these two legal substances by as much as 30 to 50 percent. (Revenues from alcohol and tobacco taxes allow researchers to cross check respondents admitted usage patterns with actual annual consumption rates…)" – Paul Armentano, Federal Drug Use Surveys and Fuzzy Math
Nevertheless, even taking the figures at face value, it means about a quarter of adults commit rather serious drug offenses, even when excluding from consideration marijuana, the most commonly used illegal substance. By comparison, according to the 2003 National Health Interview Survey (NHIS), 21.6 percent of U.S. adults smoke cigarettes.
In this case I agree with the NCHS (it seems like there is some duplication of alphabet soup agencies, doesn't it?). Drug abuse is a health issue. So are the laws against it.
Drug warriors by and large are well-meaning people who honestly think they are doing good. It even is likely their efforts save a few people from themselves by scaring them with the threat of punishment. At the same time, the panache of illegality actually tempts others, just as illegal booze did in the old speakeasy days. The side effects have been massive . The drug warriors inadvertently have enriched gangs and dealers, undermined civil liberties, turned neighborhoods into war zones, and made a huge percentage (by some counts a majority) of our citizens criminals. The US has the highest prison population – both per capita and in absolute terms – of any nation on earth, and two-thirds of the inmates are there for drug related crimes.
Though drug prohibition dates back to World War One (and gathered steam in the 30s), Richard Nixon declared a full scale War on Drugs 38 years ago in 1971. It is obvious drugs won. It is time for another approach. Legalization combined with an offer of treatment may not reduce the number of abusers. Reducing drug abuse simply may not be possible – you can toss lifelines to people, but ultimately you can't make them grab on – but legalization would be a kinder alternative for users and for the rest of us, and it would be cheaper too. Nor is the approach entirely untried. The Swiss and Portuguese have decriminalized heroin (Swiss hospitals actually supply it to registered addicts), and both countries have reduced street crime and new AIDS cases without any noticeable uptick in use. When the Bourbons regained the French throne after Napoleon, it was said “they have learned nothing and forgotten nothing.” The results were unfortunate. When Kentucky bourbon was restored to American taverns in 1933 after Prohibition, we failed to learn or forget. We immediately reinforced prohibition on drugs other than alcohol with results just as unfortunate.
My personal addictions are more in the nature of various flavors of ice cream, which, fortunately, are not yet illegal and need not be bought on the street, though the NIH has much to say about substances such as these too. One day at a time.
Sunday, April 5, 2009
My Favorite Marxist
Like so many folks these days, I’ve been single for by far the majority of my adult life – meaning neither married (though I was married once for a little over 3 years) nor in a “committed relationship.” Romantic relations were not absent, but were, for the most part, fleeting. So, perhaps I have no claim to be an expert on why some relationships last. On the other hand, I do have quite a lot of experience on why some don’t.
My longest-term romantic relationship was back in the 80s. Anyone who knows both of us might spot a character much like her in my short story collection Scum and Other Tales, but I’ll stop short of identifying which one. I think the reason this one lasted longer than most was the intellectual challenge she offered – yes, really. The other stuff was good too, but the good-natured adversarial discussions were fun, and she gave me a run for my money on almost any subject. We traded books regularly and always had different opinions about those too. She was a Marxist former SDS activist and I was a libertarian, so we always had something political to talk about – perhaps the fact each of us was radical, albeit in near-opposite ways, was an odd sort of commonality. She is the only one of my former partners, to my knowledge, to have left me for a woman. Still, I have fond memories.
I don’t pretend to be less shallow than the next guy. I’m as taken in by superficial prettiness as anyone, which at my stage of life is likely to be considered creepy so I’m careful about expressing the reaction. Any long term pairing, though, requires three things from both parties: loads of patience, fundamental good will, and some sort of mental connection. Lack of any of those by either partner is a deal-killer sooner or later, more likely sooner.
What of passion? Well, in the idiom of my generation, passion is a gas. A person is lucky to experience it once – and I consider myself very lucky to have done so more than once. Unfortunately, it is a volatile and explosive gas that sometimes stinks (methane?). OK enough of the metaphors. Well, maybe one more. Passion definitely should be on your life resume somewhere. If it’s what you feel for a suitable life partner, it is even better (and rare). However, if passion is your only reason for entering a committed relationship, my advice is to pass gas.
My longest-term romantic relationship was back in the 80s. Anyone who knows both of us might spot a character much like her in my short story collection Scum and Other Tales, but I’ll stop short of identifying which one. I think the reason this one lasted longer than most was the intellectual challenge she offered – yes, really. The other stuff was good too, but the good-natured adversarial discussions were fun, and she gave me a run for my money on almost any subject. We traded books regularly and always had different opinions about those too. She was a Marxist former SDS activist and I was a libertarian, so we always had something political to talk about – perhaps the fact each of us was radical, albeit in near-opposite ways, was an odd sort of commonality. She is the only one of my former partners, to my knowledge, to have left me for a woman. Still, I have fond memories.
I don’t pretend to be less shallow than the next guy. I’m as taken in by superficial prettiness as anyone, which at my stage of life is likely to be considered creepy so I’m careful about expressing the reaction. Any long term pairing, though, requires three things from both parties: loads of patience, fundamental good will, and some sort of mental connection. Lack of any of those by either partner is a deal-killer sooner or later, more likely sooner.
What of passion? Well, in the idiom of my generation, passion is a gas. A person is lucky to experience it once – and I consider myself very lucky to have done so more than once. Unfortunately, it is a volatile and explosive gas that sometimes stinks (methane?). OK enough of the metaphors. Well, maybe one more. Passion definitely should be on your life resume somewhere. If it’s what you feel for a suitable life partner, it is even better (and rare). However, if passion is your only reason for entering a committed relationship, my advice is to pass gas.
Sunday, March 29, 2009
Ancient Deadbeats and the Rise of Man
Snail shells drilled with holes and coated in ochre have turned up in South Africa, Israel, Algeria, and, most recently, Morocco. They all date to about 82,000 years ago, give or take several thousand years. Despite, or perhaps because of, short brutish lifespans among early peoples and the rapid changeover of generations, physical culture changed slowly back then. A few millennia scarcely make a difference.
The shells, all of similar type, obviously were strung into beads. They are the earliest known art and, so the anthropologists who found the latest batch argue, the earliest known currency. It seems the entire prehistoric world was on the snail standard, at least among modern humans. This was a curious time when modern humans co-existed with Neanderthal in Europe and Homo Erectus in parts of Asia, neither of whom left evidence of being in the least bit artistic or money-grubbing. They made a few practical tools, but that is about it.
The appearance of modern consciousness is usually considered to be evidenced by the first art. Apparently it also is evidenced by the first money, which was one and the same thing. Art and monetary value are both abstractions which are beyond minds simpler than the ones belonging to these cave dwelling snail-beaders.
One wonders if some hoarded beads while others spent them profligately. Did they borrow them and charge each other interest? Did clan leaders tax them? How different were these people from us? How different are we from them?
Here is an alternate hypothesis for why modern people spread out over the globe, replacing other types of human. The usual one is that they slowly spread their hunting ranges and, due to superior skills and higher reproduction, they simply displaced the others over time. Perhaps their spread into new lands had nothing to do with hunting. Maybe they were fleeing creditors.
The shells, all of similar type, obviously were strung into beads. They are the earliest known art and, so the anthropologists who found the latest batch argue, the earliest known currency. It seems the entire prehistoric world was on the snail standard, at least among modern humans. This was a curious time when modern humans co-existed with Neanderthal in Europe and Homo Erectus in parts of Asia, neither of whom left evidence of being in the least bit artistic or money-grubbing. They made a few practical tools, but that is about it.
The appearance of modern consciousness is usually considered to be evidenced by the first art. Apparently it also is evidenced by the first money, which was one and the same thing. Art and monetary value are both abstractions which are beyond minds simpler than the ones belonging to these cave dwelling snail-beaders.
One wonders if some hoarded beads while others spent them profligately. Did they borrow them and charge each other interest? Did clan leaders tax them? How different were these people from us? How different are we from them?
Here is an alternate hypothesis for why modern people spread out over the globe, replacing other types of human. The usual one is that they slowly spread their hunting ranges and, due to superior skills and higher reproduction, they simply displaced the others over time. Perhaps their spread into new lands had nothing to do with hunting. Maybe they were fleeing creditors.
Sunday, March 22, 2009
Aping France
I first saw Pontocorvos’ documentary-style movie The Battle of Algiers shortly after its US release in 1967. I watched it again last night, and it remains crushingly relevant.
The Algerian uprising against France, which began a few years after World War 2, taught a couple of unfortunate lessons: 1) terror can be effective, and 2) military ruthlessness can be effective. The film covers the year the turbulent year 1957 in which rebel FLN activity swelled. Insurgents bombed Europeans (roughly a fifth of Algeria’s population at the time) in soda shops, airports, clubs, race tracks, and other ordinary places; European vigilantes carried out bloody and carelessly targeted reprisals against Arabs and Berbers. The French military, smarting from its recent reverses in Indochina, intervened with force and skill. French forces killed or captured key FLN organizers and quelled the revolt with surprising speed, but they achieved this success only by suspending due process, scrapping civil liberties, employing wholesale arrests, and using aggressive interrogation methods (read torture). Though the tactics succeeded, the French literally were demoralized in the process. In the movie, Colonel Mathieu shrugs off press criticism; he says that these are the methods required to win. If the French want to keep Algeria, he says, “you must accept the consequences.” The victory proved fleeting. The FLN resumed the insurgency in a couple years and the French didn’t have the heart to fight it out again. France quit Algeria in 1962.
France and the US are much more alike than the citizens of either like to admit. Both still take the ideals of their respective 18th century Revolutions seriously (at least in words); both have a sense of exceptionalism in the world; both mix cultural defensiveness with multicultural reality; both try to export their values which they consider universal; both confuse national interests with international ones; both are regarded as hopelessly arrogant by outsiders (and by each other); and each is reluctant to learn from the mistakes or successes of the other.
Despite the French experience in Indochina, Americans decided to repeat it and were defeated for much the same reasons. (Dien Bien Phu, though a setback, was no more crippling to the French military than Tet was to the Americans; what it shattered was political will.) Despite the Anglo-French boondoggle at Suez, the US repeatedly has intervened in the region since with no better results. The French experience in Algeria was almost mindlessly repeated by the US in Iraq.
The French haven’t brought any big disasters upon themselves lately (though they do have some minority assimilation challenges). Perhaps we’ll ape that too, for a while.
The Algerian uprising against France, which began a few years after World War 2, taught a couple of unfortunate lessons: 1) terror can be effective, and 2) military ruthlessness can be effective. The film covers the year the turbulent year 1957 in which rebel FLN activity swelled. Insurgents bombed Europeans (roughly a fifth of Algeria’s population at the time) in soda shops, airports, clubs, race tracks, and other ordinary places; European vigilantes carried out bloody and carelessly targeted reprisals against Arabs and Berbers. The French military, smarting from its recent reverses in Indochina, intervened with force and skill. French forces killed or captured key FLN organizers and quelled the revolt with surprising speed, but they achieved this success only by suspending due process, scrapping civil liberties, employing wholesale arrests, and using aggressive interrogation methods (read torture). Though the tactics succeeded, the French literally were demoralized in the process. In the movie, Colonel Mathieu shrugs off press criticism; he says that these are the methods required to win. If the French want to keep Algeria, he says, “you must accept the consequences.” The victory proved fleeting. The FLN resumed the insurgency in a couple years and the French didn’t have the heart to fight it out again. France quit Algeria in 1962.
France and the US are much more alike than the citizens of either like to admit. Both still take the ideals of their respective 18th century Revolutions seriously (at least in words); both have a sense of exceptionalism in the world; both mix cultural defensiveness with multicultural reality; both try to export their values which they consider universal; both confuse national interests with international ones; both are regarded as hopelessly arrogant by outsiders (and by each other); and each is reluctant to learn from the mistakes or successes of the other.
Despite the French experience in Indochina, Americans decided to repeat it and were defeated for much the same reasons. (Dien Bien Phu, though a setback, was no more crippling to the French military than Tet was to the Americans; what it shattered was political will.) Despite the Anglo-French boondoggle at Suez, the US repeatedly has intervened in the region since with no better results. The French experience in Algeria was almost mindlessly repeated by the US in Iraq.
The French haven’t brought any big disasters upon themselves lately (though they do have some minority assimilation challenges). Perhaps we’ll ape that too, for a while.
Monday, March 16, 2009
In Defense of Trash
The cult classic B movie Faster, Pussycat Kill! Kill! (1965) was unavailable on DVD for years, but my copy finally has arrived. It is superb trash. No, it is trash transcending itself. Though there is not a scene in it that cannot be aired on primetime broadcast TV, the movie never is shown there because, collectively, the scenes make something definitely not for kids. There are busty killer babes, a threatened innocent, and (four years before Manson) a twisted family in an isolated desert ranch. Russ Meyer, with a pocket change budget, directed his quirky cast to make something special.
There is an old controversy about this sort of film, with an odd coalition of social conservatives and PC-liberals arguing that productions of this ilk encourage violence and should be restricted. They cite studies showing that exposure to violent images desensitizes people and makes them more aggressive. An equally odd mix of right and left dispute this and oppose restrictions. Gordon Dahl at UC San Diego and Stefano DellaVigna at UC Berkeley, from the latter group, recently concluded in their study that violent films reduce violence:
“We find that violent crime decreases on days with higher theater audiences for violent movies. The effect is mostly driven by incapacitation: between 6PM and 12AM, an increase of one million in the audience for violent movies reduces violent crime by 1.5 to 2 percent. After the exposure to the movie, between 12AM and 6AM, crime is still reduced but the effect is smaller and less robust. We obtain similar, but noisier, results using data on DVD and VHS rentals. Overall, we find no evidence of a temporary surge in violent crime due to exposure to movie violence. Rather, our estimates suggest that in the short-run violent movies deter over 200 assaults daily.”
Both groups, in my opinion, miss the point. A normal individual does not go out and commit assaults because he or she watched a shoot-‘em-up movie. Artists and viewers should not be shackled and censored according to the lowest common denominator of human being, i.e. someone who cannot distinguish fiction from reality and who takes his cues from the former. The problem with this person is not the movies. There always are people who can’t handle freedom, no matter what variety; this is no argument against it. Shackles belong only on people who commit crimes, not on people who depict them.
This still leaves open the question of whether such productions have positive value. This too is an old dispute. More than two millennia ago, Aristotle, disagreed with his mentor Plato who thought literature and the theater should be censored; Ari said violent and emotionally rending Greek tragedies were cathartic. They allowed viewers to experience and discharge deeper and darker aspects of themselves in a harmless way. The experiences thereby were healthy and promoted self-knowledge. I think the old boy was onto something.
Faster, Pussycat Kill! Kill! may not be The Bacchae and Russ Meyer was no Euripides, but the latter was considered by many to be a producer of trash in his day. I’m glad his works survived the censors.
There is an old controversy about this sort of film, with an odd coalition of social conservatives and PC-liberals arguing that productions of this ilk encourage violence and should be restricted. They cite studies showing that exposure to violent images desensitizes people and makes them more aggressive. An equally odd mix of right and left dispute this and oppose restrictions. Gordon Dahl at UC San Diego and Stefano DellaVigna at UC Berkeley, from the latter group, recently concluded in their study that violent films reduce violence:
“We find that violent crime decreases on days with higher theater audiences for violent movies. The effect is mostly driven by incapacitation: between 6PM and 12AM, an increase of one million in the audience for violent movies reduces violent crime by 1.5 to 2 percent. After the exposure to the movie, between 12AM and 6AM, crime is still reduced but the effect is smaller and less robust. We obtain similar, but noisier, results using data on DVD and VHS rentals. Overall, we find no evidence of a temporary surge in violent crime due to exposure to movie violence. Rather, our estimates suggest that in the short-run violent movies deter over 200 assaults daily.”
Both groups, in my opinion, miss the point. A normal individual does not go out and commit assaults because he or she watched a shoot-‘em-up movie. Artists and viewers should not be shackled and censored according to the lowest common denominator of human being, i.e. someone who cannot distinguish fiction from reality and who takes his cues from the former. The problem with this person is not the movies. There always are people who can’t handle freedom, no matter what variety; this is no argument against it. Shackles belong only on people who commit crimes, not on people who depict them.
This still leaves open the question of whether such productions have positive value. This too is an old dispute. More than two millennia ago, Aristotle, disagreed with his mentor Plato who thought literature and the theater should be censored; Ari said violent and emotionally rending Greek tragedies were cathartic. They allowed viewers to experience and discharge deeper and darker aspects of themselves in a harmless way. The experiences thereby were healthy and promoted self-knowledge. I think the old boy was onto something.
Faster, Pussycat Kill! Kill! may not be The Bacchae and Russ Meyer was no Euripides, but the latter was considered by many to be a producer of trash in his day. I’m glad his works survived the censors.
Sunday, March 8, 2009
A Kinder Gentler Terminator
An Economist article on machine intelligence and the shift toward robotic warfare reports the following:
"Dr. Arkin believes there is another reason for putting robots into battle, which is that they have the potential to act more humanely than people. Stress does not affect a robot's judgment in the way it affects a soldier's."
How curious – and disturbingly credible. Perhaps Sarah Connor is safe after all, except from her fellow humans.
"Dr. Arkin believes there is another reason for putting robots into battle, which is that they have the potential to act more humanely than people. Stress does not affect a robot's judgment in the way it affects a soldier's."
How curious – and disturbingly credible. Perhaps Sarah Connor is safe after all, except from her fellow humans.
Tuesday, March 3, 2009
Critters
Over the years I've owned 9 cats, 2 dogs, a horse, and a skunk. There were about a dozen other pets which were co-owned with family. Every one of them had qualities I would admire in a human being, and, of course, many I wouldn't. Every one of them was instructive. Let us take the skunk, for instance.
Stinky (what else?) appeared one day at the back porch as a tiny little thing that apparently had lost his mother. It was necessary to de-scent him in order to share a home with him. He grew up healthy and strong, but though he consented to human contact he never really liked it. Instead, he formed a friendship with the Great Dane. He didn't much like the indoors at all. It seemed best to let him have his way as soon as he was big enough, so he moved out of the house into the back yard where he was much happier. He dug out a nest for himself under a leaf pile next to the dog house and ate out of the same dish as the dog. He lived there for years. People and other animals, not knowing his clip was empty, always gave him a wide berth. The bluff was enough. All the same, he never wandered far from the Great Dane just in case.
Stinky gave me some perspective on events in Iraq. Why, people ask, did Saddam Hussein allow the world to believe he had weapons of mass destruction for so many years? The simple answer: a credible bluff has deterrent value all by itself. It worked for Stinky. However, Stinky had some sense of limits to bluffing, so he hung out with the big dog just in case. Had Saddam sidled up to the big dog, he'd still be in power.
Stinky (what else?) appeared one day at the back porch as a tiny little thing that apparently had lost his mother. It was necessary to de-scent him in order to share a home with him. He grew up healthy and strong, but though he consented to human contact he never really liked it. Instead, he formed a friendship with the Great Dane. He didn't much like the indoors at all. It seemed best to let him have his way as soon as he was big enough, so he moved out of the house into the back yard where he was much happier. He dug out a nest for himself under a leaf pile next to the dog house and ate out of the same dish as the dog. He lived there for years. People and other animals, not knowing his clip was empty, always gave him a wide berth. The bluff was enough. All the same, he never wandered far from the Great Dane just in case.
Stinky gave me some perspective on events in Iraq. Why, people ask, did Saddam Hussein allow the world to believe he had weapons of mass destruction for so many years? The simple answer: a credible bluff has deterrent value all by itself. It worked for Stinky. However, Stinky had some sense of limits to bluffing, so he hung out with the big dog just in case. Had Saddam sidled up to the big dog, he'd still be in power.
Wednesday, February 25, 2009
Convenient Untruth
All –isms are crazy. This is because each emphasizes self-supporting truths while ignoring or dismissing contrary ones. They therefore are systematic self-delusions.
This doesn't mean we can do without them. Craziness may be a condition of existence for human society. Besides, a concerted effort to do without them becomes an -ism too, to wit pragmatism which also emphasizes and ignores realities with abandon (in fact, formal pragmatists readily argue that truth is contextual and tentative). The world and society are just too complex to be considered at all times in all their detail; if we tried, we'd never get through our analysis of any situation in time actually to do anything about it. Our –isms provide us with simplified models of life that let us make rough-and-ready political and personal choices with reasonable speed. Still, some –isms are crazier than others.
What brings all this to mind is an autobiography I've been meaning to read for years and have finally started, Living My Life by the remarkable anarchist activist Emma Goldman. It is a fascinating book; early on, she describes her childhood and early sexual experiences in such a way that I was forced to ask, "What would Sigmund Freud have made of this?" It turns out she knew the man, so she probably knew the answer, and may have structured the book with it in mind.
As someone whose preferred –ism is of the classical liberal variety, I have sympathy for her distrust of government, for her zeal for human rights, and even for her ideal of free love. That we actually can do without government altogether, however, strikes me as particularly crazy. The idea seems especially so in the context of her egalitarian socialist economics. Anarchists of this ilk argue that government force defends property and inequality. Well, yes. In the absence of it, non-governmental force defends property and inequality. Above the level of small scale communes, socialism is achieved by heavy-handed statist force: the more socialist the economy, the heavier the hand. It is no wonder that, after early enthusiasm, Emma was disillusioned by the 1917 Russian Revolution and turned strongly anti-communist (though still not pro-capitalist). To me, discussions of a total end to the state merely bring visions of Mogadishu: neighborhoods run by gangs and warlords, effectively little governments. I prefer states shackled but there.
The book, no doubt contrary to Emma's intent, is a reminder that our simple models (and not just the politically ideological ones) distort reality, sometimes dangerously. It behooves us to notice occasionally that even the most odious –isms are based on some truths; if not they wouldn't survive at all. Even the most congenial ones conceal truths and tell lies. A reality check now and then doesn't hurt – actually, it may hurt, but it is worth it.
This doesn't mean we can do without them. Craziness may be a condition of existence for human society. Besides, a concerted effort to do without them becomes an -ism too, to wit pragmatism which also emphasizes and ignores realities with abandon (in fact, formal pragmatists readily argue that truth is contextual and tentative). The world and society are just too complex to be considered at all times in all their detail; if we tried, we'd never get through our analysis of any situation in time actually to do anything about it. Our –isms provide us with simplified models of life that let us make rough-and-ready political and personal choices with reasonable speed. Still, some –isms are crazier than others.
What brings all this to mind is an autobiography I've been meaning to read for years and have finally started, Living My Life by the remarkable anarchist activist Emma Goldman. It is a fascinating book; early on, she describes her childhood and early sexual experiences in such a way that I was forced to ask, "What would Sigmund Freud have made of this?" It turns out she knew the man, so she probably knew the answer, and may have structured the book with it in mind.
As someone whose preferred –ism is of the classical liberal variety, I have sympathy for her distrust of government, for her zeal for human rights, and even for her ideal of free love. That we actually can do without government altogether, however, strikes me as particularly crazy. The idea seems especially so in the context of her egalitarian socialist economics. Anarchists of this ilk argue that government force defends property and inequality. Well, yes. In the absence of it, non-governmental force defends property and inequality. Above the level of small scale communes, socialism is achieved by heavy-handed statist force: the more socialist the economy, the heavier the hand. It is no wonder that, after early enthusiasm, Emma was disillusioned by the 1917 Russian Revolution and turned strongly anti-communist (though still not pro-capitalist). To me, discussions of a total end to the state merely bring visions of Mogadishu: neighborhoods run by gangs and warlords, effectively little governments. I prefer states shackled but there.
The book, no doubt contrary to Emma's intent, is a reminder that our simple models (and not just the politically ideological ones) distort reality, sometimes dangerously. It behooves us to notice occasionally that even the most odious –isms are based on some truths; if not they wouldn't survive at all. Even the most congenial ones conceal truths and tell lies. A reality check now and then doesn't hurt – actually, it may hurt, but it is worth it.
Wednesday, February 18, 2009
The Day That Doesn't Exist
Once upon a time there was a federal holiday called Washington’s Birthday and, in some states, a state holiday called Lincoln’s Birthday. I could see the point of both. I also could see the points made by some that they were very close together and that they were one holiday too many. So, combining them into a single federal Presidents’ Day made a certain amount of sense.
Although one never would know it from all the Presidents Day sales at malls and auto dealerships, the combination never officially happened. The bill to create Presidents Day died in a Congressional committee in 1968. It is still Washington’s Birthday, though we do move it around a little now to make long weekends. We owe the widespread notion that there is a generic Presidents Day to the advertising companies hired by those mall owners and car dealers.
There is nothing wrong with unofficial holidays per se. The fact that something didn’t come from Congress often is a solid argument in its favor – and the Presidents’ Day bill died in Congress for all the wrong reasons (that meddlesome Lincoln fellow still had enemies, mostly in the old Confederacy). However, in this case, I think Congress accidentally made the right choice. Do I really want to celebrate all the presidents? Franklin Pierce? James Buchanan? Rutherford B. Hayes who shamelessly stole an election by means of an even more shameless deal to end Reconstruction? I don’t think so.
It probably is too late to give the day back to George alone. If I have to toast some other White House resident, though, I think it will be Chester Arthur even though he wasn’t one of the greats. A beneficiary of party bosses and the spoils system and regarded as a political hack when chosen for Vice President in a backroom deal, he became President when James Garfield was assassinated. Against all expectations, he attacked corruption and cronyism (about which, after all, he had first hand experience) and instituted serious civil service reform. Chester rose above himself. I wish for nothing more from any president.
Although one never would know it from all the Presidents Day sales at malls and auto dealerships, the combination never officially happened. The bill to create Presidents Day died in a Congressional committee in 1968. It is still Washington’s Birthday, though we do move it around a little now to make long weekends. We owe the widespread notion that there is a generic Presidents Day to the advertising companies hired by those mall owners and car dealers.
There is nothing wrong with unofficial holidays per se. The fact that something didn’t come from Congress often is a solid argument in its favor – and the Presidents’ Day bill died in Congress for all the wrong reasons (that meddlesome Lincoln fellow still had enemies, mostly in the old Confederacy). However, in this case, I think Congress accidentally made the right choice. Do I really want to celebrate all the presidents? Franklin Pierce? James Buchanan? Rutherford B. Hayes who shamelessly stole an election by means of an even more shameless deal to end Reconstruction? I don’t think so.
It probably is too late to give the day back to George alone. If I have to toast some other White House resident, though, I think it will be Chester Arthur even though he wasn’t one of the greats. A beneficiary of party bosses and the spoils system and regarded as a political hack when chosen for Vice President in a backroom deal, he became President when James Garfield was assassinated. Against all expectations, he attacked corruption and cronyism (about which, after all, he had first hand experience) and instituted serious civil service reform. Chester rose above himself. I wish for nothing more from any president.
Sunday, February 15, 2009
Trash and Treasure
While a ten yard dumpster sits in my driveway because of some repairs, I am taking advantage of it to dispense with clutter as well. The question is how to define clutter. Some choices are simple: a torn up carpet goes in the dumpster, a commendation letter to my dad (WW2 Merchant Marine) signed by Harry Truman stays. Some are not.
I am not really a pack rat by nature. However, after more than five decades of life, and as the sole surviving member of my immediate family living in my family home, my garage, basement, attic and closets inevitably have filled with my own and inherited items, nearly all of uncertain value, even the sentimental kind. What to do with the perfectly good (old but not antique) tables and chairs which I remember from my childhood but for which there is simply no room in the present living space? What to do with the childhood toys of my sister (d.1995)? We're not talking about a vintage Barbie here or something else with any value for a collector, nor are there complete sets of anything suitable for re-gifting to some other young girl. Yet the remaining hodge-podge is still hard to throw away. What of boxes of photographs of people I barely know? What of ribbons from long forgotten horse shows and vinyl albums from long forgotten bands?
I've seen garages more overstuffed than mine. I have an easier task than some because my mom was not a pack rat either. In fact, her advice to me always was, "When in doubt, throw it out." She felt that people tend to drown in their own accumulated clutter and that it was better to keep life (and moving, should that be necessary) simple. Sometimes her decisions were, in retrospect, ruthless, as with the disposal of my first edition Marvel comics back before they were worth more than the cover price. Other times they were simply surprising. For example, she kept her 1947 wedding dress until my dad died in 2000; though she had made no comment about it, it was not in the drawer where she always had kept it when the time came to deal with her effects. Still, even she couldn't stop the slow material build-up which is now mine to re-assess belatedly.
I think my mom's philosophy was, on balance, the correct one. There are mementos which I want to keep on hand, but sentiment doesn't really reside in objects, especially unused or barely used ones. The dumpster will be full and there will be furniture at the curb for passersby to pick through before pick-up day this week. Throwing out when in doubt is a memento of sorts too.
Now if I only can fend off the friends with overflowing garages and eyes on my emerging free space.
I am not really a pack rat by nature. However, after more than five decades of life, and as the sole surviving member of my immediate family living in my family home, my garage, basement, attic and closets inevitably have filled with my own and inherited items, nearly all of uncertain value, even the sentimental kind. What to do with the perfectly good (old but not antique) tables and chairs which I remember from my childhood but for which there is simply no room in the present living space? What to do with the childhood toys of my sister (d.1995)? We're not talking about a vintage Barbie here or something else with any value for a collector, nor are there complete sets of anything suitable for re-gifting to some other young girl. Yet the remaining hodge-podge is still hard to throw away. What of boxes of photographs of people I barely know? What of ribbons from long forgotten horse shows and vinyl albums from long forgotten bands?
I've seen garages more overstuffed than mine. I have an easier task than some because my mom was not a pack rat either. In fact, her advice to me always was, "When in doubt, throw it out." She felt that people tend to drown in their own accumulated clutter and that it was better to keep life (and moving, should that be necessary) simple. Sometimes her decisions were, in retrospect, ruthless, as with the disposal of my first edition Marvel comics back before they were worth more than the cover price. Other times they were simply surprising. For example, she kept her 1947 wedding dress until my dad died in 2000; though she had made no comment about it, it was not in the drawer where she always had kept it when the time came to deal with her effects. Still, even she couldn't stop the slow material build-up which is now mine to re-assess belatedly.
I think my mom's philosophy was, on balance, the correct one. There are mementos which I want to keep on hand, but sentiment doesn't really reside in objects, especially unused or barely used ones. The dumpster will be full and there will be furniture at the curb for passersby to pick through before pick-up day this week. Throwing out when in doubt is a memento of sorts too.
Now if I only can fend off the friends with overflowing garages and eyes on my emerging free space.
Sunday, February 8, 2009
So Very High School
Every now and then I am inspired to see if I still have skills I sweated to acquire in high school (never mind college). I’ll break out the old Latin or Chemistry text and take a quiz at the back of some random chapter. The results usually are disappointing, even though I was nerdy enough to have been voted Best Student in the school yearbook. Voters rarely can be trusted, of course: I was neither valedictorian nor salutatorian. However, I was a pretty good student all the same.
After an unsettling result on a quiz, I then try to rebuild at least a modicum of competence in whatever subject I have chosen to embarrass myself. I’ve had to relearn Latin verb forms an exasperating number of times over the years, and (except for the simplest ones) would have to do so once more were I to work up the temerity to open that particular textbook again. This past week I plucked Second Year Algebra off the shelf; I lacked the courage to try trig or calc. Any reader who earns a living in a math intense field will chuckle to know I was baffled even by basic quadratic equations -- and what is f(x) anyway? Swallowing my pride, I re-opened the book to Chapter One and began to read.
Clearly I haven’t had much general use for this old information or I would not have become so rusty. My itch to revisit it is certainly idiosyncratic, and it is not something I suggest anyone else need do. If anything, it raises the question students ask in every classroom in every generation: “Why should I learn this when I’m never going to use it?”
They should. There is a reason, though it is not easy to explain in simple earning-a-living terms. It is true that most people are served well enough by basic skills in arithmetic and grammar, plus whatever specialized career training they might have after high school. Yet, there is more to life than a weekly paycheck and a weekend barbeque. There is something to trying to be a well-rounded person with some understanding of how the world works and how we got to be where we are. I never have met someone on whom a liberal arts education was a waste, even if he or she thought so. (I should point out that not everyone with a diploma or a degree has an education and not everyone without one of these pieces of paper lacks one; ultimately, we are self-taught or not at all.) Even if one never again scans a poem or calculates a tangent, it is important to know these things can be done, and that they require no special magic.
One does not want to be a contestant on the Howard Stern radio show, answering (not made up examples) that Paul Revere rode in World War One and that Columbus sailed on the Mayfair. We serve ourselves and our fellows better with a world view broader, richer, and more integrated than that of an australopithecine chipping flint.
In one of his grumpier moods, science fiction author Robert Heinlein once suggested that voting booths should be rigged not to accept the votes of anyone who can’t answer three simple questions: one of math, one of civics, and one of grammar. He further suggested that the booths of those who failed the test should ring and light up as a further discouragement to them coming back. I understand the likely harm of doing this for real, but it is hard not to sympathize the sentiment behind the proposal.
After an unsettling result on a quiz, I then try to rebuild at least a modicum of competence in whatever subject I have chosen to embarrass myself. I’ve had to relearn Latin verb forms an exasperating number of times over the years, and (except for the simplest ones) would have to do so once more were I to work up the temerity to open that particular textbook again. This past week I plucked Second Year Algebra off the shelf; I lacked the courage to try trig or calc. Any reader who earns a living in a math intense field will chuckle to know I was baffled even by basic quadratic equations -- and what is f(x) anyway? Swallowing my pride, I re-opened the book to Chapter One and began to read.
Clearly I haven’t had much general use for this old information or I would not have become so rusty. My itch to revisit it is certainly idiosyncratic, and it is not something I suggest anyone else need do. If anything, it raises the question students ask in every classroom in every generation: “Why should I learn this when I’m never going to use it?”
They should. There is a reason, though it is not easy to explain in simple earning-a-living terms. It is true that most people are served well enough by basic skills in arithmetic and grammar, plus whatever specialized career training they might have after high school. Yet, there is more to life than a weekly paycheck and a weekend barbeque. There is something to trying to be a well-rounded person with some understanding of how the world works and how we got to be where we are. I never have met someone on whom a liberal arts education was a waste, even if he or she thought so. (I should point out that not everyone with a diploma or a degree has an education and not everyone without one of these pieces of paper lacks one; ultimately, we are self-taught or not at all.) Even if one never again scans a poem or calculates a tangent, it is important to know these things can be done, and that they require no special magic.
One does not want to be a contestant on the Howard Stern radio show, answering (not made up examples) that Paul Revere rode in World War One and that Columbus sailed on the Mayfair. We serve ourselves and our fellows better with a world view broader, richer, and more integrated than that of an australopithecine chipping flint.
In one of his grumpier moods, science fiction author Robert Heinlein once suggested that voting booths should be rigged not to accept the votes of anyone who can’t answer three simple questions: one of math, one of civics, and one of grammar. He further suggested that the booths of those who failed the test should ring and light up as a further discouragement to them coming back. I understand the likely harm of doing this for real, but it is hard not to sympathize the sentiment behind the proposal.
Sunday, February 1, 2009
Posies for Pia
Remember Pia Zadora? Maybe a little? Not at all? After some experience as a child actor (see Santa Claus Conquers the Martians – no, on second thought, don’t), she met, at age 17, a Wealthy (the capital “W” is not a typo) businessman more than 30 years her senior. They married in 1977, divorcing amicably in 1993. While married, the undeniably cute Pia did ads for Dubonnet; her husband was a big stockholder in the producer of the aperitif. He also helped generate roles for her in several movies, starting with Butterfly (1982), which not even the presence of Stacy Keach and Orson Welles could salvage. The critics were beyond unkind. She was the first person to win two RAZZIE awards in a row for worst actress, though in truth she wasn’t as bad as all that. Her movies were bad but she wasn’t.
The popular response was far less rude. Her pop albums sold well in the 80s, she opened for Sinatra in Las Vegas, and she even won a Golden Globe. She posed for Penthouse, and that issue sold out. By the end of the decade even the critics mellowed. The reviews of her stage performances in the 90s were good. She retired comfortably a dozen years ago and recently sold her Beverly Hills home reportedly for more than $17,000,000.
The popular response was far less rude. Her pop albums sold well in the 80s, she opened for Sinatra in Las Vegas, and she even won a Golden Globe. She posed for Penthouse, and that issue sold out. By the end of the decade even the critics mellowed. The reviews of her stage performances in the 90s were good. She retired comfortably a dozen years ago and recently sold her Beverly Hills home reportedly for more than $17,000,000.
Why is she so largely forgotten? She was no megastar to be sure, but she wasn’t obscure either. Other entertainers from the era with lesser careers are remembered better. I think it is because she skillfully exploited the opportunities provided by her marriage (her complete openness about this seems part of what irked the critics) and lived a responsible life. There were no drunken appearances on the Tonight Show, no break-ins of ex-boyfriends’ apartments, no sex tapes, no shots fired, no cocaine busts, no paternity suits, and no forgotten underwear in front of paparazzi. How boring.
It will be two years next week that Anna Nicole Smith died, and the tabloids already are full of retrospectives on her life and death. A true post-modern celebrity, she was famously famous for being famous. She dropped out of high school, stripped at a Houston club called Gigi’s, posed for Playboy, married a Wealthy businessman over 60 years her senior, and briefly hosted a bad reality show. She is not the first pole dancer to marry a large bank account, yet somehow she became a focus of national attention even though she didn’t even pretend to be talented in the usual sense.
Orson and Pia |
Don’t get me wrong, I respect and like ecdysiasts. I also respect the year of happiness Anna gave to an 89 year old man. I don’t condescend because she didn’t cultivate other professional skills. She didn’t need to. No one likes to see a personable and harmless person die young. Yet, there is something off-putting about the reports of how much pain there was in the life of someone with beauty, health, wealth, love, and fame. (The loss of her son a few months before she died was a real tragedy, but that explains nothing about the years prior.) Under the circumstances, overdosing was more exasperating to this observer than sad.
.
Miss Smith left too early, and this is truly unfortunate. Yet, here is to Pia who did it better – and is still doing it better.
Fake-Out (1982) - Opening Number
Sunday, January 25, 2009
After Hours
John Maynard Keynes, after years in the woodshed, is back in fashion. As most of us struggle this year to keep our heads above water, the government is following his advice about aggregate demand by spending astonishing trillions of dollars.
John Maynard Keynes should be on anyone's short list of great 20th century economists and thinkers. Yet he did make several predictions that proved false. One of them was that, as incomes rose (as by and large they have, though not this year), people would choose leisure over additional income and so would work less. This certainly had been the case previously, which is not surprising: in the 19th century 12 hour days and 6 day weeks were the norm. Who wouldn't want shorter hours than those? Yet it turned out that there was a point at which substitution stopped. Average hours-worked leveled off by the 1950s and actually have risen slightly since while two income households became the norm.
Material desires are part of the reason. Old luxuries are now viewed as necessities. Completely new necessities such as satellite and cell services have arisen. Meanwhile, some of the old basics -- notably housing, health care, and education -- have risen sharply in cost relative to incomes and other prices. So, we struggle to keep up with higher expenses by logging more overtime.
There is more to it than simple materialism though. Even those who can afford to cut their work hours without scrimping typically don't, largely because, by and large, we don't use additional leisure time in very satisfying ways. For most people, more free hours do not translate into more art, literature, or travel. (Travel, of course, is expensive and requires more work to pay for it anyway.) Though it is the rare person who does not describe himself or herself as "creative," creative output is remarkably unrelated to the amount of time available for it. Most people with more time on their hands simply watch more television. Apparently, at least in the absence of truly sizable wealth that can buy constant entertainment, if there is anything more boring than a regular work week, it is the absence of one.
All that may seem a long and oddly irrelevant introduction to the real topic, but there is a connection. The topic is vice.
Thanks to a puritanical heritage, we still tend to classify almost all pleasurable indulgences as vices, even if we revel in them. In the proper measure, many of them are integral to the enjoyment of a full life; others may not be integral, but aren't always harmful in moderation either. Paracelsus: "Everything is a medicine. Everything is a poison. It is all a matter of dose."
Some people consider excessive work a vice. The TV viewing mentioned above has greater cultural value than it usually is credited as having, but beyond a certain point it is stultifying, and being a couch potato is actually dangerous to health. Sex, even among consenting adults, can be destructive in the absence of basic precautions -- or in the presence of some complication such as an unwitting spouse back home. (In my observation a spouse rarely stays unwitting.) All the same, it is one of the basic joys of life. Consuming alcohol in modest amounts apparently is physically healthy, and in moderate amounts can be socially enhancing, though we all know the hazards of excess – and we all know people who think a six-pack every night is moderate. Legalities aside, a similar argument can be made about recreational drugs. (For the record, I am sober when it comes to alcohol and other recreational chemicals; my vices are of another sort.) In short, all so-called vices can be self-destructive – though too strenuous avoidance of some of them may be so too.
The moderate folks are not much trouble to themselves or others. Unfortunately, excess – not just as an occasional celebration but as a way of life -- is far from rare, and I wonder if leisure time is a large part of the problem. There are, of course, people who live desperate lives and who try to numb their awareness as a way of escape; however, in first world regions at least, there are far more whose lives are not (or need not be) noticeably terrible, yet who still strive to escape through drugs and dangerous behavior. I think many of these folks just don't know what to do with themselves, and seek out excess as a way of feeling alive; lazily whittling on the front porch just doesn't do the job for them.
Who would have thought it? Leisure apparently is a challenge to human happiness. Not all of us can cope with it.
No, I offer no grand solution, and wouldn't put any trust in anyone else's. I certainly propose no laws. Each person has to work this out for him/herself. I merely note the unexpected risk. Perhaps there is an upside to the widespread failure to save for retirement. Assuming one survives a life of excess to reach old age, there will be work to do.
John Maynard Keynes should be on anyone's short list of great 20th century economists and thinkers. Yet he did make several predictions that proved false. One of them was that, as incomes rose (as by and large they have, though not this year), people would choose leisure over additional income and so would work less. This certainly had been the case previously, which is not surprising: in the 19th century 12 hour days and 6 day weeks were the norm. Who wouldn't want shorter hours than those? Yet it turned out that there was a point at which substitution stopped. Average hours-worked leveled off by the 1950s and actually have risen slightly since while two income households became the norm.
Material desires are part of the reason. Old luxuries are now viewed as necessities. Completely new necessities such as satellite and cell services have arisen. Meanwhile, some of the old basics -- notably housing, health care, and education -- have risen sharply in cost relative to incomes and other prices. So, we struggle to keep up with higher expenses by logging more overtime.
There is more to it than simple materialism though. Even those who can afford to cut their work hours without scrimping typically don't, largely because, by and large, we don't use additional leisure time in very satisfying ways. For most people, more free hours do not translate into more art, literature, or travel. (Travel, of course, is expensive and requires more work to pay for it anyway.) Though it is the rare person who does not describe himself or herself as "creative," creative output is remarkably unrelated to the amount of time available for it. Most people with more time on their hands simply watch more television. Apparently, at least in the absence of truly sizable wealth that can buy constant entertainment, if there is anything more boring than a regular work week, it is the absence of one.
All that may seem a long and oddly irrelevant introduction to the real topic, but there is a connection. The topic is vice.
Thanks to a puritanical heritage, we still tend to classify almost all pleasurable indulgences as vices, even if we revel in them. In the proper measure, many of them are integral to the enjoyment of a full life; others may not be integral, but aren't always harmful in moderation either. Paracelsus: "Everything is a medicine. Everything is a poison. It is all a matter of dose."
Some people consider excessive work a vice. The TV viewing mentioned above has greater cultural value than it usually is credited as having, but beyond a certain point it is stultifying, and being a couch potato is actually dangerous to health. Sex, even among consenting adults, can be destructive in the absence of basic precautions -- or in the presence of some complication such as an unwitting spouse back home. (In my observation a spouse rarely stays unwitting.) All the same, it is one of the basic joys of life. Consuming alcohol in modest amounts apparently is physically healthy, and in moderate amounts can be socially enhancing, though we all know the hazards of excess – and we all know people who think a six-pack every night is moderate. Legalities aside, a similar argument can be made about recreational drugs. (For the record, I am sober when it comes to alcohol and other recreational chemicals; my vices are of another sort.) In short, all so-called vices can be self-destructive – though too strenuous avoidance of some of them may be so too.
The moderate folks are not much trouble to themselves or others. Unfortunately, excess – not just as an occasional celebration but as a way of life -- is far from rare, and I wonder if leisure time is a large part of the problem. There are, of course, people who live desperate lives and who try to numb their awareness as a way of escape; however, in first world regions at least, there are far more whose lives are not (or need not be) noticeably terrible, yet who still strive to escape through drugs and dangerous behavior. I think many of these folks just don't know what to do with themselves, and seek out excess as a way of feeling alive; lazily whittling on the front porch just doesn't do the job for them.
Who would have thought it? Leisure apparently is a challenge to human happiness. Not all of us can cope with it.
No, I offer no grand solution, and wouldn't put any trust in anyone else's. I certainly propose no laws. Each person has to work this out for him/herself. I merely note the unexpected risk. Perhaps there is an upside to the widespread failure to save for retirement. Assuming one survives a life of excess to reach old age, there will be work to do.
Sunday, January 11, 2009
On Diamond Mining
When I spotted a 7 volume hardback set in good condition of the complete novels of Mark Twain offered on Amazon for $9, the decision to order it took less than a second. Twain already occupied some of my shelf space with novels, short stories, letters, and essays. Yet, there was material in the set that I didn't already own and hadn't read, including Pudd'nhead Wilson. Pudd'nhead Wilson is one of his true gems, and I’m glad I’ve at last read it. It rarely is assigned in school because, I suspect, educators face so much trouble over racial issues in Huckleberry Finn that they simply haven't the heart for another round over the far more egregious Pudd'nhead, even though in both cases Twain's head and heart are very much in the right place. Despite the dry folksy humor, the latter novel, set in antebellum Missouri, is one of the most brutally cynical portrayals of human nature in general, and of race relations in particular, to be found in literature.
Sam Clemens himself is not at much risk of any more serious censorship at this point than exclusion from some required reading lists. Even then, recreational readers are free to find him on their own. Nevertheless, that there is any controversy over him at all underlines the importance of the First Amendment injunction that Congress shall make “no law” that abridges “the freedom of speech, or of the press." This protection receives more lip service than anything else in the Bill of Rights, yet it also is the one most consistently and relentlessly under attack. The attackers, by and large, are people of good will trying to protect the innocence of children or to avoid offense to some class or group. Defenders argue that the way to protect children is not to deny them the benefits of growing up in a free society, and that "offense" is not a good enough reason to shred the Bill of Rights either. After all, there are those who will take offense at almost anything, from Huck Finn to rap music. (Do we no longer learn the "sticks and stones" line?) As it stands, the Amendment does not make exceptions for offensive language – or, for that matter, for sexual content. It says "no law." I see no reason to interpret it differently. (I'm of course aware that the Supreme Court, notoriously in Roth v. United States [1957], decided that "no law" really means "some laws," something only 6 out of 9 lawyers possibly could conclude.)
One always has to dig through tons of dirt to find gems, as any diamond miner knows. That is all the defense the dirt needs.
Sam Clemens himself is not at much risk of any more serious censorship at this point than exclusion from some required reading lists. Even then, recreational readers are free to find him on their own. Nevertheless, that there is any controversy over him at all underlines the importance of the First Amendment injunction that Congress shall make “no law” that abridges “the freedom of speech, or of the press." This protection receives more lip service than anything else in the Bill of Rights, yet it also is the one most consistently and relentlessly under attack. The attackers, by and large, are people of good will trying to protect the innocence of children or to avoid offense to some class or group. Defenders argue that the way to protect children is not to deny them the benefits of growing up in a free society, and that "offense" is not a good enough reason to shred the Bill of Rights either. After all, there are those who will take offense at almost anything, from Huck Finn to rap music. (Do we no longer learn the "sticks and stones" line?) As it stands, the Amendment does not make exceptions for offensive language – or, for that matter, for sexual content. It says "no law." I see no reason to interpret it differently. (I'm of course aware that the Supreme Court, notoriously in Roth v. United States [1957], decided that "no law" really means "some laws," something only 6 out of 9 lawyers possibly could conclude.)
One always has to dig through tons of dirt to find gems, as any diamond miner knows. That is all the defense the dirt needs.