Monday, December 29, 2014

The Art of Doing Nothing

Last night, as is common for me, I sat on the couch in an unlit living room while letting my mind wander to this and that. A few friends are currently staying at the house, and one happened to notice me in there. “Richard, it’s dark in there! Are you feeling okay? Is something wrong?” she asked, genuinely concerned. I explained, apparently unconvincingly, that everything was fine and this was just something I do. I didn’t add that I had assumed it was something everybody did. Once upon a time it was – by nearly everybody anyway. Nowadays, though, we are so focused on our laptops, iPhones, Playstations, satellite TVs, tablets, and (if we absolutely must divert our eyes from a screen) earphones that sitting alone in the dark without any electronics probably is a bit weird.
Back in 1928 economist John Maynard Keynes wrote Economic Possibilities for Our Grandchildren, prognosticating a world in 2028 with an economy grown sevenfold; after the 1929 Crash, he said that the Depression, bad as it was, was temporary and that his prediction stood. On the production numbers, he has been proven right. 2014 US per capita GDP is up sixfold in real terms over 1928 and may well reach sevenfold by 2028.  Keynes’ goofed, though, in another prediction. He assumed that workers, as incomes rose, would continue to trade some money for leisure time as they had done for the previous century; he figured that by 2028 the workweek would be only 15 hours. They didn’t and it won’t be. The growth in leisure stalled 40 years ago and shows little sign of expanding again anytime soon. The reasons for this are manifold, but best left for a separate essay.
Nonetheless, leisure time is much longer than in 1928 (if not 1978), and some of Keynes’ worries are still interesting. The idle rich, he noted, “failed disastrously” at occupying themselves satisfyingly, and he was concerned that the rest of the population would be no better at it. Only those who could practice and enjoy “the art of life itself” could avoid a loss of purpose akin to a nervous breakdown. In practice most of us have avoided both the artistry of life and nervous breakdowns. We have consumed our time instead with mindless busyness in the real world and virtual activities in the cyber world – frequently at the same time.  If all else fails, we can work longer; our bosses won’t mind, even when we’re self-employed.
If this is enough to keep one happy, so be it. (The evidence is that it isn’t, by and large, but that too is best left to a separate essay.) The real world with real people (“meatspace”) has much to recommend it, but let me recommend one weird old-fashioned form of solitary virtual activity, too. Turn off the smart phone, sit in the dark and let the mind wander. It’s a good way to become conscious of one’s own thoughts.  They’re there. Really.

Tuesday, December 23, 2014

General Admission


Americans have a reputation for lacking the stomach for long wars. Whether it is deserved (without meaning to sound Clintonesque) depends on how long “long” is. The Revolution (1775-1783) was 8 years. The Civil War was an especially sanguineous 4. So was the US participation in World War 2. While we did tire of it eventually, we stuck out Vietnam for 10, counting from JFK’s infusion of 16,000 troops in 1963. Afghanistan is 13. Those seem pretty long to me – but perhaps I’m proving the point.  It’s not just the length but the nature of the combat that matters. We’re pretty good at taking on conventional forces, but insurgencies that drag on and on wear on our patience. We even lost patience in our own country, ending Reconstruction in 1877 before its objectives were secure.

The difference between conventional and counterinsurgency warfare looms large in Lt. General Daniel P. Bolger’s book, provocatively titled Why We Lost: A General’s Inside Account of the Iraq and Afghanistan Wars. The word “lost” also is open to interpretation. Elected governments continue to hold sway in Baghdad and Kabul after all. However, ongoing civil war was never the plan; to the extent this is the unwanted reality, “lost” is a fair enough description. Bolger was a general officer in both Iraq and Afghanistan. The very first sentence of the book is, “I am a United States Army general and I lost the Global War on Terrorism.” He doesn’t mean all by himself, but he does mean himself personally along with others. He was never in overall command in either conflict: “You’d find me lower down on the food chain, but high enough.” High enough for his own decisions to have wide repercussions.

Bolger notes that US forces were (and fundamentally still are) configured for “short decisive conventional conflicts…Employed thusly, American airpower and SOF [Special Operations Forces] in 2001, and airpower and armor in Iraq in 2003, worked as advertised.” But, he writes, “Counterinsurgency works if the intervening country demonstrates the will to remain forever. Once it becomes clear that the external forces won’t stay past a certain date, the guerrillas simply back off and wait it out.” Yet the US and its allies didn’t pocket the victories in Kabul and Baghdad and then leave, as they could have done. They stayed. In Afghanistan this posed special geographical challenges in a landlocked mountainous country accessible only at the sufferance of frenemies. In Iraq the demographics weighed heavier.

Where was Bolger’s insight in these matters back in 2001 and 2003 when communicating it to the civilian leadership and public might have done some good? Lacking, to hear him tell it. “We faltered due to a distinct lack of humility.” He blames himself and other Pentagon brass for overconfidence after the early successes, and for failing to recognize and warn that “nation-building” was a task outside the expertise and resources of the military.

Yet this assignment of blame lets the civilians too much off the hook: the Bush Administration to be sure, but also key Democrats. Feinstein, Dodd, Biden, Kerry, Reid, Edwards, Schumer, and Clinton, among others, all voted for the Iraq War which passed Congress overwhelmingly in a bipartisan vote. The public, too, was solidly behind it. Opponents were such fringy odd couples as Ron Paul and Dennis Kucinich. [Also, as it happens, me. I claim no special foresight: it was just consistent with a general instinct to oppose interventions of choice, whether in Panama, 1991 Iraq, Somalia, Kosovo, Libya, or elsewhere. Sometimes I’m probably wrong. Maybe time will prove me wrong in this instance, too.]

Besides, there were warnings from top brass from Colin Powell on down. (Powell was Secretary of State in 2003, but had been Chairman of the Joint Chiefs of Staff in the first Iraq War.) I remember full well US Army Chief of Staff General Eric Shinseki testifying to the Senate early in 2003 that controlling Iraq after the success of an initial invasion would take “something on the order of several hundred thousand soldiers.” No force of that size was available from the US and its partners for any lengthy presence. The scaled down post Cold War military simply didn’t have the troops and equipment to do this and still meet its other commitments. It was possible to rebuild the troop strength sufficient to do it, but that was an unsavory political choice that no one in Congress or the White House wanted to make. Instead, warnings were dismissed and replaced by wishful thinking that demanded too much of too few. In Bolger’s words, “The key thing was to blow Saddam off the map. The rest might well take care of itself.” In 2004 US troops totaled 138,000 and Coalition forces from 37 countries contributed 23,000, but nothing took care of itself. Bolger adds that wishful thinking didn’t end with the last Administration, but has been a continuing feature of the current one too.

Bolger’s snappily written book is a worthwhile history of the two campaigns. It doesn’t lose sight of individuals, whether in command or carrying rifles on foot. He reports what they did right and isn’t squeamish about reporting who did what wrong.  As the evaluation of an insider with views at variance with many other high ranking officers, Bolger’s book is both a complement and a counterbalance to accounts such as Tommy Franks’ American Soldier.

General Shinseki in 2003 Senate hearing

Thursday, December 18, 2014

Fernwood Revisited

Contemporary society arrived in the 1970s. Don’t let the big hair, bell bottoms, and disco music fool you into thinking otherwise. While much of the modern consumer tech hadn’t yet spread out of company labs and nerds’ garages, the social revolution of the 1960s had basically succeeded by 1970, and this by far was more significant than the state of consumer electronics. If anything, the 2010s have more in common with the 1970s than with the 1980s & 90s when a socially conservative reaction to the 70s flexed some muscle. You can see it even in 70s TV sitcoms, particularly those of Norman Lear: All in the Family, Good Times, Maude, et al. That is not to say there are no differences between then and now. There are plenty. The revolution was still fresh four decades ago, and many folks were dazed by it. This, too, shows up in film, music, and TV from the era.

One night in 1976 I was channel-surfing, which was a much smaller wave to ride than it is today. I beached upon a new Norman Lear program. Unusually, this one didn’t air on a network in prime time. It was syndicated and in the New York market it played at 11:30 PM on WNEW.  The show was Mary Hartman Mary Hartman. It sported a solid cast including Louise Lasser and Mary Kay Place. To call this show a spoof of soap operas is almost right – yet not quite right. The show didn't really aim for comedy though it often was funny. It had no laugh track. It stuck firmly to the soap opera conventions in its sets, camera shots, delivery, music, and 5 day per week schedule. But the scripts and dialogue were just…off. This confused station managers in some markets enough to air it daytime with the other soaps. Many viewers never really got it and I’m sure many wouldn’t today, but enough did to make the show a surprise hit.

Nowadays we are accustomed to broad comedies with over-the-top characters, bizarre plots, and often raunchy dialogue. Mary Hartman Mary Hartman is much more low-key than that, and never more so than when the plot goes in some weird direction. In the town of Fernwood, Ohio, Mary (Louise Lasser) discovers her grandfather is the Fernwood Flasher; her younger sister Cathy takes promiscuity to a new level; her daughter Heather is a witness in the slaying of five people (plus “two goats and eight chickens”); Mary herself becomes a hostage in a stand-off with police; meanwhile her aspiring country singer friend Loretta (Mary Kay Place) writes songs about the murders and the hostage ordeal. The actors play it completely deadpan. The characters are plainly confused about their places in a changed world. Mary’s husband Tom still wears his high school varsity jacket, since high school was the last time he knew his place and had high hopes. Mary is a housewife aware that the job description is increasingly out of step with the times. She speaks about the murders one moment and about the waxy yellow buildup on her kitchen floor the next. By not playing the situations for easy laughs, the show instills in viewers the creepy sensation that this is not just a soap opera world but our world. It isn’t all so rare to have an errant family member, a murder down the street, a local hostage situation, an identity crisis, or a waxy yellow build-up. We deal with such things with the same alternation of bewilderment and retreat into the mundane as Mary.

Last week, for the first time in decades, I watched a couple dozen episodes of the show which now is available on DVD. Before pressing “play,” I wasn’t sure if, after all these years, the soap would come across solely as a relic of its time or if it yet would be relevant to 2014. It is not just a relic. In oh-so-many ways, today is still 1976. I’m glad to have revisited Fernwood. 


Sunday, December 14, 2014

Recap: Corporal Punishers vs Roller Vixens

In the final bout of the season the Corporal Punishers of the Jerzey Derby Brigade (JDB) faced the Red Bank Roller Vixens on the JDB’s home rink in Morristown NJ.

The last time I saw the Vixens skate was in November 2013. New faces have been added since then, but there were a few returning veterans including Infra Red and Pushy Galore. The team needed them. In the very first jam Punisher CaliforniKate blew through Red Bank defenses and put 15 points on the board. In the second jam Punisher ApocElyse lapped the pack twice. It was a harbinger of things to come. Infra Red put the first few points on the board for Red Bank. Additional points were gained by Lady Speedstick and Pink Wrecker, but the Corporal Punishers dominated the first half, ending with a half-time lead 156-46.

The Vixens redoubled their efforts in the second half with noticeable effect. Blocking became more aggressive with Purple Crush and C the Fury delivering solid hits. More than a few pile-ups were part of the action. Red Bank’s jammers were more successful, with points following a star pass maneuver putting the team over the 100 mark. Yet, it was still the Punishers’ game. Morristown has rebuilt a depth of experienced jammers after having lost key skaters in 2013 and it shows: Brass Muscles, LL Kill J, and Porcelain Brawl all proved adept at evading or pushing through Red Bank blocking. Vixen blockers are not lacking in energy or aggression, but the Punishers’ tighter defensive coordination gave them an advantage. The Punishers’ recent improvement shows that a single season can make a world of difference, and a 2015 rematch might see fortunes reversed. The Corporal Punishers took the win with a score of 264-116.   



Friday, December 12, 2014

And That’s the Truth

For no particular reason, here is a list of a dozen random facts that caught my attention this past week. Calling anything a fact will stir challenges just on principle in these cynical (yet, oddly, not “skeptical” in the proper sense) days, but I deem the sources, ranging from a chemistry text to The New York Times to be reliable enough to regard the items as the truth until proved otherwise.

1.       Neptunium, element 93 (wedged between uranium and plutonium), doesn’t have significant commercial applications, but you probably have some of the stuff in your hallway anyway. Smoke detectors commonly use small amounts of americium (element 95) isotope 241. Alpha particles from the decay of americium interact with smoke particles in a detectable way. Am241 decays into Np237. Am241 has a half-life of 432 years while Np237 has a half-life of 2,145,500 years. So, in several thousand years nearly all the americium will have been converted to neptunium. You’ll probably need to change the batteries in the smoke detector before then.
2.       Among the scams practiced by our nation’s enterprising criminals (Mark Twain’s remark on the criminal class notwithstanding, I don’t refer in this instance to Congress) is the one of filing a fake deed in a county courthouse, and then borrowing money against the property, usually as an equity loan. It’s the kind of borrowing that doesn’t get paid back. The real owner suddenly discovers a lien against his property. To show how disturbingly easy this can be, in 2008 the New York Daily News gave it a try. They successfully filed a deed in the New York City Hall recording the transfer of property from Empire State Land Associates to Nelots Property. If anyone at the city hall noticed that the property address was the Empire State Building, the datum apparently wasn’t considered worthy of comment. Nor was the fact that the witness on the deed was Fay Wray and the notary Willie Sutton. (Perhaps those names really mean nothing to those below a certain age.) Nelots was fictitious and the Daily News revealed the hoax without making an effort actually to defraud the owners, so no one did jail time, but they made their point.
3.       Blue whales’ tongues typically weigh 2700 kilograms. I do not know who weighed them.
4.       69% of cell phone owners experience “phantom vibration,” the sense that the phone is vibrating when it isn’t. While there might be physical explanations in some cases – perhaps there are instances of resonance or feedback or something – the prevailing opinion is that it is generally a psychological phenomenon.
5.       Robert Louis Stevenson wrote the first draft of Strange Case of Dr Jekyll and Mr Hyde inspired by cocaine. The second draft was inspired by his wife Fanny Osbourne. She read the first and made suggestions, so he burned it. He found it easier to rewrite it in line with her suggestions that way.
6.       On September 11, 2001, Ben Sliney had a tough day. He was the FAA National Operations Manager. He took charge in the emergency and ordered the immediate suspension of all civilian flights, a radical but much praised decision. It was Ben’s first day on the job.
7.       On June 26, 1944, was the only (so far) three way major league baseball game: Yankees vs Giants vs Dodgers. The teams rotated through the tops and bottoms of nine innings. Dodgers won: Dodgers 5, Yankees 1, Giants 0. It was a novelty game intended to sell war bonds to the stadium audience; attendees bought $6,500,000 worth.
8.       “D’oh,” Homer Simpson’s characteristic exclamation, is in the Oxford English Dictionary: “Expressing frustration at the realization that things have turned out badly or not as planned, or that one has just said or done something foolish.”
9.       It long has been known that tasters (even professionals) are remarkably poor at ranking cheap and expensive wines in blind tests, and that tasters’ opinions about the wines do not match what they say when they can read the labels. Expectations count for a lot, it seems. In 2009 researchers from the American Association of Wine Economists tested whether people were just as unreliable judging food such as pâté. They had eighteen subjects taste Spam, pork liver pâté, liverwurst, duck liver mousse, and Newman’s Own Dog Food. They were asked to pick out the dog food. Three of the eighteen got it right, no better than chance.
10.   The record for going without sleep according to Scientific American is 264 hours. It was set by a 17-year-old at a science fair in 1965. I think I’ll let him keep the record.
11.   A diamond five times the size of earth orbits the pulsar PSR J1719-1438. I’ve read why astronomers have reached this conclusion, but I’d still like to see it for myself. That might be difficult. If true, there’s no need fussing over the F. Scott Fitzgerald tale The Diamond as Big as the Ritz.
12.   The little metal casing that holds an eraser to a pencil is called a ferrule. Maybe you knew that. Until this week I didn’t.


Saturday, December 6, 2014

Movies after Midnight

I probably drink too much caffeine. There are advantages, though. When midnight rolls around and eyes refuse to close, there are DVDs to spin. Mini-reviews of ten follow. They were viewed as five double-features: a lately acquired habit has been pairing a newly viewed film with an older one which the first brought to mind.

****     ****     ****     ****

Sin City: A Dame to Kill For (2014) – It has been 9 years since Sin City (2005). The sequel was a long time coming, but it doesn’t disappoint. Like the original, this surreal noir tribute film is based on the graphic novels of Frank Miller. Perhaps taking a page not only from the comics but from Kill Bill!, the director appears never to have uttered the words “too much” with regard to violence. This is seldom the right decision, but in this case it was. Several plots intertwine. Jessica Alba returns as Nancy, but jaded and damaged by her thirst for revenge. Senator Roark is as evil as ever. Joseph Gordon-Levitt learns that luck carries you only so far. Mickey Rourke still enjoys busting up scumbags, even when the fight isn’t really his. There are solid supporting roles for Juno Temple and Christopher Lloyd. Eva Green is impressively sociopathic as a dame to kill for. If you liked the first film, you can’t go wrong with this one.

Side note: while the location of (Ba)sin City is as unspecific  as that of Clark Kent’s Metropolis, the I-287 sign indicates it can’t be much more than an hour from my house.

Out of the Past (1947) – Whenever I try to hook someone on classic noir, Out of the Past is my go-to movie. It’s got it all: a hard-bitten detective who is a secret romantic underneath the cynical persona, an unsavory client as likely to pay off with bullets as with money, and femmes fatales wielding gats and gams.  Whit Sterling (Kirk Douglas) hires Jeff (Robert Mitchum) to find Kathie (Jane Greer), the gorgeous dame who shot him and stole his money. He just wants her back. He tells a puzzled Jeff that he’ll understand when he meets her. He understands all right when he meets her, and so do we. The betrayals come thick and fast. The final betrayal deserves a thumbs up. Superb.

****     ****     ****     ****

Hunger Games: The Mockingjay Part 1 (2014) – Who would see this movie without first seeing the previous two Hunger Games? I suppose some folks would, but if you’re among them, stop. Go back and see the other two first. Or, at the very least, see the first one.  If you do that you will know what to expect here, and the film delivers on expectations more than adequately. Katniss is now in District 13 where the rebels survive in their underground bases. She joins their fight against the Capital. However, we get the sense that the rebels and their president (Julianne Moore) might not be such an improvement over President Snow were they to gain power. Peeta certainly thinks so, but he has been brainwashed so his conclusions are suspect.  There is plenty of action and a plot that is satisfyingly more than a simplistic “good guys vs bad guys.” My one big complaint is right in the title: “Part 1.” The decision to split the adaptation of the final book of the trilogy into two parts was a business decision, not a directorial one, and it shows.

Sleeper (1973) – Awakened from cryogenic preservation into an authoritarian future, Miles (Woody Allen) in order to save his brain (“my second favorite organ”) flees to find the Resistance in this scifi comedy.  Along the way he falls for Luna (Diane Keaton). Luna doesn’t fail to notice the handsome Erno who heads the Resistance. Discovering that only the nose of the state’s dictator has survived an assassination attempt, Miles and Luna try to steal the Leader’s nose to prevent it from being cloned. This will leave the state Leaderless and give the Resistance the opportunity it needs. Nonetheless, Miles makes clear that he does it to please Luna, not because he believes in any political system or in political solutions: “In six months, we’ll be stealing Erno’s nose,” he says.

This early Woody Allen comedy holds up remarkably well, and is still funny on several levels.

****     ****     ****     ****

Venus in Fur (2013) – I’m not the biggest Roman Polanski fan, but this little film, which he wrote and directed, is a gem. Thomas is a playwright who has adapted for stage the 1870 Austrian novel Venus in Fur by Leopold von Sacher-Masoch (the fellow who lent his name to “masochism”). He can’t find the right actress to play Vanda, the dominatrix to whom Masoch’s hero Severin submits supposedly as an act of love. An unscheduled actress enters the theater, which is empty except for the two of them. She says her name just happens to be Vanda. She and Thomas read lines. She proves perfect in the part even though Thomas is convinced she doesn’t understand the material. Thomas sees the play at bottom as a love story while Vanda says it’s about a decent girl corrupted by a pervert. They read lines, argue, and exchange thoughts; it is not always clear where the dialogue of the play ends and their own discussions begin. Vanda is intrigued by his play but hates it, telling him (in reference to the title) that’s it’s a good thing there are no goddesses or “you would be fucked.” Vanda knows too much, however, and one begins to wonder if Thomas actually does have Venus on his hands. While that question isn’t answered in the film, and while nothing occurs that is indisputably supernatural, it is the most straightforward explanation for numerous very odd circumstances and for what happens to Thomas.

It can’t be a pure coincidence that Mathieu Amalric (cast as Thomas) looks very much like Polanski. He nails the part. Emmanuelle Seigner (Vanda) also hits the right notes. Recommended.

Gilda (1946) – One is hard pressed to find a more perversely sadomasochistic couple than Glenn Ford and Rita Hayworth in Gilda (1946). Johnny Farrell (Ford) goes to work in Buenos Aires for a casino owner named Mundson only to discover that Mundson’s wife is Johnny’s old flame Gilda (Hayworth). Part of Johnny’s job is to be her bodyguard. They don’t tell Mundson about their history but that is the only point on which they agree. They relentlessly taunt each other for past wrongs. The wrongs are unspecified though there is a hint of a prior infidelity by Gilda and more than a hint of Johnny’s excessively stubborn refusal to forgive. Yet, they continue the mutual clawing precisely because the spark between them is still there. The hurt they have done each other (and continue to do to each other) is so much a part of their identities that they are almost fond of it. Mundson, meanwhile, is involved in a criminal scheme which will test where the loyalties of Johnny and Gilda lie.

This is a fine film about two flawed people with a passionate but cruel relationship. Rita never looked or sounded better.

****     ****     ****     ****

What If (2014) – Daniel Radcliffe has had a varied post-Potter career. Here he is paired with the charming actress Zoe Kazan.  Wallace and Chantry (Daniel and Zoe) are “just friends” despite an obvious chemistry between them. The reason is that Chantry has a committed relationship with a good guy, and doesn’t want to mess it up. Wallace, who knows he wants more, is caught between the options of being a jerk (coming on to her) or being pathetic (hiding his feelings from her). When Chantry’s boyfriend leaves Toronto for an extended stay in Dublin because of a career opportunity, however, the strain on the relationship with him grows. So does the strain on the one with Wallace.

This is a pleasant love story of the sort not seen much in movies made on this side of the border in the past decade or two. It isn’t particularly original (When Harry Met Sally is the obvious comparison), but it works thanks almost entirely to Radcliffe and Kazan. The absence of fashionable cynicism is refreshing.

The April Fools (1969) – Howard (Jack Lemmon) is unhappily married to a woman who cares little for him, but at least he gets a promotion. At a party his playboy boss Ted (Peter Lawford) tells him to play around. Ted gives him tips for picking up women and tells him to try them on someone at the party. Howard does just that with Catherine (Catherine Deneuve). Wow, the tips do work. He and Catherine leave the party together. Uh-oh, Catherine is Ted’s wife. She is unhappily married, though, and announces her plan to leave Ted and fly to Paris. Howard wants to join her. Will they follow their hearts or will they be overwhelmed by practical considerations of money, responsibilities, and commitments?

This film came out when old moral standards were breaking down but while a 60s vibe of “all you need is love” was in the air. It was also a time when happy endings in romantic comedies no longer were de rigueur. This isn’t a particularly profound or insightful film, but it has some charms, asks a still relevant question, and captures a moment in time.

****     ****     ****     ****

Byzantium (2012) – Tired of vampires yet? Most of us are. Nonetheless, this flick is stylish enough to be worth a look. Clara (Gemma Arterton) and Eleanor (Saoirse [pronounced SER-sha] Ronan) are a mother-daughter pair of vampires living at the Byzantium, a sea-side hotel that has seen better days. We learn their two hundred year-old backstory in flashbacks when Eleanor writes it in a supposed work of fiction that she lets her boyfriend read. This is indiscreet, but she still is 16 after a fashion – and always will be. Clara and Emma already are on the run from the Brotherhood, a chauvinistic trade association with anticompetitive practices, for violating the code of vampirism. So, indiscretion can be lethal. Sure enough the story gets around, and the Brotherhood shows up to kill them. Who else is tasked by the Brotherhood to behead Clara but Darvell, the fellow ultimately responsible for Clara having become a vampire 200 years earlier? He isn’t keen on the idea though. He still feels bad about his behavior a couple centuries back. Our sympathies are so much with Clara and Eleanor that we tend to forget they, too, are predators.

Dracula’s Daughter (1936) – In this official Universal Studios sequel to Dracula (1931), Contessa Marya Zeleska is Dracula's daughter, and, yes, she is a vampire. The story begins just after the death of Dracula at the hands of von Helsing. The Contessa steals Dracula’s body and burns it in the hope of breaking the curse of vampirism, but this doesn’t help. She is still thirsty. Marya then seeks out the help of a psychiatrist, Dr. Jeffrey Garth, hoping to break her sanguineous habit through will and therapy. This doesn’t help either. Her thirst continues. Her assaults, like Dracula’s, are more than a little erotic, particularly against the painter’s model Lili. In a case of classic transference, however, she develops a thing for Garth. She eschews therapy but kidnaps the woman Garth loves in order to lure him to Transylvania. She plans to bite Garth so they can live together forever as vampires. He, of course, has other plans.

This movie had pretty good reviews back in the day – at least in comparison to other horror films. If you like the 1930s-40s Universal horror movies (I do), this one shouldn’t be overlooked. Viewers new to the genre, however, should be forewarned that, unlike their modern counterparts, these flicks are about atmospherics, not about action and gore.

****     ****     ****     ****

If I were to recommend only one, the winner this week would be Venus in Fur. Runner-up: Gilda.


Tuesday, December 2, 2014

Jam To-morrow and Jam Yesterday

Library shelves, both physical and digital, are well-stocked with works of futurists. I don’t mean science fiction, though more than a few futurists are also science fiction writers. I mean non-fiction efforts to forecast the future in light of evolving technologies; it’s a genre that cropped up in modern form about a century ago. Some authors are dystopian and lament the world we already have lost. Some are utopian to a degree that would shame Pollyanna. Most, though, are a curious mix of both. Examples: Alvin Toffler discusses the social upheavals associated with accelerating technological change in Future Shock (1970); Erik Drexler extols the promise of nanotechnology in Engines of Creation (1986); Vernor Vinge while in non-scifi mode wrote the influential The Coming Technological Singularity: How to Survive in the Post-Human Era in 1993 about the era when artificial intelligence outstrips human intelligence; Michio Kaku has written more than a dozen such books so far, including Physics of the Future: How Science will Shape Human Destiny and our Daily Lives by the Year 2100 (2011). Possibly the most influential was the early entry The World, the Flesh & the Devil (1929) by British molecular biologist J.D. Bernal. I finally got around to opening it last week.

Bernal chose this title precisely because of the phrase’s baggage, though he himself was atheist and Marxist. The subtitle helps illuminate his subject: An Enquiry into the Future of the Three Enemies of the Rational Soul. By “world” he means the physical environment including the cosmos beyond earth which we engineer to suit ourselves. Bernal discusses methods of propulsion to escape earth’s gravity and to move beyond it; he predicts the eventual construction of permanent habitats in space and describes them in some detail. In the chapter on “flesh” he addresses biology and anticipates genetic engineering: “It is quite conceivable that the mechanism of evolution, as we know it up to the present, may well be superseded at this point.”  He also discusses mechanical biological hybrids (cyborgs) and group minds intermediated by machines (think the Borg of Star Trek). By “devil” he means the dark side of human nature and our animal heritage which so readily turn technologies deadly. “The devil,” he writes, “is the most difficult of all to deal with: he is inside ourselves.” It was a natural thought just 11 years after WW1. Bernal is not entirely sure how well our attempts to transcend ourselves in this regard will turn out.

What is striking about The World, the Flesh & the Devil is how contemporary it seems. Current books by current futurists still raise many of the same points and make many of the same predictions, even though we now are brushing the edges of technologies that were merely a distant notion in 1929. Bernal’s vision holds up pretty well. So do his reservations about how things will turn out given what he calls our mammalian natures.

Why do we enjoy writing and reading about a future we personally will not live to see? Perhaps it is just a way to divert ourselves from a present we find unsatisfactory; futurism thus can be the flip side to nostalgia. Perhaps, also, it is a way of including ourselves in that future – “a way of cheating death,” to quote Bernal in his “flesh” chapter. Does this work? Maybe a little. Bernal died in 1971, but I met him last week after a fashion. I’m sure he would have preferred to meet me (or anyone) in person though. Woody Allen: “I don’t want to live on in my work. I want to live on in my apartment.”

The Offspring – The Future is Now

Tuesday, November 25, 2014

Bird Brains

A local news report informed me earlier this evening that there are 23,000 wild turkeys in NJ according to the state DEP. They mean the fowl, not the whiskey. It is not entirely clear to me who counted or how, but I’ll take the DEP’s word for it. At least a few of the birds live in the woods around my house. They don’t walk through my yard very often, but when they do they do so nonchalantly without apparent concern about ending up on the menu.

For anyone accustomed only to domestic turkeys the wild ones, though not technically a separate species, hold some surprises beyond the color of their feathers. For one thing they fly. They don’t fly particularly well, and most of the time they prefer not to, but they can. They sometimes swoop over the lawn or get up into the trees. For another thing they are fearless. Domestic turkeys are too in a way, but one gets the sense that domestic ones just don’t know any better. Wild turkeys are fearless out of confidence rather than ignorance. On occasion one can be aggressive. An aggressive turkey is annoying rather than truly dangerous, of course, except maybe to a toddler. Finally, they are smarter than their farm-bred kinfolk.

Domestic turkeys are stupid. There is no kinder way to say it and still deliver the facts. My grandfather, a farmer, lost some of the birds because they looked up in the sky with their beaks open when it was raining and drowned themselves. In their scholarly article Some Remarks on Bird's Brain and Behavior under the Constraints of Domestication, Julia Mehlhorn and Gerd Rehkämper remark, "domestic turkeys show the highest degree of brain reduction measured in any of the domesticated birds so far.” No surprise there. So, while these birds face bad odds at Thanksgiving time, if that Zombie Apocalypse depicted in so many movies ever happens, they’ll be safe. Unless it rains.

It long has been known that domestication reduces the brain size of animals. From the same Mehlhorn and Rehkämper article: “Empirical data on brain sizes which show smaller brains in dogs than in wolves or in domestic ducks in comparison to mallards seem to support this point of view.” Before we admire too much the skills of our ancestors who successfully dumbed down animals to make them more manageable, it’s worth noting that they did the same to themselves. To us.

Brain size peaked in humans 20,000 years ago and since then has dropped substantially, both as a percentage of body mass and in absolute terms – from an average 1500 cc to 1350 cc. (Homo Erectus was 1100 cc – we’re getting there.) David Geary, a cognitive scientist at the University of Missouri, speculates, “I think the best explanation for the decline in our brain size is the Idiocracy theory.” He and his colleague Drew Bailey were able to show a correlation in the archaeological record between population density and brain size decline, whether in early farming regions such as the Middle East or hunter-gatherer regions such as aboriginal Australia. “We saw that trend in Europe, China, Africa, Malaysia—everywhere we looked.” The hypothesis is that when societies grow large enough to carry the dim bulbs who would not be clever enough to survive without a social safety net, there is no particular advantage to large brains or high intelligence. The dim bulbs live to reproduce. There is, on the other hand, an advantage to characteristics that suit living peacefully among others. In other words, there is an advantage to being tame and domesticated. Domestication, once again, shrinks the noggin. Hence the reference to the Idiocracy movie.

So long as the turkey is on our menu and not we on its, we’re probably not in too much trouble. But if a Cro-Magnon shows up for dinner, don’t try to beat him at chess afterward.


Friday, November 21, 2014

Desperately Seeking Wonka

My tastes are uninterestingly conventional in most matters, whether in durable goods, consumables, affections, or activities: utterly bourgeois without redeeming depraved fetishes. I like to think my taste in movies and literature is good, but it’s not ultra-refined by any means: I’m seldom in art movie houses and science fiction occupies a disproportionate share of my bookshelf space. I’m neither a wine, beer, nor coffee connoisseur. Nonetheless, I try to avoid lapsing into reverse snobbery, which is always a temptation for those who don’t fully appreciate some of the finer things. We all know this sort of snob: the emphatic man-of-the-people proud to disdain wine and art collectors (among others) as effete elitists with too much money and too little sense of proportion. This snobbery is as unfair as the elitist kind. Great art, science, and craft really do go into the production of wines, and it is admirable to be an enthusiast who has learned enough to appreciate even tiny nuances and distinctions. I just don’t happen to be one of them. I use wine as an example precisely because wine appreciation, while still not entirely exoteric, has gone more mainstream in the US in recent decades – certainly more so than the 1970s when cheap sangria counted as a chic beverage at most parties.

The mainstreaming of upscale products is not confined to wine, of course; numerous products have followed the same path, thereby aiding mightily the profits of suppliers. Mineral waters, for instance, have been around pretty much forever, but during the 1970s they were very much a minority taste; the tap was fine for the vast majority. Bottled water more expensive than gasoline caught on with a larger public afterward. In the same way, there always have been connoisseurs of beer, but specialty beers and microbreweries didn’t really start denting Anheuser-Busch sales until a couple decades ago. Need what happened to coffee shops even be mentioned?

What brings this to mind is, of all things, the jar of leftover Halloween candies I raided a few minutes ago. I rarely get trick-or-treaters but want to be prepared just in case, so there are always leftover Halloween candies. While plucking out a Hershey chocolate bar from the jar, it occurred to me that chocolate might be poised to be the next big thing in the mainstreaming of the upscale. It has all the proper hallmarks. It is a widely enjoyed flavor in its common forms; it is in fact the most popular single dessert and snack flavor. There already is – and always has been – a minority of aficionados willing to pay extraordinary sums for the finest chocolate products. There are and always have been specialty makers and shops. Regional soils and climate affect taste. Chocolate differs according to bean type, cocoa butter content, sugar content, dairy content, weather, and even the local water. Heating and cooling methods profoundly affect texture. So, there is much for a specialist to know.

I’ve tried and liked expensive craft chocolates from several countries, but, as in so many other things, I am a barbarian; I’m satisfied enough with a mass produced Nestlé or Hershey bar to be reluctant to pay the high cost of craft products. However, I can appreciate that others do appreciate the difference enough to shell out for it. Perhaps we are only awaiting the right chain of upscale chocolate shops (Ishmael’s?) to bring a larger public on board.

Have you ever tried a cocoa bean (aka cacao) in its raw state? Don’t. It’s like a foul bitter chunk of stringy wood, which isn’t surprising since the beans grow right out of the bark of trees rather than on the tips of twigs as one might expect. Forastero, Trinitario, Criollo, and Nacional are the four major bean types. Forastero is the one most commonly used in mass produced chocolate because of its high yield. When harvested the beans are thrown on the ground and left to rot – on purpose. They ferment there which brings out the first hints of chocolate flavor. Weather strongly affects how the fermentation happens, so there are vintage equivalents, with some years preferred over others for each region. After fermentation, if you roast the beans, crush them, and mix them in hot water you have the original chocolatl of the Olmecs, Mayans, and Mexica (Aztecs). This is an interesting but not very pleasant drink. Chocolatl means bitter water, and so it is; moreover the cocoa butter (fat) tends to separate thereby giving the drink an oily/gravelly texture. When introduced to Europe it became a novelty drink and it remained that for a few hundred years.

The breakthrough came in 1828 when Van Houten in the Netherlands used a special press to crush the beans to fine powder called Dutch cocoa. This also squeezed out most of the cocoa butter in a much smoother state. Experimenters then mixed the smooth cocoa butter back with the powder, added sugar, and created modern chocolate. Fry & Sons in England began producing chocolate bars in 1847. In Switzerland in 1875 Daniel Peter produced milk chocolate by adding Nestlé powdered milk. Dark and milk chocolates have competed ever since. Local differences in milk and milk fat content affect flavor. Some producers still use powder and others (especially in the UK and US) liquid milk.

There is more to chocolate than flavor. Chocolate contains caffeine, cannabinoids, and theobromine, all of which are psychoactive and may account for some of the pleasure we take in it. Health benefits are claimed for chocolate. If they exist they are probably due to the antioxidants in the beans.


So, all the complexity and craftsmanship that go into chocolate-making should be enough to induce a broader base of connoisseurs (Wonkateers?) to bid up prices and debate star ratings. As for me, though, the Hershey’s is gone and I’m going back to the jar for a Nestlé’s Crunch.


Cadbury ad

Sunday, November 16, 2014

Non-Generic: Lucinda Williams

Back in Paleo-circuit times (pre-internet: Neo-circuit would be dial-up internet) electronic media by necessity were mass media. There was a fairly small number of broadcast radio and TV stations and…well…that was all.  To be sure, there were niche music radio stations even in the early days: country, classical, jazz, etc. In nearly every market, however, there was a dominant top 40 radio station to which most home, car, and portable radios were tuned at least part of the day. In the NYC area, this for at least two decades was 77 WABC AM. While I appreciate and make use of the massively greater array of media choices available today, there was one peculiar advantage to the more limited options of the Paleo-circuit. The top 40 stations were literally that. There was no division by genre; if the top singles one week were by Frank Sinatra, The Rolling Stones, and Tammy Wynette, those were the singles that aired. Recording artists who appeared on television were the same ones who turned up on the top 40 stations. Here is a play list from 1974:

1. The Way We Were - Barbra Streisand
2. Seasons In the Sun, Terry Jacks
3. Love's Theme, Love Unlimited Orchestra
4. Come and Get Your Love, Redbone
5. Dancing Machine, The Jackson 5
6. The Loco-Motion, Grand Funk Railroad
7. T.S.O.P. (The Sound of Philadelphia), MFSB
8. The Streak, Ray Stevens
9. Bennie and the Jets, Elton John
10. One Hell of a Woman, Mac Davis
11. Until You Come Back to Me (That's What I'm Gonna Do), Aretha Franklin
12. Jungle Boogie, Kool and The Gang
13. Midnight At the Oasis, Maria Muldaur
14. You Make Me Feel Brand New, The Stylistics
15. Show and Tell, Al Wilson
16. Spiders and Snakes, Jim Stafford
17. Rock On, David Essex
18. Sunshine On My Shoulders, John Denver
19. Sideshow, Blue Magic
20. Hooked On a Feeling, Blue Swede
21. Billy, Don't Be a Hero, Bo Donaldson and The Heywoods
22. Band On the Run, Paul McCartney and Wings
23. The Most Beautiful Girl, Charlie Rich
24. Time In a Bottle, Jim Croce
25. Annie's Song, John Denver
26. Let Me Be There, Olivia Newton-John
27. Sundown, Gordon Lightfoot
28. (You're) Having My Baby, Paul Anka
29. Rock Me Gently, Andy Kim
30. Boogie Down, Eddie Kendricks
31. You're Sixteen You're Beautiful (And You're Mine), Ringo Starr
32. If You Love Me (Let Me Know), Olivia Newton-John
33. Dark Lady, Cher
34. Best Thing That Ever Happened to Me, Gladys Knight and The Pips
35. Feel Like Makin' Love, Roberta Flack
36. Just Don't Want to Be Lonely, The Main Ingredient
37. Nothing from Nothing, Billy Preston
38. Rock Your Baby, George McCrae
39. Top of the World, The Carpenters
40. The Joker, The Steve Miller Band

Not much consistency there. Accordingly, audiences had more cross-genre exposure than today. We heard a lot of stuff we wouldn’t have chosen to hear if we had programmed the music. Nowadays, of course, we do effectively program our own music, selectively storing and playing our preferred tunes and videos in a multitude of formats. A curious consequence of all these choices is that we tend to be less eclectic. We focus on our preferred brands of music and visual entertainment, while even the old-tech radio and TV stations have grown ever more niche-oriented in order to grab some piece of the fractured audience. (This is also true of opinion and politics, but that is a subject for another blog.)

Some artists are hard to pigeonhole, of course. They fall between the niches. This is certainly true of the blues/rock/country/folk fusion of Lucinda Williams, who is one of the best songwriters working today as she has been for more than 30 years. Her indeterminate style hasn’t stopped her from winning awards and selling recordings, but it does make her less well-known to a broad audience than she would have been years ago. Niche stations are never quite sure that she fits. In ’99 she won a Grammy in a Contemporary Folk category even though that description of her album was more than a little dubious. Paul Rice in a Slant review of Lucinda’s new double-album Down Where the Spirit Meets the Bone comments, “In other words, should Williams be nominated next year, expect the Grammys to once again have no idea what to do with her.”

I’ll be surprised if the Grammys do not face that head-scratcher because Down Where the Spirit Meets the Bone is a fine album. At 61, Lucinda has worldly cynicism but without the bitterness of youth. Songwriters in this stage of life (e.g. Joan Jett in last year’s Unvarnished album) often get retrospective and contemplative as mortality grows harder to ignore. The perspective often enriches their work, and it does here. It’s an impressive collection of songs, all delivered in her distinctive gravelly voice. There are dark songs such as Something Wicked This Way Comes, tough songs such as Cold Day in Hell, sad songs such as This Old Heartache, and songs that smell of the bayou (Lucinda is from Lake Charles, LA) such as Stand Right by Each Other. Some are mellow and some rock. A few, such as Walk On, are very close to modern country but not quite there. She doesn't want to go quite there. In a Rolling Stone interview, she was dismissive of modern country and quoted bassist John Ciambotti: "Country music today is like Seventies rock without the cocaine."

If you’re thinking this isn’t really your kind of music, you’re probably right. It isn’t mine either. I’m not sure it is anybody’s. Nevertheless, I’m glad I bought the album anyway.  Both CDs from the pack are currently in my stereo’s CD tray – with the Offspring, Eric Burdon, and Theory of a Deadman, which are odd company. They’re likely to stay there for a while.



West Memphis (from Down Where the Spirit Meets the Bone)

Philly Blocks Morristown: Roller Derby Recap

I’ve seen the Philly Block Party skate numerous times against teams from both of Morristown’s derby leagues, and seen them defeated only once. Not only do they typically win but win big. Only two months ago the Philly Block Party defeated Morristown’s Corporal Punishers on the latter’s home rink 232-107. In last night’s rematch the early jams threatened a replay of the September bout. JK Trolling scored for Philly on first jam with multiple passes through the pack while Philly blockers displayed a solidity they would show consistently throughout the bout. For several jams Philly built on its lead. The Corporal Punishers soon recovered their footing. At first the Punishers chipped away at their opponents’ lead, and then overtook them with a major point haul by Brass Knuckles. With 5 minutes remaining in the first half, the Block Party recaptured a razor thin lead.


The second half remained competitive while blocking on both sides grew more fierce resulting in knock-downs and pile-ups. With 15 minutes remaining in the bout, Philly led 168-152, a margin that in derby is anything but safe. The Block Party was able to stretch its advantage in power jams, however. In the final jam JK Trolling broke through Punisher defenses into lead jammer position and literally sashayed around the track as the clock ran out. Final score: 238-182 in favor of Philly.


Thursday, November 13, 2014

Mr. Spock, Sneak Drinker?

I got one of those late night drunken phone calls. You know the ones. Or perhaps you don’t. Nowadays there is a plethora of other ways to communicate inappropriately while drunk: texts, tweets, facebook comments, etc. Voice communication might be considered quaint in some generational quarters. There remain some, however, who keep the tradition alive. Truth be told, in my more bibulous 20s I made a few such calls myself. Those days, both fortunately and unfortunately, are a long time ago. Nonetheless I received one last night. The caller had some life decisions to make and wanted a sounding board – not advice, so I didn’t offer any. I probably wouldn’t have offered any if asked. If the caller came to a decision, one hopes it was considered also in the sober (and probably painful) morning.

Holding off on implementing a drunken idea until sober is a wise precaution, but is there also sense to holding off on a sober idea until regarding it drunk? There may well be. It has historical tradition behind it. An oft-referenced passage in Herodotus from the 5th century BC tells us that ancient Persians would not make an important decision until considering it both drunk and sober. There are much more recent anecdotal instances when this method proved useful. The SALT treaty negotiated by Henry Kissinger and Leonid Brezhnev in 1972, for example, was a pretty good deal all around, and the Persians would have approved of the way alcohol infused its negotiation as related in Kissinger’s White House Years. But is there any study of the question that is a bit more scientific? Yes, a bit.

At a bar in Grenoble France researchers approached drunken patrons and asked them to fill out a questionnaire on matters of philosophy. Perhaps this sort of occurrence is less strange in Grenoble than in New York or Chicago, because more than one hundred patrons did it.  (See http://www.sciencedirect.com/science/article/pii/S0010027714001875) The questions included ethical dilemmas such as the classic trolley and bridge questions which, simplified, ask if you would kill one innocent person to save 5. (Assume no negative consequences to you, legal or otherwise.) Drunks are far more likely to be OK with offing the one in utilitarian fashion. In fact, they are much more coldly logical all around in ethical matters than sober folks. This is curious, since we commonly associate drunkenness with emotionality. Apparently all that emoting is self-centered. The outpourings are about the drunk’s own hates, loves, desires, and fears. With regard to other people the inebriated are surprisingly hardheaded in all sorts of ways. So, the Persians might have been onto something.

Drunks may be logical, which is a counterintuitive result, but one may ask if there is any verity to “In vino veritas,” an intuitive supposition already ancient when Pliny the Elder referenced it in Naturalis Historia. Are drunks really honest? Yes and no. They are less inhibited. So, they blurt out what otherwise they might keep to themselves. To that extent they are more honest. However, when it is in their interest to lie – as when stopped by police – there are few liars as inspired as drunks. Many of the greatest novelists have been heavy drinkers, and fiction, after all, is a kind of lying. The degree to which they are more honest probably shouldn’t be counted in their favor anyway. Honesty, despite its reputation, is not always a virtue; the trouble with truths, especially when spoken by the intoxicated, is that they frequently are unkind, inappropriate, or both.

Of course, one could simply stay sober and retain the ability to cushion the sharp edges of one’s commentary with paddings of prevarication. I, for one, stumble into inappropriateness often enough as it is, so this is a better choice for me. If that means being less logical, well, every choice comes at a cost.

Hesher (2010): Inappropriate but honest eulogy

Friday, November 7, 2014

Double Doses

Mini-reviews of six midnight home double-features follow. I’ve continued my lately acquired pattern of pairing a newly viewed film with an older one of which the first reminded me.

** ** ** ** **

Factotum (2005) – Charles Bukowski is one of those writers who is very good at what he does, but who leaves a disagreeable aftertaste. Ultimately, the problem is snobbery. Snobbery is not confined to the upper classes. There is a working class version. There also is the snobbery of the bad boy who looks down his nose at anyone less hard-drinking and hell-raising. Bukowski is no stranger to both kinds. For all that, he remains a good read. The movie Factotum, based on Bukowski’s book of the same title, shares many of the book’s faults and virtues. Hank Chinaski, played by Matt Dillon, is a thinly veiled version of Bukowski himself. Hank is a commercially unsuccessful writer who spends his life drinking, gambling, and womanizing, inevitably with women who also have drinking problems. Just to earn the bare minimum amount to live, Hank works at a series of meaningless menial jobs though he always gets fired for drinking or slacking. About his writing, though, he has artistic integrity. In short he is a drunk with literary pretentions. The movie leaves a disagreeable aftertaste. For all that, it is a good watch.

The Philadelphia Story (1940) – When asked to name my all-time favorite movie, I’ll give one of several answers according to my mood, but The Philadelphia Story always is in the running. Tracy Lord (Katherine Hepburn) has divorced Dexter (Cary Grant) after a short tempestuous marriage, and now plans to marry the self-made nouveau riche George (John Howard), who lacks not only the easy grace of old money but its relaxed morality. In fact, George is steeped in out-of-place bourgeois values. A tabloid newspaper sends reporter Macaulay and photographer Elizabeth (Jimmy Stewart and Ruth Hussey) to cover the high society marriage.

How could a film centered on the romances of the uber-rich possibly have been called to mind by Factotum? It was Jimmy Stewart’s role. The sometimes drunken writer Macaulay is so self-satisfied in his disdain for the privileged class that Tracy calls him out for being a snob: “You're the worst kind there is. An intellectual snob. You made up your mind awfully young, it seems to me.” Throw in a precocious younger sister to Tracy, an ebrious old uncle with a taste for chorus girls, and dialogue that is intelligent, sophisticated, and funny, and you have a movie classic.

** ** ** ** **

Neighbors (2014) – Mac and Kelly Radner (Seth Rogan and Rose Byrne) have a new baby and new neighbors in their suburban neighborhood. The neighbors are a college fraternity with a reputation for legendary parties. The frat’s leader arranges a modus vivendi with the couple; the Radners agree to voice complaints directly to him instead of to the police, and he agrees to take any complaints seriously. When no one answers the frat’s phone during one noisy party-night, however, Mac calls the police anyway. Feeling betrayed, the frat boys retaliate. The feud escalates. Given the actors and the premise, you probably have a pretty good idea of what sort of comedy this is. You’re right. The pervasive potty humor is tiresome rather than offensive, but I have to assume it resonates with much of the intended audience. There are some genuinely funny moments. Nonetheless, on balance I’m not the right viewer for this movie. You might not be either.

Neighbors (1981) – Starring Dan Aykroyd and John Belushi in their heyday, this film was not very well received at the time of its release. Yet it has aged well. Off-beat, bizarre, and very 1980s, the whole thing has an undertow of appeal. Belushi is a quiet suburbanite living with his family at the isolated end of a cul-de-sac that backs up to a swamp. The new neighbors (Aykroyd and Cathy Moriarty) add a new dimension to the word eccentric. Their weird and seemingly dangerous behavior evokes paranoia in Belushi. They also apparently have an open relationship which offers a challenge and temptation to Belushi and his lifestyle. If you like your films a little bit odd, this qualifies. I definitely like it more now than on first viewing in 1981.

** ** ** ** **

Visioneers (2008) – In a surreal dystopia, George Washington Winsterhammerman (Zach Galifianakis) is a “tunt” (drone) in the conglomerate Jeffers Corporation, which has risen to economic dominance through the shallow philosophy of Mr. Jeffers. Even drones in this productivity-minded future live in large beautiful homes, drive nice cars, and have no shortage of material goods. Nonetheless, their jobs and lifestyles are so dehumanizing that many literally explode. Efforts by the corporation and the government to combat the epidemic of explosions only worsen matters.

The film has its moments, but even as a comedic premise the notion that prosperity itself is dehumanizing is a little specious. Visioneers was made before the Crash of 2008 after which many folks would have risked explosion to be secure in prosperity. At any level of wealth, life is as shallow as one chooses to make it. However, the work environment depicted in Visioneers truly is dreadful and vision is precisely what Mr. Jeffers lacks.

Daisies (1966) – Directed by Věra Chytilová this surreal Czech film was banned in it its own country until 1975. Two young women, both named Marie, apparently decide that the only reasonable way to live in a corrupt world is to revel in the corruption. Whether this is shallow, deep, or somehow both is hard to say. They indulge their appetites and play pranks. They destroy a room where a sumptuous feast is laid out. They survive a dunking but shouldn’t have played with the chandelier. Strange, but intriguing.

** ** ** ** **

Divergent (2013) – Yes, it’s another dystopia based on another YA novel series in which another teen young woman is the hope for the future. Chicago is walled off from the outside world and is run by five factions. Members of each especially exhibit one of five virtues. 16-year-olds are tested to reveal their biological predisposition toward one of the five. Those who exhibit a multitude of predispositions are called “divergent” and are outcasts. In the movie it is not clear whether the wall is to keep the Chicagoans in or others out. Is Chicago a safe place or a prison? From the books by Veronica Roth (and presumably in the upcoming movie sequels) we can learn that a number of cities have been sealed off with the plan of undoing a genetic engineering program that went wrong. Undoing the program requires creating divergents, i.e. normal human beings, not eliminating them as the factions are doing. Beatrice (Shailene Woodley) is a divergent; this fact shows up in her test, but her examiner is a rebel who falsifies the result to protect her. Beatrice joins the Dauntless faction which positions her to resist a power grab by the Erudite faction who would (among other evils) hunt down the divergents.

The movie is not actually terrible, but if you’re going to pick just one dystopia with a rebellious teen, stick with The Hunger Games.

Untamed Youth (1957) – There is no doubt who is a prisoner in this teen exploitation flick. In a rural area Judge Cecelia Steele is secretly married to agricultural magnate Russ Tropp. She ensures he gets cheap agricultural labor by convicting teens and passing travelers of minor offenses and then sentencing them to work on Tropp’s farm. Penny (Mamie Van Doren) and Jane (Lori Nelson) are convicted of skinny-dipping and hitchhiking, which puts a crimp in their plan to enter show business. They are sent to the farm. The judge’s son opts to work at the farm as a supervisor subordinate to Tropp, but doesn’t like what he sees – except for Jane. He likes her. He and Jane hope to overthrow the corrupt system, but will the judge side with sonny or hubby? Penny meanwhile sings songs and looks busty. Untamed Youth is trash, and I enjoyed it thoroughly.               

** ** ** ** **

Horns (2013) – Though listed on IMDB as 2013, Horns was released both to theaters and pay-per-view only last month. Ig (Daniel Radcliffe) is falsely accused of having murdered his girlfriend Merrin (Juno Temple) whom he has loved since childhood. We see their relationship in flashbacks. Nearly the whole town thinks he did it despite the insufficiency of evidence to charge him with the crime. To his own dismay, Ig starts to grow horns. They have the effect of causing people to tell him their darkest thoughts; people also do what Ig tells them. They somehow forget the horns when they look away and forget what they said and did while under their influence. He uses this ability to discover what really happened that night. I don’t normally like movies with supernatural elements, but this one was odd enough to be interesting.

D.O.A. (1988) – This is a remake of a 1950 noir with Edmund O’Brien. The original isn’t bad, but in this case I like the remake better. Dexter (Dennis Quaid) is an English professor whose wife is divorcing him. After a night of far too much alcohol, he wakes up in the dorm room of young co-ed Meg Ryan. He sneaks out but feels worse than just hung-over, so he stops by the hospital and discovers he has been poisoned irreversibly.  On top of this, he is falsely accused of having murdered his wife. He is then accused of other murders. He has little time to discover the truth so he snares Meg Ryan and retraces his steps the night he was poisoned. Surprisingly good.

** ** ** ** **

He Was a Quiet Man (2007) – The title comes from the litany of comments we always seem to hear about a multiple killer. You know them from journalists’ interviews with neighbors and co-workers. We all do. “He was a quiet man. Very polite. He seemed so nice. He always said ‘good morning’ to me. A bit of a loner.” And so on. Those words describe Bob Maconel, perfectly played by Christian Slater. Bob is an office worker with a dreary job and every reason to hate his co-workers and immediate superiors. He is also schizophrenic and has conversations with his goldfish – they answer back. Day after day he loads and unloads his gun at his cubicle, waiting for the moment and the courage to kill his co-workers and himself; he exempts Venessa from his intended targets because she has a nice smile. At the end of one day exceptionally full of degrading treatment, he appears ready to follow through as he loads his revolver. He drops a bullet and, as he reaches down for it, shots are fired. Bodies drop to the floor. Another worker has gone postal first. Venessa is among the shot, but is still alive. Bob empathizes with the shooter, of course, and intervenes only because the fellow is about to finish off Venessa. Bob kills the shooter and finds that he is a hero instead of the dead villain he expected to be. When he visits Venessa in the hospital, though, he finds that she has been left quadriplegic. She asks him to end her life. Bob has to decide how to handle her request and his new notoriety. This is a twisted tale and all too credible. Thumbs up.

Heathers (1988)He Was a Quiet Man inevitably reminded me of Heathers, a dark teen comedy starring a much younger Christian Slater and Winona Ryder. Slater plays J.D., which are unsubtle initials even though this slang for juvenile delinquent was 20 years out of date by 1988. J.D. espouses a nihilistic might-is-right philosophy, and assists the rise of Veronica (Ryder) in HS society by killing off the cooler kids. He makes the murders look like suicides. J.D.’s inclinations run in the family. J.D.’s father owns a demolition company, and it is strongly implied that years earlier he had arranged a fatal accident on a job site for J.D.’s mother. Veronica eventually has second thoughts about murder for social advancement and breaks with J.D., but this just puts her in his sights as he plans to blow up the high school. Also thumbs up: wicked, funny, and classic 80s.

** ** ** ** **

Daisies

Monday, November 3, 2014

Kinks in the Jinx

The magazines and periodicals in a waiting areas instantly tell you something about the person you’re waiting to see. The waiting room of my previous dentist (he retired) was heavy on news magazines: The Economist, Newsweek, US News & World Report, and so on. The current one has a mix of lifestyle and science magazines: Esquire, Travel, Scientific American, et al. My lawyer’s waiting area tilts toward aviation, e.g. Plane & Pilot.  The barber shop I’ve patronized for decades offers local newspapers, Car & Driver, and Sports Illustrated.

I normally choose a newspaper while waiting for the barber, but while I awaited my turn a couple days ago I couldn’t help noticing the Sports Illustrated cover. No, it wasn’t the issue of which you are thinking. It was the recent one with relief pitcher Greg Holland of the Kansas City Royals. The Royals lost the World Series after that issue. Though it is well known to dedicated sports fans, I became aware of the so-called Sports Illustrated Cover Jinx only a few months ago. It was mentioned in a brief news report about the magazine’s 60th anniversary. In the very first issue dated August 16, 1954, Milwaukee Braves third baseman Eddie Mathews appeared on the cover. The Braves’ winning streak immediately ended and Mathews broke his hand.

While many people and teams since 1954 have suffered no ill effects after appearing on the cover  – at least not in a reasonable time period – the list of those who have is long and eerie. Appearances have preceded losing streaks by golfers, skiers, quarterbacks, boxers, tennis players, and others; there have been fatalities soon after appearances such as Laurence Owen who died with the rest of the US figure skating team in a 1961 plane crash and several race car drivers, Dale Earnhardt in 2000 among them. Is this putative jinx just a matter of readers cherry picking bad events while ignoring good ones after cover appearances? Probably there is some of that. Yet there is a pattern of lower performance by objective standards (fewer points, slower times, or whatever) after appearances that is statistically significant. It’s not a reliable predictor in any one case, but it’s enough to alter the bets of professional sports gamblers. There is an explanation for this that has nothing to do with jinxes.

Back in 1886 Sir Francis Galton, a cousin to Charles Darwin, published the dryly titled article Regression Towards Mediocrity in Hereditary Stature. He demonstrated that the children of tall parents are likely to be shorter than their parents – still taller than the general population, but shorter than their parents. This is not an intuitive result, but it makes sense if one thinks about it. Height and other physical traits typically fall on a bell curve, with most people clustered around average. Numerous genetic and environmental factors determine height, so the mix that a child gets from his parents just by the odds should be closer to the mean than toward the tails of the curve. Galton’s “regression toward the mean” occurs not just in height but in activities including investing, baking, and sports. Take a hypothetical golfer who averages a score of 70: her performances also fall on a bell curve. On most days she is within a few points of 70, but occasionally she’ll have a bad day of 80 or a really good day of 63. In either case, odds are her next day will be closer to 70. But when will she appear on the cover of Sports Illustrated? After the 63 – more likely after a streak of low 60s, which is also a statistically expected occurrence. Naturally she is likely to do worse after the appearance – to regress toward the mean. It makes sense to bet accordingly.

Maybe there is hope for the New York Jets this season yet.

Rory Gallagher – Jinxed

Thursday, October 30, 2014

Cold Comfort

Ebola is far from the first disease to be politicized. Venereal diseases in particular are tailor-made for political grandstanding, which at least since the middle of the 19th century has been one of the infections’ nastier side effects. My degree is in history and classical humanities, not medicine, so I have no expertise and therefore no opinion in the current debate about what quarantine standards should or should not exist for those in contact with Ebola patients – other than the opinion that many of those who do have opposing opinions on the subject are, as in other politicized matters, ungenerous in describing each other.  While I’m aware that those who have expertise can and do make mistakes, at least their mistakes will be better informed than mine. One hopes they’ll be fewer, too.

However, the mentions of quarantine touched something in a dusty corner of my memory, so I rummaged around there until I found it. Some readers might be familiar with former NASA scientist Randall Munroe. On his website What If Munroe gives serious scientific answers to even the most absurd questions. Examples: How long could I swim in a pool of spent uranium fuel rods without it being fatal? (Answer: As long as you can swim without tiring and drowning. Water is great at absorbing radiation which is why spent fuel rods are stored in deep pools of water. So, unless you dive down right next to the rods you’ll be fine; near the surface you’ll actually be partially protected from normal background radiation and so will receive a lower dose than if standing in open air. Munroe notes that a bigger risk is getting shot by a security guard.) What if everyone in the world aimed a hand-held laser pointer at the moon? (Answer: nothing visible.) He backs up the answers with appropriate stats. One question, dating back before the present controversy, involved quarantines.

One reader asked, “If everybody on the planet stayed away from each other for a couple of weeks, wouldn’t the common cold be wiped out?” Maybe. And maybe a lot of other diseases too. Then again, Munroe tells us, maybe not. Diseases rely on the chain of infection. If, on average, each person with a cold does not infect at least one other person, that particular rhinovirus will fade away and vanish from the earth. Unlike some disease germs (notoriously chickenpox) which linger in a recovered person, rhinoviruses are completely eliminated by healthy immune systems within two weeks. Ah, you caught that qualifier. That’s where the “maybe not” comes in. We don’t all have healthy immune systems. It is almost certain that cold viruses would survive two weeks (or much longer) in some people; from that small pool there is every likelihood that colds would spread out again. Then there are the practical problems of a universal quarantine. Some people absolutely must go to work – and interact with others while there – in order to keep modern civilization from collapse. How long would the electric power grid, for example, last without intervention? I suppose essential workers in principle could isolate themselves by wearing hazmat suits, but I wouldn’t count on compliance. The lines at the grocery store checkout counters at the end of the two weeks are not pretty to contemplate either.

On the other hand, for reasons having nothing to do with disease control, the proposal to avoid other people for two weeks has a decidedly pleasant sound to it. Maybe we’d all be a little less cranky with each other after such a two week vacation.


Thursday, October 23, 2014

This Halloween I’m Dressing as a Stockbroker


In a typical October, the trees shed their leaves and stocks shed their value. This one is no exception. During Halloween month, the scariest place is not the cemetery but Wall Street – not so much for the traders, who (unless trading on their own accounts) make money regardless of what the market does, but for the rest of us.

Very few “rules of thumb” for investors stand the test of time. There is a random element to price movements that baffle even expert analysts. For two decades Alan Greenspan was the most respected central banker in the world, yet in his latest book he admits he totally misread the movements of asset prices while in office. What hope is there for the rest of us? Strangely enough, one idiotically simple rule of thumb has been a winner for centuries: “Sell in May, go away.” The months November-April really have had better returns than May-October; October in particular repeatedly has been nasty indeed. A hypothetical investor who put $10,000 in stocks reflecting the S&P in 1958 (a recession year) and then followed the sell-May-buy-November strategy now would have a portfolio of $544,323. The reverse strategy (buy-May-sell-November) would leave the investor in 2014 with $9,728, a $272 loss. Professional investment advisers tend to pooh-pooh the sell-May strategy, because it seems to make no sense. To them it has the smell of superstition: an unreliable interpretation of a statistical oddity. Yet it is hard to dismiss the persistence of the pattern. Besides, it does make sense in human psychological terms, if not in terms of the underlying economic realities. Octobers are bad for the market because Octobers historically are bad for the market. It is self-reinforcing. Investors, aka humans, are jittery creatures when they suspect bears are about, either in the woods or in the market; they know October is an especially ursine infested month, and flee on hearing the first scary rustle or grunt.

Beyond this simple calendar trick, investment strategies by individuals and professionals alike are surprisingly useless. Often worse than useless. Several years ago Terry Odean, professor of finance at Berkeley, analyzed 163,000 trades in 10,000 individual brokerage accounts. Clearly the account-holders expected to benefit by the trades, i.e. do better than if they simply had held the stocks they sold. Yet, overall, the stocks they sold did better than the stocks they bought by an average of 3.2 percentage points. (Why? The human tendency to sell stocks that are up from their purchase price and to keep those that are down until their prices recover – “loss aversion” – meant they dumped their strongest stocks and ended with a weaker portfolio.) Odean and his colleague Brad Barber published an oft-quoted paper called Trading is Hazardous to Your Wealth, which demonstrated that active traders on average do worse than less active ones. Individuals who simply hold a diversity of stocks can expect to track the market. Beating the market rate of return is a matter of luck, and luck, as we know, is fickle.

 “Experts” are scarcely better with their stock picks. When Nobel-winning economist Daniel Kahneman was preparing a talk for investment advisers he received a wealth of data on their performance from their employing (well-known) firm. The advisers’ bonuses were based on the performance of their investment picks, so they had every incentive to choose well. His analysis of the data: “The results resembled what you would expect in a dice-rolling contest … the firm was rewarding luck as though it was skill.” To be sure, there really is a substantial amount of education and skill required to count as a financial expert. Few people understand how some of the more arcane derivatives really work. There is a huge amount of information (product lines, industry trends, balance sheets, corporate culture, etc.) to be evaluated when trying to make an informed judgment about a particular company. Yet, the informed judgments prove to be as hit and miss as the uninformed, for there are always more factors than what you see. As for whether the company is overvalued or undervalued (whether the stock will fall or rise), “Traders apparently lack the skill to answer this crucial question, but they appear to be ignorant of their ignorance.”

Well, as Socrates noted some time ago, this is not an uncommon human condition. I don’t pretend to be immune to it. I’ve never followed the sell-in-May strategy, for one thing, though I’d be far far better off if I had. My informed picks (not only in finances, alas) usually have been worse than my random ones. Yet it’s hard not to overthink the next decision anyway, even knowing that it won’t help. That, I suppose, is human, too.


1929, the archetypical Crash. Mr. Forbes' advice to buy was off by a few years: 1932 was market bottom. 1933 was the best year ever for stocks (yet to be surpassed) even though the Depression would last until 1940.