Saturday, July 30, 2011

A Sade Day

As noted in the last post, many modern protagonists in film and literature display, at best, a dubious moral sense. These characters are not even anti-heroes in the 60s/70s fashion. Dirty Harry had a firm moral code, after all; it just wasn’t consistent with the Bill of Rights. So did Lee Marvin as Walker in Point Blank (1967); it just was one consistent with being a hit man. Bronson as Kersey in Death Wish was on a mission of public good by his lights. Contrast them with the Dude in The Big Lebowski, Dexter Morgan in Dexter, or Richard B. Riddick from The Chronicles of Riddick.

But what about villains? Have they evolved, too? Not so much. We have grown cynical about heroes, but have we kept a simple belief in villains? It seems so. True enough, we frequently get a back-story nowadays to explain the villainy, or to put it in context, but this was not actually unusual in the past either, e.g. Cagney in The Public Enemy (1931). Even in Frankenstein (1931), we learn enough to sympathize with the alienation of the monster.

In general the same types of villains recur as ever. Serleena from Men in Black II is no less evil and ambitious than Ming the Merciless from Flash Gordon. Other common types: the Monster, sometimes with a mind (Dracula), sometimes without much of one (Jason); the Sociopath (Kalifornia); the Psychotic (Psycho); the seeker of unwarranted revenge (Rebecca De Mornay in The Hand That Rocks the Cradle); the megalomaniac (Stewie in Family Guy); and, perhaps the most interesting, the amoral philosopher, such as the hunter of human game in The Most Dangerous Game (1932), or the nihilist anarchist Joker portrayed chillingly by Heath Ledger in The Dark Knight (2008).

I venture that all the philosopher villains of fiction owe something to the Marquis de Sade, the crazy nephew of The Enlightenment. I recommend De Sade’s Justine and Juliette, by the way, but not his “masterpiece” 120 Days of Sodom unless you truly want to test your commitment to free expression. The Marquis deserves a discussion all his own, however, so we’ll leave him for another time.

I’ve always found credible all-too-human villains to be the scariest, and therefore they are my favorite. A prime example is Frank Booth played by Dennis Hopper in David Lynch’s off-beat 1986 movie Blue Velvet. Frank is a dangerous and volatile (and believable) character with such severe psycho-sexual issues that he has to drive them out of his mind with drugs, violence, and sadistic sex. It is not enough for him to dominate his victims; he must utterly degrade them, too. By his abuse, he so warps one victim (played by Isabella Rossellini) that she orders a young man at knifepoint to hit her, apparently because at least this time it will be her choice. Afterward, the young man (Kyle MacLachlan) is shaken to have found a part of himself that was OK with it.

It’s always satisfying to watch villains get their comeuppance in the end. It is wicked fun when they don’t, too. Perhaps that explains why old-fashioned villainy has survived on page and screen better than heroism; either ending works for us. Another reason may be that we like to contrast ourselves with these characters, as in, “Hey, I may not be perfect, but at least I don’t hunt humans for sport.”



Sunday, July 24, 2011

Limited

Captain America is battling Harry Potter at the box office this weekend, and (surprisingly, to my mind) is holding his own. It is surprising because the Cap is a morally certain fellow in a morally uncertain time. After all, Superman has abandoned his US citizenship and, presumably, the American way; Batman’s obsessive behavior qualifies as a mental illness; last year’s most notable new screen hero was eleven-year-old Hit-Girl, who, if you noticed, remorselessly killed not just outright thugs but also people (like the trampy chick at Rasul’s apartment) who merely hung out with them; and both Supergirl and Spiderman are set to be rebooted as much darker characters than on their last screen outings.

If there is a film “hero” with super abilities who seems most suited to 2011, it is Eddie Morra (played by Bradley Cooper) in the movie Limitless. Morra, suffering from writer’s block, swallows a new street drug that vastly enhances mental functioning, so that he not only completes his book in four days, but dominates Wall Street, kicks down muggers with moves simply remembered from kung-fu movies, and snappily wins back the girl who previously had dumped him. There are side-effects to the drug, however, including blackouts. During one blackout, he might have murdered a young woman in her hotel room – he can’t quite remember. Morra’s only real concern over this is whether or not he gets caught (he doesn’t). On the surface, Morra shows all the modern moral niceties. He never once says anything racist, sexist, or homophobic, and there is every reason to believe he files his taxes (whether or not he not reports the cash he stole from his dealer’s apartment). Yet, there is no reason to trust this man. His only discernible fundamental personal objectives are of the most shallow and self-serving kind, even though he can quote to you the works of every moral philosopher he ever has read.

How different from the 1940s when ugly social attitudes were rampant in the US, and were on display in the movies, such as the unthinking casual racism in The Palm Beach Story, just as one example. Yet, whenever 1940s scriptwriters slowed down to think about something and then actually tried to make a moral point, the point almost always remains unexceptionable today. Their moral compasses were functioning and they didn’t second-guess them. Captain America certainly doesn’t. He knows what is right and what is wrong and what to do about it. There is every reason to trust this man. Apparently, this moral clarity has an appeal to modern audiences even if the setting has to be 70 years old for the story to work. To be sure there are conflicted characters in 40s film, such as Rick in Casablanca, but Rick does the right thing in the end, doesn’t he? As for the real-life ugly side of the 1940s, it’s hard not to conclude from film and literature of the era (and, of course, from those who experienced it) that most folks really knew better, but found it hard to break acquired habits.

We know better today, too. Whenever we slow down and think about it, we know that old-fashioned personal honor is a more important matter than it is our current habit of mind to acknowledge. We know that surface niceties are not enough. The Cap reminds us of this. Nonetheless, I fear Eddie still reminds us more of us.

Tuesday, July 19, 2011

Calefied Canines

In case you didn’t know (and, unless you’re a member of the National Sausage and Hot Dog Council, why would you?), July is National Hot Dog Month. Yeah, the whole month. According to whom? Well, I suppose the National Sausage and Hot Dog Council. July 23 is a double whammy by being National Hot Dog Day.

In the year I was born, had you asked anyone to name the definitive American fast food, almost certainly the answer would have been “hot dog.” By the late 60s, thanks largely to McDonald’s (though White Castle, founded in 1921, preceded it), the answer would have been “hamburger.” Perhaps it still is, but the globalization of fast foods has diminished the American stamp, while, within the US, pizza, chicken, and tacos give the burger a tough race – even sushi sales are nothing at which to sneeze. Yet, while hot dog stands no longer are ubiquitous as they were in the 50s, the hot dog hangs on.

I like hot dogs plain. I like them with mustard, with relish, with chili, with jalapeños, with onions, with bacon, with cheddar, and with just about anything else you can put on them. I like them grilled, baked, corned, deep fried, microwaved, or boiled in beer. I like them made of pork, beef, turkey, or buffalo meat.

Like so many “American” foods, the hot dog was not invented in the United States. Pork sausages on buns (they sound like hot dogs to me) were given out to the folk at the coronation of Maximilian II in 1564, and the treat was not invented for the occasion. We don’t know when they first turned up in the US, but by 1870 Charles Feltman was selling them in Coney Island, NY.

We don’t know when the term “hot dog” originated either. The common tale of the term originating at a Giants baseball game in 1900 as a shortened form of “Dachshund sandwich” is apocryphal. “Hot dog” appears in newspapers a decade older than that. An article about them in an 1892 edition of the Paterson Daily Press, for example, notes, “The ‘hot dog’ was quickly inserted in a gash in a roll, a dash of mustard also splashed on to the ‘dog’ with a piece of flat whittled stick, and the order was fulfilled.” Even earlier in the 19th century, “dog” was slang for a sausage, though we only can guess at the reason for that; it was reasonable enough to call a hot one a hot dog.

Perhaps this is a good place to debunk the frequently heard myth of the “Grade D but Edible” hot dogs supposedly served at college cafeterias around the nation. The USDA does not give letter grades to meat, so no such product can exist. Letter grades are for milk.

The food nannies are quick to tell us that hot dogs are not healthy. I don’t think this is a big shocker. However, unless you’re really going to replace that dog with a carrot and alfalfa sprouts, it probably isn’t much worse than what you’d eat instead. So, enjoy your hot dog. Perhaps it’s best, though, not to challenge Sonya Thomas (aka Black Widow) for the record:


Wednesday, July 13, 2011

Lunar Blues

Next week is yet another anniversary of the 1969 moon landing. Since 42 is not a particularly noteworthy number (pace Douglas Adams), it’s likely to pass with little more than a brief “on this day in 1969” mention at the end of a few nightly newscasts.

I remember the 1960s enthusiasm for all things space. Even data from Mariner 4 were broadcast live on primetime TV: the pictures of Mars slowly assembled line by line and we actually watched this. We were excited most by the manned flights. We (the general public) knew the names of the astronauts and the flight plan of every mission. It all culminated with Apollo 11. I watched the Eagle land live on television, as did a billion other people. Then there was an air of “OK, been there, done that.” Interest in manned flight ebbed and never really returned. The Apollo moon program was cut short so suddenly and casually that there were pieces left over.

Since then, robotic craft have continued to conduct interesting explorations, but humans have confined their ambitions to earth orbit. Every now and then, politicians in some spacefaring nation announce an intent to return to the moon, or to go to Mars, or to visit an asteroid. A little (by government standards) money is spent on the preliminary stages of the project, but then the whole thing quietly is shelved as the costs escalate. A recent leader in The Economist took note of the disinterest: "It is quite conceivable that 36,000km [geosynchronous orbit] will prove to be the limit of human ambition...It is likely that the Space Age is over." By "Space Age" the author meant human flight beyond earth orbit.

This may be wrong in the very long run (Buck Rogers does his thing in the 25th century after all), but at the moment it is true that taxpayers nowhere are in a mood to finance Flash or Buck. They aren’t likely to change their minds anytime soon. The dream never quite dies though.

My own flight to the moon was in 1958. OK, it was a simulated flight in the Tomorrowland section of Disneyland in Anaheim, California. A quarter century later I took the very same rocket ship to Mars. At least I didn’t buy one of the tickets Pan Am sold (seriously) in the 1960s for passenger moon flights, redeemable as soon as seats were available. Pan Am went bankrupt in 1991. I don’t think the cost of its moon rockets was a factor.

Disney’s illusion ride to the moon was not the first by a long shot. An elaborate one was at the Pan American Exposition in Buffalo in 1901, two years before the Wright brothers flew 120 feet (36.5 m). There were seats for 30 passengers aboard the spaceship Luna. Luna rocked, shuddered, flashed and roared. Buffalo and then the whole earth appeared to recede below. A papier-mâché moon loomed overhead. After more light and sound effects, the ship landed on the moon. The passengers disembarked and walked through tunnels and grottoes as midgets in moonmen costumes romped around them. They were greeted by the Man in the Moon himself, who sat on his throne surrounded by luscious Moon Maidens. The passengers didn’t then fly back to earth on Luna though. They exited into a gift shop, which is another way of returning to earth. Frankly, it sounds much more fun than the Disney ride. Nowadays the fair is usually mentioned, when at all, for a grimmer reason: it was where President McKinley was shot.

Come to think of it, maybe we should return to the moon sooner rather than later – just in case there are any luscious Moon Maidens lurking in lunar tunnels and grottoes. Beats bringing back moon rocks.


Thursday, July 7, 2011

Don't Stick Beans Up Your Nose

Below is an abstract of a study by Jessop & Wade at the Department of Psychology, University of Sussex, Falmer, UK. Translated  from professor-ese, it says that people drink more after seeing ads that warn them of the dangers of drinking:

OBJECTIVES: The aim of the current research was to test the terror management theory-derived hypotheses that exposure to information about the mortality-related risks of binge drinking would make mortality salient (Study 1) and, hence, exacerbate willingness to binge drink amongst those who perceive this behaviour to benefit self-esteem (Study 2). STUDY 1: Participants (N=97) were allocated to one of five experimental conditions. Results confirmed that exposure to information about the mortality-related risks of binge drinking made mortality salient. STUDY 2: Participants (N=296) were allocated to one of three experimental conditions. Exposure to mortality-related information about the risks of binge drinking was found to result in greater willingness to binge drink among (i) binge drinkers and (ii) non-binge drinkers who perceived this behaviour to benefit self-esteem. There was no evidence, however, that exposure to such information influenced binge drinking over the following week. CONCLUSIONS: Research findings suggest that mortality-related health promotion campaigns might inadvertently make mortality salient, and hence precipitate the very behaviours which they aim to deter among some recipients.

The effect is not limited to booze. The popular school program DARE, for example, which warns kids about alcohol and drugs appears to be counterproductive when it has any effect at all (see http://www.alcoholfacts.org/DARE.html ). So too with cigarettes. Despite ever more graphic anti-tobacco ads, draconian taxes on cigarettes, and severe social ostracism of smokers, the percentage of smokers in the US actually has been increasing since it bottomed out a decade ago.

So why do we badger each other beyond the point where it is any use? If it’s not about the effect on the badgered, it must be about the effect on the badgerers. We feel better about ourselves when we think we are doing something to ‘help” those poor people. (And doesn’t it feel great not to be one of them?) We keep waging a hugely destructive and obviously hopeless drug war for the same reason. We feel better doing the wrong thing than nothing. Maybe it shouldn't be about us.

Next in importance to knowing when to act is knowing when to stop.

Saturday, July 2, 2011

Counter Punch

I never have seen worse reviews for a movie from professional critics than for Sucker Punch (2011): "Two hours of solitary confinement, which feels more like dog hours" (Michael Philips, Chicago Tribune); "Movie lives up to its name" (A.O. Scott, New York Times); "What happens when a studio gives carte blanche to a filmmaker who has absolutely nothing original or even coherent to say" (Lou Lumenick. New York Post); and so on. Many of the reviewers were downright angry. The reviews were so bad that they intrigued me.

Sliding the DVD into the player last night, I was expecting unparalleled awfulness. I didn’t much like director Jack Snyder’s earlier major films (300 and Watchmen) even though these got much better reviews, so Sucker Punch, I feared, might send me screaming from the room in agony because fumbling for the remote would take too long. My only reservation was from the small smattering of critics who liked it, including Betsy Sharkey of The Los Angeles Times, who called it a "wonderfully wild provocation — an imperfect, overlong, intemperate and utterly absorbing romp through the id that I wouldn't have missed for the world." There are always a few eccentric responses to anything, however, so I dismissed this minority view.

I was… well… sucker punched. Apparently, I didn’t see the same Sucker Punch as the majority of the critics. I saw the same movie Betsy did.

Yes, as most critics complain, this surreal and beautifully shot film shamelessly exploits adolescent sexual fantasies. True, the use of video game metaphors is far from original (e.g. Scott Pilgrim). Yet, lots of highly regarded movies exploit adolescent sexual fantasies – maybe most. Despite the strange neo-Victorianism that is slowly emerging in the current century, that is not in and of itself a bad thing. As for originality, how many truly new ideas on stage or in film are there? What matters is the way an idea is handled whether new or old. Some critics also complained that the plot doesn’t make sense. But it does. The film very much has its own logic.


Basic plot: wrongfully committed to a high security asylum by a perverse, evil, and corrupt stepfather, the young "Baby Doll" seeks her freedom while indulging in two layers of fantasy. Minor **SPOILER** in response to critics who complained they saw the ending coming: the sucker punch of the title is not what happens to Baby Doll. It is exactly what the movie tells you it will be at the beginning, which is the question of whose story it is.

The user reviews of this film on Rotten Tomatoes shows only 22% like this movie. User reviews on Amazon tilt the other way, but not overwhelmingly. Accordingly, I won’t give a simple positive recommendation to Sucker Punch. Clearly, a lot of viewers – maybe a majority – will hate it. Truly hate it. There is a sizable minority, though, who won’t. Think of this movie as Shutter Island (2010) meets the exploitation flick Teenage Doll (1957) meets the Nintendo game Zelda, the Ocarina of Time. If you like all three, and if blending them together doesn’t seem too outlandish, you might join Betsy and me on the “thumbs up” team.