Thursday, July 20, 2017

Not to Reason Why

Action movies are not about character development or about reflecting the human condition. They are about chases and crashes and fists and flashing weapons and narrow escapes and razzle dazzle. A handful of exceptional films manage to combine the physical elements with the deeper stuff, but audiences neither demand nor expect it. Action movies are escapist fare. A sketchily drawn but likable character or two and some bare excuse for all the swashes and bucklers to follow are enough. In the past week I’ve sampled three of this year’s action hits – one in the theater and two on DVD. One can’t fault the action in any of them, but the excuses are bare indeed.

Baby Driver
The fantasy lives of adolescent and young men are intimately connected with popular music. Remarkable feats of derring-do go on in their heads during the guitar and drum solos. (I wouldn’t presume to guess if or how what goes on in young women’s heads differs.) Filmmakers know this, largely from firsthand experience, which not just accounts for a lot of soundtrack choices but also the quirks of many film characters – the old Iron Eagle movies and the recent Guardians of the Galaxy flicks come to mind. It’s a simple way to connect with the audience. In Baby Driver, Baby (Ansel Elgort) has tinnitus (ringing in the ears) due to a childhood accident and he drowns it out with music (leaning heavily to rock) pretty much constantly – always when driving.

Baby is a wheel man. He had become one in consequence of the youthful mistake of stealing Doc’s car. Doc (Kevin Spacey) turned out to be a broodingly ruthless crime boss who saw Baby’s potential; as payback, Doc set him to work as an expert getaway driver in elaborate heists. Aside from being a criminal, Baby is a pleasant enough sort who looks after his aged disabled friend Joseph (C.J. Jones). Baby meets the waitress Debora (Lily James) who is pretty and sweet and…well… that’s about it. For no discernible reason she agrees to leave town with Baby, about whom she knows nothing, for the open road. Baby is cute, I suppose, but surely he is not the first cute guy Debora ever met. So why? Because the script says so. Besides, it fits the adolescent fantasy. The secondary love story between the crooks Buddy (Jon Hamm) and Darling (Eiza Gonz├ílez) actually is more comprehensible if even less wise: the two simply enjoy the thrill of sharing danger and violence. The course of true love never did run smooth, however, and the plans of Baby and Debora are put in jeopardy when Baby’s last job goes terribly wrong.

Taken purely as the escapist fare that it is, the movie is fun. It is well shot and the stunt driving is excellent. Don’t expect anything more from it though.

** **

John Wick: Chapter 2
For those who thought John Wick might have been a good movie if only there had been more violence (the eponymous character kills a mere 84 people), John Wick: Chapter 2 sets out to rectify that.

The reader may recall that retired hitman John Wick (Keanu Reeves) in the first movie is upset when the son of a Russian mobster kills his dog and steals his car. So, he singlehandedly wipes out the mob. In John Wick: Chapter 2, the timeline of which follows immediately after the ending of the first movie, an Italian mobster Santino D'Antonio (Riccardo Scamarcio) calls in a marker and demands that John Wick assassinate his sister, the head of the family. When Wick refuses, Santino blows up his house with an RPG. Knowing what Wick did to the last mobster who annoyed him, why would he do such an amazingly stupid thing? Because the script calls for it.

Anyway, Wick first does the job for Santino because honor (!) requires it, but as we all know it is then bad news for Santino D'Antonio and for all of the mercenaries seeking the seven million dollars Santino puts on John Wick’s head.

I’m not unaware of the tongue-in-cheek nature of this movie, but nonetheless to my taste it was a 30-round clip too far: numbing rather than escapist. My reaction is probably idiosyncratic, though, since my companions (both sexes represented) loved it.

** **

Kong: Skull Island
This is a movie I would have loved as a kid: monsters and more monsters with no irritating romantic subplot to distract from the (did I mention them?) monsters. There is not much waiting for them either. They show up in the first half hour.

The time is 1973. Landsat images reveal the existence of an island in the eye of a permanent storm that previously had shrouded it from the outside world. A scientific team headed by Bill Randa (John Goodman) investigates. Transportation is provided by a heavily-armed helicopter squadron withdrawn from Vietnam and commanded by Lieutenant Colonel Preston Packard (Samuel L. Jackson). They drop bombs all over the island in order to get seismic readings, which seriously angers the protector of the island and its indigenous people. You got it: the protector is King Kong. He rises up and swats every last chopper out of the sky.

Survivors of the crashes encounter the locals and a WW2 pilot (John C. Reilly) who was stranded on the island during the war. Conveniently, he can explain about Kong’s role as protector against the really terrible monsters who live below the surface. Packard, however, is determined to kill Kong. Why? Because the script calls for it. One gathers he is angry that Vietnam ended without a victory for his side and now he at least wants to kill a big gorilla. Um… yeah.

Most of the cast is there to get eaten by monsters, but a few should be mentioned. The photographer Mason (Brie Larson) shows that, unlike in previous iterations, a beautiful blonde woman can be on hand without anybody at all being attracted to her – not even Kong. Jing Tian’s most significant scene is in the after-credits (yes, there is a not-so-secret ending) when she reveals that there are other monsters in the world. James Conrad gets to play the competent mercenary. But it’s really not about the people. They are just there to run from (or foolishly try to kill) the monsters who are the real stars.

The movie is a fun romp and the fx are superb. If you are looking for anything other than an effects-packed action film, you won’t find it in the characters. There might be a metaphor or two, however, such as the imprudence of removing a monster who is keeping in check something worse. But primarily it’s about the chills and thrills, and it delivers enough of those.


Trailer: Baby Driver (2017)


Sunday, July 16, 2017

That’s All There Is

In my tween and teen years (1962-72) a regular guest on TV talk shows and variety shows was Peggy Lee. For most of that decade she was not a particularly welcome presence from my perspective on the youthful side of the Generation Gap. Born in 1920, Peggy was several years older than either of my parents. Her sound was very much my parents’ music and therefore something toward which I felt obligated to be (at best) indifferent. It wasn’t rock and roll. I knew nothing of her early work with the big bands of the 40s and scarcely anything of her career’s high water in the 50s. Nor did I care to. The extraordinary deference with which she was introduced (always as Miss Peggy Lee) mystified me.

This changed in 1969 when Peggy recorded a haunting version of the Leiber and Stoller song Is That All There Is? She had called on a young Randy Newman, of all people, to rework the original arrangement to something more to her liking. It was to be her last hit single and her biggest since Fever in 1958. I was one of the many who loved the record, and I grudgingly allowed at the time that maybe I had been a little closed-minded about her other work though I wasn’t yet ready to go out of my way to listen to any of it. (I had no idea I already was familiar with some of it from the sound track of the 1955 Disney movie The Lady and the Tramp.)

It wasn’t until after college that I recognized – let myself recognize – just how good much of my parents’ popular music was. To be sure, I still enjoyed the usual Boomer fare of folk and rock from Dylan to Clapton, but against all expectations I also liked 40s big bands from Glenn Miller to Duke Ellington. Who’d have thought it? Not an earlier I. What caught my fancy in particular was the mix of big bands with female vocals such as Helen Forrest, Kitty Kallen, Ella Mae Morse, and Peggy Lee. Vocals had changed over the previous decade thanks to good microphones and sound systems. Through the 1920s and into the 30s, it was important to belt out a song (ala Al Jolson and Sophie Tucker) so someone beyond the first row could hear you. With electronic amplification, this ceased to be a factor. By 1940 much more subtlety and sophistication accordingly had entered popular recordings – more so than in most popular recordings of the 1950s.

Due to her straightforward early style, Peggy Lee is not at the top of my personal list of favorite 1940s-era songbirds though she did numerous iconic numbers with Benny Goodman including Do Right (the Jessica Rabbit version is probably better known today) and a politically incorrect version of Let’s Do It. But she was definitely on the list. For a window into that era I picked up the biography Is That All There Is?: the Strange Life of Peggy Lee by James Gavin. As celebrity biographies go, this one is pretty well researched and written; it even comes with copious footnotes and an index.


The story of Norma Egstrom (aka Peggy Lee), like most success stories, is a combination of hard work and serendipity. Jamestown, North Dakota, is not the most likely place to start a showbiz career, but she made use of what was available and then traveled to find opportunities. Her break came in 1941 when she landed a job singing at the Buttery lounge in the Ambassador West hotel in Chicago. One night Benny Goodman was at a table. Helen Forrest had just quit on him and he needed a female vocalist to fill in for her temporarily until he found a permanent replacement. Peggy’s temporary employment with Benny lasted until 1948.

The bio details her personal life, which was messy in the way we expect of celebrities:  a string of marriages, affairs, and break-ups amid financial meltdowns and substance abuse. On top of all that were serious health problems including pneumonia that scarred her lungs. Yet, unlike most of her fellow 1940s songstresses, her career not just continued but flourished in the 1950s and included turns at acting, notably in Pete Kelley’s Blues (1955). She was always hands-on with musical arrangements. Peggy persisted when others didn’t. She sold out shows in Las Vegas in the 1970s, tried Broadway in the 80s, and sang from a wheelchair in the Waldorf’s Empire Room in the 90s – something unlikely to be emulated in the future by today’s pop divas. Peggy died in 2002.

Though I had bought the book mostly for insight into the big band years, the rest of it proved to be more instructive. Is that all there is? Yes. But maybe that’s enough.


Peggy Lee – Is That All There Is? (1969)
 

Tuesday, July 11, 2017

The Undiscovered Country

What is the sine qua non of being human rather than just another primate? Is it language? Art? Abstract thought? In the 1960s and 70s psychologist Ernest Becker offered another answer, one that accompanies (and perhaps inspires) the cognitive ability to talk, sculpt, and contemplate. So far as we know, humans are the only earthly creatures aware of the inevitability of their own deaths. There is nothing new about this answer, but Becker believed we give it insufficient prominence, which itself is a revealing act of denial. Becker, whose mind was focused by his own terminal illness, told us that we spend most of our energies denying that terrible knowledge; in the process, we develop civilization, art, religion, and neuroses. His book The Denial of Death, written in 1973 as his own demise loomed at age 49, won a posthumous Pulitzer Prize in 1974.

I read Becker’s book several years ago. Last week I followed it up with The Worm at the Core: On the Role of Death in Life by Sheldon Solomon, Jeff Greenburg, and Tom Pyszczynski. The trio of Becker enthusiasts are experimental psychologists who since the mid-1980s have devised numerous tests of Becker’s assumptions and conclusions. The results strongly back Becker. Judges in Tucson, for example, typically set bail for prostitutes at $50; when reminded by a questionnaire of their own mortality, however, the average bail was $450. (The cases, unknown to the judges, were fake, so no ladies were over-penalized in the tests.) People become much more protective of group norms and values when reminded of death because identifying with a larger entity (country, ideology, legal system, sect, party, ethnicity, etc.) makes us feel part of something that needn’t perish, so we are harsher toward violators; judges are not immune to the tendency. Being protective of one’s own group typically means being less tolerant of others, so those reminded of death are more hostile to “outsiders” of any kind. It works in reverse, too. Canadian and Australian test participants who were assigned to read highly negative commentary on Canada and Australia afterward used many more death-related words on a word association test than did the control group; those who read positive commentary used fewer. People reminded of death smoke and drink more to get their minds off it – even when the reminder is a public service warning about the lethality of smoking and drinking. On the upside, people reminded of death also get more creative in hopes of leaving some legacy that will survive in some sense.

The legacy gambit doesn’t always succeed at cheering the creative artist. Woody Allen: “I don’t want to live on in my work. I want to live on in my apartment.” John Keats, whose poetry was not well appreciated during his lifetime, despairingly left instructions for his tombstone not to bear his name, but to read, “Here lies One whose Name was writ in Water.” Edgar Allan Poe at least achieved some recognition in his own time though one would be hard pressed to write something more expressive of mortality than The Conqueror Worm. Needless to say, both writers have me outclassed, but I can relate in principle. My efforts at fiction over the years have been desultory at best, but my most productive phase (two novellas and a couple dozen short stories) was in the two years following the loss of the last of my immediate family. It wasn’t a conscious attempt to leave something of myself behind, but the timing is hard to miss.

Solomon, Greenburg, and Pyszczynski acknowledge, of course that other animals fear death from an immediate threat. “All mammals, including humans, experience terror. When an impala sees a lion about to pounce on her, the amygdala in her brain passes signals to her limbic system, triggering a fight, flight, or freezing response…And here’s the really tragic part of our condition: only we humans, due to our enlarged and sophisticated neocortex, can experience this terror in the absence of looming danger.” They designed their experiments to demonstrate just how many of our creative and destructive (including self-destructive) impulses derive from – or at least are heavily influenced by – an often unconscious fear of death

Dealing with death has been a staple of human lore from the beginning. The oldest literature (as opposed to business contracts and tax lists) that still survives is the Epic of Gilgamesh, which is about Gilgamesh coming to terms with the death of his friend Enkidu. The ancients approached the matter of death in the same various ways we do today: some with religion, some through their children, some through their work, and some by repressing the whole subject while trying to think of something else. The ever practical Epicureans argued that the experience of death is literally nothing and it is silly to worry about nothing. This is logical, but there are some subjects about which humans have a hard time being logical, and most are not satisfied by this argument. Solomon, Greenburg, and Pyszczynski list the standard ways most people strove and still strive to transcend death: biosocial (having children or identifying with some nationality or ancestral line), theological (belief in a soul), creative (art or science that survives the artist/scientist), natural (identifying with all life), and experiential. I’ll let them explain themselves on that last one: “experiential transcendence is characterized by a sense of timelessness accompanied by a heightened sense of awe and wonder.” Some of my acid-head friends in college used to talk like that. I think the authors left out “acceptance with a cynical humor” such as we see in Poe, Camus, and modern-day celebrations of Halloween.

The authors wrap up by asking the reader to assess whether he or she handles thoughts of death in ways that are beneficial or harmful. “By asking and answering these questions, we can perhaps enhance our own enjoyment of life,” they say.

So is the book worth a read? Yes. Their experiments are interesting though there is something of “a hammer in search of a nail” quality to them. If they had reminded those judges about sex before setting bail, would that have affected the outcome? Would it have affected subjects who afterward took word association tests? They didn’t run those experiments, so we don’t know, but my suspicion is yes. In short, I think the old Freudian Eros vs. Thanatos (love and death) dichotomy is closer to the whole truth. Nonetheless, I agree that we all too often try to banish the Thanatos side of that from our conscious thoughts with results that are often unhealthy. We’re better off if we can learn to deal. So, on balance, Thumbs Up.



The Rolling Stones: Dancing with Mr. D

Tuesday, July 4, 2017

Something to Say

All mammals communicate by sound in some basic way; some of them – definitely including all primates – communicate in very complex ways. Nonetheless, language is different. It involves more than pointing and squealing with the meaning “danger over there!” It entails a level of abstraction and a contemplation of the nonfactual, e.g. “Go peek around that rock and let us know if there is predator, prey, or something else on the other side.” We don’t know when humans first spoke a full-blown syntactic language, defined as words with discrete meanings strung together with a grammar to form a larger thought. It is certain, though, that no later than 60,000 years ago (maybe much earlier), they were bragging and gossiping and insulting each other as much as we do today.

Did they speak a single shared language at that time – or at least closely related ones? There is no way to know but there are reasons for supposing so. The entire population of modern humans (based on genetic studies indicating past bottlenecks) just before they radiated out from and across Africa was a few tens of thousands at most. Merely two or three thousand left Africa to populate the rest of the world. It seems likely that the members of such a small ancestral population could communicate with each other. Radical unifiers such as linguists Joseph Greenberg and Merritt Ruhlen make a compelling case that firmly established language families belong to somewhat less obvious superfamilies that ultimately spring from a common source. They point to spooky similarities in languages as apparently unrelated as Khoisan, Navaho, and Latin. (See The Origin of Language: Tracing the Evolution of the Mother Tongue by Merritt Ruhlen.) More conservative linguists object that language monogenesis can never be proven, and they are right. However, that doesn’t mean we should refuse to note tantalizing clues pointing in that direction even if they never will be enough to seal the case definitively.

The most thoroughly studied language family is Indo-European. Since different language groups and subgroups evolve and diverge in self-consistent ways, it has been possible for linguists to reconstruct a proto-Indo-European language spoken 8000 years ago in Neolithic times that is ancestral to an array of modern languages from English to Hindi. An entertaining book on the proto-Indo-European roots of words we commonly use today in modern English is Written in Stone: A Journey through the Stone Age and the Origins of Modern Language by Christopher Stevens. This is not an academic book full of footnotes. When, for example, he says “dok” is proto-Indo-European for “to learn” and that it turns up in “doctrine,” “docent,” and “heterodox” via intermediary languages, the reader is left to take his word for it. However, there is enough of a bibliography for a reader to double-check the sources, if so inclined. Stevens is not, in fact, just making assertions; there are extensive scholarly researches on the subject to back him up, even though he doesn’t refer to them at every turn. This makes Written in Stone far more readable and breezy than it otherwise would have been. It is a fun book, and at the end of it the reader will have 100 or so words to exchange with a Stone Age fellow should he or she encounter one, and none of the words will be altogether foreign.

Important as the spoken word has been and remains, human culture needed written language to really take off. The spoken word vanishes as soon as it is uttered. There is only so much knowledge, lore, and cultural information that can be transmitted orally, and untimely deaths of knowledge-keepers can cause much of it to be lost forever. Writing changes all that. The origins of writing in Sumeria (and soon thereafter in Egypt) is fairly well understood and documented. It apparently was independently invented in China and Mesoamerica. Sumerian writing started out as graphic representations of trade goods; the first writings were mercantile contracts. It developed fairly quickly (by ancient measures) into something complex enough to record anything that could be spoken.

But even before the very first scrawlings that count as “writing” existed, abstract symbols existed. In a South African cave 100,000 years ago people were grinding ochre, a red pigment. We don’t know for what, but it probably was for symbolic body decoration of some kind; that is how the stuff commonly was used later in prehistory. One chunk of ochre in the cave from that time period has three notches on it and another has a chevron. Again, we don’t know why, but archaeologist Genevieve von Petzinger speculates they are ownership marks: some artist was indicating “these are mine.” It is still common for people to mark their tools. I do myself. But whatever was intended, they were abstract symbols.

Although she does go farther afield, von Petzinger’s specialty is the cave paintings of Ice Age Europe between 40,000 and 10,000 years ago, mostly because they are well enough documented to allow statistical treatment. (Her own explorations have revealed that many of the records of cave images are incorrect however.) Her particular interest is abstract symbols rather than the representational images of animals with which most of us are familiar. Her book The First Signs: Unlocking the Mysteries of the World’s Oldest Symbols is not limited to these. She also discusses representational Ice Age art including figurines, but the abstract symbols have her attention. She identifies 32 (asterisks, crosshatches, cordiforms, spirals, etc.) that recur with high frequency over tens of thousands of years in caves hundreds of miles distant from each other. They must have meant something. She won’t call the symbols “writing” for numerous reasons, but she does think they tell us something about how writing started: “Rather than assume that writing appeared out of nowhere 5000 to 6000 years ago, can we trace its origins back to those artists working 20,000 years earlier? I believe we can.”

Both books are a pleasant way to spend some time communicating with our Stone Age ancestors. Perhaps what those folks had to say was more edifying than many social media posts today. One always can hope anyway.


The Lovin’ Spoonful – Words

Tuesday, June 27, 2017

Bandstand Grandstand

Few experiences can make you either forget your age or remember it as effectually as listening to popular music. Our youthful selves are so thoroughly imprinted by the songs current during our teen years that we remember their lyrics for the rest of our lives. Hearing them immediately takes us back. The first sign of having exited “the younger generation” is thinking that music on contemporary popular radio stations is terrible by comparison. Perhaps that is the second sign; maybe the first is hearing the songs on the radio instead of some other platform.

With all that in mind I picked up Your Favorite Band is Killing Me: What Pop Music Rivalries Reveal about the Meaning of Life by music critic Steven Hyden. He explores various sorts of rivalries within and between bands and also among listeners. There is the age-old rivalry between generations. That often fades but in one direction only: another sign of aging is noticing that our parents’ music (in my case Glenn Miller, Benny Goodman, Harry James, et al.) isn’t bad. But the most intense rivalries are among coeval listeners. The classic example for my generation was the common question, “Do you prefer the Beatles or the Rolling Stones?” On the surface this seems like a simple matter of taste akin to asking what toppings you like on a pizza. It was understood to be a bigger question than that. An entire worldview and a statement about oneself were inherent in the answer. (I tended to sidestep the question by answering “the Animals,” which come to think of it also was telling.)

Hyden is Generation X so he doesn’t get around to Beatles/Stones until chapter 6, and then only reluctantly as “dad rock.” Mostly he speaks of what had emotional import for him, e.g. Oasis vs. Blur, Cyndi Lauper vs. Madonna, Nirvana vs. Pearl Jam, Biggie vs. Tupac, White Stripes vs. Black Keys, etc. I wasn’t even aware rivalry was a thing for most of his opposing pairs, but I get it. Whether or not it is accurate or fair to regard, for example, Nirvana as outlaw and Pearl Jam as corporate (in the 90s I just lumped them both together as grunge), I can understand what a youthful listener might be trying to project by favoring one over the other – often passionately. It’s all about self-image really, and we are inclined to get passionate when protecting that. Hyden gives fair warning of what can happen if you play Metallica’s Black Album in “a room full of borderline psychopaths waiting for Megadeth to come on stage.” I’ll take his word for it. “Musical rivalries don’t matter,” he says, “until they matter to you personally.”

Some of the more interesting rivalries (touched upon by Hyden only lightly) are over alternate interpretations of songs by fans of the same band, but these are intellectual disputes and less likely to be quite so intense. Not always. As a non-pop example (not mentioned by Hyden) Friedrich Nietzsche developed key elements of his philosophy by arguing with himself passionately over Richard Wagner, first as an advocate and then as his fiercest critic. Even when the emotional volume is dialed down, such arguments can be more revealing than other kinds. For obvious reasons I won’t give a name, but in the late 90s a woman insisted to me in all sincerity that Cher’s Believe single was about addiction. Do you believe in life after love of drugs? For her (though I doubt very much for Cher) it was.

This brings to mind an old high school assignment about which I haven’t thought in decades. Every single school day in addition to other class assignments my senior English teacher required a 500 word essay. “On my desk by 5 PM. That does NOT mean 5:01!” To this day I feel I’ve forgotten something as 5 PM approaches. He usually let students pick their own topics but sometimes he would assign one. On one occasion we were told to interpret the lyrics of some popular song of our choice. My first inclination was to pick something truly weird such as MacArthur Park, Windmills of Your Mind, or Some Velvet Morning. I just about had settled on the last of those when on reconsideration I decided it was too much work for only 24 hours. (This was pre-internet, remember, so you couldn’t just look up interpretations online; you probably couldn’t even get the lyrics in 24 hours unless you owned the record and copied them yourself.) Instead I just went with the Beatles Nowhere Man, which really needs no interpretation at all. It means what it says, so that’s what I said in prose. I felt I was just skating by on minimum effort and was surprised (and oddly discomfited) by a good grade. Perhaps my punctuation was good or something. Then again, perhaps the rest of the class had been just as lazy as I in their choices. As that may be, I now realize Some Velvet Morning would have been a mistake. I hadn’t yet read Hippolytus by Euripides. (In case the reader has forgotten, it is about an ascetic young man who refuses to revere Aphrodite; Aphrodite punishes him in tortuous fashion by making his stepmother Phaedra fall in love with him with tragic consequences.) No one on this continent would write lyrics with the name Phaedra in it without intending the reference. I would have missed it. My well-read English teacher would not have. He would have given me an argument and won. I was better off taking the easy route.


Lee Hazlewood and Nancy Sinatra – Some Velvet Morning

Tuesday, June 20, 2017

Saluting Summer

Summer is the one season to which we insist on giving an unofficial start and finish. Memorial Day and Labor Day are fine holidays in their own right (the former rather somber), but defining summer by them is fundamentally a marketing scheme. I have nothing against marketing schemes per se: they may prod economic activity to the general benefit. FDR tweaked Thanksgiving, for example, to extend the holiday shopping season a few days; formerly it sometimes fell on the last day of the month. But while I don’t object to marketing schemes I don’t feel bound by them either. Summer starts officially on the solstice (June 21 this year, at 4:24 a.m. GMT [12:24 EDT] to be precise) and ends on the equinox (September 22). These are orbital phenomena not subject to the desire for auto, carpet, and beach furniture sales. I’ll stick with the official dates. Stonehenge is a bit far from my house, so I have yet to greet the sunrise there with the Druids, but I take note of the day in my own way.

Richard (not me, another Richard) and
Gill bringing some sunshine to a cloudy
day get-together. No virgins were 
sacrificed in the proceedings
In ancient times the summer solstice was a major holiday. In much of the modern world it still is. This is not the case in the U.S., but I find it a convenient time for a party anyway. Roughly midway between Memorial Day and July 4, it doesn’t compete with other parties and barbecues, and in this part of the country the weather has a good chance of being favorable for anything outside. Despite my remarks above, I’m not overly dogmatic about the date for the celebration, for the calendar doesn’t always cooperate neatly. As a practical matter, weekdays are not ideal celebratory days for anyone with a job or classes. Accordingly, when (as this year) the solstice falls on a weekday, I’ll pick the weekend before for a get-together so that more of the usual guests can attend. At the autumnal equinox I’ll pick the weekend after if need be, though this year I see it falls conveniently on a Friday.

A plurality (29%) of Americans list autumn as their favorite season. To me this seems odd. Autumn has its attractions but I always am mindful of the slide toward winter. There are geographical differences in the answers, of course: summer can be punishing in some of the southern states making it predictably less popular there. Nonetheless summer overall still gets its fair 25% national share, and I’m squarely in that camp. As a kid I used to claim I liked winter best. To be sure, there was fun to be had in snow, but mostly I said it just to be contrarian to the grown-ups who asked the question. In truth I recall far more fun in the summer back then and I had the usual schoolboy’s affection for summer vacation. Since I became an adult (a questionable move, by the way), I’ve had to shovel my own walks, repair ice damage on my own property, and pay my own heating bills. So, I’ve given up any pretense. I’ll openly declare summer to be my season. Given an either-or choice, I’ll opt for a sweltering heat over a bone-chilling frost every time.

A good reason why became evident minutes after I wrote the above paragraph yesterday: the first significant local power failure of 2017 turned out my lights (and computer) for 12 hours. The storm did some damage regionally, but I was fortunate and merely had the outage at my place. Simply contemplatively sitting on the porch in the dark without distractions other than the sound of rain actually was rather pleasant. I often do that anyway (yes, sober), though admittedly seldom for hours at a stretch.  Compare that to my post from November 7, 2012 following Hurricane Sandy:

“It’s another evening hunkered at my office. Power is still out at my home, which means there has been no light, heat, or water (I’m on a well) there since the 29th of October. Snow is falling tonight as is the temperature. This poses a threat to my pipes in which some water no doubt lingers.”

I’ll take watching rain on a warm evening, thank you. Since I jumped the gun by a few days with the party, I’ll also toast the sun (even though it will be below the horizon) 24 minutes past midnight local time tonight.


Sam Cooke – Summertime

Friday, June 16, 2017

On Trees and Apes

From Hell It Came (1957)
In my pre-teen childhood I loved monster movies, as do most kids. Slasher films were not a thing back then and I honestly don’t know how I would have reacted to those, but I loved Wolfman, Dracula, Rodan, Creature from the Black Lagoon, The Beast from 20,000 Fathoms, and so on. I enjoyed the outpouring of low budget productions from studios in the 50s and early 60s, some of which I saw in the theater but most of which I watched on Saturday TV; they included such monsters as a giant spider, giant snails, a giant bird, a giant lobster (yes, really), giant octopus, disembodied brains, aliens of all kinds, and a 50 foot woman. One of the most ludicrous was a vengeful murdering tree. TCM, of all channels, played this on Wednesday. I hadn’t seen it in decades, and I couldn’t pass up the nostalgic silliness.

The wooden hearted fellow means to
toss her in the quicksand
The initial crawl sets up the plot: “Our story occurs on a savage island where a Prince is killed unjustly. The victim was buried upright in a hollow tree trunk. The legend says that ‘the tree walked to avenge its wrongs!’” The legend proves not to have been a one-off event. As is common in un-PC 1950s B-movies, the island witch doctor is a scheming murderer; he frames and executes Kimo, the island prince, for a crime. An American scientific research team on the South Sea island soon finds a tree growing in radioactive soil where the prince was planted. The tree has characteristics of both plant and animal; it even has a heartbeat. (It also has a knife sticking in it that was used to kill the prince.) The researchers dig up the tree and take it back to their lab. It seems to be dying but Dr. Terry Mason (Tina Carver) insists on using her experimental formula for countering effects of radiation. She injects the tree and then they inexplicably all go to bed, figuring they’ll check on the tree in the morning. Of course the formula works during the night and the ligneous beastie lumbers off to avenge himself on the villagers.

This is a 1950s movie, so spoilers are hardly possible. You know pretty much the fate of the monster, but he doesn’t meet it until evildoers get their comeuppance. The whole thing is so ridiculous that I couldn’t help but enjoy it…but I don’t think I need to see it again.

** **
King Kong (1933)
After From Hell It Came I did feel the need to revisit the archetype of all monster movies. It wasn’t the first monster movie by any means. The 1925 The Lost World showed what was possible with stop action, but we first see the full panoply of what would become standard plot elements for the genre in King Kong. Besides, while I didn’t see Kong: Skull Island (2017) in the theater, it will be on DVD in month or two, so a revisit to the original was in order anyway as a proper precursor. As always, it was rewarding good fun even though there are ways in which the movie doesn’t rise above its time.

I don’t think the 1933 King Kong needs a plot description. Though I have met a surprisingly large number of Millennials and GenZs who haven’t seen it, I haven’t met one unfamiliar with the plot.

There is a hypothesis widely bandied about on the net that the theme of King Kong is racist. I don’t buy it. The movie is immensely racist beyond all possibility of argument, but not thematically. (The hypothesizers might be on firmer ground with the remakes.) The racism in the original King Kong is overt, unselfconscious, blatant, and simple-minded – not uncommon in a 1933 movie – which are the opposite of subtle, reflective, cryptic, and thoughtful. The minds of Cooper and Schoedsack were thinking more broadly when it came to the underlying theme.

A few words are in order about Merian C. Cooper and Ernest B. Schoedsack, the creators and directors of King Kong. They were adventurers of a type uncommon in their own day and extraordinarily rare today. Cooper flew for the US Army Air Corps in World War 1 and then for the Poles against the Soviets. Shot down in 1920, he escaped from a Soviet POW camp. In the 1920s he met and struck up a lifelong friendship with Schoedsack. They traveled the world together on tramp steamer, acquired cameras and filmed remarkable documentaries from Iran to Thailand. Cooper is much like the Carl Denham character in King Kong and much of Driscoll’s awkward dialogue with Ann (Fay Wray) in the movie reportedly was lifted from Schoedsack’s own utterances. Moving on to Hollywood, they made three iconic films in succession, all of which shared sets: King Kong, The Most Dangerous Game (also starring Fay Wray), and She. The inspiration for King Kong in particular was a World War 1 propaganda poster that was on Cooper’s office wall. Cooper and Schoedsack appear in the movie: they are the pilot and gunner who take out Kong at the end.


What is the theme? That transcending the inner beast is not about the superficial trappings of civilization. Kong, the villagers, and Americans all behave in fundamentally the same (violent) way and for the same reasons despite the surface differences in technology and civilization: at bottom they all act as beasts. When she hears about Kong, a woman in a New York scene even makes a remark about gorillas, “Gee, ain't we got enough of them in New York?” It is only in the pursuit of beauty that any of them transcend themselves. Beauty kills the beast. It’s why we feel bad for Kong, unlike, say, the critter in The Beast from 20,000 Fathoms that would have chomped Ann without a thought. It’s why Kong is still the king, and why he keeps turning up in popular culture.


Messer Chups - Curse of Stephen Kong