Thursday, July 27, 2017

Broaching Poaching

In 1994 evolutionary psychologist David M. Buss published The Evolution of Desire: Strategies of Human Mating, which analyzes the topic in evolutionary terms. “Evolutionary psychology” is just the latest moniker for the longstanding argument that human behavioral predilections are pre-bent by prehistory – that they are a feature of the way the human brain and its affective subsystems are structured. Cf. Carl Jung regarding a newborn: “He is not born as a tabula rasa, he is merely born unconscious. But he brings with him systems that are organized and ready to function in a specifically human way, and these he owes to millions of years of human development.” This seems obvious, and it is clearly the case in all other species. Yet there always are those who argue against it except when it is inconvenient (e.g. regarding sexual preferences), and until recently they were dominant in academia. In my own estimation evolutionary psychology is a powerful tool for understanding human nature, but it’s not the whole story. (In fairness, few evolutionary psychologists say it is.) The tabula rasa folks are wrong, but they are not crazy. Included in that evolved heritage is a mental capacity to choose to act against our predilections. Freud and his successors tell us we do so at our cost (though the payoff might be worth it), but we can do it. The slate never can be wiped clean, but with effort it can be overwritten. Individual decisions and socialization do matter. Most of us don’t overwrite it most of the time, however, and even those who do find what lies beneath bleeding through to the top from time to time.

The book was controversial when first published but, in the decades since, cross-cultural studies involving thousands of people have reconfirmed most of its findings. Last year Buss released an updated version, which includes the results of studies from the past 20 years. It was my reading material yesterday. The title has a plural because each sex employs a variety of strategies depending on circumstances such as the sex ratio and economic conditions. There are, of course wide individual variations in romantic matters, but there are bell curves of behavior for each sex that overlap but have distinctly different centerlines. We all are descended from ancestors who were reproductively successful, so it is hardly surprising that their predilections are (by and large) ours. Most often, strategies for obtaining (and dumping) mates are employed without conscious forethought. The strategies are frequently anything but nice. Buss: “I would prefer that the competitive, conflictual, and manipulative aspects of human mating did not exist. But a scientist cannot wish away unpleasant findings.”

One small chapter in the book discusses mate poaching. For some reason it particularly struck a chord with readers. Articles about it (which ignore the rest of the book) have turned up regularly in popular magazines and periodicals ever since ‘94. Why this particular topic attracted so much interest probably has to do with our own experiences as real or potential poachers or poachees – or as the Significant Other of one. Desirable mates are always in short supply, so this tactic persists, abetted by the quirkily human tendency to believe that “the grass is always greener on the other side of the fence.” (The clichĂ© is from Ars Amatoria, Ovid’s first century handbook on seduction: “Fertilior seges est alienis semper in agris.”) 60% of men and 53% of women admit “to having attempted to lure someone else’s mate into a committed relationship.” 93% of men and 82% of women have been the targets of such a poaching attempt. (The percentages are reversed when the offer is just for short term sex.) The most time honored method is presenting oneself as more desirable than a rival while derogating the rival. Hardly anyone is thinking of reproductive success when engaging in or defending against this behavior. Often that’s the last thing they want. They are boosting self-esteem, playing a game, exercising control, “following their hearts,” or any of a multitude of motivations, but there is something more primal beneath all that. Contraception allows contemporary humans (unlike our ancestors) to separate sex and reproduction, but we still are apt to act and react as though they are linked.

So, the odds are someone at some time will make a play for your sweetie. The odds are you’ll make a play for someone at some point. The good news (or bad news, depending on your perspective) is that the attempts succeed only occasionally. When they do, from the standpoint of the one left behind it’s probably best to let them. Anyone that ready to wander off with a poacher is preferably somebody else’s problem.


Samantha Fish – Somebody’s Always Trying to Take My Baby Away
[My silhouette is not on camera, but I was there.]

Thursday, July 20, 2017

Not to Reason Why

Action movies are not about character development or about reflecting the human condition. They are about chases and crashes and fists and flashing weapons and narrow escapes and razzle dazzle. A handful of exceptional films manage to combine the physical elements with the deeper stuff, but audiences neither demand nor expect it. Action movies are escapist fare. A sketchily drawn but likable character or two and some bare excuse for all the swashes and bucklers to follow are enough. In the past week I’ve sampled three of this year’s action hits – one in the theater and two on DVD. One can’t fault the action in any of them, but the excuses are bare indeed.

Baby Driver
The fantasy lives of adolescent and young men are intimately connected with popular music. Remarkable feats of derring-do go on in their heads during the guitar and drum solos. (I wouldn’t presume to guess if or how what goes on in young women’s heads differs.) Filmmakers know this, largely from firsthand experience, which not just accounts for a lot of soundtrack choices but also the quirks of many film characters – the old Iron Eagle movies and the recent Guardians of the Galaxy flicks come to mind. It’s a simple way to connect with the audience. In Baby Driver, Baby (Ansel Elgort) has tinnitus (ringing in the ears) due to a childhood accident and he drowns it out with music (leaning heavily to rock) pretty much constantly – always when driving.

Baby is a wheel man. He had become one in consequence of the youthful mistake of stealing Doc’s car. Doc (Kevin Spacey) turned out to be a broodingly ruthless crime boss who saw Baby’s potential; as payback, Doc set him to work as an expert getaway driver in elaborate heists. Aside from being a criminal, Baby is a pleasant enough sort who looks after his aged disabled friend Joseph (C.J. Jones). Baby meets the waitress Debora (Lily James) who is pretty and sweet and…well… that’s about it. For no discernible reason she agrees to leave town with Baby, about whom she knows nothing, for the open road. Baby is cute, I suppose, but surely he is not the first cute guy Debora ever met. So why? Because the script says so. Besides, it fits the adolescent fantasy. The secondary love story between the crooks Buddy (Jon Hamm) and Darling (Eiza González) actually is more comprehensible if even less wise: the two simply enjoy the thrill of sharing danger and violence. The course of true love never did run smooth, however, and the plans of Baby and Debora are put in jeopardy when Baby’s last job goes terribly wrong.

Taken purely as the escapist fare that it is, the movie is fun. It is well shot and the stunt driving is excellent. Don’t expect anything more from it though.

** **

John Wick: Chapter 2
For those who thought John Wick might have been a good movie if only there had been more violence (the eponymous character kills a mere 84 people), John Wick: Chapter 2 sets out to rectify that.

The reader may recall that retired hitman John Wick (Keanu Reeves) in the first movie is upset when the son of a Russian mobster kills his dog and steals his car. So, he singlehandedly wipes out the mob. In John Wick: Chapter 2, the timeline of which follows immediately after the ending of the first movie, an Italian mobster Santino D'Antonio (Riccardo Scamarcio) calls in a marker and demands that John Wick assassinate his sister, the head of the family. When Wick refuses, Santino blows up his house with an RPG. Knowing what Wick did to the last mobster who annoyed him, why would he do such an amazingly stupid thing? Because the script calls for it.

Anyway, Wick first does the job for Santino because honor (!) requires it, but as we all know it is then bad news for Santino D'Antonio and for all of the mercenaries seeking the seven million dollars Santino puts on John Wick’s head.

I’m not unaware of the tongue-in-cheek nature of this movie, but nonetheless to my taste it was a 30-round clip too far: numbing rather than escapist. My reaction is probably idiosyncratic, though, since my companions (both sexes represented) loved it.

** **

Kong: Skull Island
This is a movie I would have loved as a kid: monsters and more monsters with no irritating romantic subplot to distract from the (did I mention them?) monsters. There is not much waiting for them either. They show up in the first half hour.

The time is 1973. Landsat images reveal the existence of an island in the eye of a permanent storm that previously had shrouded it from the outside world. A scientific team headed by Bill Randa (John Goodman) investigates. Transportation is provided by a heavily-armed helicopter squadron withdrawn from Vietnam and commanded by Lieutenant Colonel Preston Packard (Samuel L. Jackson). They drop bombs all over the island in order to get seismic readings, which seriously angers the protector of the island and its indigenous people. You got it: the protector is King Kong. He rises up and swats every last chopper out of the sky.

Survivors of the crashes encounter the locals and a WW2 pilot (John C. Reilly) who was stranded on the island during the war. Conveniently, he can explain about Kong’s role as protector against the really terrible monsters who live below the surface. Packard, however, is determined to kill Kong. Why? Because the script calls for it. One gathers he is angry that Vietnam ended without a victory for his side and now he at least wants to kill a big gorilla. Um… yeah.

Most of the cast is there to get eaten by monsters, but a few should be mentioned. The photographer Mason (Brie Larson) shows that, unlike in previous iterations, a beautiful blonde woman can be on hand without anybody at all being attracted to her – not even Kong. Jing Tian’s most significant scene is in the after-credits (yes, there is a not-so-secret ending) when she reveals that there are other monsters in the world. James Conrad gets to play the competent mercenary. But it’s really not about the people. They are just there to run from (or foolishly try to kill) the monsters who are the real stars.

The movie is a fun romp and the fx are superb. If you are looking for anything other than an effects-packed action film, you won’t find it in the characters. There might be a metaphor or two, however, such as the imprudence of removing a monster who is keeping in check something worse. But primarily it’s about the chills and thrills, and it delivers enough of those.


Trailer: Baby Driver (2017)


Sunday, July 16, 2017

That’s All There Is

In my tween and teen years (1962-72) a regular guest on TV talk shows and variety shows was Peggy Lee. For most of that decade she was not a particularly welcome presence from my perspective on the youthful side of the Generation Gap. Born in 1920, Peggy was several years older than either of my parents. Her sound was very much my parents’ music and therefore something toward which I felt obligated to be (at best) indifferent. It wasn’t rock and roll. I knew nothing of her early work with the big bands of the 40s and scarcely anything of her career’s high water in the 50s. Nor did I care to. The extraordinary deference with which she was introduced (always as Miss Peggy Lee) mystified me.

This changed in 1969 when Peggy recorded a haunting version of the Leiber and Stoller song Is That All There Is? She had called on a young Randy Newman, of all people, to rework the original arrangement to something more to her liking. It was to be her last hit single and her biggest since Fever in 1958. I was one of the many who loved the record, and I grudgingly allowed at the time that maybe I had been a little closed-minded about her other work though I wasn’t yet ready to go out of my way to listen to any of it. (I had no idea I already was familiar with some of it from the sound track of the 1955 Disney movie The Lady and the Tramp.)

It wasn’t until after college that I recognized – let myself recognize – just how good much of my parents’ popular music was. To be sure, I still enjoyed the usual Boomer fare of folk and rock from Dylan to Clapton, but against all expectations I also liked 40s big bands from Glenn Miller to Duke Ellington. Who’d have thought it? Not an earlier I. What caught my fancy in particular was the mix of big bands with female vocals such as Helen Forrest, Kitty Kallen, Ella Mae Morse, and Peggy Lee. Vocals had changed over the previous decade thanks to good microphones and sound systems. Through the 1920s and into the 30s, it was important to belt out a song (ala Al Jolson and Sophie Tucker) so someone beyond the first row could hear you. With electronic amplification, this ceased to be a factor. By 1940 much more subtlety and sophistication accordingly had entered popular recordings – more so than in most popular recordings of the 1950s.

Due to her straightforward early style, Peggy Lee is not at the top of my personal list of favorite 1940s-era songbirds though she did numerous iconic numbers with Benny Goodman including Do Right (the Jessica Rabbit version is probably better known today) and a politically incorrect version of Let’s Do It. But she was definitely on the list. For a window into that era I picked up the biography Is That All There Is?: the Strange Life of Peggy Lee by James Gavin. As celebrity biographies go, this one is pretty well researched and written; it even comes with copious footnotes and an index.


The story of Norma Egstrom (aka Peggy Lee), like most success stories, is a combination of hard work and serendipity. Jamestown, North Dakota, is not the most likely place to start a showbiz career, but she made use of what was available and then traveled to find opportunities. Her break came in 1941 when she landed a job singing at the Buttery lounge in the Ambassador West hotel in Chicago. One night Benny Goodman was at a table. Helen Forrest had just quit on him and he needed a female vocalist to fill in for her temporarily until he found a permanent replacement. Peggy’s temporary employment with Benny lasted until 1948.

The bio details her personal life, which was messy in the way we expect of celebrities:  a string of marriages, affairs, and break-ups amid financial meltdowns and substance abuse. On top of all that were serious health problems including pneumonia that scarred her lungs. Yet, unlike most of her fellow 1940s songstresses, her career not just continued but flourished in the 1950s and included turns at acting, notably in Pete Kelley’s Blues (1955). She was always hands-on with musical arrangements. Peggy persisted when others didn’t. She sold out shows in Las Vegas in the 1970s, tried Broadway in the 80s, and sang from a wheelchair in the Waldorf’s Empire Room in the 90s – something unlikely to be emulated in the future by today’s pop divas. Peggy died in 2002.

Though I had bought the book mostly for insight into the big band years, the rest of it proved to be more instructive. Is that all there is? Yes. But maybe that’s enough.


Peggy Lee – Is That All There Is? (1969)
 

Tuesday, July 11, 2017

The Undiscovered Country

What is the sine qua non of being human rather than just another primate? Is it language? Art? Abstract thought? In the 1960s and 70s psychologist Ernest Becker offered another answer, one that accompanies (and perhaps inspires) the cognitive ability to talk, sculpt, and contemplate. So far as we know, humans are the only earthly creatures aware of the inevitability of their own deaths. There is nothing new about this answer, but Becker believed we give it insufficient prominence, which itself is a revealing act of denial. Becker, whose mind was focused by his own terminal illness, told us that we spend most of our energies denying that terrible knowledge; in the process, we develop civilization, art, religion, and neuroses. His book The Denial of Death, written in 1973 as his own demise loomed at age 49, won a posthumous Pulitzer Prize in 1974.

I read Becker’s book several years ago. Last week I followed it up with The Worm at the Core: On the Role of Death in Life by Sheldon Solomon, Jeff Greenburg, and Tom Pyszczynski. The trio of Becker enthusiasts are experimental psychologists who since the mid-1980s have devised numerous tests of Becker’s assumptions and conclusions. The results strongly back Becker. Judges in Tucson, for example, typically set bail for prostitutes at $50; when reminded by a questionnaire of their own mortality, however, the average bail was $450. (The cases, unknown to the judges, were fake, so no ladies were over-penalized in the tests.) People become much more protective of group norms and values when reminded of death because identifying with a larger entity (country, ideology, legal system, sect, party, ethnicity, etc.) makes us feel part of something that needn’t perish, so we are harsher toward violators; judges are not immune to the tendency. Being protective of one’s own group typically means being less tolerant of others, so those reminded of death are more hostile to “outsiders” of any kind. It works in reverse, too. Canadian and Australian test participants who were assigned to read highly negative commentary on Canada and Australia afterward used many more death-related words on a word association test than did the control group; those who read positive commentary used fewer. People reminded of death smoke and drink more to get their minds off it – even when the reminder is a public service warning about the lethality of smoking and drinking. On the upside, people reminded of death also get more creative in hopes of leaving some legacy that will survive in some sense.

The legacy gambit doesn’t always succeed at cheering the creative artist. Woody Allen: “I don’t want to live on in my work. I want to live on in my apartment.” John Keats, whose poetry was not well appreciated during his lifetime, despairingly left instructions for his tombstone not to bear his name, but to read, “Here lies One whose Name was writ in Water.” Edgar Allan Poe at least achieved some recognition in his own time though one would be hard pressed to write something more expressive of mortality than The Conqueror Worm. Needless to say, both writers have me outclassed, but I can relate in principle. My efforts at fiction over the years have been desultory at best, but my most productive phase (two novellas and a couple dozen short stories) was in the two years following the loss of the last of my immediate family. It wasn’t a conscious attempt to leave something of myself behind, but the timing is hard to miss.

Solomon, Greenburg, and Pyszczynski acknowledge, of course that other animals fear death from an immediate threat. “All mammals, including humans, experience terror. When an impala sees a lion about to pounce on her, the amygdala in her brain passes signals to her limbic system, triggering a fight, flight, or freezing response…And here’s the really tragic part of our condition: only we humans, due to our enlarged and sophisticated neocortex, can experience this terror in the absence of looming danger.” They designed their experiments to demonstrate just how many of our creative and destructive (including self-destructive) impulses derive from – or at least are heavily influenced by – an often unconscious fear of death

Dealing with death has been a staple of human lore from the beginning. The oldest literature (as opposed to business contracts and tax lists) that still survives is the Epic of Gilgamesh, which is about Gilgamesh coming to terms with the death of his friend Enkidu. The ancients approached the matter of death in the same various ways we do today: some with religion, some through their children, some through their work, and some by repressing the whole subject while trying to think of something else. The ever practical Epicureans argued that the experience of death is literally nothing and it is silly to worry about nothing. This is logical, but there are some subjects about which humans have a hard time being logical, and most are not satisfied by this argument. Solomon, Greenburg, and Pyszczynski list the standard ways most people strove and still strive to transcend death: biosocial (having children or identifying with some nationality or ancestral line), theological (belief in a soul), creative (art or science that survives the artist/scientist), natural (identifying with all life), and experiential. I’ll let them explain themselves on that last one: “experiential transcendence is characterized by a sense of timelessness accompanied by a heightened sense of awe and wonder.” Some of my acid-head friends in college used to talk like that. I think the authors left out “acceptance with a cynical humor” such as we see in Poe, Camus, and modern-day celebrations of Halloween.

The authors wrap up by asking the reader to assess whether he or she handles thoughts of death in ways that are beneficial or harmful. “By asking and answering these questions, we can perhaps enhance our own enjoyment of life,” they say.

So is the book worth a read? Yes. Their experiments are interesting though there is something of “a hammer in search of a nail” quality to them. If they had reminded those judges about sex before setting bail, would that have affected the outcome? Would it have affected subjects who afterward took word association tests? They didn’t run those experiments, so we don’t know, but my suspicion is yes. In short, I think the old Freudian Eros vs. Thanatos (love and death) dichotomy is closer to the whole truth. Nonetheless, I agree that we all too often try to banish the Thanatos side of that from our conscious thoughts with results that are often unhealthy. We’re better off if we can learn to deal. So, on balance, Thumbs Up.



The Rolling Stones: Dancing with Mr. D

Tuesday, July 4, 2017

Something to Say

All mammals communicate by sound in some basic way; some of them – definitely including all primates – communicate in very complex ways. Nonetheless, language is different. It involves more than pointing and squealing with the meaning “danger over there!” It entails a level of abstraction and a contemplation of the nonfactual, e.g. “Go peek around that rock and let us know if there is predator, prey, or something else on the other side.” We don’t know when humans first spoke a full-blown syntactic language, defined as words with discrete meanings strung together with a grammar to form a larger thought. It is certain, though, that no later than 60,000 years ago (maybe much earlier), they were bragging and gossiping and insulting each other as much as we do today.

Did they speak a single shared language at that time – or at least closely related ones? There is no way to know but there are reasons for supposing so. The entire population of modern humans (based on genetic studies indicating past bottlenecks) just before they radiated out from and across Africa was a few tens of thousands at most. Merely two or three thousand left Africa to populate the rest of the world. It seems likely that the members of such a small ancestral population could communicate with each other. Radical unifiers such as linguists Joseph Greenberg and Merritt Ruhlen make a compelling case that firmly established language families belong to somewhat less obvious superfamilies that ultimately spring from a common source. They point to spooky similarities in languages as apparently unrelated as Khoisan, Navaho, and Latin. (See The Origin of Language: Tracing the Evolution of the Mother Tongue by Merritt Ruhlen.) More conservative linguists object that language monogenesis can never be proven, and they are right. However, that doesn’t mean we should refuse to note tantalizing clues pointing in that direction even if they never will be enough to seal the case definitively.

The most thoroughly studied language family is Indo-European. Since different language groups and subgroups evolve and diverge in self-consistent ways, it has been possible for linguists to reconstruct a proto-Indo-European language spoken 8000 years ago in Neolithic times that is ancestral to an array of modern languages from English to Hindi. An entertaining book on the proto-Indo-European roots of words we commonly use today in modern English is Written in Stone: A Journey through the Stone Age and the Origins of Modern Language by Christopher Stevens. This is not an academic book full of footnotes. When, for example, he says “dok” is proto-Indo-European for “to learn” and that it turns up in “doctrine,” “docent,” and “heterodox” via intermediary languages, the reader is left to take his word for it. However, there is enough of a bibliography for a reader to double-check the sources, if so inclined. Stevens is not, in fact, just making assertions; there are extensive scholarly researches on the subject to back him up, even though he doesn’t refer to them at every turn. This makes Written in Stone far more readable and breezy than it otherwise would have been. It is a fun book, and at the end of it the reader will have 100 or so words to exchange with a Stone Age fellow should he or she encounter one, and none of the words will be altogether foreign.

Important as the spoken word has been and remains, human culture needed written language to really take off. The spoken word vanishes as soon as it is uttered. There is only so much knowledge, lore, and cultural information that can be transmitted orally, and untimely deaths of knowledge-keepers can cause much of it to be lost forever. Writing changes all that. The origins of writing in Sumeria (and soon thereafter in Egypt) is fairly well understood and documented. It apparently was independently invented in China and Mesoamerica. Sumerian writing started out as graphic representations of trade goods; the first writings were mercantile contracts. It developed fairly quickly (by ancient measures) into something complex enough to record anything that could be spoken.

But even before the very first scrawlings that count as “writing” existed, abstract symbols existed. In a South African cave 100,000 years ago people were grinding ochre, a red pigment. We don’t know for what, but it probably was for symbolic body decoration of some kind; that is how the stuff commonly was used later in prehistory. One chunk of ochre in the cave from that time period has three notches on it and another has a chevron. Again, we don’t know why, but archaeologist Genevieve von Petzinger speculates they are ownership marks: some artist was indicating “these are mine.” It is still common for people to mark their tools. I do myself. But whatever was intended, they were abstract symbols.

Although she does go farther afield, von Petzinger’s specialty is the cave paintings of Ice Age Europe between 40,000 and 10,000 years ago, mostly because they are well enough documented to allow statistical treatment. (Her own explorations have revealed that many of the records of cave images are incorrect however.) Her particular interest is abstract symbols rather than the representational images of animals with which most of us are familiar. Her book The First Signs: Unlocking the Mysteries of the World’s Oldest Symbols is not limited to these. She also discusses representational Ice Age art including figurines, but the abstract symbols have her attention. She identifies 32 (asterisks, crosshatches, cordiforms, spirals, etc.) that recur with high frequency over tens of thousands of years in caves hundreds of miles distant from each other. They must have meant something. She won’t call the symbols “writing” for numerous reasons, but she does think they tell us something about how writing started: “Rather than assume that writing appeared out of nowhere 5000 to 6000 years ago, can we trace its origins back to those artists working 20,000 years earlier? I believe we can.”

Both books are a pleasant way to spend some time communicating with our Stone Age ancestors. Perhaps what those folks had to say was more edifying than many social media posts today. One always can hope anyway.


The Lovin’ Spoonful – Words