Thursday, December 24, 2020

Pic Picks for a Socially Distanced Eve

It’s not my habit to watch holiday movies except by accident: they tend to be on TV during the holidays. When I was much younger I went through a phase of deliberately ignoring holidays altogether on the theory that all of them are arbitrary and each of us is capable of choosing and observing his or her own. The trouble with this – aside from the general problem of anti-dogma being a dogma of its own – is that other people are not often available for your own arbitrary festivities; were they to schedule their own as well, hardly any would coincide. So, as a practical matter, aided by twinges of nostalgia, I reverted to celebrating Thanksgiving on Thanksgiving and Halloween on Halloween and so forth. I’m still inclined to host equinox and solstice parties, but while not official holidays (in this country anyway) those are not entirely novel ideas either. In fact, they long predate civilization.

2020 is not like other years however. The absence of the usual houseful of guests prompted a bit more nostalgia this season than usual, so my movie picks Monday, Tuesday, and Wednesday were Christmas movies – sort of. In keeping with the spirit of this year, the holidays are incidental rather than integral to the plots which are all rather dark. I can recommend all three, albeit to different degrees, when settling into an easy chair tonight or tomorrow after Skyping with friends and family. 

****


The Thin Man
(1934)

This is a film I revisit from time to time in any event, but it is set at Christmas. The very good private detective Nick Charles (William Powell), despite the unsavory company he usually keeps, recently has married the wealthy and charming Nora (Myrna Loy) in California and regards himself as retired. They are in NYC for the holidays where Nick, against his inclinations but with Nora’s prompting, is pulled into a missing person case that is somehow tied to two murders. The bibulous Nick and Nora make a marvelous investigative couple and actually seem to like each other even when they fight. This is unlike the way couples so commonly are portrayed today when, even when supposedly in love, their fights seem to reveal genuine hostility. (Perhaps screenwriters in any time write from experience.) It is a classic detective story with heart and cleverly funny dialogue. Though originally intended as a standalone movie, it was well enough received in its day to spawn five sequels. If only there were more. 

****


Better Watch Out
(2017)

OK, this gets a recommendation only as a guilty pleasure, but sometimes that is enough. For the first 15 minutes this appears to be a standard teen-oriented horror film with the tropes of a tween boy with a crush on his babysitter and both facing off a home intruder. It then takes a different direction. There isn’t much more I can say without spoilers, but if you sometimes like darkly comedic (but not satirical) horror, this may be for you: not highbrow but fun. Yes, it is set at Christmas.

 

****

Eyes Wide Shut (1999)

Stanley Kubrick’s final movie has more than one theme. The primary one is the nature of betrayal, particularly within the context of marriage. When does cheating begin? Is the thought enough? Dr. William Harford (Tom Cruise) and Alice Harford (Nicole Kidman) verbally spar after some jealous moments at a party. William is disturbed when Alice tells him that once she would have run off with a naval officer had he only asked. Is that betrayal even though he didn’t ask and she didn’t go? William, still brooding, later picks up a hooker and only an emergency medical phone call prevents anything from happening. Is this betrayal? William’s night gets stranger as he manages to crash an erotic party at a private estate. Has betrayal happened yet?

The second theme is the existence of an ultra-elite who don’t live by the rules that apply to the rest of us, and don’t have to. William and Alice, by normal standards, are elite: their real assets in NYC alone make them one-percenters. Yet, they are still part of the hoi polloi by the standards of the ultra-elite. William makes proud a point of identifying himself as a doctor wherever he goes, but to the super-rich, however friendly they might behave toward him, this makes him a tradesman no different than a plumber and most definitely an outsider. The Christmas setting likely was chosen just as a contrast to the fundamentally pagan attitudes and rites at the estate where William doesn’t belong.

By the way, the movie was deliberately filmed in 4:3 ratio (as most movies were until the 1950s) in order to narrow the viewer's view. It is the way it should be watched.

Tom Cruise and Nicole Kidman broke up immediately after making this movie. Neither has been forthcoming about whether the film had anything to do it.

****

So, those are three options to consider, but if the reader would rather watch It’s a Wonderful Life yet again, I understand.

 

Trailer: Eyes Wide Shut



Friday, December 18, 2020

The Flurry Query

The first notable snowfall of the season always makes me question why I live in New Jersey. Yesterday’s snowfall was a fairly modest one in my location (parts of the state were affected much more) but it still evoked the question. I’m hardly alone in asking the question and many residents answer it with their feet – or rather with U-Haul trailers. Of interstate moves across NJ’s border, 69% are outbound and only 31% inbound according to Bloomberg. NJ leads the nation in interstate emigres in percentage terms; the percentage is higher even than that of New York where people presently are fleeing New York City in droves during the current Covid regimen. There are many reasons for the exodus including NJ’s property taxes (the nation’s highest) and the least friendly business climate of the 50 states according to the Small Business and Entrepreneurship Council. These and other (completely self-inflicted) detriments increasingly outweigh a number of very real geographical attractions and advantages to living in NJ. One negative that is natural, however, is the weather. It is wet in the spring, humid in summer, windy in autumn, and bone-chilling in winter – the winter weather being the most off-putting of the four.
A...um...few years ago

Yesterday: still shoveling

Homo sapiens is not by design a cold-weather animal. When our ancestors spread out of Africa some 70,000 years ago they wisely hugged the southern shoreline of Eurasia as they spread east, infiltrating northward only after occupying southern lands. They occupied Australia before they occupied Europe. Of course, Europe and northern central Asia already were occupied at the time by Neanderthals and their cousins the Denisovans who were, in fact, adapted to the cold. Sapiens previously had little trouble brushing them aside in the southern areas however. Interbreeding appears to have occurred primarily at the early stage of expansion, especially in the Middle East, when intruding modern humans were heavily outnumbered rather than later when they were better established; about 2% of the DNA of present-day humans outside of sub-Saharan Africa is Neanderthal and/or Denisovan. (It is surprising the percentage isn’t higher.) It wasn’t enough to change their body types to better handle a chill. Modern humans turned north, it seems, only reluctantly. They delayed not so much because of the existing inhabitants as because it was…well…cold.

Why did they go north at all? People it seems, don’t like each other very much, especially the ones we know best – hence the special ferocity of civil wars (and Twitter). So, as their numbers rose they spread out, split (in groups of 30 to 150), and spread out some more in order to get away from those awful others with their irksome quirks and offensive ideas. Since maintaining sufficient calories as hunter-gathers requires 1000 acres or more per person in non-ideal climes, even a small rate of population growth meant humans quite quickly spread into vast new regions. The north may have been cold, but at least it was away from those other people. Besides, it turned out the north was rich. This is why a large percentage of the world’s remaining hunter-gathers are in subarctic regions such as the Tozhus and Nenets in north Asian Russia. It isn’t a great way of life for a vegan, but reindeer, elk, and fish are plentiful. Calories are not much of a problem and mobile skin-covered yurts are surprisingly cozy.

This still has relevance in NJ – not that I hunt and gather except in a distantly analogous sense. The northern states (and northern countries generally) are rich in resources and opportunities – pre-tax anyway. The benefits initially were worth seasonally numb fingers and icy winds. That explains why people moved here in the first place, but not why we stay. In my case the reason primarily is inertia, which probably plays a bigger role in human life than any other single factor. It plays a large role in my life certainly. What remain of my friends and family are mostly here, my personal history is mostly here (I live 10 miles from where I was born), my house (which had been my parents’) is here, and my other physical belongings are here. It would be troublesome in the extreme to pull up stakes now. It would have been easy in my 20s when I didn’t have anything that couldn’t fit in the trunk of a car, but not now. The day may come when remaining here becomes so unaffordable as to overcome inertia (as it already has for so many others) but the time isn’t yet. However much it might prompt the question, the deciding factor in the end won’t be a snowfall: not even the next full-blown blizzard.

Sirenia – A Blizzard is Storming


Friday, December 11, 2020

Many a man hath more hair than wit

He called it Macaroni

A recent visit to the barber was my last for this year. Haircuts this year, like everything else in 2020, are weird. I don’t mean the styles. Long-and-shaggy is a more common look than last year to be sure, though that is not by preference. I mean the actual process of getting a haircut. The shops are open in NJ at present, but at limited capacity, by appointment, and with a variety of plastic barriers. Fancier salons and haircutters always have required appointments, of course, but I go to a traditional barbershop, striped pole and all. It’s the same one I’ve used for more than four decades, though the most recent cut was by the grandson of the fellow with the scissors the first time I sat in one of the shop’s chairs. 

The traditional red-striped barber pole, by the way, is a holdover from the days of barber-surgeons. The first guild of barber-surgeons was formed by King Edward IV in 1462 who gave the members a monopoly of surgery and barbering in London. I guess the reasoning was that both activities involved cutting so they were more or less the same. The two professions weren’t formally separated until 1745 with the formation of the Company of Surgeons (soon renamed the Royal College of Surgeons) and the Company of Barbers. Anyway, bloodletting was a common surgical technique from medieval times until the early 19th century in order to get rid of “bad blood.” (I’m not sure why bad rather than good blood was expected to flow out.) The patient commonly stood and gripped a pole while he bled. So, a pole (blood red) with a spiral wrap of a white linen bandage became an early symbol for a practitioner that potential patients (many of them illiterate) could recognize. The original symbolism soon was widely forgotten, and so the sign (oddly) stuck with the barbers rather than the surgeons after the two went separate ways. 

Fortunately, I’ve never experienced bloodletting, intentional or otherwise, at the barbershop I patronize. Yet, as snippets of hair dropped to the floor while I was last there, the peculiarity of human hair came to mind, abetted by reading the night before, which happened to be Hair by Kurt Stenn, a leading follicle expert. Hair isn’t primarily about fashion, despite some nods to the importance of hair styles for social signaling, but mostly about the biology of hair. Under the general term “hair” Stenn refers to all pileous growth whether tresses, beards, or fur. 

Humans have the strangest hair of all primates. At first glance, the absence of it on most of the body is the most striking visual difference between us and the other apes – hence Desmond Morris’ classic 1967 work The Naked Ape. Just as weird, we have a mane, which no other primate does. If uncut a head hair likely will grow anywhere from one to three feet (30cm to 1m) long depending on the person. At that point it falls out and the follicle generates a new one in its place. In a sense it is not quite right to say we are mostly hairless on our bodies. Except for the palms of our hands and the soles of our feet we have hair follicles everywhere. We have, in fact, just as many as do chimpanzees: see Comparative Evidence for the Independent Evolution of Hair and Sweat Gland Traits in Primates in the Journal of Human Evolution. Unlike the thick body hairs on a chimp, however, most of the body hairs growing from humans are so delicate, short, and wispy that they are all but invisible; one needs a microscope to see many of them. On the other hand, we have 10 times more sweat glands than chimpanzees and other apes have. 

There may be a common cause for the evolution of increased numbers of sweat glands and reduced hairiness. Both traits are great for shedding heat, which was a big advantage on a hot savanna where our ancestors were highly active, unlike most mammals, which laze around most of the time in between short bursts of energy. (Humans may not be particularly fast, but when in shape we can run or travel at pace nonstop for extraordinary distances by the standards of most mammals.) For this reason, the leading conjecture of when humans lost their thick body hair is some 1,800,000 years ago when Homo erectus pursued a lifestyle (evident in butchered bones at archeological sites) of long distance tracking and hunting. Since clothes apparently date back only 70,000 years for anatomically modern humans (determined by DNA analysis of lice that inhabit clothes), for nearly 2,000,000 years we really were the naked ape. There are other proposed timelines and reasons for reduced hairiness, however. (Darwin suggested sexual selection as the proximate cause, which not need not favor a useful trait so long as the trait isn’t seriously harmful – it can even be a little harmful as famously in the case of the peacock’s tail.) DNA studies currently underway are intended to date the changes in hairiness, so in the coming decade we may get a firmer answer from these analyses. 

There is another possible cause that I’ve never seen mentioned in the literature (though it might have been) but which I’ll just toss out there. Our hairless bodies and shaggy manes might have made us look scary to other animals. Consider our own first reactions to, say, a mostly hairless mangy bear. It isn’t, “Aw that’s cute.” It’s “What the hell is that and what is wrong with it?” If other predators had a similar (if nonverbal) negative first reaction to us, they might have been more inclined to leave us alone – a definite survival advantage. Few of these conjectures are mutually exclusive, of course. 

Mangy bear
Humans are not keen on their own body hair; modern folk often attack it ferociously. We are, however, obsessively in love with our head hair. Life being what it is, this is the hair that gives us the most trouble. It grays, thins, mats, and falls out while our body hairs get, if anything, thicker, longer, and richer. Shelves of books can be, and have been, written on the social aspects of hair styles, but that is outside the scope of either Stenn’s book or this blog. I’m just happy to still have something left for the barber to cut. The day may yet come when I don’t.

Crosby, Stills, Nash & Young - Almost Cut My Hair



Friday, December 4, 2020

Two for the Read

As the year winds down and the covid-inspired lockdowns, if anything, intensify, there at least is time to catch up on reading. This past week’s selections included some factive fiction and fictive fact. 

Harlan Coben is one of my go-to writers of contemporary mystery/suspense novels, second in my preference only to South African author Deon Meyer. In Coben’s 2019 Run Away, Simon Greene and his physician wife Ingrid try to find their missing adult (college-age) junkie daughter Paige. All too many of us (whatever our own histories of substance abuse might or might not be) have faced the problem of how to deal with an addicted family member or loved one who doesn’t want help if it means giving up drugs or alcohol. Not all of us are as persistent as the dedicated family man Simon. When Paige’s known dealer/enabler Aaron is murdered, Simon becomes a suspect. It is difficult to comment much on a mystery novel without spoiling the mystery. Among the questions raised, however, are how Paige fits in with a string of superficially unconnected murders, another missing person case, a professional hit couple, a rural cult, and deep family secrets. As usual with Harlan, the book is hard to put down once started. Suspense is high and the characters are, despite (or because of) their flaws, relatable. Also as usual with Harlan, he with no hesitation distinguishes between ethics and the law. 



The second book was a biography of someone whose voice frequently reached out from my stereo speakers in my youth. 

In August 1968, I was 15 and far, far from cool, but least I had a considerably hipper older sister who introduced into the house trending music that I might not have found on my own until it was passé. Her freshman year at Boston University would begin in less than a month, however, and she would be taking her favorite record albums with her. So I went shopping with a friend at a local record store in Morristown to beef up what would remain on the stereo shelf at home, relying in part on my sister’s advice and in part on that of the store clerk. Once back home, I slipped one of the new vinyl albums out of its jacket, which was covered with the artwork of Robert Crumb. The album was Cheap Thrills by Big Brother and the Holding Company. I still have it 52 (!) years later. The lead singer was Janis Joplin. I sat through the album from start to finish: the first of numerous occasions and a very 1960s thing to do. I bought her previous album on my next trip to the record shop and then picked up the next two (including the posthumous Pearl in ’71) within a few weeks of their release. 

Popular music in the 21st century typically is polished, highly produced, visual, and choreographed. This so unlike the 1960s when a frowzy Janis simply walked on stage and sang her heart out. There is something about the romantic excess in her songs that appeals especially to adolescents – and for once I don’t mean that as an insult. It appeals to them because they are not cynical yet (much as they might think otherwise) and all of that emotion looms fresh and huge; if we have a little teenager left in us – again not meant as an insult – we still can recapture a bit of that from singers like Janis. 

A dark side to Janis – not unique to the 1960s – was substance abuse: notably Southern Comfort onstage and heroin backstage. Southern Comfort is also something one tends to grow out of after adolescence btw: too sweet and syrupy straight-up, but OK as a mixer with coke or something. Like all liqueurs, it also gives mean hangovers – or so, ahem, I’m told. Hangovers by themselves would have been survivable however. Janis died in October 1970 from an overdose of unexpectedly pure (only 50% adulterated) heroin. 

George-Warren’s book covers the ground one expects a bio to cover.. All biographers going back to Plutarch insert their own perspectives into their narratives, and George-Warren does, too, but not overly distractingly. (Autobiographers on the other hand, while their perspectives can’t help but be their subject’s own, are more apt to outright lie, especially by omission.) The book is well researched and details the wrinkles of Janis’ lifeline. Yet, despite the much briefer account contained in it, I prefer the memoir Laid Bare by actor/author John Gilmore who knew the woman personally. Though he writes of Janis in only one chapter, and intensely from his own perspective, we put the book down feeling we’ve met the person – and have mixed feelings about having done so. A third option is simply to let Janis speak for herself in her music. That works, too.

Janis Joplin – Get It While You Can (1970)




Friday, November 27, 2020

Casting a Shadow

Yesterday the weather was suitable to eating outside, so I enjoyed my Thanksgiving turkey at my picnic table. In lieu of my usual one or two dozen guests, I shared the meal with two stray cats who were attracted by the aromas. They instantly made themselves my best buddies. They returned for leftovers today but will be aloof again when the turkey runs out.

 

That overfull feeling

Afterward, I spent much of the day with a book and movie, which (along with sleeping off a turkey coma) is a common way of dealing with the aftermath of the meal. There was, of course, plenty on TV that was holiday related: not just Thanksgiving fare (A Charlie Brown Thanksgiving; Trains, Planes, and Automobiles; et al.) but the first batch of Christmas fare (e.g. Miracle on 34th Street) as well. Special holiday TV programming seems a bit unnecessary in an age when you can stream (or, if you’re old school, play on DVD) pretty much anything anytime. Stations offer it anyway. Since 1956 The Wizard of Oz, has been aired on American TV on Thanksgiving, though it really has nothing to do with the holiday. Yesterday it ran continuously on TBS. I didn’t watch it. It’s not that I dislike it or the other offerings (well, some of them I do). I’ve just seen them enough. For the same reason, my book of choice was not A Christmas Carol. 

2020 demands something darker. The times in general do. This may seem an off-topic digression, but I’ll tie it in: Freud developed his theories in the Victorian era of strict moral codes. The codes didn’t merely restrict behavior (not altogether successfully) but dictated proper thoughts as well. To the extent people internalized these dictates, they felt guilty about their own (humanly bestial) thoughts and developed neuroses of the type that so fascinated Sig. The thoughts, desires, and motives were perfectly normal, of course. Freud spoke of motivations from the moral-free id. Carl Jung talked of the shadow self: the hidden side of one’s nature including dark elements that we don’t display in civilized society (unless we are sociopaths). Rather than deny the shadow’s existence or repress it, Jung emphasized the importance to psychic health of accepting it and integrating it into one’s whole personality. Not to accept it is to deny one’s own humanity; the shadow will express itself anyway – perhaps by cruelly moralizing to others. Don’t let the shadow drive the car, he tells us in essence, but recognize without guilt that it will always be a passenger. Existentialist philosophers also emphasized the primacy of action over mind; what matters is what you do, not what you think. All of these shrinks and thinkers aimed to increase human happiness by freeing people from the whole notion of (to lift a term from Orwell) thoughtcrime. In the 21st century strict moral codes – this time political in origin – are returning; for those who internalize the dictates, thoughtcrime has returned as a notion as well. Art that challenges this (e.g. Dexter) has one mark in its favor for that reason alone. 

So, my book choice was one to tickle the shadow self. The author who goes by the nom de plume Delicious Tacos (by his own account to avoid losing his day job over something some boss or co-worker reader might find offensive) apparently has a substantial online following, but I never heard of him until a couple months ago when I picked up Finally Some Good News, his post-apocalyptic novel. Yesterday I tried Savage Spear of the Unicorn. This is a collection of short stories, many of them barely disguised autobiography. “The mass of men lead lives of quiet desperation,” said Thoreau. Tacos may live in desperation, but in his writings he is anything but quiet. He hands his Jungian shadow self the keyboard and lets it scorch pages with the scabrous brutality of the way people actually think – even (maybe especially) the ones who speak and act politely. He spews utter frustration at a soul-crushing world in which we buy a car to work boring hours at an awful job just so we can pay for the car; he tells why he will die alone in a time when the sexes no longer even know how to talk to each other. Even in his tamer stories his characters are all too human – and not in an admirable way. For example there is a tale (*SPOILER* follows) of a young woman rescued from poverty by a mysterious benefactor who gave her $1,000,000. Having achieved her dreams, she dedicates herself to finding the anonymous benefactor who changed her life. At last she tracks down the aging wealthy recluse philanthropist and asks him, “Can I have more money?” 


The author is rude, crude, and harsh, and most definitely not for anyone who believes in thoughtcrime. Yet, beneath it all he writes surprisingly well: a Henry Miller for our time (who, of course, was banned in his). 

The movie of choice was one I saw mentioned on a Youtube list of underrated movies from the past decade. I’m not normally a big fan of comic horror films (unless written by Joss Whedon) but based on this list I decided to give it a chance. It seemed suitably dark. Made in 2014, Cooties is about a pandemic started in a chicken-nuggets factory and spread around in school lunches. The virus affects only children who are turned into murderous savages. The film is rated “R” (as it should be for scenes such as kids using a teacher’s head as a soccer ball) so the kids who act in it can’t legally watch it. It is a low budget campy gore fest unapologetically playing with the usual horror tropes. The humor is silly rather than clever, yet somehow the result is surprisingly genial. The film stars Elijah Wood, Rainn Wilson, and Alison Pill. I don’t think Cooties is remotely underrated. That said, watching it was (this year at least) preferable to watching Dorothy skip down the Yellow Brick Road yet one more time. Somehow it seemed right for Thanksgiving 2020. 

Now we are in the next holiday season and are hounded to buy, buy, buy on all media platforms. I haven’t yet selected any books or movies for the next few weeks, but this year It’s a Wonderful Life probably won’t be among them.




Friday, November 20, 2020

The Bird Is the Word

A bird on the table is worth four in the air

My usual Thanksgiving dinner of 12-to-20 guests is canceled this year due to you-know-what and the state directive to restrict dinner companions to those "with whom you've been with throughout this pandemic.” That would be myself and two vegans. Um…yeah. My dis-invitations to the usual suspects went out a few days ago. I still plan on consuming the usual poultry and accompaniments by myself largely for reasons of sensory nostalgia, but the turkey will be a small one. 

As a kid, Thanksgiving was my favorite holiday. Oh, the playfulness surrounding Halloween made October 31 a close second, but back then (as now) the license to gorge on a full-blown feast tipped my scale of good things toward the 4th Thursday in November – despite how it subsequently tipped the bathroom scale. Besides, the holiday always falls within a few days of my birthday (sometimes on it), and there always has been enough Narcissus in me to regard the holiday as a sort of extension of that. 

Thanksgiving feasts were common in the American colonies as harvest festivals – first by the Spanish and French in the 16th century and later in the English colonies. A Thanksgiving (by that name) held in Virginia in 1607 preceded the “first” one in Plymouth colony by 14 years, though the latter eventually caught on as the mythic origin story. The holiday was celebrated sporadically in the 17th and 18th centuries. It was first declared a US national holiday by George Washington as a one-off event in 1789. This was the first year of his Administration under the new Constitution. He made no mention of the Pilgrim story. Rather, the holiday was for Americans to give thanks for “an opportunity peaceably to establish a form of government for their safety and happiness.” Thanksgivings were declared intermittently in subsequent years by presidents and state governors. In 1863 in the midst of the Civil War Abraham Lincoln made the holiday permanent. FDR gave it one final tweak: he fixed the holiday on the 4th Thursday of the month instead of, as previously, the last (some Novembers, of course, have five) as an economic measure in order to extend the Christmas shopping season. 

As a kid I cared nothing about any origin myth. As an adult I don’t care more. It was always about food, friends, and family. It was convivial. Unsurprisingly, the ancient Roman term for such an event was convivium. One festival called Mercatus Romani was held in late November and was accompanied by convivia; “mercatus” means “commerce” and accordingly the festival involved street markets and seasonal shopping, which has a modern ring to it too. 

Yet, the roots of Thanksgiving-like late-autumn banquets are far older than the Romans. There is something deeply atavistic about a ritual blood sacrifice at this time of year – in the modern version, commonly a turkey. One may point out that only the bird is sacrificing anything since the rest of us get to eat the critter, yet even this aspect of sacrifices has deep origins. Hesiod in Theogony (c. 700 BCE) tells how the already age-old practice came about of people placing bones on altar fires for the gods while keeping the meat for themselves. When humans’ relationship with the gods was still unsettled, Hesiod tells us, in order to decide how a sacrifice should be divided between people and the gods an ox was slaughtered and divided into two portions. Prometheus (who had a soft spot for humans – go figure) made the division. For one portion he stuffed meat and fat in the ox’s gross, slimy, ugly stomach; for the other he sewed up the ox’s bare bones prettily inside the skin. He then let Zeus pick between the two. Zeus picked the pretty pile. He was irate at having been tricked into choosing bones but he didn’t go back on the deal. (There is more to the story involving fire and the punishment of Prometheus, but let’s not get sidetracked.) I won’t be offering up leftover turkey bones on an altar. They’ll be tossed in the woods for the delectation of the local wildlife (the remains are always gone by morning), but that is an offering of sorts to Nature. 

In any event, while the upcoming November 26 may be shy on conviviality, the date is arbitrary anyway. My current thought is to do a “Thanksgiving in February” (or whenever) for the usual crowd when vaccines and growing herd immunity make that feasible… and not likely to prompt a visit from the police. Turkeys beware. 

A Vegetarian Option



Friday, November 13, 2020

A Clash of Verses

Istanbul #2461

Pursuant to some comments the other day on an online book chat group to which I belong, I revisited some of Aleister Crowley’s poetry at Poemhunter. Playing on the stereo in the background was Bob Dylan’s 2020 album Rough and Rowdy Ways. Normally I can ignore background sounds when concentrating on some other task, but music and poetry are so closely related that on this occasion they clashed in my head. I turned off the stereo. Yet, that very act also distracted me from Crowley whose verses I also clicked off and will revisit some other time. The question of how the two forms are similar yet different nagged at me instead. It is the sort of question that in order to explore once required a library trip and an immersion in card and microfiche files. It’s a trifle easier now.
 

The facile answer, of course, is that songs are sung and are (usually) accompanied by music while poetry (usually) is not. Poems might not be an auditory experience at all but just read on a page. But while that distinction is both obvious and key, there is more to it than that. When reading lyrics, it becomes clear pretty fast that Carol King did something different from Robert Frost. 

To be sure, there is overlap, most obviously in rap (Kanye West’s All Falls Down in original form was a poem he recited at a poetry jam in 2003) and the once-popular recitation songs in C&W (e.g. Ringo by Lorne Greene). Songs and poems alike are verse, and both can impact us with an economy of words that prose can’t match. (The impact is visible on brain scans: See the 2017 NIH paper The Emotional Power of Poetry: Neural Circuitry, Psychophysiology and Compositional Principles.). Indeed, songs are a part of every culture for this reason. The very first literature (not business contracts and tax records, which came earlier, but literature) is poetry. Yet there is a difference. In his acceptance speech for a Nobel Prize in Literature, Bob Dylan (though he took the prize money) pretty much told the committee that it had made a mistake: 

“But songs are unlike literature. They’re meant to be sung, not read. The words in Shakespeare’s plays were meant to be acted on the stage. Just as lyrics in songs are meant to be sung, not read on a page. And I hope some of you get the chance to listen to these lyrics the way they were intended to be heard: in concert or on record or however people are listening to songs these days.”
Matthew Zapruder in The Boston Review said something similar: “Words in a poem take place against the context of silence (or maybe an espresso maker, depending on the reading series), whereas, as musicians like Will Oldham and David Byrne have recently pointed out, lyrics take place in the context of a lot of deliberate musical information.”

There is a history of snobby professors (most often without music skills) valuing poetry over lyrics, which is why one is taught in school while the other generally isn’t. Yet the latter in toto are, if anything, more complex, requiring musical sensibility as well as a facility with words. So, while one can sing a poem or recite a song, something typically feels off about it, as though shoe-horning one art form into another. It is why Steve Allen’s recitation of rock lyrics on The Tonight Show in 1957 still holds up as comedy. What matters is whether the music is integral to the effect – whether you lose something essential by removing it. 

Bob Dylan referenced Shakespeare, which provides a good excuse to use him for illustration. Some of Hamlet's lines were sung in the musical Hair (Will’s iambs make that easy: I once saw a musical Macbeth with a blues score), yet are plainly still poetry. Compare them with Sigh No More, which Shakespeare intended from the beginning to be a song in Much Ado about Nothing. The difference is unmistakable. 

In my case these musings are purely academic, since my own attempts at verses (whether lyrics or poems) always have had clunky results, to put it kindly. (I prefer to talk about myself kindly: one can’t always count on others to do so.) My sister was the poet of the family. I posted some of her verses at Echoes of the Boom. If she ever wrote lyrics, I’m not aware of them. 

In any event, the question now has ceased to nag. It’s time to return to the stereo and Poemhunter – but not both at the same time.

Steve Allen (1957)



Gene Vincent & The Blue Caps - Be Bop a Lula




Friday, November 6, 2020

Serenity

Decades of research link major stressful life events to increased risk of sickness and death during the events themselves and in the ensuing year; the risk extends to everything from cardiovascular disease to common infections. It is not established that stressful events increase the chances of a person catching or developing a disease in the first place; rather, they make the expression and progression of it worse. That is to say, they depress a body’s immune responses and general resiliency. Among the major personal events associated with increased risk are divorce, death of a loved one (spouse, parent, sibling, et al.), a move to a new place of residence, a financial crisis whether from the loss of a job or some other source, and even (sometimes) the loss of a pet. Most people, it is important to note, do not get sick in the wake of these events. Once again, the events probably don’t cause disease; they just exacerbate it. Whether we get ill or not, we soldier on as best we can. 

I had my own annus horribilis in the year straddling parts of 2000 and 2001. I checked every one of the boxes for the risky events mentioned above: got divorced, moved, struggled with finances, and lost both parents – and a pet. By luck I avoided getting sick on top of it. The corona viruses trading around back then were of the common cold variety and they happened to pass me by. It is a year that sticks with me though, informing a larger part of my later-adult identity than any year since. It comes to mind today because I passed Hilltop Cemetery on the way to the post office this morning. This is not unusual. The cemetery is three miles (5 km) from my house and is on a route I travel a few times per week. It is not my common practice to stop there even though my parents and sister are buried there. (Yes, there is room for one more; my parents were kind enough – if that is the right expression – to purchase a spot for me.) My mom didn’t see any value in cemetery visits. “Give your flowers to people when they are alive,” she always said. I’ve taken her advice to heart, though on rare occasion (maybe once or twice per year) I stop briefly only to see all the familiar names. I grew up in this town and know more of the local people six-feet-under in that place than I do of those walking above ground – most of the latter being relative newcomers. Anyway, I took more notice than usual while passing there this morning for two reasons. First, “an ongoing narration” about Hilltop Cemetery written by an old friend of my parents (he introduced them in high school) came into my possession via my aunt a few days ago. Second, I noticed on my car’s display that the date is November 6. That is the date my mom died 19 years ago. 

Last photo taken of my mom

We always remember the date a parent dies. (No, it doesn’t seem that long ago.) Everything internal changes at that point even if the outer trappings of our lives (where we live, where we work, how we live) do not. While a parent lives, a part of oneself (no matter what age) is always someone’s child. Afterward, that identity disappears. We are forever the adult in the room, whether we choose to act like one or not. There is also no escaping a greater sense of mortality.
 

My grieving days are long past, so I doubt I’m at any elevated risk from it should I happen to wrestle with this year’s corona virus, a bug that has brought so much more disruption into our daily lives than the ones current in 2001. To be sure, moments of nostalgia do and always will recur from time to time (such as today). Even though Civil War doctors often listed “nostalgia” as a contributing (sometimes only) cause of death, I doubt I’ll be dying of that either. That doesn’t mean I’ll forget. I won’t…and for all this year’s unpleasantness, I’m grateful that 2020 is (for me, at least) more serene than two decades ago.

 

Just a classic Glenn Miller number to which I remember my parents dancing:

Sunday, November 1, 2020

Artificial Reckoning

One of the books on rotation on my bedtable this past week was Artificial Intelligence: The Quest for the Ultimate Thinking Machine by Richard Urwin. This is by no means a technical manual but (unlike most general audience books on the subject) neither is it entirely simplistic. Urwin describes, with examples, the various approaches to Artificial Intelligence: fuzzy logic, subsumption architecture, swarm intelligence, evolutionary computing, etc. He explains how each is suited to particular contexts and how they could work in concert in general intelligence. The book provides some brief and basic insight into the minds of our future robot overlords. Just kidding…maybe. Networked AIs are everywhere in our lives today. They recommend books, adjust our furnaces, and trade our stocks. Even modern toasters often have remarkable computing power, not because they need it to toast bread but because chips are cheap and can provide extra features. (Back in 2016 hackers exploited this by hiding computer malware in household appliances with wireless capability.) It is helpful to understand a little about them and how they operate.


The three types of AI are Pragmatic, Weak, and Strong. The first is purely task oriented: as simple as a Roomba or as complex as a self-driving Tesla. At the upper end these AIs might approach the intelligence of an insect; they don’t need more than that to do their jobs. The second type, requiring hefty computational speed and power, simulates general intelligence, e.g. IBM’s
Jeopardy champion Watson, some of the more sophisticated medical diagnostic programs, and the conversational robot Sophia by Hanson Robotics. The key word is “simulates.” They do not think in the way people do. Using (sometimes evolving) algorithms they plow through millions of possible responses to a query and find the best match with machine efficiency – and machine cluelessness. There is nothing aware about them. Strong AI would think like a person. Strong AI is aspirational rather than something that yet exists, but many researchers are working with artificial neural nets, genetic algorithms, and other technologies with this as an ultimate goal. It is an open question whether such AI ever could be conscious, defined as that meta-state of not only knowing but knowing that one knows.

To a user, a sufficiently sophisticated Weak AI (one that acts as though it is conscious) would be hard to distinguish from a Strong AI, but there would be a difference from the AI’s perspective. It would feel like something to be a Strong AI; not so Weak AI, which doesn’t feel anything. Weak AI doesn’t have a perspective any more than your desktop calculator has one. More than a few sci-fi plots (e.g. Ex Machina) center on the difference.

In science fiction, machine consciousness usually ends badly for people, whether it is HAL deciding to off Dave, Skynet deciding to off the human species, or Cylons being the ultimate rebellious offspring. It was the plot of the very first drama to use the word “robot”: R.U.R. (1920). Despite the semiautonomous robots proliferating on the battlefield – the next generation fighter aircraft probably will be unmanned – this doesn’t worry me much. The people giving the machines their objectives and directions always have been and always will be more dangerous. AIs are very unlikely to develop animus toward people on their own accord; they don’t, so to speak, have skin in the game. Plenty of humans, though, have animus toward their own kind. Some want to destroy people with different beliefs or characteristics while others want to destroy everybody. Witness the apocalyptic cult Aum Shinrikyo whose members sarin-gassed the Tokyo subway back in the 90s as an intended step toward bringing about the end of the world. Grumpy AI is pretty far down the list of potential risks.

Charles Stross described another possible future in his novel Saturn’s Children in which the characters are humaniform robots. In Stross’ book the robots inadvertently had long ago killed off the human race through love, not war. Humans so preferred their AI love-bots to other humans that the species simply died out. This has a strange credibility. The fantasy of robot love recurs repeatedly in books, songs, and movies over the past century. A few examples from among countless: the movie Metropolis in 1927, Connie Francis’ hit single Robot Man 60 years ago, the movies Cherry 2000 and Making Mr. Right back in the 80s, and t.A.T.u.’s Robot in the 2000s. Today, simulated lovers can be created on computer games such as New Love Plus+ and phone apps such as Metro PD: Close To You. Many gamesters already prefer them to the real thing. Combining these creations with life-size physical robots can’t be far away.

If humans are to disappear, I suppose there are worse ways to go. Meantime, I’m reminded of Robert Frost’s poem (A Considerable Speck) in which he notices a mite on the page in front of him.

 

“I have a mind myself and recognize

Mind when I meet with it in any guise

No one can know how glad I am to find

On any sheet the least display of mind.”


My take on AI is much the same.

 

Hanson Robotics’ Sophia




Sunday, October 25, 2020

Can You Keep a Secret?

People vary in their definitions of success in life, but one classic vision still resonates with many: an upscale income to finance an upscale suburban home in an upscale neighborhood shared with a desirable spouse and two kids whom you drop off at a desirable school in one of the two Audis you park in the garage. It’s called “having made it” or “living the dream.” At least it’s called that by enough people to keep real estate prices in those upscale neighborhoods frothy. By most definitions the upper middle class consists of about 20% of the population, but in a populous country (such as the US with 332,000,000) that’s a lot of people and a lot of neighborhoods. (I live in one of them even though I don’t check those other boxes and have no desire to do so.) The risk of defining success and self-worth in this manner is that the way of life is fragile. Finances, health, relationship circumstances and other factors can alter radically and quickly. How far might some people go to defend against threats to their lifestyle – or just to their chance of achieving it in the first place? Sometimes the threat can be a simple secret. 


Harlan Coben, native of Livingston NJ (very much one of those upscale neighborhoods), writes consistently good page-turning thrillers and mysteries. The Stranger (made into a 2020 Netflix series that I haven’t seen) is a little different from most of his books, but is still intriguing. The central protagonist is Adam Price who checks all of the “having made it” boxes in the upscale (fictional) suburb of Cedarfield NJ; a dad of another player on his son’s high school lacrosse team actually says to him, “We’re living the dream.” His comfortable life is then threatened by a secret – or rather by the revelation of one. A stranger walks up to him at a bar and tells him his wife Corrine had faked a pregnancy and miscarriage; he tells exactly what old credit card charges to check online that will verify the story. The stranger then raises with him the question of paternity since one big lie makes another more plausible. Adam is not alone in hearing life-upturning secrets from the stranger. One woman gets the news that her college daughter is an escort. Others hear secrets ranging from sexual indiscretions to financial improprieties. It seems that a few young computer-savvy entrepreneurs (including the stranger) are using the resources of modern tech to discover dark secrets and make money from them. This proves more dangerous than they anticipated. 

The yarn is a good one in its own right, but it raises unsettling questions about privacy in the modern world. Everyone has secrets. Some are tame and some are not. Some are not even ours, but secrets we keep on behalf of friends. They might include financial secrets (e.g. hiding a bank account from a spouse), theft, infidelity, deception, romantic desires (even if unpursued), or outright criminal behavior. Sometimes secrets are kept out of a sense of shame, and these can be burdensome to the secret-holders. Even those who shamelessly reject conventional morality and are perfectly comfortable with their secrets may still fear the practical personal, social, employment, or legal consequences if the secrets are revealed. Such fears are often justified. 

Yet, as many have learned to their cost, secrets are difficult to keep against a determined investigator in an ever more monitored world. The tools for tracking people have multiplied over the past two decades. The location of cell phones can be tracked by a simple app. EZ Pass registers the times and places of road and bridge tolls. Police cars automatically scan license plates of passing vehicles. Security cameras proliferate. Where were you last April 17th at 10:20 a.m.? It’s very likely this can be determined. Nothing posted or stored online (or even on an individual computer) is ever really secure. Nothing is ever safely deleted – someone may have a copy somewhere. It is entirely possible for a 30-year-old to be suddenly confronted with something stupid he or she posted or emailed in high school. (There are certain advantages to being old enough for one’s teen years to have predated World Wide Web.) Then there is DNA testing, which is not just a forensic tool anymore. DNA test kits that sell for under $100 can be bought at any pharmacy. They come with warnings that the tests may reveal uncomfortable family secrets. About 30% do: not just the obvious paternity question but grandparental surprises, sibling surprises, and even some maternity surprises. 

So secrets aren’t what they used to be. Nowadays, most people keep their secrets only because they aren't worth anyone else’s bother to dig them out. What if they were? That eventuality didn’t work out well for Adam Price and Corrine. It probably wouldn’t work out well for us either.


The Pierces – Secret




Sunday, October 18, 2020

When Pumpkins Grin

It’s that month again. Like all major holidays since I was a kid, Halloween has expanded its domain. It now rules the entire month of October and stretches Cthulhu-like tendrils back before the equinox all the way to Labor Day when Halloween candy suddenly appears on supermarket shelves. (For non-American readers, Labor Day is the first Monday in September.) Though Halloween did not originate in the US, the American way of celebrating it (well over a century old) has become popular even in places (such as Japan) completely removed culturally from the holiday’s Celtic origins. Perhaps this is not surprising. Around the world there are parallel notions of a time of the year when the boundary between the living and the dead is permeable and ghosts walk the earth more freely than usual, such as the Day of the Dead in Mexico or the Hungry Ghost Festival in East and Southeast Asia. Some celebrations of those parallel holidays are hardcore, such as ritual disinterment of the remains of family members for home visits in Indonesia. (See From Here to Eternity by Caitlin Doughty for a description this and other death-related rituals.) I can see how carving pumpkins, dressing up spookily for fun, and cadging candy could make inroads as alternative activities – or at least as additional ones. Many Halloween costumes seem far removed from the influence of the graveyard, but cavorting with ghosts and tweaking the nose of Death still remain at the core of the holiday.

Ghosts are part of the mythic heritage of every people on every continent. What surprises those of a skeptical bent is how much they remain a part of modern belief systems. In a 2019 US study 45% of adults admitted to believing in ghosts, defined as spirits of the dead who can manifest to living people. (A solid majority believe in spirits if you count spirits who have moved onto someplace else.) Another 20% are unsure. This is pretty typical of advanced countries including ones that are highly secular in the usual sense. This may be an undercount since some people are embarrassed to admit to believing paranormal things. Counterintuitively, belief in ghosts rises with education. In a 2006 study by Bryan Farha at Oklahoma City University and Gary Steward Jr. of the University of Central Oklahoma (reported in the Skeptical Inquirer) 23% of college freshman believed in the general gamut of the paranormal including astrology, clairvoyance, and ghosts (40% believed in haunted houses specifically, with another 25% unsure), while 31% of college seniors did, and 34% of graduate students did. Science majors were no more skeptical than other students. From Science Education is No Guarantee of Skepticism (2012) by Richard Walker, Steven J. Hoekstra, and Rodney J. Vogl: “Across all three samples, the correlation between test scores and beliefs was non-significant... In other words, there was no relationship between the level of science knowledge and skepticism regarding paranormal claims.”

Choosing a seat at my
kitchen table

Ghosts have never been a part of my personal belief system. As a child my parents told me they didn’t exist and I believed them as uncritically as I believed their story that Santa Claus did exist. By the time I was able to think more critically for myself I saw no reason to change my mind – about the ghosts, that is. Yet, around half of my friends (including highly capable professionals) to this day are believers or at least are unwilling to say they are disbelievers. About a quarter tell me my house is haunted with two claiming to have seen apparitions. My house in the woods makes a lot of creaking, knocking, and groaning sounds as it heats and cools; the floorplan and lighting result in odd shadows – some of them cast through windows from trees that move when the wind blows. I don’t think about the sounds and umbrations except when teasing guests, e.g. “Don’t worry, the troll downstairs gets restless but he is securely chained.” Not all of them are amused. There is at least one grown man who doesn’t like to go to my basement alone.

I’m not immune to the creepiness of shadowy houses at night. The scariest Halloweens of my life have been in them. My dad was a builder and construction sites have a special appeal to marauding teens on Halloween (and the night previous). They don’t always confine themselves to spreading toilet paper and soaping windows. They sometimes did substantial damage including breaking windows, spray-painting obscenities, and slashing tires on construction vehicles. So, by my later teens I had been drafted into guard duty at unfinished houses on the last two nights of October. If you want to experience a spooky Halloween, spend it alone (with no cell phone) in a dark half-built house on a wooded lot at night. It wasn’t ghosts that worried me, of course, but the possibility of confronting beings with pulses. They are always the greatest hazard at any place or time.

In my own home, no ghost ever has done me harm, so if I’m wrong about their nonexistence, I figure at least they are friendly. Perhaps they’ll even do me the favor of scaring off some marauders with pulses.

 

Bessie Smith – Haunted House Blues (1924)



Sunday, October 11, 2020

Good Enough

I grew up with films of the 30s through 50s: what we now consider classic movies. At some hours there wasn’t much else on TV. Until the end of the 1960s TV stations, even though there were just a handful in each market, were scrounging for content. Some stations (even major ones) simply went off the air at night because they had nothing to broadcast. Others, however, during non-prime hours of the day and late at night played classic films, the rights to which they had acquired for a song. Nowadays a youngster always can find something more suitable to a kid’s taste any time of the day or night on one of 200 other TV channels – or online. Back then we settled. 


These old films varied a lot in quality. The bulk of them were B-movies in every sense, but I wasn’t a particularly precocious youngster and so wasn’t a very good judge. It is interesting to revisit them now with an adult eye. Even some of the bad ones still have a nostalgia value for me because I remember having watched them at age 10 when I should have been in bed. I re-encountered one such film on TCM the other day that I barely remembered from youth. Lured (1947) is a noir mystery in which a pre-I Love Lucy Lucille Ball plays a taxi dancer who co-operates with Scotland Yard by acting as bait to catch a serial killer. In truth, the movie is contrived and not very good, but it isn’t altogether bad either. It is good enough to enjoy for 102 minutes. 

As I’ve grown older I’ve become a fan of the good enough. To be sure, I appreciate excellence as much as anyone, but that is a rare commodity and not always worth the cost. Voltaire warned that “the best is the enemy of the good.” He meant, of course, that demanding nothing but the best (including from oneself) may mean you don’t get or achieve anything at all. It usually means that. We are often discouraged from writing or painting or building something by our own concern that we won’t be great. The truth is, we’re probably right – especially by contemporary standards. Kurt Vonnegut once remarked that in a Neolithic village the totem carver was Michelangelo as far as the villagers were concerned. The rock painter was Picasso and the campfire singer was Elvis. In the modern global village we are up against world class performers, not just a hundred locals. It accordingly is easy to become discouraged by comparing ourselves to them. Excelling on a global scale is tough. 

Even by Roman times this was an issue. Vergil was so unhappy with the Aeneid that on his deathbed he directed the manuscript be destroyed. Fortunately, his dying wish was ignored and this classic of Western literature survives. In 1908 Monet destroyed a number of his paintings: “I know that if they are exhibited, they’ll be a great success, but I couldn’t be more indifferent to it since I know they are bad.” So, sometimes we are wrong about our own work. However, even if our judgment is right (as is more likely), it’s worth giving whatever we like to do a go anyway. Maybe it won’t be great. It probably won’t be. Yet it might be good enough to bring pleasure to oneself and perhaps to others. Once again, there is something to be said for that. Just yesterday I did a good enough job of mowing the lawn. The result won’t win any landscaping awards but it doesn’t look bad either – and it was cheaper than hiring professional lawn care. 

I’ve had occasion to contemplate what a life filled with good enough can be like: a good enough job, a good enough car, good enough home, and, yes, good enough relationships. It sounds like a pretty happy one to me. I’ve experienced stretches of life filled with extreme bests and extreme worsts, and I’d be willing to trade. 

Maybe I’ll check what is playing this evening on TCM. If it’s something like The Philadelphia Story (definitely a best), that’s wonderful. But if it’s something good enough like The Falcon’s Alibi, that is OK, too.

Tom Petty & The Heartbreakers – Good Enough