Friday, November 27, 2020

Casting a Shadow

Yesterday the weather was suitable to eating outside, so I enjoyed my Thanksgiving turkey at my picnic table. In lieu of my usual one or two dozen guests, I shared the meal with two stray cats who were attracted by the aromas. They instantly made themselves my best buddies. They returned for leftovers today but will be aloof again when the turkey runs out.

 

That overfull feeling

Afterward, I spent much of the day with a book and movie, which (along with sleeping off a turkey coma) is a common way of dealing with the aftermath of the meal. There was, of course, plenty on TV that was holiday related: not just Thanksgiving fare (A Charlie Brown Thanksgiving; Trains, Planes, and Automobiles; et al.) but the first batch of Christmas fare (e.g. Miracle on 34th Street) as well. Special holiday TV programming seems a bit unnecessary in an age when you can stream (or, if you’re old school, play on DVD) pretty much anything anytime. Stations offer it anyway. Since 1956 The Wizard of Oz, has been aired on American TV on Thanksgiving, though it really has nothing to do with the holiday. Yesterday it ran continuously on TBS. I didn’t watch it. It’s not that I dislike it or the other offerings (well, some of them I do). I’ve just seen them enough. For the same reason, my book of choice was not A Christmas Carol. 

2020 demands something darker. The times in general do. This may seem an off-topic digression, but I’ll tie it in: Freud developed his theories in the Victorian era of strict moral codes. The codes didn’t merely restrict behavior (not altogether successfully) but dictated proper thoughts as well. To the extent people internalized these dictates, they felt guilty about their own (humanly bestial) thoughts and developed neuroses of the type that so fascinated Sig. The thoughts, desires, and motives were perfectly normal, of course. Freud spoke of motivations from the moral-free id. Carl Jung talked of the shadow self: the hidden side of one’s nature including dark elements that we don’t display in civilized society (unless we are sociopaths). Rather than deny the shadow’s existence or repress it, Jung emphasized the importance to psychic health of accepting it and integrating it into one’s whole personality. Not to accept it is to deny one’s own humanity; the shadow will express itself anyway – perhaps by cruelly moralizing to others. Don’t let the shadow drive the car, he tells us in essence, but recognize without guilt that it will always be a passenger. Existentialist philosophers also emphasized the primacy of action over mind; what matters is what you do, not what you think. All of these shrinks and thinkers aimed to increase human happiness by freeing people from the whole notion of (to lift a term from Orwell) thoughtcrime. In the 21st century strict moral codes – this time political in origin – are returning; for those who internalize the dictates, thoughtcrime has returned as a notion as well. Art that challenges this (e.g. Dexter) has one mark in its favor for that reason alone. 

So, my book choice was one to tickle the shadow self. The author who goes by the nom de plume Delicious Tacos (by his own account to avoid losing his day job over something some boss or co-worker reader might find offensive) apparently has a substantial online following, but I never heard of him until a couple months ago when I picked up Finally Some Good News, his post-apocalyptic novel. Yesterday I tried Savage Spear of the Unicorn. This is a collection of short stories, many of them barely disguised autobiography. “The mass of men lead lives of quiet desperation,” said Thoreau. Tacos may live in desperation, but in his writings he is anything but quiet. He hands his Jungian shadow self the keyboard and lets it scorch pages with the scabrous brutality of the way people actually think – even (maybe especially) the ones who speak and act politely. He spews utter frustration at a soul-crushing world in which we buy a car to work boring hours at an awful job just so we can pay for the car; he tells why he will die alone in a time when the sexes no longer even know how to talk to each other. Even in his tamer stories his characters are all too human – and not in an admirable way. For example there is a tale (*SPOILER* follows) of a young woman rescued from poverty by a mysterious benefactor who gave her $1,000,000. Having achieved her dreams, she dedicates herself to finding the anonymous benefactor who changed her life. At last she tracks down the aging wealthy recluse philanthropist and asks him, “Can I have more money?” 


The author is rude, crude, and harsh, and most definitely not for anyone who believes in thoughtcrime. Yet, beneath it all he writes surprisingly well: a Henry Miller for our time (who, of course, was banned in his). 

The movie of choice was one I saw mentioned on a Youtube list of underrated movies from the past decade. I’m not normally a big fan of comic horror films (unless written by Joss Whedon) but based on this list I decided to give it a chance. It seemed suitably dark. Made in 2014, Cooties is about a pandemic started in a chicken-nuggets factory and spread around in school lunches. The virus affects only children who are turned into murderous savages. The film is rated “R” (as it should be for scenes such as kids using a teacher’s head as a soccer ball) so the kids who act in it can’t legally watch it. It is a low budget campy gore fest unapologetically playing with the usual horror tropes. The humor is silly rather than clever, yet somehow the result is surprisingly genial. The film stars Elijah Wood, Rainn Wilson, and Alison Pill. I don’t think Cooties is remotely underrated. That said, watching it was (this year at least) preferable to watching Dorothy skip down the Yellow Brick Road yet one more time. Somehow it seemed right for Thanksgiving 2020. 

Now we are in the next holiday season and are hounded to buy, buy, buy on all media platforms. I haven’t yet selected any books or movies for the next few weeks, but this year It’s a Wonderful Life probably won’t be among them.




Friday, November 20, 2020

The Bird Is the Word

A bird on the table is worth four in the air

My usual Thanksgiving dinner of 12-to-20 guests is canceled this year due to you-know-what and the state directive to restrict dinner companions to those "with whom you've been with throughout this pandemic.” That would be myself and two vegans. Um…yeah. My dis-invitations to the usual suspects went out a few days ago. I still plan on consuming the usual poultry and accompaniments by myself largely for reasons of sensory nostalgia, but the turkey will be a small one. 

As a kid, Thanksgiving was my favorite holiday. Oh, the playfulness surrounding Halloween made October 31 a close second, but back then (as now) the license to gorge on a full-blown feast tipped my scale of good things toward the 4th Thursday in November – despite how it subsequently tipped the bathroom scale. Besides, the holiday always falls within a few days of my birthday (sometimes on it), and there always has been enough Narcissus in me to regard the holiday as a sort of extension of that. 

Thanksgiving feasts were common in the American colonies as harvest festivals – first by the Spanish and French in the 16th century and later in the English colonies. A Thanksgiving (by that name) held in Virginia in 1607 preceded the “first” one in Plymouth colony by 14 years, though the latter eventually caught on as the mythic origin story. The holiday was celebrated sporadically in the 17th and 18th centuries. It was first declared a US national holiday by George Washington as a one-off event in 1789. This was the first year of his Administration under the new Constitution. He made no mention of the Pilgrim story. Rather, the holiday was for Americans to give thanks for “an opportunity peaceably to establish a form of government for their safety and happiness.” Thanksgivings were declared intermittently in subsequent years by presidents and state governors. In 1863 in the midst of the Civil War Abraham Lincoln made the holiday permanent. FDR gave it one final tweak: he fixed the holiday on the 4th Thursday of the month instead of, as previously, the last (some Novembers, of course, have five) as an economic measure in order to extend the Christmas shopping season. 

As a kid I cared nothing about any origin myth. As an adult I don’t care more. It was always about food, friends, and family. It was convivial. Unsurprisingly, the ancient Roman term for such an event was convivium. One festival called Mercatus Romani was held in late November and was accompanied by convivia; “mercatus” means “commerce” and accordingly the festival involved street markets and seasonal shopping, which has a modern ring to it too. 

Yet, the roots of Thanksgiving-like late-autumn banquets are far older than the Romans. There is something deeply atavistic about a ritual blood sacrifice at this time of year – in the modern version, commonly a turkey. One may point out that only the bird is sacrificing anything since the rest of us get to eat the critter, yet even this aspect of sacrifices has deep origins. Hesiod in Theogony (c. 700 BCE) tells how the already age-old practice came about of people placing bones on altar fires for the gods while keeping the meat for themselves. When humans’ relationship with the gods was still unsettled, Hesiod tells us, in order to decide how a sacrifice should be divided between people and the gods an ox was slaughtered and divided into two portions. Prometheus (who had a soft spot for humans – go figure) made the division. For one portion he stuffed meat and fat in the ox’s gross, slimy, ugly stomach; for the other he sewed up the ox’s bare bones prettily inside the skin. He then let Zeus pick between the two. Zeus picked the pretty pile. He was irate at having been tricked into choosing bones but he didn’t go back on the deal. (There is more to the story involving fire and the punishment of Prometheus, but let’s not get sidetracked.) I won’t be offering up leftover turkey bones on an altar. They’ll be tossed in the woods for the delectation of the local wildlife (the remains are always gone by morning), but that is an offering of sorts to Nature. 

In any event, while the upcoming November 26 may be shy on conviviality, the date is arbitrary anyway. My current thought is to do a “Thanksgiving in February” (or whenever) for the usual crowd when vaccines and growing herd immunity make that feasible… and not likely to prompt a visit from the police. Turkeys beware. 

A Vegetarian Option



Friday, November 13, 2020

A Clash of Verses

Istanbul #2461

Pursuant to some comments the other day on an online book chat group to which I belong, I revisited some of Aleister Crowley’s poetry at Poemhunter. Playing on the stereo in the background was Bob Dylan’s 2020 album Rough and Rowdy Ways. Normally I can ignore background sounds when concentrating on some other task, but music and poetry are so closely related that on this occasion they clashed in my head. I turned off the stereo. Yet, that very act also distracted me from Crowley whose verses I also clicked off and will revisit some other time. The question of how the two forms are similar yet different nagged at me instead. It is the sort of question that in order to explore once required a library trip and an immersion in card and microfiche files. It’s a trifle easier now.
 

The facile answer, of course, is that songs are sung and are (usually) accompanied by music while poetry (usually) is not. Poems might not be an auditory experience at all but just read on a page. But while that distinction is both obvious and key, there is more to it than that. When reading lyrics, it becomes clear pretty fast that Carol King did something different from Robert Frost. 

To be sure, there is overlap, most obviously in rap (Kanye West’s All Falls Down in original form was a poem he recited at a poetry jam in 2003) and the once-popular recitation songs in C&W (e.g. Ringo by Lorne Greene). Songs and poems alike are verse, and both can impact us with an economy of words that prose can’t match. (The impact is visible on brain scans: See the 2017 NIH paper The Emotional Power of Poetry: Neural Circuitry, Psychophysiology and Compositional Principles.). Indeed, songs are a part of every culture for this reason. The very first literature (not business contracts and tax records, which came earlier, but literature) is poetry. Yet there is a difference. In his acceptance speech for a Nobel Prize in Literature, Bob Dylan (though he took the prize money) pretty much told the committee that it had made a mistake: 

“But songs are unlike literature. They’re meant to be sung, not read. The words in Shakespeare’s plays were meant to be acted on the stage. Just as lyrics in songs are meant to be sung, not read on a page. And I hope some of you get the chance to listen to these lyrics the way they were intended to be heard: in concert or on record or however people are listening to songs these days.”
Matthew Zapruder in The Boston Review said something similar: “Words in a poem take place against the context of silence (or maybe an espresso maker, depending on the reading series), whereas, as musicians like Will Oldham and David Byrne have recently pointed out, lyrics take place in the context of a lot of deliberate musical information.”

There is a history of snobby professors (most often without music skills) valuing poetry over lyrics, which is why one is taught in school while the other generally isn’t. Yet the latter in toto are, if anything, more complex, requiring musical sensibility as well as a facility with words. So, while one can sing a poem or recite a song, something typically feels off about it, as though shoe-horning one art form into another. It is why Steve Allen’s recitation of rock lyrics on The Tonight Show in 1957 still holds up as comedy. What matters is whether the music is integral to the effect – whether you lose something essential by removing it. 

Bob Dylan referenced Shakespeare, which provides a good excuse to use him for illustration. Some of Hamlet's lines were sung in the musical Hair (Will’s iambs make that easy: I once saw a musical Macbeth with a blues score), yet are plainly still poetry. Compare them with Sigh No More, which Shakespeare intended from the beginning to be a song in Much Ado about Nothing. The difference is unmistakable. 

In my case these musings are purely academic, since my own attempts at verses (whether lyrics or poems) always have had clunky results, to put it kindly. (I prefer to talk about myself kindly: one can’t always count on others to do so.) My sister was the poet of the family. I posted some of her verses at Echoes of the Boom. If she ever wrote lyrics, I’m not aware of them. 

In any event, the question now has ceased to nag. It’s time to return to the stereo and Poemhunter – but not both at the same time.

Steve Allen (1957)



Gene Vincent & The Blue Caps - Be Bop a Lula




Friday, November 6, 2020

Serenity

Decades of research link major stressful life events to increased risk of sickness and death during the events themselves and in the ensuing year; the risk extends to everything from cardiovascular disease to common infections. It is not established that stressful events increase the chances of a person catching or developing a disease in the first place; rather, they make the expression and progression of it worse. That is to say, they depress a body’s immune responses and general resiliency. Among the major personal events associated with increased risk are divorce, death of a loved one (spouse, parent, sibling, et al.), a move to a new place of residence, a financial crisis whether from the loss of a job or some other source, and even (sometimes) the loss of a pet. Most people, it is important to note, do not get sick in the wake of these events. Once again, the events probably don’t cause disease; they just exacerbate it. Whether we get ill or not, we soldier on as best we can. 

I had my own annus horribilis in the year straddling parts of 2000 and 2001. I checked every one of the boxes for the risky events mentioned above: got divorced, moved, struggled with finances, and lost both parents – and a pet. By luck I avoided getting sick on top of it. The corona viruses trading around back then were of the common cold variety and they happened to pass me by. It is a year that sticks with me though, informing a larger part of my later-adult identity than any year since. It comes to mind today because I passed Hilltop Cemetery on the way to the post office this morning. This is not unusual. The cemetery is three miles (5 km) from my house and is on a route I travel a few times per week. It is not my common practice to stop there even though my parents and sister are buried there. (Yes, there is room for one more; my parents were kind enough – if that is the right expression – to purchase a spot for me.) My mom didn’t see any value in cemetery visits. “Give your flowers to people when they are alive,” she always said. I’ve taken her advice to heart, though on rare occasion (maybe once or twice per year) I stop briefly only to see all the familiar names. I grew up in this town and know more of the local people six-feet-under in that place than I do of those walking above ground – most of the latter being relative newcomers. Anyway, I took more notice than usual while passing there this morning for two reasons. First, “an ongoing narration” about Hilltop Cemetery written by an old friend of my parents (he introduced them in high school) came into my possession via my aunt a few days ago. Second, I noticed on my car’s display that the date is November 6. That is the date my mom died 19 years ago. 

Last photo taken of my mom

We always remember the date a parent dies. (No, it doesn’t seem that long ago.) Everything internal changes at that point even if the outer trappings of our lives (where we live, where we work, how we live) do not. While a parent lives, a part of oneself (no matter what age) is always someone’s child. Afterward, that identity disappears. We are forever the adult in the room, whether we choose to act like one or not. There is also no escaping a greater sense of mortality.
 

My grieving days are long past, so I doubt I’m at any elevated risk from it should I happen to wrestle with this year’s corona virus, a bug that has brought so much more disruption into our daily lives than the ones current in 2001. To be sure, moments of nostalgia do and always will recur from time to time (such as today). Even though Civil War doctors often listed “nostalgia” as a contributing (sometimes only) cause of death, I doubt I’ll be dying of that either. That doesn’t mean I’ll forget. I won’t…and for all this year’s unpleasantness, I’m grateful that 2020 is (for me, at least) more serene than two decades ago.

 

Just a classic Glenn Miller number to which I remember my parents dancing:

Sunday, November 1, 2020

Artificial Reckoning

One of the books on rotation on my bedtable this past week was Artificial Intelligence: The Quest for the Ultimate Thinking Machine by Richard Urwin. This is by no means a technical manual but (unlike most general audience books on the subject) neither is it entirely simplistic. Urwin describes, with examples, the various approaches to Artificial Intelligence: fuzzy logic, subsumption architecture, swarm intelligence, evolutionary computing, etc. He explains how each is suited to particular contexts and how they could work in concert in general intelligence. The book provides some brief and basic insight into the minds of our future robot overlords. Just kidding…maybe. Networked AIs are everywhere in our lives today. They recommend books, adjust our furnaces, and trade our stocks. Even modern toasters often have remarkable computing power, not because they need it to toast bread but because chips are cheap and can provide extra features. (Back in 2016 hackers exploited this by hiding computer malware in household appliances with wireless capability.) It is helpful to understand a little about them and how they operate.


The three types of AI are Pragmatic, Weak, and Strong. The first is purely task oriented: as simple as a Roomba or as complex as a self-driving Tesla. At the upper end these AIs might approach the intelligence of an insect; they don’t need more than that to do their jobs. The second type, requiring hefty computational speed and power, simulates general intelligence, e.g. IBM’s
Jeopardy champion Watson, some of the more sophisticated medical diagnostic programs, and the conversational robot Sophia by Hanson Robotics. The key word is “simulates.” They do not think in the way people do. Using (sometimes evolving) algorithms they plow through millions of possible responses to a query and find the best match with machine efficiency – and machine cluelessness. There is nothing aware about them. Strong AI would think like a person. Strong AI is aspirational rather than something that yet exists, but many researchers are working with artificial neural nets, genetic algorithms, and other technologies with this as an ultimate goal. It is an open question whether such AI ever could be conscious, defined as that meta-state of not only knowing but knowing that one knows.

To a user, a sufficiently sophisticated Weak AI (one that acts as though it is conscious) would be hard to distinguish from a Strong AI, but there would be a difference from the AI’s perspective. It would feel like something to be a Strong AI; not so Weak AI, which doesn’t feel anything. Weak AI doesn’t have a perspective any more than your desktop calculator has one. More than a few sci-fi plots (e.g. Ex Machina) center on the difference.

In science fiction, machine consciousness usually ends badly for people, whether it is HAL deciding to off Dave, Skynet deciding to off the human species, or Cylons being the ultimate rebellious offspring. It was the plot of the very first drama to use the word “robot”: R.U.R. (1920). Despite the semiautonomous robots proliferating on the battlefield – the next generation fighter aircraft probably will be unmanned – this doesn’t worry me much. The people giving the machines their objectives and directions always have been and always will be more dangerous. AIs are very unlikely to develop animus toward people on their own accord; they don’t, so to speak, have skin in the game. Plenty of humans, though, have animus toward their own kind. Some want to destroy people with different beliefs or characteristics while others want to destroy everybody. Witness the apocalyptic cult Aum Shinrikyo whose members sarin-gassed the Tokyo subway back in the 90s as an intended step toward bringing about the end of the world. Grumpy AI is pretty far down the list of potential risks.

Charles Stross described another possible future in his novel Saturn’s Children in which the characters are humaniform robots. In Stross’ book the robots inadvertently had long ago killed off the human race through love, not war. Humans so preferred their AI love-bots to other humans that the species simply died out. This has a strange credibility. The fantasy of robot love recurs repeatedly in books, songs, and movies over the past century. A few examples from among countless: the movie Metropolis in 1927, Connie Francis’ hit single Robot Man 60 years ago, the movies Cherry 2000 and Making Mr. Right back in the 80s, and t.A.T.u.’s Robot in the 2000s. Today, simulated lovers can be created on computer games such as New Love Plus+ and phone apps such as Metro PD: Close To You. Many gamesters already prefer them to the real thing. Combining these creations with life-size physical robots can’t be far away.

If humans are to disappear, I suppose there are worse ways to go. Meantime, I’m reminded of Robert Frost’s poem (A Considerable Speck) in which he notices a mite on the page in front of him.

 

“I have a mind myself and recognize

Mind when I meet with it in any guise

No one can know how glad I am to find

On any sheet the least display of mind.”


My take on AI is much the same.

 

Hanson Robotics’ Sophia