Thursday, September 28, 2023

One Way or the Other

The political weather in the contemporary USA is always stormy, but the intensity increases when candidates position themselves seriously for the primaries as they are doing now. I have no intention of commenting directly on them. The airwaves, podcast channels, and blogosphere are crowded with people doing that obsessively. There is no need to add yet one more voice to the din. Besides, I generally vote third-party, which tends to annoy my friends from both mainstream parties. Folks on both sides tell me I’m helping the greater evil win by not supporting the lesser (meaning their candidate). I don’t intend to comment on that argument here either, though in conversation I sometimes do. But it illuminates something on which I do wish to comment, which is the human tendency toward binary thinking. This is by no means confined to politics. It is just more stark in that context.


Numerous books and articles on the subject exist. Most argue, as do Jack Denfield Wood and Gianpiero Petriglieri in their scholarly paper "Transcending Polarization: Beyond Binary Thinking" published in Transactional Analysis Journal, that “individuals, groups, and larger collectivities instinctively frame their predicaments in a binary way – as a polarity encompassing a dimension of choice with two mutually exclusive alternatives” as a consequence of human (or, for that matter, mammal) evolutionary history. In a primordial environment rife with predators (including other humans), identifying something/someone rapidly as “good” or “bad” was a life or death matter: “Evolution has selected and conserved the neural machinery that supports instinctive ‘good or bad’ binary thinking, largely because of its survival value.”
 
We are capable of transcending this natural tendency and viewing the world in a more nuanced way: an oversize cortex would be a bit of a waste if we couldn’t. Not all of our judgments need be split second ones, and if we take the time we can see shades of gray. We even can step back and see colors completely outside the black-white spectrum. However, it takes an effort. More often than not it is worth that effort.
 
Though by now we should know better, it is still easy to be seduced by a binary formulation such as “You’re either with us or against us.” Well, no. Not really. Moral questions cannot always be simplified into Good vs. Evil. (Nietzsche dedicated a whole book to that one: Beyond Good and Evil, though I’m afraid I raised few eyebrows while reading it while waiting in the jury pool at the county courthouse.) A proposal is not necessarily smart or stupid – it can be a bit of both. The same can be said about a person.
 
True, like our ancestors, we still encounter circumstances when binary thought is good for us, e.g. fight or flight. Binary judgment is a quick efficient rough-and-ready way to categorize events, people, and things in our lives. After all we don’t always have the luxury of time to give every matter deeper thought. We might never get around to reacting at all were we to do that. But it behooves us to remain aware that we are oversimplifying – and polarizing. If we have the time and the patience we can do better: assuming, that is, that we want to. There is a certain amount of fun to believing that there is a clear right side and a wrong side and that we are on the right one. In some particular case we might even be correct. But probably not.

 
Benny Goodman – Gotta Be This Or That (1945)


Thursday, September 21, 2023

Getting Past the Past

Of those still above ground, one of my go-to mystery/suspense authors for recreational reading is fellow New Jersey native Harlan Coben. (South African author Deon Meyer is another.) Coben has a formula for most of his novels: the comfy upper middle class suburban life of a character (usually 35-45) is thrown into crisis by some grave threat – often something reemerging from the protagonist’s past. It’s a formula that works more often than not. I most recently read Stay Close, which was adapted for a Netflix miniseries that I haven’t seen. The usual elements are there, but with more nuance than Coben typically delivers, and the book is better for it. Suburban soccer mom Megan doesn’t want to give up her American Dream lifestyle or her husband. Yet, she has some humanly mixed feelings about both. She still feels the draw of her old pre-marriage party life as a stripper in Atlantic City (to the point of visiting the club incognito), and she still has feelings for a paparazzo named Ray whom she knew at the time. Both she and Ray are haunted by a secret from those days. A murder in Atlantic City with an old familiar pattern threatens their secret and their lives.


Megan’s mixed feelings are what make her more relatable than most Coben protagonists, who tend to be single-minded defenders of their homes and families. Megan is a defender but is more complex than that. She is suffering from a midlife crisis. Most of us above a certain age know what that is like. (I certainly do.) Every stage of life has its own characteristic challenges, but the midlife crisis (usually setting in near 40 and possibly lasting to as late as 60) accompanies the nadir of the well-known happiness U-curve; on average, self-reported subjective feelings of happiness and contentment are high at 20, decline to a low around 40, and then recover after 60. The reader may have encountered articles purporting to debunk the U-curve, but they tend either deliberately to ignore the term “subjective” or to ignore the bell curve. Of course we are talking about the centerline of the bell curve; there are always going to be cheery people and miserable people on the tails in every age group, but the average still counts for something. The self-reports remain and they still fit a U.
 
The reasons for the midlife crisis are numerous, but start with the recognition that youth has passed. This involves not just the effects of physical aging but the understanding that doors are closing: time is running out to make major changes in one’s life and career. Thoughts of mortality come more to the fore, not least because many of us lose our parents at this time – or at any rate witness them getting truly old. Kids grow up and move out. Job changes (common these days) are more difficult. Responsibilities (financial and otherwise) reach their peak. While not everyone has existential crises (the “what’s it all about?” questions), many people do.
 
Responses to all this are well-known and the butt of some mockery. We may try (unsuccessfully) to look and act much younger than we are. Single people may impulsively marry, likely inappropriately and to someone younger. Married people may divorce “while there is still time.” Clothes and cars get suddenly flashy. Substance abuse may worsen. You know the drill.
 
The good news is that the bottom of U-curve gives nowhere to go but up. Those of us who get through this phase without too much damage to ourselves probably will get happier, even if objectively there doesn’t seem much reason. Health and strength may continue to decline, but after a point we just don’t give a damn – or at least we give less of one. Acceptance has much to recommend it.
 
Jerry Lee Lewis – Middle Age Crazy


Thursday, September 14, 2023

Comforting Cliché's

I’ve been soft-shoeing around the edges of some of life’s vicissitudes this week. I won’t explain that further because it is not really my story to tell, but I find myself spouting a lot of clichés in consequence. That’s OK. Clichés are clichés because most of them are truisms and truisms tend to be… well… true. I remember back in high school one of my youngish teachers (perhaps 27) commenting that one of his most annoying life lessons was discovering that all the trite old sayings from his parents at whom he used to roll his eyes were true.
 
He wasn’t entirely right about that. Take Nietzsche’s “What does not kill makes stronger.” That is only half-true. Sometimes what does not kill permanently maims. Assuming a harm is fully recoverable, however, Fred was onto something.
 
My teacher was largely right however. One really shouldn’t judge a book by its cover. Life really is too short to sweat the small stuff. Actions do speak louder than words. We do all share a common fate. OK, the grass might not really be greener on the other side of the fence, but it sure looks that way. (Horace actually translates as “the crops are riper in the neighbor’s field,” which I like a bit better.)
 
What about clichés that are not homilies but just hackneyed turns-of-phrase such as eat one’s words, cruel to be kind, wild goose chase, be-all and end-all, heart of gold, and too much of a good thing (all Shakespeare)? In everyday conversation I don’t think they’re so bad. They convey the point in a way our listeners readily understand. Not everything we express need be creative original oratory. We can save that for our acceptance speech for... um… whatever we’re accepting.


 
Don’t get me wrong (a cliché admonition), I value creativity in expression, too, but perhaps it is more important in one’s fiction, essays, poetry, and reportage. Otherwise, it is no big offense to use clichés all the live long day (Shakespeare again).
 
Biff Rose – Ballad of Clichés (1969)


Thursday, September 7, 2023

OK Boomer

A dismissal of my musical taste by a young person yesterday prompts me to expand a bit on my last blog. (I smiled at the dismissal.) 
 
Though the verbal equivalent of an eye-roll, the “OK Boomer” mantra from exasperated Zoomers is not really an insult. It may be intended as such but it isn’t, for Boomers are generally pretty secure (rightly or wrongly) in their opinions on everything from music to lifestyles. I think part of that has to do with history.
 
Contrasting generations is a pastime in all eras. The conclusions are always somewhat misleading since every age group contains people across a wide spectrum of attitudes and behaviors. But if we acknowledge that we are talking about the centerline of bell curves with tails that overlap, generational comparisons still can have merit. Some generations really are (on average) more industrious than others, or technically adept than others, or strait-laced than others, or whatever. The technological environment accounts for more of the differences than we usually recognize, argues Jean Twenge in her book Generations, which is full of arcane data and graphs. Silents (b.1926-1945), for example, grew up with radio, vinyl records, and movies. Boomers (b.1946-1964) were the first television generation. Xers (1965-1980) experienced cable TV and early personal computers in their youths. Millennials (1981-1996) grew up with the internet. Zoomers (1997-2012) grew up with smart phones. The oldest Zoomer was 10 when the iPhone appeared and the youngest doesn’t remember a time without it.


 
I don’t disagree with Twenge’s basic point. I think those technologies and numerous others have had a profound impact on the youth of each era. But I think that Boomers also have a peculiar relationship with broader history purely by the accident of their birth years. To some degree this is true of Silents, too, though their numbers are diminishing rapidly while the GI Generation is gone but for a few centenarians. Attrition is taking its toll on Baby Boomers as well, but there are a lot of us so we’ll remain a cultural force for a while.
 
The future arrived in the 20th century. To be sure, its mechanical, scientific, and social foundations were laid centuries earlier and the above-ground framing of it accelerated in the 19th, but the structure took real recognizable shape in the 20th. Technology changed traditional lifestyles forever. The experience of my paternal grandfather (b. 1896) illustrates the point. He left Austria-Hungary shortly before World War 1 in a horse-drawn hay wagon. He returned to Budapest for a visit five decades later in a Boeing 707. Manned heavier-than-air flight was just a fantasy when he was born. He lived to see Gemini spacecraft orbit the earth (though he missed seeing the moon landing by a few years). My other grandparents were born in 1899, 1900, and 1900. All lived on farms for at least part of their adult lives.
 
Though Boomers experienced only the second half of the century first-hand, we personally knew those who had lived through the first half. I know the story of my maternal grandfather’s army physical for World War 1. (His induction was canceled when the war ended in 1918.) I heard about the transition from literal horse power to mechanical power; as late as the 1930s my paternal grandfather dug cellars with a draft horse and scoop. I heard all about the hardship and angst of the Depression, about my dad’s experiences in World War 2, and (from my mom) what high school was like in the 1940s. My parents played big band music on the stereo when I was a kid. I heard about rationing from those who experienced it and about genuine horrors of war from immigrant survivors of them who were friends and neighbors. All of this was from the mouths those who experienced it directly – not third-hand from literary sources – which makes it feel very real. I won’t mention the events Boomers experienced first-hand, since we spend plenty of time talking about those (often nostalgically) so there is no need to repeat all that here. The point is that none of the 20th century seems very distant to me. I knew people who were around at the beginning of it. (I incorporated some of their recollections into my own short stories, as in How to Avoid Work and Flirt with the Butcher.) The arrival of the future between 1900 and 1999 feels personal.
 
I think this gives Boomers a sense of place in history that adds to their already high regard for themselves. It probably also accentuates in us the natural tendency of those over a certain age to be resistant to further change. We have become dinosaurs. That’s OK. Dinosaurs lasted 180 million years. They must have done something right. Theory of a Deadman needn’t worry so much.
 
Theory of a Deadman – Dinosaur