Hangovers are at least as old as the technology to brew
alcohol to supplement the small amount that can be found naturally in overripe
fruit hanging in trees. That technology is prehistoric, but the English word
“hangover” is actually fairly recent, the earliest known appearance in print
being in a 1904 slang dictionary. Before then the word was “crapulence,” which
I rather like better though one is likely to be misunderstood using the term today.
No matter how you say it (“cruda” [rawness] in New World Spanish, “resaca”
[detritus from receding surf] in Old World Spanish, “Kater” [tomcat – you
figure that one out] in German, etc.), it’s an unpleasantness most of us have
experienced. I wasn’t precocious with my vices in my youth (which I say
with more embarrassment than pride) and didn’t experience a full-blown hangover
until college. (It was legal: drinking age in DC then was 18.) The primary
components that night were red wine and excessive conviviality with similarly
bibulous friends. Back in my own dorm room, I went to bed with the stereo still
on. Sometime after 4 a.m. I awakened to an awful sensation. I leapt out of bed
and hurried down the hallway to the bathroom. You know what happened there. I
returned my room still nauseated. Playing on the stereo (no kidding) was
Melanie’s Leftover Wine,
a song that to this day I cannot hear without queasiness, though I still have
the album. A couple more hours of sleep did not prevent the subsequent daylight
hours from being less than my happiest. I wish I could say that was my last such
experience as well as the first, but it wasn’t. Like Shakespeare “I have very
poor and unhappy brains for drinking” (Othello).
However, unlike Shakespeare, who purportedly died of a hangover, I eventually (late
20s) curtailed my intake accordingly.
Scene of the crime: My dorm room at GWU
Nonetheless, the title Hungover:
theMorning After and One Man’s Quest
for a Cure caught my eye on the ER Hamilton bookseller website. The Canadian
author Shaughnessy Bishop-Stall relates the story of his international travels.
Just in the interest of research he gets drunk almost “every night now” and
tries hangover "cures" in various cities. They range from a brutal
massage in Switzerland to an IV drip in the Netherlands. He also does things
that are not advisable to do hungover such as driving a race car and jumping
off the Stratosphere in Las Vegas. The latter unexpectedly cured a hangover:
the terror and adrenaline did the job, or perhaps just distracted him from his
discomfort. (He was properly harnessed of course; unharnessed couldn’t possibly
work more than once.) He does come up with a formula that he says works for him
though he admits it could be just a placebo effect with no validity for anyone
else. It’s a weird mix of B vitamins and other ingredients including some (e.g.
frankincense) that might be hard to find last minute on New Year’s Eve; it also
must be taken between the last drink and going to bed. (If you really need to
be alert in the morning, I wouldn’t bet on this “cure.”)
Oddly enough, despite
alcohol being the most studied of popular intoxicants, scientists still don’t
really agree on what causes hangovers. Dehydration and the build-up of acetaldehyde
are commonly cited in popular literature, but dehydration is just one symptom (one
easily prevented or fixed at that) among many, and hangovers typically are at
their worst when acetaldehyde already has dropped to a low level. Sugary drinks
make hangovers worse due to the formation of lactates, but eliminating sugar
will not eliminate hangovers, only marginally lessen the severity. The most
promising idea is that hangovers are an inflammatory immune system reaction.
This hypothesis is supported by a high positive correlation of the production of
cytokine (an immune system signaling molecule) with hangovers. This is why
anti-inflammatories (including plain old aspirin) do help. The simplest solution,
as the finger-waggers among us always tell us (correctly, unfortunately), is not
to overindulge in the first place. Bishop-Stall’s book is a cautionary tale in
this regard too. His boozy research, which lasted several years, cost him
dearly in his personal life as intemperate lifestyles often do. Some argue, on
the other hand, that one can overcorrect the other way. Raymond Chandler: “I think a man ought to get drunk at least twice a
year just on principle, so he won't let himself get snotty about it.” Perhaps,
though Raymond might have exceeded twice.
Lefty Frizzell & Johnny Bond - Sick, Sober And Sorry (1957)
Back in July I mentioned an intent to make digital versions
of my family/personal photo albums. I didn’t want to create just page after
page of disordered scanned images but coherent albums with chronology and
commentary that would be comprehensible to a stranger – and that could be
stored on a flash drive. The likelihood is low that any stranger ever will look
at them and try to make sense of them, but I wanted to make it possible. Not
until the beginning of this month did I take a stab at doing this. Despite a
somewhat desultory approach to the task, I’ve made some headway though there is
still far to go. As source material, there exist in my closet four physical real-space
album volumes with
only the loosest internal order: 1) 1890s to 1955, 2) 1956-1969, 3) 1970-2001,
and 4) 2002-present. #4 is the thinnest since it consists mostly of printouts
of digital photos, and I haven’t printed more than a small fraction of those I
have saved. Digitizing the first three albums (I’m keeping the four volume
chronology) requires a lot of scanning but at least the original photos are all in
the same place. Most of the pics from the past 20 years, on the other hand, are
already digital but are scattered here and there on a variety of devices; the
physical album for this period is too incomplete to provide much guidance.
Editing is a must. It’s as important to know what exclude from an album as what
to include: most of our photos are best left out of them and stored (if at all)
loose in a box, be the box literal or virtual. Perspective is also something to be
considered: e.g. identifying someone in an image as, say, a great grandmother is obviously a
perspective of the album assembler, not (probably) the reader. In the preface
to Volume I of the digital albums, I wrote the following: “Every photo album, like every narrative, has to be from a
particular perspective if it is to have any coherence rather than being a
random gallimaufry of images. Stories can have multiple narratives, true
enough, but each of those sub-stories still has a single perspective. This
album, assembled by me, is inevitably from my perspective – not only from
narcissism, though I won’t try to deny a quantity of that, but because I cannot
write or post from any other without wandering into fiction. Besides, as the
last one standing in my branch of the family this album is not designed to be
shared at family gatherings where multiple perspectives are expected and
appreciated. That gathering would consist of… well… me. It is possible that
this will be perused at some point by friends or by two-or-more-step-removed
relatives who might share some elements of the history recorded in it. If so,
hi there. Take what you will from it that is relevant to you. As for the rest,
consider it a visit with Richard who is making everything about himself in his
usual fashion.” The reader may notice a sense of mortality in that preface.
This is an inevitable consequence of looking at a family photo album – at least
after a certain age. All but a few of the people imaged in Volume I are gone
(all the survivors are seniors), well over half of those in Volume II are gone
as well, and a large minority in Volume III. When I was in high school in the 1960s the breakthroughs in
bioscience were fresh and there was every expectation that a biotech revolution
would happen in the next few decades to match the mechanical and electronic
revolution of the previous half century. “Some of you sitting in this very
classroom,” one biology teacher told us, “will live to be 200.” We’d have
colonies on the moon by 2000, too. There is a difference between those two predictions. The
technology exists so that we could
have colonies on the moon at this time if we wanted them. They would just be so
insanely expensive to establish and maintain as to make no sense. A 200 year
lifespan, however, is off the table no matter what resources are devoted to it.
Aging is not a singular process but involves a convergence of ever diminishing
resilience along multiple pathways. Researchers publishing in Naturemake this point and came up with a maximum theoretical
lifespan if everything in a person’s life in terms of health went absolutely perfectly:
“Formally, such a state of ‘zero-resilience’ at the critical point corresponds
to the absolute zero on the vitality scale in the Strehler–Mildvan theory of
aging, thus representing a natural limit on human lifespan… A recent careful
analysis of human demographic data supports this argument and yields an
estimate for limiting lifespan of 138 years.” It should be noted that no one
ever has lived 138 years as far as we know. The longest life that is reliably
documented (122) is that of Jeanne Louise Calment (1875-1997).
By coincidence my recreational reading material last week was
by John Martin Fischer, a philosopher with The Immortality Project, titled Death, Immortality and Meaning in Life.
It’s an ambitious title for a book only 196 pages long. The book summarizes
philosophical takes on the subject from Epicurus to Becker. The Immortality
Project actually gets its name from the influential 1973 book The Denial of Death for which author Ernest
Becker won a posthumous Pulitzer in 1974. Humans are the only animals aware
they are mortal, and Becker argued that in the process of trying to escape our
mortality we developed civilization, art, family legacies, and religion. (We
also develop neuroses, but that is another issue.) These “immortality projects”
are attempts to ensure that something of ourselves survives. Making a digital
photo album presumably is one such project, too, albeit a modest one. All such projects are ultimately doomed of course. Nothing is
forever: not civilization and not the planet itself. But a brief existence is
still better than none. Wrote Kurt Vonnegut in Cat’s Cradle, “And I was some of the mud that got to sit up and
look around. Lucky me, lucky mud.”
I have an eating schedule that is idiosyncratic. (My friends
sometimes choose a less diplomatic adjective.) I eat every other day. I don’t
recommend the schedule to anyone else. For some people it might well be
harmful. But it works for me. I adopted this pattern late in 1975 when my
post-college metabolism shifted. The same daily number of calories that had kept
my weight steady during the previous five years instead rapidly packed on unwanted
pounds. It then took the better part of a year to develop a sustainable (for
me) response. Rather than cut back on my daily intake to below what felt “normal”
– i.e. rather than be hungry every single day – I opted for a one-day-on-one-day-off
schedule. This was before “intermittent fasting” was fashionable; I just
preferred being hungry every other
day. I ate whatever I wanted one day and then didn’t eat at all the next. The
strategy worked. I dropped 30 pounds (14 kg) within the year and kept the
weight off for more than two decades. Around the turn of the current century,
however, my life became difficult. (Among friends I sometimes choose a less
diplomatic adjective.) My self-discipline suffered. The result was the same as
in 1975. From 2000 to 2020 I repeatedly restarted the old on-off schedule (with
short term success) only to break with it repeatedly with the same-old results.
Finally late last year I got back solidly on the on-off schedule. When special food-related
events fall on what otherwise would be an “off” day (as did Thanksgiving), I
fast two days in a row ahead of the event in order to get into synch. I dropped
30 pounds in 2021 just as in 1975. Regrettably I didn’t drop 30 years as well,
but one takes what one can get. Breakfast is a key part of sticking to my routine. I can
forgo that snack tonight if I know a good breakfast is only a few hours plus a
night’s sleep away. I have a favorite breakfast spot and usually rotate among
four very non-vegan selections: country fried steak (called chicken fried in
some places) with eggs over easy; sausage and onion omelet with a side of bacon;
chili jalapeƱo cheese omelet with a side of sausage; and eggs over easy on
prime rib hash with bacon. All come with home fried potatoes and toast, though
I always say to hold the toast. I’ll sometimes deviate from these with a taco
omelet or something, but those four are the mainstays. Since returning to the
on-off schedule I have lost my taste for sweet breakfasts: the savory options
hold sway all the way. Besides, regularly eating sugary blueberry pancakes
soaked in maple syrup might be tempting fate given the national epidemic of
Type II diabetes.
Country fried steak at Marilyn's in Chester comes with three eggs and toast
There are supposedly personality differences between sweet
and savory breakfast aficionados according to an article in The New York Post. Among other things the savory fans are more likely to like rock, sci-fi, and
cats. Well, I do like those though I suspect many eaters of crepes and Begian
waffles do too. The same article notes that the biggest demographic for avocado
toast and hummus is men aged 25-34 – yet another reason to worry about this age
group.
The “science” regarding the value of breakfast as in so many
other dietary things is conflicting. According to an Australian study published
in the British Journal of Medicine, what
you eat is more important than when you eat it.Says professor and study co-author Flavia Cicuttini, “We
found that breakfast is not the most important time of the day to eat, even
though that belief is really entrenched in our society and around the world.” The
authors did not find evidence that breakfast aids weight loss despite common
claims to this effect in popular literature; on the contrary they found that
breakfast eaters ate 260 more calories on average daily than those who skipped the
meal. Yet, other scientists argue that there is an advantage to a morning meal
in order to stabilize blood sugar levels and boost immunity. Says Dr. Barry
Sears of the Inflammation Research Foundation, “By having breakfast, you refill
the liver glycogen levels so that the cortisol levels can decrease. If you
don't eat breakfast, then cortisol levels will continue to increase to maintain
blood glucose levels eventually causing insulin resistance that makes you more
hungry throughout the day.” Perhaps by skipping breakfast (along with other meals) one
day and eating it the next I can combine the benefits of both regimens. Or
perhaps I will suffer the ill effects of both. Whatever the case may be, I’m
looking forward to sitting at my usual table tomorrow. Tomorrow, I think, is a
country fried steak day. Today is an “off” day so this morning I just had blues
for breakfast. That was OK too.
We had a couple dustings of snow in the past week in these
parts: barely enough to be worth mentioning, but nonetheless a reminder of
things to come. Probably. NJ winters have little predictability. They range
from balmy to blizzardy. There is likely to be ice and snow however. Possibly a
lot. Public service announcements on the radio already are warning about the
dangers of slipping and falling on ice. I’m well-stocked with salt, sand, and
shovels in order to keep my walkways safe. I test-started my snow-blower. As
for slipping and falling… well, I’m pretty sure I’ll do that anyway despite
forewarnings. I do that every winter.
I’m not outstandingly klutzy but let’s say that Parkour won’t
ever be my game. I don’t require ice to take a tumble. I don’t even need a cat
between my feet. My worst fall from the perspective of onlookers was in my 20s
when I tried to push a wheelbarrow full of bricks across a plank spanning a
portion of an unfinished foundation; the plank turned and I joined a shower of bricks
in an 8 foot (2.5 m) descent to concrete. Other workers came running assuming I
was under the bricks. I wasn’t. Ego aside, I walked away without an idiomatic
scratch. No one was more surprised than I, but such things happen. In 1943 a
B-17 gunner named Alan Magee fell out of his bomber over France; he fell 22,000
ft (6700 m) without a parachute. He crashed through the roof of the St. Nazaire
train station, which slowed his descent. He survived and spent the rest of the
war in a German POW camp. He died in 2003. Unfortunately, luck can run the other way too, especially
when we are past our 20s and no longer bounce as well. Every year people die
from tripping over their own feet at 0 altitude. According to the CDC 17,000
Americans die every year from slips, trips, and falls. 26% of those are on
level surfaces. 3% happen getting out of bed, which I suppose is one argument
not to do that. 800,000 are hospitalized from falls annually. Injuries not requiring
hospitalization are in the millions. Falls are the leading cause of accidental injury
for those over 65. Ice and sleet, of course, up the risk enormously. Icy stairs
are exceptionally dangerous. According to the CDC website, “One out of five falls causes a
serious injury such as broken bones or a head injury.” This can’t be right.
Five falls are an average winter month for me: all of them outside. Once again,
I’m not klutzy in a general way. It’s just that there are very few level
surfaces on my property. I keep the walkway (which is level) to the guest
parking area (also level) shoveled and salted, but the bulk of the driveway from
the road past the house all the way up to the barn is on an incline steep and
steady enough that heating oil delivery trucks tend to lose traction
about halfway up in icy conditions. (For this reason I keep a spare 15 gallons of
fuel on hand to feed the furnaces overnight when this happens.) Shoveling,
snow-blowing, and salting the driveway on foot (which I do myself) therefore involves
a lot of slipping and sliding on slanted ice. I land on my posterior with some
regularity. Taking the wheeled garbage bin down the driveway to the curb in
winter may involve a whirly dance as we slide our way down. The lawn, too, is
mostly a series of slopes. When the lawn is snow-covered, getting traction on
it is a challenge. Nonetheless, no spill in 40 years has caused me more than a
bruise, and rarely one of those. Yet, while I accordingly doubt the “one out of five” stat,
there is no doubt that a bad landing can cause serious harm, even on a cushion
of snow from just a standing height. Not all of us bounce like Magee. I
wouldn’t want to repeat my own stunt with the bricks either. I doubt I’d be as
lucky twice. So, I’ll heed the PSAs and watch my step. I urge the reader to do
the same.
Ever since humans learned to count – when that might have
been is anybody’s guess – they have been numerologists, assigning special
importance to some numbers beyond their absolute values. Sometimes this was
because of celestial correspondences – such as 12 because of 12 full lunar
cycles in a year – and sometimes because of internal characteristics. The
Pythagoreans regarded 10, for instance, as the most perfect number (being 1 + 2
+ 3 + 4) and, according to their beliefs, therefore of mystical significance.
Thanks to the dominance of the Base 10 number system, modern folk also tend to
give 10 and its multiples quasi-mystical significance. Consider the
significance we attach to 10th anniversaries. So too birthdays: 20th,
30th, 40th etc. birthdays all are considered landmarks in
our lifetimes. Yet the birthdays prior – which is to say those ending in 9 –
have greater impact. This comes to mind since just last week I reached an age
ending in 9.
Donald as a Pythagorean in "Donald in Mathmagic Land (1961)"
We commonly define our stage of life by decade:
twenty-something or fifty-something or whatever. We just as legitimately could
divide a lifetime by 7s or some other number, but we keep the math simple and
divide it by 10s. We often set our goals accordingly, as in “I want to
be/achieve/have such-and-such by the time I’m 30.” Hence 29 rings alarm bells.
Time is running out. The goals change with each new looming decade, but the
sense of urgency when it is 12 months away is always the same. We push
ourselves into (sometimes ill-considered) action while there is time. First
time Marathon runners are more likely to be an age ending in 9 than other ages.
First time sign-ups on the Ashley Madison adultery website are overrepresented
in the 29, 39, and 49 ages. According to the World Values Survey (42,000
respondents), people are most likely to question the meaning of their lives in
a 9 year. Some don’t like their conclusions because suicides spike in years
ending in 9: not a by lot (2.5%) but enough to be statistically significant. We may motivate ourselves to change our behaviors in the 9
years, but do we change who we are at bottom? The consensus among psychologists
is that radical personality transformations are rare at any age – and may
indicate brain damage when they occur. Even apparent transformations, as among
religious converts, affect surface behaviors more than underlying traits. The
Big Five personality traits are openness, conscientiousness, extraversion,
agreeableness, and neuroticism (acronym OCEAN). Each of these is a spectrum,
and where a person scores on them is a remarkably accurate predictor of how he
or she deals with life. High conscientiousness, for example, is the single
biggest predictor of material success: more so than intelligence, which is not
a personality trait. (IQ is like horsepower under the hood: more of it means
your car can do more, but it indicates nothing about what kind of driver you
are.) The terms are largely self-explanatory except perhaps openness. (It does
NOT mean an open person is someone who can be convinced to agree with us politically.)
It has to do with imagination and curiosity about the world, evinced either
through adventurousness, bookishness, or both. A person’s scores scarcely ever change
noticeably in a 9 year or in any other one year, but we do evolve slowly over longer
time periods. The good news is that those long term gradual personality changes
are usually for the better. Writes Professor of Psychology Christopher Solo, “Many
studies, including some of my own, show that most adults become more agreeable,
conscientious and emotionally resilient [less neurotic] as they age.” The “more
agreeable” trend seems to argue against the Grumpy Old Man stereotype, but
since such characters do exist I suppose we must conclude that in their youths
they were flaming jackasses. They mellowed out to merely grumpy. Happiness
tends to increase with age too. Ironically, this is partly because of a
lowering of expectations. Our options really do diminish with each passing
decade and we make peace with that. Also, we tend to care less about the
opinions of others. I remember an old George Burns line: “When I was young I
was taught to respect my elders. I’ve finally reached the point when I don’t
have to respect anybody.” Getting back to my own 9er year, when I was 20 (or 40 for
that matter) I didn’t consider the difference between 69 and 70 to be worth
mentioning. Both seemed superannuated to me. But now that I’m here the
difference seems significant. Making a major personality change in the
remaining year before the big seven-zero is apparently off the table. I’m long
past angst over existential “meaning of life” issues. Quite aside from being well
outside the preferred age demographic, I’m not married so don’t really qualify
for Ashley Madison. Sign up for a marathon? I don’t think “No” is a strong
enough answer. I’m satisfied with life as it is, generally speaking. Still,
there must be some changes to my lifestyle that I should cram into the months
before that 9 flips to 0. I just need to sit down and make a list – though it’s
possible I’ll misplace it.
The Boswell Sisters - There`ll be some changes made (1932)
There were 10 at my Thanksgiving table
on Thursday. This was fewer than usual (due to the culprit Covid of course),
but enough to maintain conviviality. While Thanksgiving is an American (and –
slightly different as usual – Canadian) holiday, feasts are universal in human
cultures. The timing and purpose of them varies, but most of them are
seasonally related – post harvest being a common but far from exclusive time
for them. They predate agriculture. Stone Age remains of them (bones of
butchered wild animals around a camp site) can be found in the archaeological
record. There is even an argument that the need to provide feasts for large
gatherings, such as the Mesolithic ones around Gobekli Tepe in Anatolia, helped
prompt the development of agriculture. Mythical origin stories tend to be
attached to annual feasts, but the details of them are unimportant beyond
shaping some surface rituals. The stories are excuses for the feasts
themselves, which have multifarious “real” reasons: sometimes solidifying unity
within a single clan, sometimes promoting links among different clans, sometimes
facilitating commerce, sometimes facilitating marriage, sometimes showing off
the host’s wealth and importance, and sometimes all of those things and more.
Writes researcher Chloe Nahum-Claudel of the University of Cambridge, “Feasts mobilise
people’s values, their morality, and understanding of the world of which they
are a part. They have particularly powerful world-making effects because they
are both irreducibly concrete – satisfying hunger, exciting pleasures,
coordinating the political-economy, and embedding themselves in the
organization of time and memory – and expansively meaningful, simultaneously
expressing and generating deeply held values.” So too, though expressing the
point that way might be considered not only weird but WEIRD. Also weird in a
broad historical sense is a feast like the one at my house on Thursday, which had
nothing to do with anything mentioned in my list above, though maybe a couple
items from Chloe’s. Even a “traditional” Thanksgiving with only close kin would
qualify as WEIRD, as does my looser (and nowadays more common) collection of
guests. WEIRD is an acronym for Western,
Educated, Industrialized, Rich, and Democratic. The term was invented by social
psychologists Joseph Heinrich, Ara Norenzayan, and Steve Heine more than a
decade ago. Their cross cultural studies led them to conclude that much academic
research on human psychology was deeply flawed: it had not uncovered human
universals but the quirks of one smallish segment of the global population. Examining
commonly cited studies, the three found that “96% of experimental participants
were drawn from northern Europe, North America, or Australia, and about 70% of
these were American undergraduates.” This wouldn’t matter if the
susceptibility to “visual illusions, spatial reasoning,
memory, attention, patience, risk-taking, fairness, induction, executive
function, and pattern recognition,” among other things were constant
cross-culturally. Studies that explicitly tested this question, however, showed it is not. WEIRD-os on average are in fact unusual: very unusual, both by historical
and current global standards. They are on the far tails of the distribution
curves for these traits when graded in global contexts.
An interesting book on the subject is The WEIRDest People in the World by
Joesph Henrich, chair of the Department of Human Evolutionary Biology at
Harvard. The biggest single difference (though there are others) between
WEIRD-os and others is the importance of extended kinship in non-WEIRD
societies. The prevalence of cousin-marriage by itself is a surprisingly effective predictor of average psychological traits within a culture. Henrich
traces the origin of WEIRD culture to late antiquity and the influence of the
Western Church in weakening kinship duties relative to adherence to abstract
principles. Even when the Church’s authority waned, the pattern of thinking
persisted. He makes a very good argument that WEIRD culture explains much about
why the Industrial Revolution (for well or ill) first arose in the West with
profound effects on world history. Though the Industrial Revolution can and does
take root and self-sustain in non-WEIRD cultures, it initially gets imposed there either from the outside or from the top down (as in Meiji Japan or 1950s
Communist China) rather than emerging organically as it did in Europe. Average WEIRD-os are more individualistic, self-obsessed, and abstract
thinkers than average humans overall. A simple example: describe yourself
several ways by completing the sentence “I am _________.” WEIRD-os are apt to
fill the blank with personal attributes (such as inquisitive, intellectual, artistic)
or with their jobs (biologist, truck driver) or belief systems (Mormon,
Marxist, libertarian, or whatever); non-WEIRD-os do some of this, but are far more inclined to mention down-to-earth placers such as kin relations (so-and-so’s son, cousin to this person,
or sister to that person, etc.) or place in the social order. Both ways of
answering are legitimate, but they are different. As that may be, today I’m still (over)eating
leftover turkey as I will be for a couple more days – alone rather than in
company. That isn’t weird by American standards. It might well be by global historical
standards. That’s OK. I’ve been called worse.
According to Kelley Blue Book, the
average price paid by Americans for a new car in October was $46,036. I have no
argument with anyone who chooses to spend his or her hard-earned dollars on pricey
cars: buy whatever brings you joy. I’m just surprised that so many do. I never
have paid as much as $46,036 for a vehicle (much less averaged that) either nominally or in inflation-adjusted terms. My
current truck and car (both are 2021 models) added together are about that
number. OK I’m cheap. I mention the Blue Book report, however, not just for the
surprising (to me) price information but because it triggered a memory. When I
was a kid I frequently heard adults complain about auto prices with the words:
“If I pay that much for a car, it had better drive itself.” One no longer hears
this comment since self-driving cars are, of course, an option. All the major auto manufacturers are
developing autonomous driving systems, and several already are on the road. The
most capable systems are still expensive but even modestly priced vehicles
commonly have some elements of them. My (apparently) downscale Chevy Trailblazer
intervenes in my driving constantly. If I drift out of my lane it self-corrects.
It auto-brakes if it decides I’m waiting too long to do so. It chooses when to
turn the hi-beam headlights on and off. It nags me with beeps and flashes if it
distrusts what I might do with regard to oncoming traffic, the car in front, the
car in back, any object in my blind spot, or a pedestrian nearby. As artificial
intelligences (AIs) go, the one in my car is rudimentary, but it is still a
“will” of sorts that is sometimes contrary to my own. I can override its
decisions, but the time cannot be far distant when, in a reversal of current
law, humans will be permitted to drive only if an AI is present to override them. AIs drive more than just our cars. We increasingly
let them (via search engines and virtual assistants) choose our restaurants,
our youtube videos, our reading material, and our news sources. Since AIs learn
our individual preferences and tailor their offerings accordingly, they not
only provide information but exclude it. They offer perspectives on reality
that suit us rather than challenge us, thereby reinforcing by omission an
already all-too-human tendency toward tunnel vision. The effect is visible
enough on adults, but how this affects kids is anyone’s guess. Young children
will never remember a time before interactive AI. Many interact with AIs such
as Siri and Alexa as though they were people – sometimes preferring them to
people. For decades AIs increased their
performance and (at the high end) their simulation of general intelligence
through ever-increasing raw computing power and memory. Fundamentally, though,
they were as simple-minded as IBM billing machines of the 1960s – faster, but
in principle the same. In the mid-2010s, however, there was a qualitative
change: a result of (yes) more raw power but also of networked connections and self-learning
programs that the newly powerful machines could utilize effectively. Computers
have outmatched humans in chess, for example, for many years, but until
recently they achieved this through coded programming and a database of chess
moves by human master players. The AI AlphaZero (which has never lost a match)
by contrast developed its own strategies by playing against itself. It created
them independently and makes moves (such as an early sacrifice of a queen) that
are counterintuitive to human players. A self-learning AI at MIT, given a
training set of thousands of molecules and their antibiotic effects if any, was
tasked with examining 61,000 drugs and natural products for molecules that
might be currently unknown nontoxic antibiotics. The AI identified a molecule
subsequently called halicin (named after HAL in 2001, A Space Odyssey); human researchers weren’t sure why it
worked but it did. The AI saw something they didn’t. Nor are AIs leaving artistic
creativity to humans. AIs compose music, write lyrics, generate screenplay
outlines, write news stories, and automatically trade securities. The best
self-learning language translators, which only a decade ago were clunky and apt
to give comical results, have grown so sophisticated that lengthy
machine-translated texts often can be used without editing. Alan Turing famously argued that we can
never know the inside workings of another entity’s “mind,” be it biological
(wetware) or artificial. Consequently, all that matters is the result. If
someone or something acts as though intelligent, it’s intelligent. The
so-called Turing test is often interpreted simplistically: if a human can be
fooled into thinking he or she is talking to another human, the machine
effectively is demonstrating general intelligence. This isn’t accurate. Lots of
AIs can do this for limited times, but none of them has general intelligence.
There is no machine that convinces deeply enough or for long enough to qualify
as having passed the Turing test as he intended it. But some are getting eerily
close. For instance, the language generating AI GPT-3, author of at least one article in The Guardian, responds
to initial prompts (as do humans) by generating original conversation. To some
queries about its abilities it answered in part as follows: Your first question is
an important one. You ask, “Can a system like GPT-3 actually understand
anything at all?” Yes I can. Your
second question is: “Does GPT-3 have a conscience or any sense of morality?” No
I do not. Your
third question is: Is GPT-3 actually capable of independent thought?” No I am
not. You may wonder why I give this conflicting answer. The reason is simple.
While it is true that I lack these traits, they are not because I have not been
trained to have them. Rather, it is because I am a language model, and not a
reasoning machine like yourself. Good to know. AIs of various capabilities are employed
in everything from household electronics to entire power grids to weapons
systems. Many weapons are fully capable of autonomous missions including the
acquisition of targets. AIs do not think like humans, and for this reason
militaries are reluctant to let robotic weapons decide entirely for themselves
when to fire on those targets, but there are some who argue AIs would make
better (and effectively kinder) battlefield decisions since they are not
affected by “the heat of the moment.”
An interesting book on the impact
(present and future) of AI on economic life, human psychology, and geostrategy
is The Age of AI and Our Human Future
by Henry Kissinger, Eric Schmidt, and Daniel Huttenlocher. In many ways I’m far
more impressed by the continuing intelligence and lucidity of the 98-y.o. statesman
Kissinger than I am by GPT-3. Schmidt is a former CEO of Google and is still
a technical advisor. Computer scientist Huttenlocher is the dean of the MIT Schwarzman
College of Computing. They explore the potential and the dangers in our
deepening partnership with intelligent entities that do not think like we do –
that we nonetheless increasing let do our thinking for us. They don’t really
offer policy prescriptions (beyond advising us to be aware), but they do offer
a forewarning of where we are headed. Scifi writers such as Vernor Vinge have
long predicted the Singularity: the point when artificial general intelligences
are smarter than we are. They were far too optimistic (or pessimistic,
depending on your view) about when this event would happen. We are already past
the dates proposed by Vinge in the ‘80s. The Singularity remains beyond the
horizon. It seems more certain than ever to arrive eventually though. Odds are
I won’t get to see it. I don’t about you, but I’d like to.
Two young friends of mine are on the
last third of a see-the-USA auto tour. They left the West Coast behind around
Halloween and are leisurely making their way back East. They should be back in
NJ in a couple weeks. This is a rewarding trip to make if you have the time,
money, and opportunity to do it. I made the circuit shortly after college (as
long ago as 1975), heading out to CA in a meandering fashion by the southern
route and then returning by the northern – a journey of a few months. (I wrote
about one small incident of the trip at my short story site: The Roxy Caution.) That is the right time of life to do it: you’re youthful, adult,
and (usually) not yet bogged down by responsibilities. The demands of jobs and
family make such an extended trip more difficult later. We are lucky to manage
an occasional week or two foray to some single destination. Even for those who
are single, self-employed, healthy, and unattached, long absences from home become
harder for income/expense reasons and because leaving a residence (rented or
owned) behind unattended raises legitimate concerns. We grow less mobile. We dig
in. In my case, the trip itself rather than a place was the destination, but of course there were cities and sites I wanted to see
that served as benchmarks. In particular I remember rolling into Los Angeles
because… well… it meant I had run out of continent. After skirting the
coastline northward, I’d be headed back. My sister Sharon (d. 1995) lived in
Hollywood at the time, having moved there from San Francisco where I had visited
her a year earlier – a trip made by air, not road – so we were able to visit. She
liked LA. “It’s not an attractive city,” she said. (It wasn’t and it’s not.) “But
everything is here. It is much more
livable in every way than SF.” So it was. LA is always a contradictory
hodgepodge and is always in transition, of course, but in the 70s the mix was
particularly strange: the city/region was tawdry and decaying yet somehow still
glitzy and full of opportunity – and socially far more free-spirited than today
in ways that are hard to describe to anyone who wasn’t there. (The detective
dramedy movie The Nice Guys, whatever
its other qualities or lack thereof, did a pretty good job at catching the
flavor.) Today LA has troubles (blame whom you will) beyond what existed four
or five decades ago – affordability alone being a big one – yet the promise of
possibilities has not entirely faded. I’m too dug in (and frankly too old) to consider
answering the call of that promise by relocating, but many younger and more
adventurous souls are still drawn to it. For them, I’ll give Sharon the last
word: Hollywood, Hollywood, I could live here forever in a tiny
white house And watch the stormy winter sunsets Spread pink rays over the palm trees. I could be a movie star chick In a white fast Mercedes tooling down
Sunset Strip. I could be a biker’s lady And I’d live on the beach in Venice Collecting shells and driftwood for my
windowsill While jogging in the evenings. I could be a Laurel Canyon hippie And live on a hilltop among the pine
trees And make silver jewelry. I could marry a mechanic from the
Hollywood flats And cook a lot of pie and grow fat And stroll with my babies down the
Boulevard on Sundays. In Hollywood there are a hundred lives
for me to see. A hundred different bodies I could be. The California sunshine is my stagelight And the Walt Disney blue sky my might. Sharon
Bellush (1975)
I don’t get many trick-or-treaters at my
house on Halloween. Some years there is none at all. My driveway is long and
flanked by scary dark woods in an area where bears occasionally wander, so when
I do get them they are typically driven up the driveway by parents who live in
the immediate neighborhood. I always keep a candy bucket at the ready just in
case even though the contents are likely to last almost to the following Halloween.
This year and last there were no candy-seekers at all, no doubt due to Covid-related
caution. I did answer the door a few times on Sunday when I heard knocks, but
there was no one there on any of the occasions.
It is not unusual to hear mysterious
knocks at my house. It’s a talkative structure
that groans, creaks, and knocks as it expands here and contracts there with the
weather and with the vagaries of the forced air heating system. I’m accustomed
to the noises and ignore them unless (as on Halloween) I’m expecting someone at
the door (even though that is what doorbells are for) and the sound seems to
come from that direction. Overnight guests often comment on them however. “Don’t
worry,” I tell them. “That’s just the troll who lives in the basement. He
rarely gets loose from his chains.” That doesn’t always reassure them. More than
one guest has told me the house is haunted. Two actually described the ghost.
(The descriptions didn’t match other than both being female.) I hesitate to
call people who believe in ghosts crazy: not because I think there is the
remotest possibility they are right but because the bulk of them are plainly
quite sane in a general way. They just believe in ghosts. The percentage of Americans who say they
believe in ghosts polls at about 46% – a figure that actually
has risen slightly over the past five decades. Toss in those who answer “Not
Sure” (about 7%) and that is a solid majority. Between a quarter and a third
claim to have seen ghosts. This is despite the steady decline of traditional
religion. (Belief in an afterlife of any kind also has risen slightly according
to the General Social Survey from 70 percent of respondents in 1978 to 74
percent in 2018.) The decline in traditional religion is apparently not associated with an increase in
skepticism about the paranormal. Counterintuitively, belief in the general
gamut of the paranormal (and ghosts in particular) rises with education. Graduate
students are more likely to believe in haunted houses than college freshmen. Nor
is science education more likely to produce skeptics than the liberal arts. See
2012 study Science Education is No
Guarantee of Skepticism by Richard Walker, Steven J. Hoekstra, and Rodney
J. Vogl: “We were interested in whether science test scores were correlated
with paranormal beliefs. For each sample, we correlated the participant’s test
score with their average belief score. Across all three samples, the
correlation between test scores and beliefs was non-significant (CBU
r(65)=-.136, p>.05; KWU r(69)=.107, p>.05; WSSU r(70)=.031, p>.05). In
other words, there was no relationship between the level of science knowledge
and skepticism regarding paranormal claims.” I think it is fair to say that believers
in ghosts want to believe in them. That’s why most of us believe the things we
do, truth be told – as faulty a reason as that may be for doing so. Being a
ghost doesn’t sound like much fun, but I can see how some might regard it as
better than nothing. I remain unconvinced. If I decide to reduce the bangs and
knocks in the house, rather hold an exorcism I’ll have the furnace serviced. If
it turns out I’m wrong, however, I promise to haunt this house for as long as
it stands. I won’t harm the next occupants. I’ll unshackle that troll though,
and I can’t speak for him.
That life is fleeting is hardly an
original thought. I attended a repast
for a departed friend earlier this week, so the thought was more on the surface
than usual, but mortality is always something at the back of our minds. It is
why people have bucket lists. It is why we feel guilty about wasting time. Just
last week a somewhat younger fellow in my circle of friends was expressing his
funk over where he is “at this stage” in life: alone and essentially
property-less. One could rephrase that (and I did) as free without burdensome
responsibilities though I understand why he didn’t find that reassuring. This
is a sentiment we hear expressed by people at all levels of financial and professional
success. One woman (whose property I showed back in my broker days) with a
seemingly healthy family in an $800,000 house once said to me with a head shake
while leaning on her Mercedes, “I can’t believe this is my life.” She meant
that in a bad way, not in an “I’ve hit the jackpot” way. Despite material
superficialities, she may well have had solid reasons. In a world where there is always someone
who outshines us in ability and achievement, most of us feel like
underachievers and laggards much of the time. We can feel this way at any stage
of life, but the “middle-age crisis” is the classic event and for good reason.
Time really is slipping away from us at that point: our range of possible
futures constricts. This often leads to rash decisions from the thought “If I
don’t do this now [get married, get divorced, quit my job, become an artist,
study philosophy, have an affair, start a business, backpack through India, or
whatever], I never will.” Sometimes the decision works out through luck or good
planning, but more likely it’s a mistake, and a major life mistake made at 45 or 50 is less recoverable than one made at 25. (This is why such blunders are less of
a “crisis” at 25 even though the feelings may be just as intense.) I speak from
personal experience: time by itself is the wrong reason to do anything. First
think twice to determine if it is something you would still want to do if not
rushed for time. Then think it over once more. If the answer is still yes it
might be the right move. A second, third, or even a fourth thought would have
benefited me in my 40s. It took me far too long to stop worrying
about life benchmarks. There is something to be said for making peace with
underachievement. Don’t get me wrong: if winning those trophies (real or
metaphorical) makes you happy then by all means go for them. But if they don’t,
don’t. In the end, the needlessly unhappy life is the wasted one. An entertaining
little book that makes just this point is The
Underachiever’s Manifesto: The Guide to Accomplishing Little and Feeling Great
by Dr. Ray Bennett. In it he expresses a fundamentally Epicurean (in the
classical sense) world view and advocates a leisurely approach to life: “By now
you should be completely confident that underachievement is the key to
happiness in your life and for everyone else around you, so stop worrying about
not being perfect.” Accomplishment, he tells us, is in the eye of the beholder.
He quotes Pablo Picasso of all people: “You must always work not just within,
but below your means. If you can handle three elements, handle only two. If you
can handle ten, then handle only five. In that way, the ones you do handle, you
handle with more ease, more mastery, and you create a feeling of strength in
reserve.”
None of this means we shouldn’t write
that novel or record that song or whatever it is we always meant to do but
didn’t. Bennett simply reminds us that the point of doing those things is to
have fun: something we tend to forget when we drive ourselves to meet some
external standard. Unless we really enjoy our work, perhaps being a workaholic
is the wasted life… and life is fleeting.
Betting on weather in NJ, whether days
ahead or months, is a fool’s game. Nonetheless we all try regardless of our expertise
or its lack. The smartest of the foolish money is on a coming winter season of
heavy snow, much like last winter, in consequence of the recurrence of La NiƱa, a weather pattern
caused by cool water temperatures in the Pacific. Whether those bets pay off or
not, it won’t hurt to prepare. Having been snowbound more than once last winter
with two 2WD vehicles unable to navigate my driveway much less the roads, I
replaced my sedan a couple months ago with an All Wheel Drive Trailblazer. I’ve
stored 20 gallons of fuel in case the heating oil delivery truck can’t make it
up my driveway as happened a few times last year. I’ve tied tarps on equipment and
machinery (e.g. AC and pool filter) that fare better if not infiltrated by
snow. Yesterday I was up on the roof of the barn cutting back overhanging
branches. Besides the risk of them doing damage by breaking, snow causes the branches
to bend down onto the roof, which is not good for the shingles. I used a
Sawzall and manual branch cutters for this job because I didn’t feel
comfortable waving a chainsaw over my head while balancing on a roof peak.
With or without whirring blades, I’m
cautious these days on roofs – even a one-story roof. There is a family
history. My dad had a nasty encounter with a Skil saw as long ago as the 1950s when he fell through
rafters. He was young and he recovered. I’m not young but I’d be averse to
repeating the event even if I were. Besides, my bones probably would object to
the impact from the fall alone, and not just because I’m (let’s not mince
adjectives) old.
We of the 21st century are
more fragile than our ancestors, who bounced better. The further back in time
one goes the greater the fragility gap. Skeletal evidence from archeological
sites reveals just how much bone density has dropped. A 2017 study of remains
from 5300 BCE to 100CE showed the ancient bones to be 30% stronger than those
of average 21st century people. They are stronger even than those in
most modern athletes: “humeral rigidity exceeded that of living athletes for
the first ~5500 years of farming.”
An interesting book on biological
changes in humans over the past few tens of thousands of years is Primate Change: How the World We Made is Remaking
Us by Professor Vybarr Cregan-Reid. The changes include higher rates of
cardiovascular disease, smaller and more cavity-prone teeth, smaller brains, allergies,
back pain, higher rates of myopia, etc. A few of the changes are genetic: brain
size, for example, which is down about 10% from the peak some 30,000 years ago;
this is a global phenomenon that is apparently an effect of self-domestication as
human populations rose and forced more social interaction. (Domesticated
animals almost universally are smaller brained than their wild counterparts.) Most
of the changes are either purely environmental or epigenetic. Epigenetic
influences change the way genes are expressed even though the genes themselves are
unchanged. These include such things as childhood diet and exercise. Unlike
purely environmental effects (e.g. adult exercise) these changes (e.g. height
and foot structure) are irreversible in adulthood. They also are partly
heritable, which came as a surprise to researchers a few decades ago. In general,
residents of advanced countries today are weaker than our ancestors (and modern
day hunter-gatherers for that matter) in almost every imaginable way. We
somehow live longer anyway, but that is a testament to how safe we have made
our world rather than to our innate fitness to live in it. Our sedentary
lifestyle is mostly to blame. A desk job (including school) is as deleterious
to health as smoking. An indication of how inactive contemporary humans are is
that on average each burns fewer than 300 calories per day over the basic metabolic
minimum; Paleolithic hunters (as do their remaining modern-day counterparts) burned more than 1000. Based on long-term trends (especially of
the last few centuries) the future of human biology is not rosy. We are not
realistically about to give up the creature comforts that make us weak. However, other interventions are possible. There exist today drugs that mimic the
effects on the body of rigorous daily exercise with all its benefits; they are
not approved for use (the side effects are undetermined) but they might one day
be on drug store shelves. Then there is the prospect of bioengineering (long
promised but little delivered) that might put us into shape without medication
and despite our laziness. If we do tinker with the genome, I suggest aiming
(among other things) for a greater capacity to self-regulate body temperature so we can stay warm in the cold.
We then could face the winter with less trepidation and lower heating bills –
though higher food bills. I don’t mind eating an extra meal. Oh, yes: we should
re-toughen those bones, too, to improve our bounce.
Not all the friends of our youth are
people we have ever met. Some of them are artists, musicians, and authors with whom we have one-sided friendships. Some of them died before we were born: long before. One-sided
relationships are called parasocial, but there is nothing para- about their
influence. Hearing an old song or revisiting a favorite book evokes every bit as
much nostalgia as the last high school reunion. Maybe more. It’s a feeling I
get whenever reopening a novel by Wells, Asimov, Heinlein, or any of several
other authors whose works already had claimed space on my shelves in my second
decade. It is just as well that, more often than
not, we don’t ever meet the actual artists – except perhaps in some cases for a
minute at a book signing or some similar event. Art says little about the character
of the artist. They might be very much what they seem in their work or they
might be radically different. They might be pleasant or they might be jerks. It
makes little difference to the value of their books, but sometimes we can’t help
but wonder. For this reason the title When Nietzsche Wept by Irvin D. Yalom (Professor
Emeritus of Psychiatry at Stanford) caught my eye. Period novels with
intriguing subject matter still can manage to disappoint, but this one did not.
This is a well-written and extremely well researched historical novel set in
the 1880s and featuring Friedrich Nietzsche, Lou Salome, Josef Breuer, Sigmund Freud,
and other key thinkers of the day. Yalom conveys a real sense of the personalities
and he has a firm grip on their ideas, which were sometimes stuck in the 19th
century and at other times transcendent.
Friedrich Nietzsche is one of my old
parasocial friends. He shook up a lot of my preconceptions in my late teens and
early 20s. I first read him while taking a college class of classical Greek
tragedy – not as an assignment but just because I had seen him referenced and
wanted to see what hehad to say. Nietzsche’s The Birth of Tragedy (through the Spirit of Music) was a revelation
not only for its deep insight into Greek drama but into human nature. The book
also has a lot to say in a comparative way about Wagner. Truth be told, at the
time I knew Wagner best from Bugs Bunny but I soon remedied
that, though without becoming a Wagner fan. (Nietzsche himself later broke with
Wagner over the latter’s anti-Semitism.) I soon followed The Birth of Tragedy with Thus
Spoke Zarathustra [Also Sprach Zarathustra]
and then with several of Nietzsche’s other books. The Walter Kaufmann
translations were and are still the best. When called for jury duty for the
first time, I spent the hours in the courthouse jury pool waiting to be chosen
or not for a case by reading Beyond Good
and Evil. I didn’t think about it at the time, but that title might have
raised a few eyebrows in that venue. Nietzsche’s writing career lasted only a
decade, but he was prolific in that decade. His ill health and thoughts of
mortality intensified his drive to produce. I understand the feeling: most of
my short stories and my only novel were written in the space of a few years when
my life circumstances caused me to be feeling my mortality. None of my writings
is as deep as anything published by Fred, but we can’t all be mad geniuses.
Nietzsche suffered a mental collapse in 1889 (tertiary syphilis is the usual diagnosis)
and eventually ceased speaking. He was tended by his sister until his death. He
remained obscure throughout his productive period but his fame rose as soon as
he was no longer capable of being aware of it. His influence extended beyond philosophers
to artists such as Strauss, who wrote the heavy-handed but impressive tribute Also sprach Zarathustra in 1896. As noted, Nietzsche often failed to rise
above his time, and on those occasions he induces most readers (including me)
to shake their heads and sigh. But he more than made up for it the rest of the
time. He and the existentialists who followed him caused me to engage in
introspection of a kind that I had neglected until then. It was less a matter of showing
the way than showing we all choose our own ways – even if most of us opt for
well-trodden crowded paths. I owe him a lot. The closer one looks at
the life he actually lived, however, the more he looks like a taxing friend to
have had. I’m happy to keep him at arm’s length via Yalom’s pages.
Strauss – Also sprach
Zarathustra [initial fanfare]