I go out for breakfast about three times
per week at my favorite local hole-in-the-wall diner. Naturally, they know me
pretty well there. As soon as I walk in the door a server (usually Dawn) sets
down a mug of coffee (black) and a bottle of Tabasco sauce at an available
booth (isolated by plastic barriers these days). I haven’t had to ask for the hot
sauce in years. My breakfast choice varies (or more accurately cycles) among
chili jalapeno omelet, country fried steak and eggs (over easy), eggs on prime
rib hash, breakfast burrito, blueberry pancakes with a side of sausage, and a
few others. Hot sauce goes on all of them – OK, not on the pancakes, but on the
side of sausage. It doesn’t have to be Tabasco. Pretty much any brand will do,
but of the two available at the diner that’s my preference. (Another
[pre-covid] customer once asked me if I was a Roswell alien, an obscure pop
culture reference I actually got.)
A peek in my fridge
People differ in their taste for spice.
Mine probably originated in childhood from my mom’s theory that no meal
preparation can go so far wrong that it can’t be rescued by enough black
pepper. My dad didn’t agree (he didn’t complain but he didn’t agree) but to me a
crust of peppery seasoning on just about anything seemed perfectly normal. Soon
after I started making or buying my own meals I upped the ante with chili
peppers and hot sauces. They are an acquired taste rather than an innate one.
People raised with blander food traditions usually have to be eased into an
appreciation for hot spices, if they ever acquire one at all. (Curries are good
way to start, since they can be stepped up gradually from mild to fiery.) The
reason is biological: hot spices trigger pain receptors. Piperine (in black
pepper) and capsaicin (in chili peppers) both fire TRPV1 receptors, mimicking
the sensation of heat. Why is that pleasurable? Like the bite of high proof
bourbon or the bitterness of strong coffee, the attraction is hard to explain
to newbies. For someone acclimated to them, however, they enhance flavors in
ways that are worth having sweat break out on one’s forehead. Spices have a long (pre)history in human
recipes. Coriander seeds have been found at a 23,000-year-old site in Israel.
Traces of garlic mustard have been found in Danish pots dating back 6100 years.
Romans poured garum on food as freely as Americans pour ketchup. Garum is a spicy
fermented fish sauce, which is a bit much even for my taste though such sauces
are still common in some Asian cuisines. The European Age of Discovery was
prompted by a search for spices – or rather by the profits from spices. The jackpot
was found in the New World. In 1400 chili peppers existed only in a swath from
Mexico (where they were first cultivated) to western South America. Today they
are grown and eaten all over the world, though they are still most strongly
identified with Mexican cooking. Nevertheless, spice preferences are
literally a matter of taste. I’d never argue with anyone who prefers mild to
hot salsa. The right amount of pepper at any given degree of hotness on the
Scoville scale is the amount you like, whether zero or a fistful. Besides,
people differ in their gastric acid responses: the same chili pepper that
settles one person’s stomach will give another heartburn. However, if you are
gastronomically tolerant of piperine and capsaicin, there are a number of health
benefits from liberal doses of them. According to the Penn Medicine website
red chili peppers have been shown to reduce LDL cholesterol and to lower the
risk of stroke and heart disease. By notching up one’s metabolism they promote
weight loss. Once it gets past the stomach (where, once again, people differ in
gastric acid response) capsaicin aids in digestion and gut health by attaching
to receptors that release anandamide, an anti-inflammatory. Capsaicin even has
value as a topical treatment for pain relief by first firing and then numbing
pain receptors. Today I’m skipping breakfast but a
steak-and-cheese quesadilla is calling out to me for lunch. I have just the
right ghost pepper salsa for it.
The films and series of the Star Trek universe
include both swans and turkeys. Overall Star
Trek: Generations (1994) is a turkey, but it does contain some
poignant dialogue about time, as when Dr. Soran remarks, “It's like a predator;
it's stalking you. Oh, you can try and outrun it with doctors, medicines, new
technologies. But in the end, time is going to hunt you down... and make the
kill.” Other animals run from predators by instinct. Humans do too, but as the
only creature fully aware of its own mortality our reasons for running have
extra depth. Our awareness has motivated our obsession with time, which is
always running out. We live by the clock and the calendar.
In a way we always have as the giant calendar that is Stonehenge attests, but
the ancients nonetheless viewed and experienced time in different ways than
modern folks. That applies to daily activities, yearly activities, and life
courses, but also to their notions of deep time, such as whether it is linear
or cyclical and whether the world has a beginning and an end (their etiology
and eschatology). Time and Temporality in
the Ancient World, edited by Ralph M. Rosen, is a title that caught my eye in
E.R. Hamilton’s book catalogue in part because of my general interest in the
subject and in part because for various reasons I’ve been feeling a bit ancient
myself lately.
The book contains 9 scholarly articles
about the experience and consideration of time in ancient Western and Eastern cultures.
My primary caveat is that much of it is written in professor-ese. In the first
entry, for example, the author defines “event” and “process” at length: “We can
make the distinction between event and process because we are content to use
time as the axis on which events are mapped and through which processes flow;
time acts as our independent axis of measurement and definition.” He goes on
like that for pages. I’m aware of the utility of defining one’s terms, but the
meat of his article is perfectly well served by the normal Webster’s definition
of those two words, which he only managed to obfuscate. Beneath the unfortunate
prose, however, there are insights into the historical sense of the ancients in
the Middle East, India, China, and Greece and the ways they tried to reconcile
the clearly cyclical aspects of nature (seasons, astronomy) with an apparently
uni-directional arrow of time from birth to death. The Sumerians, credited with the first
written language some 5000 years ago, had a sexagesimal (base 60) number
system, which helps explain some aspects of their system of daily time that we continue
to use today: 24 hours to a day, 60 minutes to an hour, 60 seconds to a minute.
The curious thing about the last of those is that Sumerians did not have the
ability to measure a second, but they defined the time interval anyway. They
could only approximate a minute. To supplement naked-eye observations of the
sun, moon, and stars, they had no more accurate mechanical timepiece than a water
clock. In simplest form a water clock is just a leaky bowl that drips water at
a constant rate. (Sand hourglasses didn’t come along until the 8th
century CE.) The Romans and Chinese later (and independently) developed far more
complex water clocks with gears and escapements. They were notoriously
inaccurate, but they still were useful on cloudy days and nights. They were recalibrated
regularly to sundials, which themselves are inconstant due to the seasons. The 7-day week also dates back to the
Sumerians. It is the period of a waxing or waning moon (very nearly) and was
associated with the seven known planets (the ancients regarded the sun and moon
as planets) whose names (in English mostly with Nordic equivalents) still
attach to the days today. Used extensively by astrologers, the 7-day week
co-existed with 8-day market weeks and other divisions of months (e.g. ides and
nones) well into the Roman era, eventually almost entirely prevailing in the 4th
century CE due to larger Biblical sway and the directives of Emperor
Constantine. By this time the 7-day week was current in China as well (possibly
via India where it was also current), again largely
for astrological convenience. Enumerating time was helpful in a host
of practical ways, but practicality may not have been the initial impetus to do
it. It might well have been a sense of mortality and the spiritual questions
associated with it. The persistence of astrology alongside astronomy and
timekeeping seems to argue for this. Grand cycles of time beyond normal human
scales, but paralleling astronomical cycles, are common in Eastern, Western,
and New World mythologies. So are notions of personal rebirths whether
literally through reincarnation or into some afterlife. Ancient timekeepers I
suspect felt the predator stalking them and sought some hope of escape hidden somewhere
in the secrets of the celestial clock and in the large and small cycles of
which they themselves were a part. The personal experience of time has an
odd duality in modern as in ancient times. We tend to live each day as though
we have unlimited numbers of them even though each year seems to pass faster
than the last. The effect shows up early. I remember back at the beginning of
my junior year at George Washington University my friend Donald (hi Don, if you’re
reading this) commenting, “I feel I’ve always been here [at college], and
always will be here, but that I got here yesterday.” That isn’t the sort of
time that can be measured by an atomic clock, but I knew what he meant.
Unspoken but understood was that he (and I) would leave there tomorrow… and so,
in a sense, we did. It reminds me of an old Henny Youngman joke: “Live each day
as though it’s your last. One day you’ll be right.”
No one
over 30 would ever write a song with this title.
That
everyone in this clip (if still alive) is a senior citizen is explanation
enough why.
2419 years ago Socrates was sentenced to
death officially for impiety and corrupting youth (with philosophy), but really
because he annoyed people by proving they were talking nonsense when they
thought they were smart. According to his own account, as related by Plato, the
priestess at Delphi had called Socrates the wisest man in Athens. Socrates held
no such opinion of himself so he interviewed statesmen, craftsmen, poets, and
other persons of skill and intelligence in order to prove the oracle wrong.
Instead, he found that everyone he interviewed, though unquestionably competent
in their particular fields, believed their competence extended far beyond their
fields including into matters of ethics, meaning, and political policy. He
found it easy to get them to contradict themselves and show they didn’t know
what they were talking about. He decided the meaning of the oracle was that he
was wise for at least recognizing the limits of his own knowledge and wisdom. Most of us know that brains and wisdom are
not well correlated. There no fool like a smart fool. In his book The Intelligence Trap: Why Smart People Make
Dumb Mistakes, David Robson uses an automotive simile. Higher intelligence
is like having more horsepower under the hood of your car; it doesn’t make you
a better driver. If you’re a bad driver, more horsepower just makes you more
dangerous on the road. The trouble is that few people self-identify as bad
drivers, metaphorically or otherwise. (In the US 99% of drivers think they are
above average!) In order to avoid tweaking current ideological noses, for a
social policy example let’s return to Plato with a revisit to The Republic. Only someone as brilliant
as Plato could imagine and argue persuasively in favor of such a proto-fascist
hellhole.
Though I.Q., a measure of abstract and
analytical thinking, is broadly associated with academic and career success,
the correlation isn’t anywhere near as tight as one might expect. Psychologist
Robert Sternberg has had better predictive success by measuring three
abilities: analytical (bare I.Q.), creative, and practical. Yet people who
score well on some mix of these, however successful career-wise, are as apt to
make terrible life choices in other ways as the rest of us. They are as apt as
well to hold bizarre beliefs and to think non-analytically outside of their narrow
specialties (sometimes even there). The percentage of people who believe in the
general gamut of paranormal, for example, from ghosts to astrology to
clairvoyance actually rises with education. Consider Nobel prizewinning chemist
Kary Mullis and his… let’s call them nonmainstream views on alien abduction and
on the (according to him) non-HIV cause of AIDS. As Michael Shermer also has
written, intelligent people are better at rationalizing evidence that
contradicts their views, and so cling all the harder to beliefs in fairies or
spiritualism or transdimensional lizards or the Illuminati conspiracy or what-have-you. Most intelligence traps are more mundane
than those, but all the more consequential for that. They come in many forms.
One is to see things excessively in terms of your own specialty. The German for
someone like this is Fachidiot, which
is a concise way of conveying the longer but expressive English adage, “To a
man with a hammer everything looks like a nail.” Another is for genuinely
competent people to be overconfident in their competence. The results can be
tragic. Some 80,000 people per year in the US die due to misdiagnosis by
generally capable physicians. In 2004 self-confidence by FBI experts in their
own fingerprint analysis capabilities led to the arrest and trial of Brandon
Mayfield for the Madrid bombings. The FBI identified his fingerprint on a bag
found at the scene, and FBI fingerprint experts couldn’t imagine being wrong.
They were wrong. Charges were dropped when the Spanish National Police (whose
doubts about the FBI’s identification had been ignored) arrested the real
perpetrator whose fingerprint actually was a match. (Mayfield won $2 million in
damages.) Robson gives a lengthy list of other
failings that are as or more common among the gifted as among average folk.
They include blind spots, anchoring, confirmation bias, sunk cost bias,
groupthink, motivated reasoning, mindlessness, meta-forgetfulness and more. Thinking analytically requires attention
and effort and most of us prefer to rely on intuition when we think we can get
away with it. Most people including Ivy League students get this wrong for
example: Jack is looking at Anne but Anne
is looking at George. Jack is married but George is not. Is a married person
looking at an unmarried person? Yes, No, or Cannot Be Determined? Most people say Cannot Be Determined which is the intuitive answer because Anne’s
status is unknown, but the correct answer is Yes. (Anne is either married or not; if not, Jack is looking at
her; if so, she is looking at George.) The good news is that we can train
ourselves to think better (including with regard to propaganda and
misinformation), and Robson discusses the methods. The hard part is having the
intellectual humility to do it. As in AA, one must first admit to having the
problem.
I have a small property I’m readying for
sale. The repairs and fix-ups required a dumpster, which I placed not there but
on my primary property (my home) since it is in the woods; the other location is
visible on a well-trafficked road where everyone in the neighborhood would be
tempted to use the dumpster. Besides, as mentioned in the earlier blog Junque my barn and garage attic teem
with mismatched stuff left behind by my builder father, so I took the
opportunity of the dumpster’s presence to reduce (though far from eliminate) the
excess from them as well. While removing some items from the attic I uncovered
a small surprise. I should preface this by mentioning that
my mom was very unsentimental about material things, photos and documents (such
as journals) excepted. She was personally warm, but artifacts didn’t mean that much
to her and she was wary of clutter. “When in doubt throw it out” was her motto
in such things. It’s not a bad motto really, and I’ve generally tried to follow
it though I make more sentimental exceptions to it than she did. My dad (b.1926)
by contrast was not sentimental on the surface but he did save keepsakes (which
I now own) such as a 6” (15 cm) combination safe from the 1920s, a cast iron
toy tractor from his childhood, some shelving he made in shop class in school, and
his uniforms from WW2. (My mom, on the other hand didn’t save her wedding dress
past the date when it was clear my sister wouldn’t use it.) Anyway, in the
attic between wall studs I encountered his baseball bat from the 1930s, which
he gave to me in 1960. The last time I remember using it was 1970 before I attended
college. I really haven’t thought about it since then: out of sight out of mind.
Yet, there it was. I’m pretty sure my dad put it there (who else would have?) out
of view to avoid tempting my mom to make one of her “Oh, he’ll never use this
again” decisions to toss it.
The bat still feels the same with a very
familiar balance, but I wouldn’t hit a fast ball (assuming I still can hit a
fast ball) with it anymore. Ash is a pretty tough wood but after nearly 90
years of rough use and semi-exposure to the elements I doubt it would survive an
impact with an 80 mph ball. Maybe, but I wouldn’t risk it. It is still a suitable
self-defense weapon, however, so maybe it has value as more than just a
keepsake presently occupying a corner in my bedroom. All of my keepsakes are for me alone
since I don’t have kids whose closets can be cluttered in turn. Odds are if I
did they wouldn’t be interested anyway. Of course, ultimately no one has
biological legacies, for humanity is temporary, just as individual humans are.
We are doomed as a species. One might hope the end comes later rather than
earlier but the end will come. How will the world end? Back in 2008 the Future of Humanity Instituteat Oxford took a stab at calculating
odds of human-caused catastrophic global events prior to 2100. The researches
divided risks into three categories: events causing at least 1 million deaths,
events causing at least 1 billion deaths, and extinction events. I’ll mention
only extinction events except for the report’s prediction of a 30% chance of a
pandemic by a genetically engineered virus that kills more than 1 million
people. (Whatever the origins of Covid – crossover from bats is still the
official explanation – it has killed 2.8 million at this writing.) The chance
of actual extinction from an engineered pandemic was given at 2% by 2100, which
is small but non-negligible; the extinction risk from a natural pandemic by
contrast is only 0.05%. The risk from Artificial Intelligence was estimated at
5%. The extinction risk from war (all types of war, which might or might not involve
nuclear, chemical, and biological weapons) came in at 4%. The total risk of
extinction from anthropogenic causes of all types combined by 2100 came in at 19%.
In 2008 the decline in fertility that is currently much in the news, though
noted, was not yet alarming many researchers, so the risk that humans would
just stop reproducing was not considered; in any event, 2100 is much too early to
be a global doomsday for that reason. FHI focused on human-caused disasters,
but the biggest threats to the species are out of humans’ hands. Natural
climate change can be truly dramatic. Where I am currently sitting there was
once a mile of ice overhead. There will be again one day. Episodes of mass vulcanism
have occurred before and will occur again. Whether or not they would drive us
to extinction, they’d make civilization pretty hard to maintain. And, of course
there is the punch from outer space, which is not just a plot device for cheesy
scifi movies. In 2029 the large asteroid Apophis will pass inside the orbit of geosynchronous
communications satellites. It will be visible to the naked eye from the ground.
The odds of it striking earth are vanishingly small on that pass, but when it
returns in 2068 the odds, while still small, are not vanishingly so. Sooner or
later we will be hit by a big one as earth has been hit multiple times before.
Even if we somehow escape such celestial catastrophes as lethal asteroids, mass
solar ejections, gamma ray bursts from nearby supernovas, and so on, the sun
itself will kill us. It is growing hotter as it evolves and will make earth uninhabitable
in 500,000,000 years – some say an order of magnitude sooner than that. So, I’m not much worried about my bat as
a legacy though it will remain a personal keepsake for as long as I’m here. Hmmm, perhaps tonight is a good night
for an apocalyptic movie: maybe Gregg Araki’s 2010 Kaboom about a hedonistic college student who becomes aware of a
doomsday cult seeking the end of the world – an end of the world for non-cultists
at least. There are real cults of this stripe, such as Aum Shinrikyo, members
of which were responsible for the Tokyo Sarin gas attack of 1995. Warning: Araki
has his (weird) moments but he is not for viewers triggered by sexual content (albeit R
rather than NC-17).