Yesterday
I walked into the kitchen pantry while my mind wandered as it often does. I
suddenly realized I had forgotten for what I had gone in there. I’m pretty sure
this wasn’t a senior moment (something one always questions after a certain
point) since I commonly did this very thing as a teenager – in fact more
frequently then. In those days my mom (overestimating my academic achievement)
called me the “absent minded professor.” Then, as now, my mind tended to wander
onto events of the day, future plans, recent conversations, and various other matters
so that I would forget the task at hand. I remembered what I was after
yesterday immediately after exiting the pantry by the way: it was toothpaste,
which in fairness is not something one usually retrieves from a pantry. Earlier
in the morning I had noticed the tube in the bathroom was empty. This
sort of forgetfulness has been understood in broad outline for decades. Short-term
and long-term memory storage are separate mechanisms. The former, mediated by
the hippocampus, normally is good for up to 30 seconds – short enough to forget
why one walked into a pantry. In order to kick a memory into long-term storage in
the cortex you either have to deliberately concentrate on it or by happenstance
associate it with some stimulus such as pain, fright, pleasure, intrigue, and
so on. This is why soon after an hour’s commute we can’t remember most of it –
only the parts where we had to concentrate because of some stressful road
conditions. By the next day we might forget even these. In essence, the default
setting of the brain is to forget; to remember requires effort, whether evoked
spontaneously or by intent.
forget-me-nots (Myosotis sylvatica)
Storing
something into long-term memory changes the wiring and chemistry of neurons in
the cortex so that some connections carry charges easier. These engrams are our
memories, but even these frequently fade. (They are also subject to
manipulation, which is why “recovered memories” in particular are notoriously
unreliable.) A memory can be made not just long-term but permanent either by
repeatedly calling up the memory (thereby reinforcing the wiring changes) or by
associating it with some major stimulus. Traumatic stimuli are especially
effective at locking in a memory, but more pleasant stimuli work, too. A very
imperfect but useful analogy is a flexible green tree branch. If you flex it a
little and quickly let it go, it will snap back to its original shape. It will,
in essence forget the flex: an analog of forgetting a short-term memory. If you
flex it repeatedly, however, or give it a sharp extreme bend, it will not snap
back all the way, but will retain a partial “memory” of the flex. Recent
studies indicate that forgetting is an essential part of the process of making new
memories long-term. Encoding all the
details of an experience would clutter our minds and overload our capacity to speedily
get any meaningful use from the memory. It’s important to remember at what
corner to make a turn to get to a destination; it is not important to remember
the rust patterns on every storm drain we pass on the way to the corner, so we
don’t. It’s important to remember that a dog growled before it bit, not that
there was a grass stain on its back left leg. Our brains tend to filter out the
unimportant data (though odd details sometimes stick), so we are very good at
remembering the gist of an event (the Darwinian advantages of which are
obvious) but not the surrounding fluff. This is why eyewitness testimony is so
often poor. Was the getaway car green or black? Were there two shots or three?
Eyewitnesses often (in fact, usually) conflict with each other about the details
and they are very open to influence; for example, if told a car was green they
are likely to remember it as green whether it was or not. Recalling
just the gist aids survival because it helps us generalize in useful ways. In
the dog example above, it’s good to generalize that a growling dog might bite,
not that a growling dog with a grass stain on its back left leg might bite. That
is too much detail; we could underestimate the risk of a stain-free growling
dog. This is a commonplace issue in artificial intelligence as well. The
problem is called “overfit,” which is when a self-learning AI attaches too much
significance to random coincidences, thereby diminishing its ability to
generalize and make useful predictions. An effective AI needs to be able to
“forget” the unimportant stuff. There
are people with extraordinary long-term memories. The condition is called hyperthymesia,
but it is a relative thing. People with hyperthymesia forget, too; they just forget
less than the rest of us. This sounds like a great advantage, and in some ways
it can be. The actress Marilu Henner famously has this type of memory; it
doesn’t seem to cause her any trouble and it certainly has helped her remember
her lines. Overall though, people with hyperthymesia do not do any better in
life than other people; on the contrary they are at higher risk of PTSD and
other obsessive disorders. Experiments
with rats have demonstrated various ways that neurotransmitters (dopamine in
particular) affect memory retention. Understanding the process better might
help with the treatment of PTSD (not enough forgetting) on one end and
Alzheimer’s (too much forgetting) on the other. The scifi trope of excising or
implanting specific memories does not appear to be in the cards for the
foreseeable future, however, which is probably just as well. I’ll
end here since there is something I need to retrieve from the pantry. There was
something else I wanted to say, but I forget what it was.
I
don’t write blogs for money, which is fortunate because like two-thirds of
bloggers I don’t make any. OK, that’s not entirely true, but so close to
entirely as to make no difference. All the money passed along from Google for all the ads on my site since the first one in January 2009 (a few cents
here, a few there) might add up to enough to cover the cost of today’s lunch (a
Philly cheesesteak from Marilyn’s) but I doubt it. Nonetheless, this is my 800th
post (a nice round number that inspired this bit of retrospection), which is
somewhat more than one post per week over the past 12 years. The number of
readers varies one week to the next, but there are a few hundred in an average
week, which is not enough to be remunerative but enough to be gratifying. It is
possible to make money blogging, and there is plenty of advice on the net and
elsewhere telling you how to do that. The key is specializing in a particular
niche such as dietary advice, financial advice, relationship advice,
or…well…advice on how to make money blogging. And then there is politics, the
more dehumanizing of one’s opponents the better. While this always generates a
lot of hateful feedback, regardless of what the blog’s ideological content might
be, it also generates traffic. (I don’t do this: not because I lack a political
viewpoint [I’ve been active in a third party] but because there is more than enough
of all that on the net. Besides, political “discussions” tend to drive out all
others in a sort of Gresham’s Law of discourse, and there is so much else to write
about.) So, even though 90% of bloggers earn nothing or a pittance, 2% earn
over six-figures. Many of these successful bloggers are associated with related
commercial operations, such as travel blogs on sites that sell travel packages
or health blogs on sites that sell dietary supplements. Some of the statistics
are puzzling. For example, blogs over 2000 words attract the most readers (2500
is the sweet spot) yet the average reader spends only 37 seconds on the site.
According to a Wiki article on Speed Reading, a typical reader reads at a rate
of 250 words per minute while “proficient readers are able to read 280–310 wpm
without compromising comprehension.” So, I have to assume those 2500-word blogs
aren’t getting a very thorough examination. But if the goal of the poster is
just ad-clicks, I guess that doesn’t matter.
For
those who want to try to make money this way, have at it and best of luck. Most
of us have other reasons for doing it. I just kind of like it. I could expand
on that: writing about a variety of topics prompts me to look into them more
deeply, which is fun in itself and provides fodder for dinner conversations
(remember when there were dinner conversations?); it’s a way of keeping one’s
writing skills from decaying too rapidly; and it serves as a sort of online
scrapbook, a reminder of where one’s head was at 5, 10, or more years ago. All
of which means, I just kind of like it. I recommend it to anyone with even a
slight literary itch. It is,
of course, a self-indulgent thing to do, but it has advantages over the other
two similar self-indulgences: autobiography and journals, both of which are
unabashedly about oneself. Blogging at least purports to be about things of
more general interest. Besides, autobiographies are books of lies: some of
commission but especially of omission. Anyone who wouldn’t leave significant things
out of an autobiography has led a singularly uninteresting life. Journals
(unless intended from the start to be for publication) on the other hand are more
truthful by design but for that very reason are unwise in today’s world.
Psychologists, educators, and therapists often urge keeping a journal as a
mental health measure; it is a place where you can express privately all those
things on your mind that you would leave out of an autobiography. There always
has been a risk of forgetfully leaving it out where some visitor can sneak a
peek at it with unfortunate consequences, but prior to the 1990s the damage was
likely to be limited. There also generally was plausible deniability if the
sneak-reader repeated anything. Nowadays with cell phone photos and scanners the
pages can be uploaded in seconds. Privacy never can be assumed anymore, either
for the spoken or written word. It is unwise to commit to writing anywhere anything you would not want
publicly exposed for all to see. We write blogs with that in mind from the
start, so it is not so much a problem. Off course,
discretion is not everyone’s strongest attribute. Most people who post unwisely
get away with it most of the time because most of the time nobody cares. But one cannot
always count on disinterest and ignoration. A prime example of misplaced
confidence is the so-called Bling Ring of teenagers about a dozen years ago who
burglarized celebrities’ homes in the LA area while posting on social media.
(The surprising thing is that they got away with it for a year; Sofia Coppola’s
movie The Bling Ring about the events
is worth a look.) But even a minor faux
pas sometimes can go viral. So, with appropriate caution, the reader (if
not already doing so) might consider joining the 600,000,000 bloggers that
already exist worldwide. You could be among the 2% that make a good living at
it. At the very least, in about a dozen years it could pay for your lunch.
“Parasocial” is the term for one-sided relationships of the
sort we might have with celebrities: we know them (to some extent) but they
don’t know us at all. These one-sided relationships existed even in ancient
times when the only media were handwritten scrolls, statues, and word-of-mouth
rumors about the elite and famous, but they got an enormous boost with the
arrival in the 20th century of movies and recordings that seem to
bring us face to face or within earshot. Fans are known to become emotionally
attached to these well-known strangers enough to experience genuine grief when
they die. The phenomenon was examined back in 1985 by Richard Schickel in Intimate Strangers: The Culture of Celebrity
in America, a book that now seems rather quaint in today’s age of virtual
interactions on online social media where twitter posts can masquerade as actual
conversations. Most of us understand the limitations of parasocial
relationships. Those who don’t understand them form the pool from whom stalkers
emerge.
Dawn Wells (1938-2020) "Mary Ann"
A more curious phenomenon, also regarded as parasociality, is
a relationship with fictional characters. This, too, has ancient precedents.
Long before Homer’s epics were written down around 700 BCE (prior to which they
were an oral tradition) people sat around the hearth fires and listened to
bards recite stories of Troy and tales of brave Ulysses. The listeners surely
felt they knew the characters as well as they knew their own neighbors. Again,
modern media make the illusion all the more convincing. In the 1960s there was
a common debate among high school boys of Mary Ann vs. Ginger on the TV show Gilligan’s Island. Being a smart-aleck
teenager at the time, I tended to respond dismissively: “Like, you know they’re
fictional, right?” But in truth I knew exactly why the debaters debated, and
had an unvoiced opinion. Beyond the superficial level of schoolboy crushes,
people often become deeply invested in their favorite shows. They care about
the characters and what happens to them. On some level viewers regard the
characters on Friends as friends. The
prevailing opinion among psychologists is that up to a certain point this is
perfectly healthy: a normal expression of empathy. Below the most surface level
of consciousness our minds aren’t good at distinguishing between fictional
characters and real ones – assuming the scriptwriters and actors are halfway
competent. Dr. Danielle Forshee on her website says “your brain recognizes the
human emotion they are portraying and starts to feel connected to those
characters.” Also, the characters can be avatars of ourselves, and so it can be
cathartic and helpful to see them work through difficulties we’ve faced
ourselves. It can be disturbing to watch them get into avoidable trouble (by
marrying the obviously wrong person, for example) just as it is disturbing to
watch a real friend do it – or, worse, do it ourselves.
This is also why fans get so upset when book series or TV
shows don’t end the “right” way. Many shows get canceled abruptly and so have
no proper endings at all. (It’s a bit like being ghosted in real life.) This
frustrates fans but apparently not as much as deliberately crafted endings that
they don’t like. Google something like “beloved TV shows with hated finales”
and you’ll get a long list including Game
of Thrones, Dexter, Lost, Enterprise, Seinfeld, The Sopranos, and How I Met
Your Mother, among many others. Sometimes the finales are truly slapdash
and unsatisfying, but even well-crafted ones can annoy fans.
I actually like one of the most hated finales of all time: How I Met Your Mother. *Spoilers*
follow, so in the unlikely event the reader hasn’t ever seen the show, skip
this next part. The majority of fans are upset because 1) there isn’t a
“happily ever after” ending and 2) Ted reconnects with Robin after nine seasons
of the audience being told that they weren’t right for each other. Well, life
doesn’t have “happily ever after” endings. Mark Twain commented that love
affairs end either badly or tragically, and the tragic ending in HIMYM is the
way things go – earlier in this case than one might hope but frequent enough at
that age even so. Secondly, Robin and Ted weren’t
right for each other when Ted was looking for “the One” with whom to build a
home and family while Robin wanted a childless globetrotting career. Years
later, however, Ted is a widower father of grown children and Robin is successful
at her job and divorced. Their goals no longer clash but they still have a long
shared personal history. They might get on just fine. Times and circumstances
have changed. That doesn’t diminish Ted and Tracy; they were right for the kind
of life they had at the time they had it. (Besides, the point is often missed
that Ted is Tracy’s second pick; in one episode she talks to her deceased
partner whom she had regarded as “the One” about moving on.) Maybe my response is a matter of
age: I’m looking at things from the perspective of old done-that Ted, not young
aspirational Ted, but to me the ending makes perfect sense.
That I gave this matter any thought at all is a prime example
of parasociality. So, anyone at whom I snickered in high school for debating
Mary Ann vs. Ginger is fully within his rights to say to me about the
characters in HIMYM, “Like, you know they’re fictional, right?”
Among the books recently to occupy my bedtable (they serve as
my sleeping pills) was First Principles:
What America's Founders Learned from the Greeks and Romans and How That Shaped
Our Country by Thomas Ricks. Many of the political disputes that exercise
passions today have eerie parallels in the early years of the Republic, and
there is value in revisiting that history.
Today the Greco-Roman classics for most students are an
educational side note, often with scarcely more attention paid to them than an
assignment of a (totally out of context) bad translation of Aristophanes’ The Frogs in high school. In the 18th
century they were the core of education above the elementary level. All of the
Founders were deeply familiar with them – more so than with the literature of
their own time. The classical authors strongly influenced their philosophical
and political thought both directly from the original sources and indirectly
via Locke, Montesquieu et al. Hence there was an oft-voiced disdain for
democracy (a dirty word until the 1820s) because Aristotle regarded it as a
“perversion” of a constitutional republic just as oligarchy was a perversion of
aristocracy and tyranny of monarchy. The favorable description of the Roman
Republic (Consulship, Senate, Assembly) by Polybius was commonly referenced in
debates over state constitutions as well as the federal one. Also influential
was Cicero: “Statuo esse optime constitutam
rempublicam quae, ex tribus qeneribus illis regali, optimo, et populari modice
confuse” [I maintain that the best constitution for a State is that which, out
of the three general types, is a balanced mix of monarchy, aristocracy, and democracy].
18th century politicians weren’t subtle about it either: they regularly
referenced classical authors and ancient precedents in their speeches. Jefferson
admired Epicurus, which helps explain the “pursuit of happiness” line. Other authors
who figured prominently in the debates were Plutarch, Demosthenes, Thucydides,
Sallust, Xenophon, Tacitus, and Livy. Conspicuous for his absence was Plato.
Elbridge Gerry, who sufficiently mastered republicanism to devise the
gerrymander, voiced the general opinion by stating, "Plato was not a
republican." Ricks explores how this pervasively classical way of looking
at things (so often overlooked by modern historians) affected the Revolution
and the Early Republic.
The book caught my eye in no small part because I wrote about
the same thing – far, far less ambitiously at only 14 pages – as long ago as
college. Yes, I still have some of those old papers including such page-turners
as The Impact of a Vulnerable Grain
Supply on the Imperialism of Fifth Century Athens,The Historical Writings of Procopius of Caesarea, and A History of Land Use in the Township of
Mendham though I can't imagine who'd ever want to read them.
Ricks has done his research and presents it well. He is not
oblivious to the glaring blind spots in the vision of the Founders (the failure
to address slavery being the blindest of all), but nonetheless credits them for
putting into practice a theory of rights and governance that made their
hypocrisy ultimately untenable – an unmatched achievement in the 18th
century. He rightly notes the importance of the 1800 election. There have been
other fateful elections (1860 most obviously) but without the precedent of 1800
the others wouldn’t even have taken place. With Jefferson’s victory, for the
first time there was a peaceful transition of power to an opposition party that
the losing Federalists truly believed was a threat to the nation. The event
showed such a transition was possible and that the result, while consequential,
needn’t be the end of the world.
The author closes with an epilogue of ten present-day
recommendations that he says are informed by lessons from the Founders and,
through them, from the Greeks and Romans they chose as their mentors. Some of
the points are unexceptionable but others simply don’t follow from his history
of founding principles – or at least one could use similar sophistry to argue just
the reverse. That is not to say his modern-day views on health care funding and
the like are wrong, just that they are not derivable from Constitutional first
principles. They are perhaps not incompatible with them (well, a couple
actually might be) but then neither are opposite positions (well, a couple
actually might be). There is a tendency (also dating to the Early Republic) to
try to achieve by Constitutional interpretation what cannot be achieved by
legislation; this often requires reading words into the text that simply aren’t
there. It’s a political maneuver used by any side that thinks it has a shot of
winning a court decision, but it should be recognized as such rather than as a dispassionate
analysis of Constitutional principles, classically derived or otherwise.
What Ricks doesn’t mention is another takeaway from classical
history, particularly of the Roman Republic, which many of the Founders were so
keen to emulate. Extreme factionalism in the Republic, fueled by ambitious
politicians exploiting class fears and grievances, led to civil war: not once
but repeatedly. An end was put to this only by Augustus Caesar, and then only to
be replaced by non-ideological internal wars among generals. (Rome under the Principate
never did adequately solve the peaceful succession problem.) Americans emulated
such factionalism down the path to civil war once. It would be best not to do
so again.
11
days after the solstice is a strange day to start the New Year. The designation
of a New Year’s Day in any calendar system is ultimately arbitrary, of course,
but calendars had their origins in Neolithic times from the tracking of the
phases of the moon and the (apparent) motion of the sun; accordingly, ancient calendars
almost universally begin the year (or try to) at either the winter solstice or the spring
equinox. Julius Caesar, ever unconventional, saw no reason why it couldn’t
start on some other day, however, and it is he whom we have to thank for our
calendar doing exactly that. All early
calendars ran into trouble by trying to include lunar months in a solar year.
The two cycles don’t synch, of course, so some scheme for reconciling them had
to be employed; usually this required throwing in an intercalary month every
few years. The traditional Roman calendar was a worse hash than most (even the
far more ancient Sumerian was better), thereby creating huge problems in everything
from shipping schedules to the calculation of interest payments. Wrote
Plutarch, “Festivals and days of sacrifice gradually got out of place and
finally came to be celebrated at the very opposite seasons to what was
originally intended.” Then Caesar went to Egypt during the Roman civil war
where he was impressed not only by Cleopatra but by Cleopatra’s astronomer
Sosigenes. Sosigenes had his own
proposals for timekeeping, so Caesar put him in charge of revamping the Roman
calendar. Sosigenes developed the Julian calendar, which went into effect by
Caesar’s order in 46 BCE. It dispensed with lunar phases and intercalary months.
“He linked the year to the course of the sun, abolishing the short extra month
and adding an entire day every fourth year” (Suetonius Julius Caesar). Caesar chose what the first day to begin the new
calendar would be, and he deliberately chose one other than the solstice. The
new calendar was a vast improvement for everyday life and it simplified
astronomical calculations as well. When Cicero (no fan of Caesar) was told what
date the constellation Lyra would rise, he grumbled, “No doubt it has been
ordered to do so.” With
only one minor tweak, this is the calendar we still use today. Sosigenes knew
very well that the solar year is 365.242 days, not 365.25, but the Julian
calendar would drift only 3 days every 4 centuries, and in 46 BCE that no doubt
seemed too far away to be a concern. Centuries do pass, however. Accumulated
excess days were chopped off in 1582 (not until 1752 in Britain and its
colonies) when the Gregorian calendar went into effect. It is the same as the
Julian except that it eliminates the leap day from any year ending in 00 that
is not also evenly divisible by 400. So, 2000 was a leap year but 1900 wasn’t
and 2100 won’t be; this makes the calendar accurate to within one day per 3300
years. So why
was January 1 not set on the solstice in 46 BCE? Caesar wanted it otherwise,
and that is all there is to it. We still live with the consequences of that
choice, which, it must be said, are surely less onerous than the consequences
of most of our own life choices. It is commonplace for us to try to put our bad
choices behind us New Year’s Eve and start fresh on New Year’s Day. That is
what we intend anyway. We
need not be entirely captive to the whims of Julius Caesar when celebrating a
new year. There is nothing stopping each of us from choosing a personal New
Year’s Day (or a whole private calendar for that matter): “I consider my new
year to start on July 22, because why not?” That is perfectly legitimate, but it
means celebrating your New Year’s Eve alone (even sans Covid); this would be
the least of the drawbacks to scheduling by one’s own idiosyncratic calendar.
So, most of us are content to ring out the old year by the generally accepted
calculation on December 31 and ring in the new year on January 1. The
New Year’s Blues typically kick in on January 2 by which time we’ve broken at
least a few of our new year’s resolutions and are grudgingly acknowledging that
our lives and challenges are not really different than they were a few days
earlier. Yet, the blues pass, too, since “not really different” at least means
things are no worse. Besides, there is always Orthodox New Year to celebrate.
This is calculated by the untweaked Julian calendar, which has diverged two
weeks from the Gregorian. That leaves plenty of time to recover from last
night’s merriments, so be prepared to raise yet another glass or two when
January 14th arrives. Maybe a second stab at keeping new year’s
resolutions will work out better, and January 15 will be blues-free – unless
we’re referring to music genres in which case it definitely should be
blues-full.