Thanks to freezing rain during the previous night, the streets
in NJ were skating rinks yesterday morning and the trees were encased in sparkling
crystal. Unsurprisingly, the weight of the ice brought down tree limbs that
severed electrical wires around the state. My power went out at 9 a.m. (I have
a well and an oil furnace, so my heat and water go off with it.) I cozied up by
the fireplace to wait it out. The outage was minor by the standards of such
things. After two hours, JCP&L crews working in the ice
reconnected a line somewhere and my lights popped back on. When they did I
experienced a momentary sense of déjà vu. This is remarkable only because it
has been years since my last sense of it. In my youth it happened with some
frequency. This turns out to be normal. In most ways normal human brains hiccup
more often as they age, but not in this one. In this one way they get more
stable. The decade of life in which healthy people most frequently experience
déjà vu is 15-25. Occurrences decline thereafter and all but disappear in
seniors in the absence of medical conditions (e.g. epilepsy) or psychoactive
drug therapy.
The Britannica
Dictionary defines déjà vu as "the feeling that you have already
experienced something that is actually happening." (The reverse experience
– the feeling that something familiar is strange and alien – is called jamais
vu.) There are several non-mutually exclusive explanations for the phenomenon. I
choose to discount the paranormal ones though some people do not: an MD in Psychology Today, for one, writes, “There are situations that are glitches in time when the rules bend and
the mystery takes hold.” I think it rather more likely the glitches are in us,
not in time. Sometimes we really have a trace memory of something very
similar to what we are currently experiencing. This was likely the case
yesterday. I’ve sat before many fires in my life, and the woodsy area where I
live typically loses power several times per year – usually briefly but
sometimes for days. In other cases “dual processing” may be involved. The human
brain processes sensory data on multiple levels at once and most of those
levels are not conscious. (Hence we usually can navigate across a room
successfully even when we are mentally preoccupied and not paying conscious attention.)
A time lag can occur between non-conscious apprehension of one’s surroundings
and the upload of the same information to our conscious awareness (processed in
separate parts of the brain), in which case we may seem to experience an event
twice – in a sense we do. The lag is miniscule but apparently noticeable. This
can be more pronounced in cases of extended déjà vu called déjà vécu (“already
lived”). One elderly British gentlemen with the beginnings of dementia has a
chronic case of it; he gave up watching television because he said he always already
knew what would happen next. He, of course, could not describe the show before
turning it on, so this looks like a processing error rather than paranormal
precognition. Another possibility (again, not mutually exclusive with others)
has to do with the way we retrieve memories. Our memories are not stored all in
one place but are scattered here and there. When we recognize some item as
familiar, a mechanism in the hippocampus assembles those other bits and pieces of
the memory into a sort of holographic whole. It may be possible, however, for
one element to trigger that sense of familiarity even in an otherwise novel
environment. Speaking of déjà vu in epileptics in particular, Dr. Alan Brown of
Duke University’s Department of Psychology & Neuroscience commented to an
interviewer, “My belief is that a pre-seizure déjà vu experience is triggered
by spontaneous activity in that area of the brain that handles familiarity
evaluations.” This may be the case among neurological normies too. Whatever the real explanation may be, the term “déjà vu” is
not appropriately used in the context I’ve heard it twice this morning: with
regard to current events. This is a flawed use on many levels, as politically
infused terminology usually is. The speakers, of course, meant that history was
repeating itself. It isn’t. Max Beerbohm: “History,” it has been said, “does
not repeat itself. The historians repeat one another.” Analogies made by armchair
historians always should be treated cautiously, as should “lessons” supposedly
drawn from them. As examples, arguably the two World Wars taught opposite
lessons. The first was to exercise restraint lest a limited regional dispute spiral
out of control: Austria’s problems with Serbia were not worth a world war. The second
was to nip aggression in the bud lest it spiral out of control. In truth the
two cases were different. The cases are always different. As for current world events, I won’t add to the din of “analyses,”
most of which range from bad to very bad. I just know that calling it déjà vu is
calling it something it is not. I’ve heard it all before. I’ve heard it all
before.
A young friend of mine who regards me as an uncle sometimes
has thematic parties. Once she hosted a formal tea party, not as a legitimate
thing but in irony. On that occasion she asked to use my mom’s old teacup
collection and some of my CDs of ‘40s-era music. The most recent was a “dad
party” in which everyone dressed as older men and told stale jokes. For that
one she asked to borrow my red suspenders. I wear suspenders (of various
colors) entirely un-ironically. Suspenders (in some places called braces) may be antiquated
in fashion terms but they are surprisingly recent historically. One can’t very
well speak of suspenders without first talking about what they suspend (or brace).
Pants/trousers predate suspenders by millennia. (Terminology note: in colloquial
US English all trousers are pants but not all pants are trousers, for “pants”
also include such things as pajama bottoms and some undergarments.) Yet, even
pants, while ancient, are not quite as old as one might think. Prehistoric
people in colder climates did cover their lower bodies, but they tended to do
it in pieces. Ötzi the Iceman,
for example, had leggings and a waist vestment. Pants as a unified garment seem
to have been invented at the same time as the domestication of horses, mostly
likely on the Eurasian steppes. They are eminently suited for horseback riding.
Ancient Greek artwork depicts Scythians and other horse warriors (including
Amazons) in pants. The oldest surviving pair of trousers are more than 3000 years old. They were
found in western China; the style and workmanship indicate a long established sartorial
tradition. Pants soon caught on among non-equestrian northern ancient barbarian
tribes, notably the Germans, since they were more comfortable in cold weather
than anything… well… breezier. Mediterranean cultures rejected the garments for
a long time. When the Visigoths sacked Rome in 410 CE, the Roman Emperor
Honorius took firm action from his home in Ravenna by banning the wearing of
pants – pants being Germanic attire. In China, however, trousers were in
continuous use from ancient times by both sexes, especially but not exclusively
among the working class, and usually were worn with a robe. After the fall of
Rome, pants in the West as male attire increasingly replaced other garments even
in southern Europe, particularly among peasants and laborers. By the 15th
century they were nearly universal in European middle and upper class circles
as well. Thereafter, the styles and lengths varied widely over time as fashions
came and went. Breeches that ended just below the knee held sway in the 18th
century, but after 1800 full length trousers became fashionable. Middle class
women in Europe and North America started wearing them (some working class
women always did) in the late 19th century as sport and casual wear and
did so with increasing frequency in the 20th century. (I have photos
of my grandmother in riding breeches in the 1920s and my mom in
jeans in the 1940s.) They became common business attire for
women in the 1970s. Pants, for all their practicality, have one drawback – at
least for those with waist measurements similar to (or larger than) hip
measurements. They can fall down. Belts are the oldest answer but are not a
perfect one. I speak from experience as someone with a body shape that, in
order to keep pants in place, requires 1) cinching a belt so tight as to hurt
the kidneys, 2) constantly hitching up pants by hand, which is awkward or
impossible when both hands are full as when moving a sofa, or 3) wearing
suspenders. Suspenders seem an obvious idea to supplement (or replace)
belts, yet we don’t have any evidence of them prior to the 1700s. That isn’t
proof they didn’t exist, but if so they weren’t mentioned in literature or
depicted in art. In the 18th century they were simple cloth strips tied
to buttonholes in pants. Intended to be no more visible than garters, they were
worn under shirts. (Suspenders are still sometimes worn under shirts that don’t
tuck in such as sweatshirts or polo shirts.) These originals weren’t very
comfortable. In 1820 Albert Thurston came up with an improved design made of
tightly woven wool that attached with leather loops. (You still can buy these.)
They could be worn over shirts (they pretty much had to be if the shirts were
tucked in) but still were expected to be under vests, jackets, or coats. (As
always, these fashion considerations were a middle and upper class concern;
workers wore whatever was practical however was practical.) Suspender design continued
to improve throughout the 19th century; the products became more
elastic and adjustable. In 1894 suspenders with metal clasps were invented, so
they could be used on any pants – not just those tailored for them with buttons
or buttonholes. In the 1920s and 1930s suspenders worn openly over shirts
developed a sort of cachet. Gangsters and private detectives are often depicted
with them in films of the ‘30s and ‘40s. After World War 2 their popularity
faded but they never went away entirely. For some body types they are, as
mentioned above, practical – and, yes, my young friend is right that it’s a
body type common among older men. Some wearers do not have this body shape challenge
but simply like the look. Suspenders sometimes turn up in women’s fashions too,
e.g. the Annie Hall phase in the ‘70s. As for me, I just like keeping my pants up as a favor to
myself and to the world at large – and yes I wear suspenders with a belt. I drive with a spare tire
in the trunk, too. Redundancy is safety.
Last week I stumbled upon a podcast with an interview of
Richard Gideon, author of Power of
Regret: An in-depth look into the psychology of regret, its benefits, and how
to break free from guilt. The discussion was interesting and tangentially related
to Carl Jung, some of whose writings I was revisiting (see previous blog), but
I had other things to do. So, I logged off – but not before ordering the “book”
off Amazon. When it arrived it proved to be little more than a pamphlet that,
at 22 pages, uses the term “in-depth” generously. However, if an author can say
what he wants to say in that number of pages, I suppose there is no reason to
yammer on for more.
We all have regrets. Some people claim they don’t, but they
do. Yet, I understand what they mean when they say this. They mean they’ve made
peace with their regrets. Some have simply forgiven themselves. Others, more
philosophically, recognize that in order to accept who we are right now we must
accept everything good and bad that got us to this point; everything in life
and nature is so interconnected, after all, that one element cannot be changed without
changing the whole. That doesn’t mean we wouldn’t do things differently if real
life, like video games, had do-overs from various save points. But even this
would require remembering having done it wrong the first time. Those things we
would do over are still fairly called regrets. There is nothing to be gained by
wallowing in them however. Making peace with them is the focus of Gideon’s publication. Regrets come in three main flavors. The first are simple
misjudgments that ended in embarrassment, harm, or financial cost. They don’t
have a moral aspect. (I regret not having bought Intel stock in 1971 and
Bitcoin in 2009.) The second involves a failure of character in our own eyes
though not necessarily a violation of ethics.
An example would be rudeness to a friend or parent just before he or she (for
unrelated reasons) dies. While unpleasant, mere rudeness is not unethical per se, but a person still might feel
regretful and guilty about it in the case mentioned. The third are actual moral
transgressions that may or may not be criminal: betrayal, theft, assault, or
worse. We start accumulating regrets early, but by middle age we
usually have a substantial stockpile. Back in 2016 the insurance company
Allianz conducted a poll on life regrets. There were as many regrets of
omission as commission in the answers. Some answers:failure to quit abusive relationships sooner;
failure to prepare children for independence, which is very much a first world
problem; working too much; saving too little; clinging to grudges; not
achieving more; spending too little time with children and with parents; and
failure to be more daring. Men and women give similar answers with two notable
exceptions. Women are more likely than men to express romantic regrets (44% vs
19%) while women are less likely than men specifically to regret divorce (27%
vs 39%). Those two may seem to be in contradiction, but they are quite
different: the former are mostly about what-might-have-been while the latter are
about what-actually-was. Gideon’s tips on letting go of the past are obvious, but are not
wrong for being so. Put things in context, he advises, such as your age and
level of experience at the time of the regrettable incident. Find the positive,
e.g. the lessons and personal growth from the event. Remember that there is a
reason that even under the law most crimes have statutes of limitations for
prosecutions. We all are fools, jerks, and violators sometimes, and none more
than those who insist they aren’t. There comes a time to give yourself a break
over the past. “Stop being cruel to yourself,” Gideon says. “Accept you’ve
served your sentence and let go of the guilt.” So, remember but move on. If we don’t we’ll regret that too.
Edith Piaf – Non Je Ne Regrette Rien (No, I Regret
Nothing)
Prompted by a discussion about Carl Jung on an online book
chat group to which I belong, I retrieved ThePortable Jung from one of my
bookshelves this week for a revisit. The content is thoughtful, erudite,
intelligent, and insightful. I can see, however, why Freud grew frustrated with
Jung and broke with him. (Freud tended to take philosophical differences
personally.) In the section on Synchronicity, for example, Jung shows himself far
too ready to credit claims regarding ESP, astrology, and other paranormal
phenomena without questioning whether anything might be flawed about the researchers’
methods or data. It’s hard not to visualize the more skeptical and
materialistic Sigmund rolling his eyes.
Jung’s notion of the collective unconscious is less flaky in
own writings than is often portrayed by others. It is little more than an
acknowledgement that human beings are not born as blank slates (as was asserted
by many academics of the time – and our own time for that matter) but that
there is such a thing as evolved human nature. Evolutionary psychologists say
the same thing in a more straightforward way. Jung’s exploration of myth and
mythic symbolism impressed Joseph Campbell (deservedly so) and inspired him to
write the intro to this anthology. It is likely however that Jung’s writings
about the “shadow self” have had the most lasting cultural impact. The shadow is the part of our personality that we don’t like
to acknowledge. It consists largely of those thoughts, lusts, motivations, and
desires that violate social norms and (more importantly) our personal ethos.
It also consists of positive traits that we have trained ourselves to keep
hidden; for example, if our assertiveness was punished when we were children,
we may habitually suppress it later in life. To the extent that certain personality
traits harm one’s self-image, they may be repressed or (worse) projected onto
others. This is not identical to Freud’s description of the unconscious, since
some of this is in fact conscious: that’s why the person feels uncomfortable about
these traits and tries to deny them. Yet, Jung argued that we all have cruelty
and destructiveness (and worse) in our natures. To become psychically whole and
healthy we must acknowledge the shadow and integrate it into who we are. This
doesn’t mean we should give into destructive impulses; it just means we do
ourselves no favor by denying their existence. Unlike the groundhog, seeing one’s
own shadow clearly can lead, so to speak, to an early spring. Jung writes, “By not being aware of having a shadow, you declare a part of
your personality to be non-existent… If you get rid of qualities you don’t like
by denying them, you become more and more unaware of what you are, you declare
yourself more and more non-existent, and your devils will grow fatter and
fatter.” Besides, there is nothing praiseworthy or good about any
action or inaction if it isn’t a choice – if you have no other option. Jung was
a fan of Friedrich Nietzsche in his youth, and one of Nietzsche’s lines was, “I
laugh at those who think themselves good because they have no claws.” If you don’t
have the capacity to do harm, then it’s not a big achievement to refrain from
causing it. If you do have the capacity (including the capacity to enjoy it)
then ethical decisions become possible. All that matters in the realm of ethics
is what you do when you have a choice, not what you think or feel. The notion of the shadow has entered pop culture. Jeff
Lindsay literally refers to the “shadow” ofthe main character in his Dexter
novels (which were adapted to a long-running TV series), but that example is
too easy: too on the money. Let’s look instead at a slightly less obvious
example: the old cult TV series Buffy the
Vampire Slayer. A character named Angel in the show is a vampire that has had
his soul returned to him: a soul in the context of the show is a moral compass.
Vampires in the show are normally soulless and are therefore psychopaths. They
do what they like without guilt or moral twinge. But, as in real life, there
are psychopaths and psychopaths. Not all psychopaths are serial killers. Most
are law-abiding, not because they care at all about the law or about ethics or
about anyone else’s wellbeing, but as a rational choice: they personally just
don’t get enough jollies from crime to balance the likely penalties. One can be
a psychopath and not a sadist, which is to say one can be nonchalant about
causing pain without actually enjoying it. Angel berates himself for all the
twisted evil sadistic things he did while soulless. Why? Because he knows the
sadism was there in his own human character before he ever became a vampire. Losing
his soul just removed his inhibitions against expressing it, but it was there.
His whole character arc in this show (and then his own spin-off) consists of re-integrating
his shadow self without turning to evil deeds again. I don’t think I’m at much risk of becoming a vampire despite
having encountered my fair share of bloodsuckers in life. Also, I gave up most
shadow boxing some years ago, though it was a hobby for a while. But then again
my claws are not as sharp as they used to be. I still recommend in a general
way taking Carl’s advice to befriend the shadow side of oneself. But don’t let
it operate the chain saw.