Hangovers are at least as old as the technology to brew
alcohol to supplement the small amount that can be found naturally in overripe
fruit hanging in trees. That technology is prehistoric, but the English word
“hangover” is actually fairly recent, the earliest known appearance in print
being in a 1904 slang dictionary. Before then the word was “crapulence,” which
I rather like better though one is likely to be misunderstood using the term today.
No matter how you say it (“cruda” [rawness] in New World Spanish, “resaca”
[detritus from receding surf] in Old World Spanish, “Kater” [tomcat – you
figure that one out] in German, etc.), it’s an unpleasantness most of us have
experienced. I wasn’t precocious with my vices in my youth (which I say
with more embarrassment than pride) and didn’t experience a full-blown hangover
until college. (It was legal: drinking age in DC then was 18.) The primary
components that night were red wine and excessive conviviality with similarly
bibulous friends. Back in my own dorm room, I went to bed with the stereo still
on. Sometime after 4 a.m. I awakened to an awful sensation. I leapt out of bed
and hurried down the hallway to the bathroom. You know what happened there. I
returned my room still nauseated. Playing on the stereo (no kidding) was
Melanie’s Leftover Wine,
a song that to this day I cannot hear without queasiness, though I still have
the album. A couple more hours of sleep did not prevent the subsequent daylight
hours from being less than my happiest. I wish I could say that was my last such
experience as well as the first, but it wasn’t. Like Shakespeare “I have very
poor and unhappy brains for drinking” (Othello).
However, unlike Shakespeare, who purportedly died of a hangover, I eventually (late
20s) curtailed my intake accordingly.
Scene of the crime: My dorm room at GWU
Nonetheless, the title Hungover:
theMorning After and One Man’s Quest
for a Cure caught my eye on the ER Hamilton bookseller website. The Canadian
author Shaughnessy Bishop-Stall relates the story of his international travels.
Just in the interest of research he gets drunk almost “every night now” and
tries hangover "cures" in various cities. They range from a brutal
massage in Switzerland to an IV drip in the Netherlands. He also does things
that are not advisable to do hungover such as driving a race car and jumping
off the Stratosphere in Las Vegas. The latter unexpectedly cured a hangover:
the terror and adrenaline did the job, or perhaps just distracted him from his
discomfort. (He was properly harnessed of course; unharnessed couldn’t possibly
work more than once.) He does come up with a formula that he says works for him
though he admits it could be just a placebo effect with no validity for anyone
else. It’s a weird mix of B vitamins and other ingredients including some (e.g.
frankincense) that might be hard to find last minute on New Year’s Eve; it also
must be taken between the last drink and going to bed. (If you really need to
be alert in the morning, I wouldn’t bet on this “cure.”)
Oddly enough, despite
alcohol being the most studied of popular intoxicants, scientists still don’t
really agree on what causes hangovers. Dehydration and the build-up of acetaldehyde
are commonly cited in popular literature, but dehydration is just one symptom (one
easily prevented or fixed at that) among many, and hangovers typically are at
their worst when acetaldehyde already has dropped to a low level. Sugary drinks
make hangovers worse due to the formation of lactates, but eliminating sugar
will not eliminate hangovers, only marginally lessen the severity. The most
promising idea is that hangovers are an inflammatory immune system reaction.
This hypothesis is supported by a high positive correlation of the production of
cytokine (an immune system signaling molecule) with hangovers. This is why
anti-inflammatories (including plain old aspirin) do help. The simplest solution,
as the finger-waggers among us always tell us (correctly, unfortunately), is not
to overindulge in the first place. Bishop-Stall’s book is a cautionary tale in
this regard too. His boozy research, which lasted several years, cost him
dearly in his personal life as intemperate lifestyles often do. Some argue, on
the other hand, that one can overcorrect the other way. Raymond Chandler: “I think a man ought to get drunk at least twice a
year just on principle, so he won't let himself get snotty about it.” Perhaps,
though Raymond might have exceeded twice.
Lefty Frizzell & Johnny Bond - Sick, Sober And Sorry (1957)
Back in July I mentioned an intent to make digital versions
of my family/personal photo albums. I didn’t want to create just page after
page of disordered scanned images but coherent albums with chronology and
commentary that would be comprehensible to a stranger – and that could be
stored on a flash drive. The likelihood is low that any stranger ever will look
at them and try to make sense of them, but I wanted to make it possible. Not
until the beginning of this month did I take a stab at doing this. Despite a
somewhat desultory approach to the task, I’ve made some headway though there is
still far to go. As source material, there exist in my closet four physical real-space
album volumes with
only the loosest internal order: 1) 1890s to 1955, 2) 1956-1969, 3) 1970-2001,
and 4) 2002-present. #4 is the thinnest since it consists mostly of printouts
of digital photos, and I haven’t printed more than a small fraction of those I
have saved. Digitizing the first three albums (I’m keeping the four volume
chronology) requires a lot of scanning but at least the original photos are all in
the same place. Most of the pics from the past 20 years, on the other hand, are
already digital but are scattered here and there on a variety of devices; the
physical album for this period is too incomplete to provide much guidance.
Editing is a must. It’s as important to know what exclude from an album as what
to include: most of our photos are best left out of them and stored (if at all)
loose in a box, be the box literal or virtual. Perspective is also something to be
considered: e.g. identifying someone in an image as, say, a great grandmother is obviously a
perspective of the album assembler, not (probably) the reader. In the preface
to Volume I of the digital albums, I wrote the following: “Every photo album, like every narrative, has to be from a
particular perspective if it is to have any coherence rather than being a
random gallimaufry of images. Stories can have multiple narratives, true
enough, but each of those sub-stories still has a single perspective. This
album, assembled by me, is inevitably from my perspective – not only from
narcissism, though I won’t try to deny a quantity of that, but because I cannot
write or post from any other without wandering into fiction. Besides, as the
last one standing in my branch of the family this album is not designed to be
shared at family gatherings where multiple perspectives are expected and
appreciated. That gathering would consist of… well… me. It is possible that
this will be perused at some point by friends or by two-or-more-step-removed
relatives who might share some elements of the history recorded in it. If so,
hi there. Take what you will from it that is relevant to you. As for the rest,
consider it a visit with Richard who is making everything about himself in his
usual fashion.” The reader may notice a sense of mortality in that preface.
This is an inevitable consequence of looking at a family photo album – at least
after a certain age. All but a few of the people imaged in Volume I are gone
(all the survivors are seniors), well over half of those in Volume II are gone
as well, and a large minority in Volume III. When I was in high school in the 1960s the breakthroughs in
bioscience were fresh and there was every expectation that a biotech revolution
would happen in the next few decades to match the mechanical and electronic
revolution of the previous half century. “Some of you sitting in this very
classroom,” one biology teacher told us, “will live to be 200.” We’d have
colonies on the moon by 2000, too. There is a difference between those two predictions. The
technology exists so that we could
have colonies on the moon at this time if we wanted them. They would just be so
insanely expensive to establish and maintain as to make no sense. A 200 year
lifespan, however, is off the table no matter what resources are devoted to it.
Aging is not a singular process but involves a convergence of ever diminishing
resilience along multiple pathways. Researchers publishing in Naturemake this point and came up with a maximum theoretical
lifespan if everything in a person’s life in terms of health went absolutely perfectly:
“Formally, such a state of ‘zero-resilience’ at the critical point corresponds
to the absolute zero on the vitality scale in the Strehler–Mildvan theory of
aging, thus representing a natural limit on human lifespan… A recent careful
analysis of human demographic data supports this argument and yields an
estimate for limiting lifespan of 138 years.” It should be noted that no one
ever has lived 138 years as far as we know. The longest life that is reliably
documented (122) is that of Jeanne Louise Calment (1875-1997).
By coincidence my recreational reading material last week was
by John Martin Fischer, a philosopher with The Immortality Project, titled Death, Immortality and Meaning in Life.
It’s an ambitious title for a book only 196 pages long. The book summarizes
philosophical takes on the subject from Epicurus to Becker. The Immortality
Project actually gets its name from the influential 1973 book The Denial of Death for which author Ernest
Becker won a posthumous Pulitzer in 1974. Humans are the only animals aware
they are mortal, and Becker argued that in the process of trying to escape our
mortality we developed civilization, art, family legacies, and religion. (We
also develop neuroses, but that is another issue.) These “immortality projects”
are attempts to ensure that something of ourselves survives. Making a digital
photo album presumably is one such project, too, albeit a modest one. All such projects are ultimately doomed of course. Nothing is
forever: not civilization and not the planet itself. But a brief existence is
still better than none. Wrote Kurt Vonnegut in Cat’s Cradle, “And I was some of the mud that got to sit up and
look around. Lucky me, lucky mud.”
I have an eating schedule that is idiosyncratic. (My friends
sometimes choose a less diplomatic adjective.) I eat every other day. I don’t
recommend the schedule to anyone else. For some people it might well be
harmful. But it works for me. I adopted this pattern late in 1975 when my
post-college metabolism shifted. The same daily number of calories that had kept
my weight steady during the previous five years instead rapidly packed on unwanted
pounds. It then took the better part of a year to develop a sustainable (for
me) response. Rather than cut back on my daily intake to below what felt “normal”
– i.e. rather than be hungry every single day – I opted for a one-day-on-one-day-off
schedule. This was before “intermittent fasting” was fashionable; I just
preferred being hungry every other
day. I ate whatever I wanted one day and then didn’t eat at all the next. The
strategy worked. I dropped 30 pounds (14 kg) within the year and kept the
weight off for more than two decades. Around the turn of the current century,
however, my life became difficult. (Among friends I sometimes choose a less
diplomatic adjective.) My self-discipline suffered. The result was the same as
in 1975. From 2000 to 2020 I repeatedly restarted the old on-off schedule (with
short term success) only to break with it repeatedly with the same-old results.
Finally late last year I got back solidly on the on-off schedule. When special food-related
events fall on what otherwise would be an “off” day (as did Thanksgiving), I
fast two days in a row ahead of the event in order to get into synch. I dropped
30 pounds in 2021 just as in 1975. Regrettably I didn’t drop 30 years as well,
but one takes what one can get. Breakfast is a key part of sticking to my routine. I can
forgo that snack tonight if I know a good breakfast is only a few hours plus a
night’s sleep away. I have a favorite breakfast spot and usually rotate among
four very non-vegan selections: country fried steak (called chicken fried in
some places) with eggs over easy; sausage and onion omelet with a side of bacon;
chili jalapeƱo cheese omelet with a side of sausage; and eggs over easy on
prime rib hash with bacon. All come with home fried potatoes and toast, though
I always say to hold the toast. I’ll sometimes deviate from these with a taco
omelet or something, but those four are the mainstays. Since returning to the
on-off schedule I have lost my taste for sweet breakfasts: the savory options
hold sway all the way. Besides, regularly eating sugary blueberry pancakes
soaked in maple syrup might be tempting fate given the national epidemic of
Type II diabetes.
Country fried steak at Marilyn's in Chester comes with three eggs and toast
There are supposedly personality differences between sweet
and savory breakfast aficionados according to an article in The New York Post. Among other things the savory fans are more likely to like rock, sci-fi, and
cats. Well, I do like those though I suspect many eaters of crepes and Begian
waffles do too. The same article notes that the biggest demographic for avocado
toast and hummus is men aged 25-34 – yet another reason to worry about this age
group.
The “science” regarding the value of breakfast as in so many
other dietary things is conflicting. According to an Australian study published
in the British Journal of Medicine, what
you eat is more important than when you eat it.Says professor and study co-author Flavia Cicuttini, “We
found that breakfast is not the most important time of the day to eat, even
though that belief is really entrenched in our society and around the world.” The
authors did not find evidence that breakfast aids weight loss despite common
claims to this effect in popular literature; on the contrary they found that
breakfast eaters ate 260 more calories on average daily than those who skipped the
meal. Yet, other scientists argue that there is an advantage to a morning meal
in order to stabilize blood sugar levels and boost immunity. Says Dr. Barry
Sears of the Inflammation Research Foundation, “By having breakfast, you refill
the liver glycogen levels so that the cortisol levels can decrease. If you
don't eat breakfast, then cortisol levels will continue to increase to maintain
blood glucose levels eventually causing insulin resistance that makes you more
hungry throughout the day.” Perhaps by skipping breakfast (along with other meals) one
day and eating it the next I can combine the benefits of both regimens. Or
perhaps I will suffer the ill effects of both. Whatever the case may be, I’m
looking forward to sitting at my usual table tomorrow. Tomorrow, I think, is a
country fried steak day. Today is an “off” day so this morning I just had blues
for breakfast. That was OK too.
We had a couple dustings of snow in the past week in these
parts: barely enough to be worth mentioning, but nonetheless a reminder of
things to come. Probably. NJ winters have little predictability. They range
from balmy to blizzardy. There is likely to be ice and snow however. Possibly a
lot. Public service announcements on the radio already are warning about the
dangers of slipping and falling on ice. I’m well-stocked with salt, sand, and
shovels in order to keep my walkways safe. I test-started my snow-blower. As
for slipping and falling… well, I’m pretty sure I’ll do that anyway despite
forewarnings. I do that every winter.
I’m not outstandingly klutzy but let’s say that Parkour won’t
ever be my game. I don’t require ice to take a tumble. I don’t even need a cat
between my feet. My worst fall from the perspective of onlookers was in my 20s
when I tried to push a wheelbarrow full of bricks across a plank spanning a
portion of an unfinished foundation; the plank turned and I joined a shower of bricks
in an 8 foot (2.5 m) descent to concrete. Other workers came running assuming I
was under the bricks. I wasn’t. Ego aside, I walked away without an idiomatic
scratch. No one was more surprised than I, but such things happen. In 1943 a
B-17 gunner named Alan Magee fell out of his bomber over France; he fell 22,000
ft (6700 m) without a parachute. He crashed through the roof of the St. Nazaire
train station, which slowed his descent. He survived and spent the rest of the
war in a German POW camp. He died in 2003. Unfortunately, luck can run the other way too, especially
when we are past our 20s and no longer bounce as well. Every year people die
from tripping over their own feet at 0 altitude. According to the CDC 17,000
Americans die every year from slips, trips, and falls. 26% of those are on
level surfaces. 3% happen getting out of bed, which I suppose is one argument
not to do that. 800,000 are hospitalized from falls annually. Injuries not requiring
hospitalization are in the millions. Falls are the leading cause of accidental injury
for those over 65. Ice and sleet, of course, up the risk enormously. Icy stairs
are exceptionally dangerous. According to the CDC website, “One out of five falls causes a
serious injury such as broken bones or a head injury.” This can’t be right.
Five falls are an average winter month for me: all of them outside. Once again,
I’m not klutzy in a general way. It’s just that there are very few level
surfaces on my property. I keep the walkway (which is level) to the guest
parking area (also level) shoveled and salted, but the bulk of the driveway from
the road past the house all the way up to the barn is on an incline steep and
steady enough that heating oil delivery trucks tend to lose traction
about halfway up in icy conditions. (For this reason I keep a spare 15 gallons of
fuel on hand to feed the furnaces overnight when this happens.) Shoveling,
snow-blowing, and salting the driveway on foot (which I do myself) therefore involves
a lot of slipping and sliding on slanted ice. I land on my posterior with some
regularity. Taking the wheeled garbage bin down the driveway to the curb in
winter may involve a whirly dance as we slide our way down. The lawn, too, is
mostly a series of slopes. When the lawn is snow-covered, getting traction on
it is a challenge. Nonetheless, no spill in 40 years has caused me more than a
bruise, and rarely one of those. Yet, while I accordingly doubt the “one out of five” stat,
there is no doubt that a bad landing can cause serious harm, even on a cushion
of snow from just a standing height. Not all of us bounce like Magee. I
wouldn’t want to repeat my own stunt with the bricks either. I doubt I’d be as
lucky twice. So, I’ll heed the PSAs and watch my step. I urge the reader to do
the same.
Ever since humans learned to count – when that might have
been is anybody’s guess – they have been numerologists, assigning special
importance to some numbers beyond their absolute values. Sometimes this was
because of celestial correspondences – such as 12 because of 12 full lunar
cycles in a year – and sometimes because of internal characteristics. The
Pythagoreans regarded 10, for instance, as the most perfect number (being 1 + 2
+ 3 + 4) and, according to their beliefs, therefore of mystical significance.
Thanks to the dominance of the Base 10 number system, modern folk also tend to
give 10 and its multiples quasi-mystical significance. Consider the
significance we attach to 10th anniversaries. So too birthdays: 20th,
30th, 40th etc. birthdays all are considered landmarks in
our lifetimes. Yet the birthdays prior – which is to say those ending in 9 –
have greater impact. This comes to mind since just last week I reached an age
ending in 9.
Donald as a Pythagorean in "Donald in Mathmagic Land (1961)"
We commonly define our stage of life by decade:
twenty-something or fifty-something or whatever. We just as legitimately could
divide a lifetime by 7s or some other number, but we keep the math simple and
divide it by 10s. We often set our goals accordingly, as in “I want to
be/achieve/have such-and-such by the time I’m 30.” Hence 29 rings alarm bells.
Time is running out. The goals change with each new looming decade, but the
sense of urgency when it is 12 months away is always the same. We push
ourselves into (sometimes ill-considered) action while there is time. First
time Marathon runners are more likely to be an age ending in 9 than other ages.
First time sign-ups on the Ashley Madison adultery website are overrepresented
in the 29, 39, and 49 ages. According to the World Values Survey (42,000
respondents), people are most likely to question the meaning of their lives in
a 9 year. Some don’t like their conclusions because suicides spike in years
ending in 9: not a by lot (2.5%) but enough to be statistically significant. We may motivate ourselves to change our behaviors in the 9
years, but do we change who we are at bottom? The consensus among psychologists
is that radical personality transformations are rare at any age – and may
indicate brain damage when they occur. Even apparent transformations, as among
religious converts, affect surface behaviors more than underlying traits. The
Big Five personality traits are openness, conscientiousness, extraversion,
agreeableness, and neuroticism (acronym OCEAN). Each of these is a spectrum,
and where a person scores on them is a remarkably accurate predictor of how he
or she deals with life. High conscientiousness, for example, is the single
biggest predictor of material success: more so than intelligence, which is not
a personality trait. (IQ is like horsepower under the hood: more of it means
your car can do more, but it indicates nothing about what kind of driver you
are.) The terms are largely self-explanatory except perhaps openness. (It does
NOT mean an open person is someone who can be convinced to agree with us politically.)
It has to do with imagination and curiosity about the world, evinced either
through adventurousness, bookishness, or both. A person’s scores scarcely ever change
noticeably in a 9 year or in any other one year, but we do evolve slowly over longer
time periods. The good news is that those long term gradual personality changes
are usually for the better. Writes Professor of Psychology Christopher Solo, “Many
studies, including some of my own, show that most adults become more agreeable,
conscientious and emotionally resilient [less neurotic] as they age.” The “more
agreeable” trend seems to argue against the Grumpy Old Man stereotype, but
since such characters do exist I suppose we must conclude that in their youths
they were flaming jackasses. They mellowed out to merely grumpy. Happiness
tends to increase with age too. Ironically, this is partly because of a
lowering of expectations. Our options really do diminish with each passing
decade and we make peace with that. Also, we tend to care less about the
opinions of others. I remember an old George Burns line: “When I was young I
was taught to respect my elders. I’ve finally reached the point when I don’t
have to respect anybody.” Getting back to my own 9er year, when I was 20 (or 40 for
that matter) I didn’t consider the difference between 69 and 70 to be worth
mentioning. Both seemed superannuated to me. But now that I’m here the
difference seems significant. Making a major personality change in the
remaining year before the big seven-zero is apparently off the table. I’m long
past angst over existential “meaning of life” issues. Quite aside from being well
outside the preferred age demographic, I’m not married so don’t really qualify
for Ashley Madison. Sign up for a marathon? I don’t think “No” is a strong
enough answer. I’m satisfied with life as it is, generally speaking. Still,
there must be some changes to my lifestyle that I should cram into the months
before that 9 flips to 0. I just need to sit down and make a list – though it’s
possible I’ll misplace it.
The Boswell Sisters - There`ll be some changes made (1932)