According
to the American Academy of Sleep Medicine, more people say they are sleepy on New
Year’s Day than on any other day of the year. The first day back at work in
January came in second (57% vs. 45%). I think we know the reason why, and it
hasn’t much to do with having stayed up until midnight the previous eve. So,
many of us will be on our couches tomorrow with chips on the coffee table, aspirin
in our bloodstreams, and TV remotes in our hands. (I personally plan to be
hangover-free this year, but only because I’ve experienced being otherwise in
past ones.) If the reader has seen too many Twilight
Zone marathons to wish to see another one and is furthermore no big fan of
college football, he or she might struggle to find something watchable with
that remote. This past weekend I happened to watch (in one case rewatch) three
movies, all of which I can recommend. By the end of the third, the funk should
be lifting – depending on just how much overindulgence was involved.
Ad
Astra (2019)
I
almost saw this in the theater several months ago but ultimately opted for Joker instead. That was the right
choice, but this would not have been a bad one either. (I could, of course,
have seen both, but I don’t go to the theater frequently anymore.) The title (“to
the stars”), as any first year Latin student knows, is part of several Latin
sayings, notably ad astra per aspera
(“to the stars through difficulty”: the motto of Kansas of all places) and sic itur ad astra (“thus one goes to the
stars”: Aeneid IX 641).
Even
scifi films that make an effort to portray spacecraft and space habitats
realistically tend to make them overly
polished. Not Ad Astra: here they are
credibly worn, gritty, and lived-in. The special effects in the film are
phenomenal without overwhelming the story. Brad Pitt pulls off a much deeper
and contemplative performance as the astronaut Roy than I had expected from
him.
Earth
is suffering damaging EMP surges that seem to come from the anti-matter power
source of a presumed-lost crewed probe beyond Neptune. The probe was designed
to image planets in other solar systems. It is commanded by Roy’s father. The
highly skilled but deeply flawed character Roy sets out to find the probe and
destroy it. There is much in the film about personal identity, generations,
morality, and whether meaning is to be found out there or within oneself. Lest
that sound like too much philosophy and not enough action, there is enough of
the latter, too. The pacing isn’t rushed, but at 2 hours it doesn’t drag.
For
those who like hard scifi (e.g. The
Martian), this is a solid entry.
****
****
Ready
or Not (2019)
Getting
the balance right between humor and horror when mixing the two is no easy
matter. A few pull it off including Cabin
in the Woods and Tucker and Dale vs.
Evil. Ready or Not does, too.
Grace
(Samara Weaving) marries a young scion of a super-rich family that made its
fortune in playing cards and board games. Family tradition has her pick a game
card from a very special game on her wedding night. Most of the cards are
innocuous, but unknown to her one of the cards will declare her “it” in a game
of hide-and-seek in which the family tries to kill her. Her only chance is to
survive until morning. Naturally, she draws this card.
This
is a warped but entertaining film with something to say about what people will
do for love and money – and love of
money.
****
****
Road
House (1948)
This
has nothing to do with the better-known Patrick Swayze movie of the same title.
The 1940s were a marvelous decade stylistically and in the popular
arts. Film noir defines the decade on
the screen more than any other genre, and this noir drama is one worth seeing. Ida Lupino is superb as the world-weary
performer Lily in a rustic road house owned by the somewhat unstable Jefty
(Richard Widmark) and managed by his best friend Pete (Cornel Wilde). A love
triangle develops. Betrayals upon betrayals including attempted murder ensue.
There is suspense, fine acting, and a well-written script.
The clip from the film below is Ida singing about the reason so
many of us are on our couches with remotes.
**** ****
If, after those three flicks you’re still feeling off, have some
Alka-Seltzer and take a nap. Tomorrow all will be better… unless tomorrow is
that work day that came in second in that AASM survey.
Ida
Lupino - One for My Baby (and One More
for the Road)
Tis
the season for get-togethers with friends and family, and this year I’ve been
to a few such including in my own house. In today’s hyper partisan world the old
rule to avoid religion and politics is more advisable than ever – especially regarding
the latter as ever more people make politics their religion and suffer no
heretics. The temptation to be righteously offended is high. By and large, the
rule was followed everywhere I was present. I say by and large because
absolutely everything has become politicized including the food on the table
and the wrappings of gifts, so complete avoidance is impossible... and then
there are the conspiracy theories. These I rather enjoy when they come up in
conversation, and fortunately a few did. I don’t mean I enjoy ones about
contemporary politicians, which are wearyingly predictable in how they are
advanced and received. (Fortunately, these were avoided.) I mean the ones less
influenced by the emotions of the moment, such as those about JFK, MLK, RFK,
Marilyn Monroe, faked moon landings, Pearl Harbor, the Illuminati, and so on. On
a somewhat different level, let us not forget alien abductions, Bigfoot, live-Elvis-in-hiding,
and Nessie. (In NJ, Nessie supposedly has a cousin named Hoppie in Lake
Hopatcong.)
Many
of these are not fringe beliefs, if by fringe we mean limited to a small
percentage of the population. More than half of Americans believe the JFK
assassination was the work of a conspiracy, for example. A 2016 Chapman
University survey found more than half disbelieve the official account of 9/11.
33% of respondents said they believe the government is covering up the truth
about the “North Dakota crash,” which was an unreal event entirely invented by
the researchers for the survey. Underlying most of these is a tinge of
paranoia: the notion that there are string-pullers behind the scenes who do not
have the best interests of the rest of us at heart.
All
of us have some opinions that will strike most others as odd, but what
distinguishes someone who entertains a poorly supported hypothesis from someone
who is a full-blown conspiracy theorist? It’s the habit of the latter of
turning the burden of proof on its head: they demand that naysayers prove their
theory wrong. Of course, proving a negative fact is seldom possible. Prove
Marilyn wasn’t murdered. I can’t, but that doesn’t make it true. Can you prove
you didn’t rob a convenience store in Allentown PA in 2002? Assuming you were
old enough to do the deed in 2002, odds are you can’t. I can’t prove I didn’t (though
I didn’t). Put another way, it is the difference between bias and prejudice. Someone
with the former has an inclination to believe something but is persuadable by
contrary evidence; someone with the latter has made up his mind (pre-judged) and will
dismiss any contrary evidence as tainted. Everyone – absolutely everyone – has
cognitive bias (See Blindspot by
Mahzarin R. Banaji & Anthony G. Greenwald) but we still can choose not be governed
by it – not to act blindly on the basis of it.
Intelligence
is no barrier to odd beliefs and unlikely conspiracy theories. Quite the
opposite. Michael Shermer (founding publisher of the magazine Skeptic, contributing columnist to Scientific American, and author of The Believing Brain) explains that intelligent
folk are better able than duller folk to convolute, reinterpret, and interconnect data in
creative ways. Moreover,
they are every bit as motivated to do so: “our most deeply held beliefs are
immune to attack by direct educational tools, especially for those who are not
ready to hear contradictory evidence.”
Partly
for this reason, I’m inclined to allow the odd bee in a person’s bonnet without
it diminishing my opinion of him or her in a general way – provided the particular
conspiracy theory isn’t actually vicious. Another reason is that there really
are conspiracies in the world on large scales and small. On occasion the bee can be right. Edward Snowden,
whatever one thinks of him, revealed rumors about the NSA were true. On a personal
level, if you feel you are being followed, you might just be right: stalkers do
exist out there. Even if a theory is wrong overall, it may yet contain a nugget of
truth. I doubt the Illuminati are running anything from under the Denver
Airport, for example, but there really are elites who interact with each other in
the Bilderberg Group and elsewhere and who surely would like to run things if
they could. Said Gore Vidal, “Anyone who isn’t
paranoid isn’t in full possession of the facts.”
So,
I’m happy (and entertained) to hear what people have to say about Roswell or
Area 51 or HAARP or what-have-you. Despite my cognitive biases, I’ll even
endeavor to remain open to persuasion. If persuasion fails, perhaps when I awake
in an alien spaceship the joke will be on me.
Back
in the ‘60s when I discovered JG Ballard’s off-beat scifi (so much more fun
than my school assignments), he instantly became one of my favorite authors. He
continued to be up through his final (2006) novel Kingdom Come. He writes so elegantly that had he written user
manuals for washing machines they would have been a joy to read. Fortunately he
was more creative than that.
There often is a Lord of the Flies vibe to Ballard’s
fiction, though with adults and triggered by an excess of civilization rather
than the lack of it. Ballard came to believe that modern physical and social
environments are so at variance with the natural world in which people evolved
that our ids urge us to rebellion. When animals in zoos are enclosed in spaces radically at
variance with their natural habitats they develop behavioral disorders.
Ballard’s human characters do, too; being intellectual creatures, they couch
their rebellions in philosophical terms, but there is really something much more
primal at the bottom of them. In High-Rise,
the residents cut loose in every imaginable way. In Crash the central characters find psychic release and erotic
satisfaction in auto wrecks. In RunningWild, the children in an upper-crust
gated community kill their parents. In Super-Cannes, highly educated professionals,
egged on by a psychiatrist/philosopher, form gangs of roving violent thugs at
night. In Millennium People, middle
class folks rebel violently against their own suburban lifestyles, taking up
terrorism and burning their own neighborhoods.
By
2019 I’d read the bulk of Ballard’s published work, but there were two full
length novels I’d missed. They’re missed no more.
Rushing to
Paradise
(1994) is a darkly funny novel written from the perspective of the adolescent
Neil. Neil is mesmerized by the anti-nuclear/animal-rights protests in Honolulu organized
by Dr. Barbara Rafferty, who has a controversial past involving assisted
suicide. Ostensibly to “save the albatross,” she assembles an activist group to
sail to the French South Pacific island of Saint-Esprit, which is being
prepared for a nuclear test. [For historical context: The French continued
nuclear weapons tests in the South Pacific until 1995, and in 1985 sank in
harbor the Greenpeace ship Rainbow
Warrior, whose crew had intended a stunt much like that of “Dr. Barbara.”] Neil
joins her expedition. The project attracts attention from TV and movie
producers. Dr. Barbara and her followers reach the island but are forcibly
removed by French soldiers amid a media circus. Neil is wounded in the foot.
Months
later, they mount a second expedition, which Neil again joins. Arriving at the
island they find the French military has withdrawn and that nuclear tests, at
least for the time being, have been suspended. Dr. Barbara and her odd
assortment of scientists, activists, and hippies establish the island as an
animal sanctuary. Yet, charismatic leaders and their fanatical followers (of
any ilk) often become brutal and dangerous when social restraints come off, and
Saint-Esprit proves no exception. Despite all the sloganeering, Dr. Barbara
seems singularly uninterested in the albatross. When Neil questions her about
it, she explains to him that the real sanctuary is for women. The men have a
way of dying on the island in the ensuing months; Neil eventually realizes his
value to Dr. Barbara lies in not being a full grown man. Yet even the explanation
she gives about this is an intellectualization by her of more primitive
motivations.
The
second novel, The Kindness of Women
reveals much about JG Ballard himself. Ballard is most widely known for a very
atypical book: the semi-autobiographical novel Empire of the Sun, based on his experiences as a boy interned as an enemy alien in WW2
Shanghai during the Japanese occupation. Spielberg directed the movie
adaptation in 1987. The novel The
Kindness of Women is the lesser known sequel to Empire of the Sun. “Semi-autobiographical”
understates it: it is closer to three-quarter. Maybe more. His lifelong
obsession with death, repressed sexuality, and the thinness of civilization’s
veneer becomes much more explicable in light of this tale.
The
protagonist James Graham (which is what JG stands for) leaves the Shanghai internment camp, which is suddenly unguarded, at the
end of the war. The teen’s first encounter while walking along railroad tracks
away from the camp is with Japanese soldiers (still occupying the area) who are
slowly killing a Chinese in civilian clothes. (There might or might not have
been a reason other than random cruelty.) James knew by then not to question anything
Japanese soldiers did, so he puts on an air of as much nonchalance as he can
manage. He assumes his own odds of survival in this moment are 50/50 but the soldiers
just exchange a few mutually
unintelligible words with him and don’t bother him; he and they pointedly
ignore the bound civilian. He walks on. After James leaves China, he attends medical
school, serves in the RAF, marries but suffers an early loss of his wife, and unexpectedly
achieves some success with writing. All of those events parallel Ballard’s own
life, of course. It is clear that the years in Shanghai always followed him –
both the character “James” and his real self. They deeply impressed on him what
people could be like when effectively unrestrained – and when imprisoned. As
one might expect from the title, his relationships to (and with) the special women
in his life (sometimes kind, sometimes ruthless, sometimes both) from the days
in the camp up to the time of the Spielberg movie are crucial to his sense of
the world.
All
of us are haunted by our personal Shanghais. Most of us (though some not) are
fortunate enough for them to have been less extreme than Ballard’s, but his
writing slips us into his shoes easily.
Eggnog is prominent on supermarket
shelves this time of year. Homemade is vastly better, but I’m much too lazy to
make it from scratch these days, so it catches my attention. I rarely buy it,
however, because a glass contains as many calories as a lumberjack’s breakfast.
(I do little calorie-counting, but there are some items that shout out for it.)
As a kid I loved the stuff (the nonalcoholic variety, of course) and I still
like it. Ready-made nonalcoholic eggnog is a relatively recent product, appearing
in markets in the late 1940s. Add your booze of choice for the adult
version. Maybe sometime before New Year I’ll skip breakfast and have one or two
– with or without the ethanol.
As it happens, the first alcoholic
drink I ever had was eggnog. It was on a warm evening in Islamorada Florida during
a family vacation between Christmas and New Year’s Day. I was 8. A dozen adults,
my parents among them, were gathered under a beachside gazebo and had served
themselves nog. A cup was handed to me before my mom sampled it and realized it
was spiked. At this late date, I have no idea with what, but given the company
and location it probably was rum. “Oh, it won’t hurt him,” my dad said. “There
isn’t much in it.” My mom let it go. There wasn’t much in it either, as it had
no noticeable effect whatsoever, though of course I wasn’t allowed a refill.
That’s all there was to it. A decade later my relationship with alcohol became
more complicated.
Sumerian kegger with straws
Humanity itself got a much earlier
taste. Great apes today seek out fermented fruits, which can have as much as 5%
alcohol content – about the same as beer. There is every reason to suppose
hominins did the same. The natural supply always was limited. When human ingenuity
removed those limits, humanity’s relationship with ethanol became more
complicated. 8000-year-old shards of pottery from sites in the Middle East,
China, and Georgia (Transcaucasian Georgia, not the one it’s a rainy night in) contain
traces of beer, mead, and wine. Since this timing coincides with the advent of
farming, some anthropologists have argued that the reason humans adopted
farming (which is harder work than hunting and gathering while providing a less
healthy diet) was not for food but for an abundant supply of grains for beer; if
so, the credit (or, some might say, the blame) for civilization belongs to
brewers. The first written records appear in Sumer 5000 years ago, and among
them is a recipe for beer. Beer powered the overachieving masons in ancient
Egypt as well. The classical Greco-Roman world favored wine. Asian cultures
favored rice wine. In the Middle Ages playful Arab and European alchemists discovered how to distill powerful spirits.
A useful and compendious history of
humanity’s love/hate affair with alcohol, by the way, is Drink: a Cultural History of Alcohol by Iain Gately, though he has
little to say about eggnog in particular.
Eggs and alcohol don't seem to have been a popular mix in ancient times, though Pliny the Elder does mention it as a cure for alcoholism. He tells us that owl eggs aged for three days in wine "produce distaste for it." Posset, something very much like
eggnog, appears in the historical record in England in the 1300s. It was a
mixture of eggs, figs, and ale. It didn't cure alcoholism. As the years passed, experimenters swapped out ale for sherry and then hard liquor. Other additives besides figs were tried. The
word “eggnog” turns up in 18th century dictionaries, which means it
is probably older. It apparently derives from the words egg and grog, and the
recipe by then was essentially the same as today. Then as now the choice of
liquor was a matter of personal preference. George Washington preferred a mix.
He offered the following recipe:
One quart cream, one quart milk,
one dozen tablespoons sugar, one pint brandy, pint rye whiskey, pint Jamaica
rum, pint sherry — mix liquor first, then separate yolks and whites of 12 eggs,
add sugar to beaten yolks, mix well. Add milk and cream, slowly beating. Beat
whites of eggs until stiff and fold slowly into mixture. Let set in cool place
for several days. Taste frequently.
George’s eggnog packs a lot more
punch than whatever I drank at age 8, which likely contained just a splash of
Bacardi’s. If it is the one concocted surreptitiously at West Point, however, it
explains the Eggnog Riot of 1826 in which 90 drunk cadets broke dishes,
windows, and bannisters. All were disciplined but only 11 were expelled, which
is a much milder response than I would expect today. Perhaps the Superintendent
figured the morning after was its own punishment. I’ve never experienced one,
but I understand eggnog hangovers are brutal.
George wasn’t the only President to
share an eggnog recipe. Dwight Eisenhower was a bourbon man: one dozen egg
yolks, one pound of granulated sugar, one quart of bourbon, one quart of coffee
cream and one quart of whipping cream.
Bourbon is good, but if I ever get
ambitious enough to make eggnog from scratch at home and pour a quart [.95
liter] of it in there as per Ike’s instructions, I’ll repeat my age-8
experience of stopping after one cup.
There was a Christmas tree in the house every
December when I was growing up. I very much liked it as a kid and spent an
inordinate amount of time sitting by it. I’m not really sure why. My parents
owned three homes in my lifetime and none had any shortage of trees on its lot.
So, a tree in the house was special only by being in the house.
1954
Presents showed up underneath it Christmas
morning, of course, but I liked the tree for itself.The lights and decorations didn’t matter much
either. The tree alone had some atavistic appeal. My dad must have felt it too,
though he never outright said so. He always brought home a Christmas tree
before either my sister or I asked for one. During the rest of the year he
followed the builders’ tradition of raising an evergreen on the roof peak when topping out a newly constructed house. It always was a sad day when the tree exited
our house: usually on January 2, though a few times it dried too quickly and
departed before the New Year.
The origin of Christmas trees is open to some
debate. When the Council of Nicaea standardized the dates of Christian holidays
in 425 CE they quite sensibly chose dates that coincided with pre-Christian
holidays. The transition was easier that way. Solstice celebrations in one form
or another had existed pretty much everywhere. (The solstice fell on December
25 when the Julian calendar was adopted.) Many pagan
holiday traditions such as gift-giving for Saturnalia transferred readily to
the new holidays. A millennium later, the Puritans banned the celebration of
Christmas (Easter, too) precisely because of these pagan origins. Formalizing
the ban into law in 1659, the General Court of the Massachusetts Bay Colony
dictated: “It is therefore ordered by this court and the authority thereof that
whosoever shall be found observing any such day as Christmas or the like,
either by forbearing of labor, feasting, or any other way, upon any such
account as aforesaid, every such person so offending shall pay for every such
offence five shilling as a fine to the county.” They technically were right
about the history… and yet they were wrong. One can't help suspecting (with HL
Mencken) that the real worry the Puritans had was that someone somewhere might
have fun.
Getting back to the
trees, ancient pagans of central Europe and the Baltic celebrated the solstice in
woodsy fashion by hanging evergreen wreathes, holding sacred grove ceremonies,
and burning the Yule log. Christmas trees seemingly derive from these
practices, but before the 16th century in Germany there is no mention in print
of whole trees being brought indoors for the holiday, so this particular tradition may be no older than that. The trees became popular in the US in the mid-nineteenth century with the arrival of
large numbers of German immigrants.
Though my mom and dad
grew up Presbyterian and Catholic respectively, they were not dogmatic in
religious matters as adults, and became less so with each passing year. They
sent me to an Episcopal high school, but only because they thought it was
better than the public school. So, for the most part their seasonal decorations
were just festive. When pressed about their spiritual opinions in later years
they sounded rather New Agey. My skepticism regarding all things mystical
kicked in early, and in my 20s I shunned Christmas decorations with an almost Puritan adamancy – a youthful fault. For years I
forewent any seasonal ornamentations on some sort of secularist principle. I didn’t object
to others (including my parents) having them, of course, but in my first decade of living alone there were no Christmas trees in my dwelling space. I grew out of this unholier-than-thou attitude in time. Besides, the Neolithic (maybe Paleolithic) pagan origins of the
seasonal celebrations that so distressed the Bay Colony court were the very
things that gave me an excuse to ease up even before I mellowed with
age. Before the 1980s ended, a tree reappeared in my home each December.
2019
Evergreens still have an atavistic appeal. So, in 2019 once again, there
is a tree in my living room and egg in my nog. There will be gifts under the
tree for friends and family on Christmas, too. Further, if I decide to build a
small house for the stray cat who lives outside year round (something I’ve
considered) I’ll raise an evergreen on its roof when it is topped out.
The
usual suspects were on hand for Thanksgiving at my house, which this year also
happened to fall on my birthday. 18 altogether of friends and family were
there, though a few left early while a few others arrived late so 14 was the
maximum at any one moment. The standard advice to avoid politics on such
occasions is sound, and was very nearly obeyed. It was “very nearly” rather
than “entirely” only because food itself is political nowadays. I suppose it
always has been to some degree, but it is more fiercely so today. Enough
tolerance was demonstrated all around to keep things mostly congenial however.
There were dishes on hand for vegans and carnivores alike. (I’m unabashedly one
of the latter.) It was nice to catch up with everybody in person rather than
through social media. Remarkably, no one at the table had a nose in a phone.
That was nice, too.
When
the table was cleared, the fridge filled with leftovers, the sink left for
later, and the guests departed, the question arose of how to employ the time
before the arrival of the sandman. “Turkey coma” is something of a misnomer,
since I always have been exceedingly awake after over-indulging in
poultry, sides, and desserts. It is true, though that I’m not ever motivated to
do anything useful in that condition. A movie was the obvious answer. Two
20-something friends have been rewatching all the Harry Potter movies the past week because the films remind them of
what they charmingly and without sense of irony call their youth. This brought
to my mind another flick set in a British school: The Belles of St. Trinian’s (1954).
My
first instinct was to recommend the movie to the two Millennials, but first
instincts are not always wise. If they were, enough intelligence for second
thoughts never would have evolved in hominins. That’s not to say we often make
use of it. I’ve made the mistake before of trying to introduce favorite books
and films to people for whom they were unsuited, only to bore the recipients to
tears. No snobbery is intended by “unsuited”: few of those recommendations were
highbrow, and I’ve received my share of recommendations for which I was unsuited,
too. When it comes to classic films (“classic” in the sense of older than the
1990s rather than necessarily “great” or even good), young people generally
have to find their own way to them. They can’t be pushed, or they will dislike
them on principle. So, as quiet returned to the house Thursday evening I spun
up the DVD for myself.
Ronald
Searle infused his 1940s cartoons with a dark humor much of a kind with that of
Charles Addams. The hellion schoolgirls of St. Trinian’s were a recurring subject
for him. Searle is not as well-known today on this side of the pond as he was
during what I call (with quite proper lack of irony) my youth. Back then The Belles of St. Trinian’s and its
sequels aired frequently on television. They haven’t for the past few decades,
and the 2007 St. Trinian’s sank like
a stone. Searle is worth rediscovering. He occupies a space on my shelf next to
Addams. The 1954 movie remains the definitive adaptation, and if you have found
your own way to classic films but haven’t yet seen this one, do yourself a
favor and do so.
Alistair
Sim is marvelous in a dual role as the headmistress Millicent Fritton and as her
underworld brother Clarence. Millicent describes her educational philosophy
thus: “in other schools girls are sent out quite unprepared into a merciless
world, but when our girls leave here, it is the merciless world which has to be
prepared.” The plot involves a race horse owned a sultan father of new student:
he had chosen St. Trinian’s for her only because it was near his horses.
Millicent hatches a scheme to rescue the near-bankrupt school by betting on the
horse though Clarence and the Sixth Form girls have contrary plans. In a
subplot, police Sgt. Ruby Gates (Joyce Grenfell) goes undercover as a substitute
teacher to uncover illegal activities at the school. There is a thoroughly
enjoyable celebration of anarchy throughout the film. Keep in mind it is 1954,
and the flick is intended to be kid-friendly, so don’t expect Quentin Tarantino,
but it is still as much fun to watch as it was when I first saw it decades ago.
It wrapped up my Thursday nicely.
When
Paleolithic painters scrawled images on cave walls 20,000 years ago, critics viewing
them by torchlight argued about whether they enlightened or corrupted society
and about whether they should be censored. We don’t know that for a fact, of
course, but since critics have argued about art in this way since there have
been written records, it is not a big stretch to suppose they did so earlier as
well. There always has been tension between supporters of unfettered artistic freedom
and supporters of… well… fetters. Moralists see it as a choice between
decadence and decency – sometimes between outright evil and decency. Moralists
of a different stripe see the choice as prudery versus liberty. Beneath this
tension is the more basic question of the purpose of art. Does it have a
purpose? If so, should art uplift or simply reveal?
Decadent art?
The
dramatic arts, when they came along, moved to the center of the debate. In the
5th century BCE Euripides was regarded by conservative Athenian critics
as decadent – even dangerous – compared to his elders Aeschylus and Sophocles.
Sophocles himself remarked, “I depict men as they ought to be. Euripides depicts
them as they are.” Indeed, though Sophoclean characters have their tragic flaws, there is a core of nobility in them. Euripidean characters, by
contrast, at their cores are likely to be adolescently voyeuristic (Pentheus),
cruelly vengeful (Phaedra), callously opportunistic (Jason), or murderous
(Medea). Even Aristophanes, who was pretty edgy himself, satirized Euripides in
The Frogs.
When
drama moved to the movie screen the tensions remained unresolved. They are to
this day. 100 years ago censors (acting sometimes through force of law and
sometimes through social pressure) typically framed their objections in
religious terms. Today the objections are more likely to be ideological, but
whether the concern is cosmic sin or secular political correctness, the effects
(and one supposes the underlying impulses) of censorship are similar. Neither
side in the debate gets the upper hand permanently. Nannies and libertines
trade off ascendency from one era to the next. One very special era in movie
history was that of the early talkies (1927-1934) when censors were largely
ignored: the pre-code era.
In
order to head off regulation by Congress the Motion Picture Association adopted
a self-regulatory production code in 1927 and updated it in 1930, but the
studios in practice didn’t pay heed to it prior to 1934. Faced in that year
with a more serious threat of legal restraints, The
Motion Picture Production Code (commonly called the Hays Code) began
to be broadly enforced by the studios. The code states, “No picture shall be
produced that will lower the moral standards of those who see it. Hence the
sympathy of the audience should never be thrown to the side of crime,
wrongdoing, evil or sin.” A long and detailed set of rules for following the code
(describing, for example, how married couples may be depicted in a bedroom and
how long a kiss can last) soon developed alongside the code itself. All but a
few of the restrictions would find support from PC censors today, albeit for
differently stated reasons. Directors found ways to push the envelope, of
course. In Notorious (1946) Hitchcock
famously got around the 3-second limit for on-screen kisses by having Grant and
Bergman kiss
repeatedly over 3 minutes, but never more than 3 seconds at a
time. Still, the code remained a real force until the mid-1960s. Prior to then,
many of the most interesting movies ever made were pre-codes.
The
better of the pre-code films portray people as they are, which uplifting and PC
films do not – at least not in any rounded fashion. Fundamentally well-meaning
people have dark sides: they can scheme and cheat. People who are fundamentally
villains can be kind and generous in any number of ways. Pre-code characters
have that complexity. They are human. Once again, that is in the better films;
every era generates its share of garbage, and the pre-code era is no exception.
A marvelous DVD series of films from this period is the Forbidden Hollywood Collection. I have
owned for some time the first two volumes which contain such B-classics as Baby Face and Night Nurse. Last week I added Volume 3 to my shelf and
binge-watched its six movies. All six are directed by William “Wild Bill”
Wellman, best known for Wings (1927),
The Public Enemy (1931), A Star is Born (1937) and Nothing Sacred (1937). The films in
Volume 3 are nothing so ambitious. They are small films, but are interesting nonetheless,
not least because they mostly deal with ordinary people:
Other Men's Women
(1931)
Best
friends Bill and Jack are fireman and engineer on a railroad locomotive. Bill
is single and devil-may-care while Jack is married and responsible. When he
makes an extended visit to Jack’s home, Bill and Jack’s wife Lily (Mary Astor)
form a mutual attraction. Trouble ensues, but not in a simplistic way. There
are mixed motives, unintended consequences, and guilt all around.
The Purchase Price (1932)
Night
club singer Joan Gordon (Barbara Stanwyck) breaks from her underworld lifestyle
and her gangster lover Eddie by changing her name and answering a mail-order
bride ad posted by a farmer in North Dakota. As one might imagine, Joan doesn’t
adjust readily to country life. Her husband Jim (George Brent) is handsome but frequently
behaves as a stubborn jerk. Their marriage accordingly gets off to a rough
start and is a long time being consummated. To complicate matters, Eddie tracks
her down and shows up at the door.
As
Mary (Loretta Young) awaits the verdict of her trial for murder, we see in
flashback Mary’s journey from falsely arrested teenager to prison inmate to cavorter
with gangsters. A wealthy lawyer falls for her and tries to change her life,
but her past catches up with her, as pasts tend to do.
Heroes for Sale
(1933)
Presumed
killed in a raid on a German position in WW1, Tom (Richard Barthelmess) is
actually severely wounded and captured. After the war he returns home addicted
to morphine (from his treatment in a POW hospital) and finds that another
soldier has taken credit for his heroics. He gets clean and tries to make a new
start in Chicago. He does well and marries Ruth (Loretta Young). Then Ruth is
killed in labor unrest and Tom is falsely arrested and convicted. Upon his
release Tom takes to the road as a hobo.
Wild Boys of the Road (1933)
In
the Depression, high school sophomores Tom and Ed hop a freight train out of
their Midwestern small town so as not to burden their unemployed parents. They
meet many kids their age who are doing the same, and they team up with a
runaway named Sally. They are traveling in search of work, but wherever they go
the kids face violence (including sexual assault) and unwelcoming police. When
they get to New York, an opportunity arises but a run-in with the law
complicates matters.
None
of these films is unforgettable, but every one is a solid argument on the side
of the artistic libertines. Thumbs Up.
Clip from Frisco Jenny: in pre-quake 1906, night club hostesses relieve customers of cash
Nothing
lasts forever. We certainly don’t. The oldest fully documented human lifespan (that
of Jeanne Louise Calment) was 122 years: 1875-1997. There have been claims of longer
lives. Tom Parr of Shropshire supposedly died at 152 in 1635 after
overindulging as a guest of Charles I. Odds are, though, he had claimed the
birth record of his grandfather as his own because he enjoyed the notoriety of
being old and hale. Record-keeping was hit-and-miss in the day, so it was an
easier deception to pull off then. Even if accurate, however, 152 is short
enough in the scheme of things.
Many
people externalize fears about our own personal deaths by contemplating the end
of humanity instead. Hence the popularity of apocalyptic literature, which in
religious and secular forms is as old as literature itself. In his book The Day It Finally Happens, Mike Pearl
writes, “But a certain breed of science nerd seems to take actual comfort in an
ultimate and inevitable apocalypse – or if not comfort, per se, then a certain
gleeful, misanthropic relish.” Indeed. Pearl doesn’t relish such thoughts, but
they do preoccupy him. Pearl describes himself as suffering from an anxiety
disorder that prompts him to be a writer: “it fills my head with ideas but I
hate the ideas.” As a “coping strategy” he writes a Vice column “How Scared Should I Be?” for which he researches the
actual risks of his various fears coming true and what the consequences would
be. He finds the process soothing somehow even when the risks turn out to be
rather high. The Day It Finally Happens
discusses a score of those hateful ideas.
Some
of his chapters truly do involve high order calamities such as nuclear war and
the next supervolcano eruption. Others do not: for example “The Day the UK
Finally Abolishes Its Monarchy.” That day, which he gives a 5 out of 5
plausibility rating, will not herald the end of civilization in the UK or
anywhere else. (I avoid the subjunctive in deference to his possibly debatable
5/5 rating, at least anytime soon.) It will end the name “UK,” which will be
replaced by a United Something-Else, but other peoples have survived the
transition to a republic, and so will the Brits. Also unlikely to be
world-ending is “The Day Humans Get a Confirmed Signal from Intelligent Extraterrestrials.”
Whatever one thinks of his 4/5 plausibility rating for this one, such a signal
most likely would be a stray indecipherable transmission from hundreds of light
years away (or much much farther) thereby making any meaningful two-way
communication impossible. More Heaven’s Gate-style cults might spring up here
and there (invest in Nike?), but it is doubtful much else would change. Some
chapters discuss two-edged swords, such as “The Day Humans Become Immortal.”
This is a pretty good day from an individual standpoint, but were it to happen
(he gives it a 3/5 plausibility rating, though not in this century) even a tiny
fertility rate would crowd out the planet in short order. Actually, even if we
somehow ended all deaths from aging and disease, we would not be immortal. Assuming
we otherwise remain human (no cyborgs or engineered invulnerabilities), we will
have fatal accidents, and sooner than one might think. Actuarial tables show
that it would be the rare human who survives much beyond a millennium. (Population
still would be a problem even so.) 1000 years is pretty good, though, Voltaire’s
warning about lifespans in Micromegasnotwithstanding. I’ll take it.
As
mentioned, some of Pearl’s scenarios are legitimately scary such as “The Day
Antibiotics Don’t Work Anymore” and (given the dependence we already have on
it) “The Day the Entire Internet Goes Down.” Yet, Pearl is (despite, or because
of, his anxiety disorder) fundamentally an optimist. All of his scenarios would
be hard on at least some of us. A few would be widely horrific. Yet, none is an
utter extinction event. His researches show that nuclear war, climate change,
and supervolanoes are all survivable by some. This comforts Pearl. “I feel a
very strong sense of revulsion when I imagine my entire species literally going
extinct,” he explains. “Don’t you? If you don’t, I’m not sure we can hang…”
I’m
not sure we can hang. I don’t dispute his survivability assessments for his
scenario list. I just am sure there will be worse days than the ones about
which he writes – including one that ends us all. Whatever we do or don’t do to
our climate in this century, for example, earth in the longer term has lethal
plans of its own. There was once a mile of ice piled on top of where I am
sitting right now, and there will be again one day. Civilization
will be a little tough to maintain in this spot. (No jokes, please, about
whether civilization exists in New Jersey at present.) Astronomical events have
all but wiped the slate clean on earth in the past and will again. The sun
itself has a limited life span, and the planet will become uninhabitable long
before the end of it. I don’t really worry much about it, and not just because
probably none of these things will happen in my lifetime. If there were some
way to collect the bet, I would bet our machines will outlive us. They have a
better chance of surviving off-world for the long term – though, again, not
forever. That’s OK. We accept our own ends. Why not Our own End? We’re here
now. That counts for something – maybe everything. Right now, I quite literally
smell the coffee. I’ll go pour a cup.
1965
was one of the more notably transformative years for me personally. The year
one turns 13 is for most people: one falls from the apex of childhood to the
lowliest rank of teenager, a change commonly driven home by the start of high
school. It was the year I became very self-conscious in both good and bad ways.
Much of the “feel” of the year is still very real to me. I have many strong
sense memories from the year including smells from such various sources as horse
stalls, mimeograph paper, and (permeating nearly all interior spaces) tobacco smoke. My favorite album that year was Animal Tracks. (I still like Eric Burden and the Animals; I caught
a concert by the septuagenarian last year.) To my classmates back then I pretended
my favorite was Highway 61 Revisited
because that was a cooler answer. I did, in fact, like that album (and Dylan in
general), but not as much as more straightforward rock. (A quick look shows
that the vintage Highway 61 Revisited
vinyl is still on my shelf.) My first fumbling attempt at a flirtation was deliberately
ignored or honestly unnoticed – either is possible. Meantime the world was
turning on its head. To be sure, I was aware of the cultural milieu to the
extent someone that age ever is, but it seemed normal to me. The fish, as the
adage goes, does not notice the water in which it swims.
My
mom noticed. I remember her saying in 1968 that in the previous few years “the
world just went crazy.” This was from someone who had been a teenager during
World War 2. Still, I knew what she meant. (By then I had evolved a little
beyond a fish apparently.) The presuppositions of the very Leave It to Beaver era of my childhood (I even looked a little like
Jerry Mathers) had shredded – quickly. Anyone who lived through the 60s knows
just how distinct the two halves of the decade were. 60-64 were just the 50s
amped up a little. “The Sixties,” as we usually think of them, were the second
half of the decade, which spilled over into the early 70s. A minor example of
the shift: compare the Beatles albums Meet
the Beatles (64) and Sgt Pepper (67).
My
mom’s assessment (stated somewhat more academically) is shared by many from
across the philosophical spectrum. Nicholas
Leman, Professor at Columbia University, says that the 60s “turned as if on a
hinge” in 1965. George Will independently uses the same hinge metaphor. Charles
Murray in Coming Apart identifies the
year as the moment when the country began to…well…come apart in the ways that
are all too obvious today. Cultural critic Luc Sante (The New York Review of Books) comments that western culture reached
some sort of peak in 1965 and has been in decline since. Even crime became qualitatively
different (see my review of Evilby
Michael H. Stone and Gary Brucato) as standards
shifted. Major social changes don’t really happen without a prelude, and
the roots of The Sixties are discoverable in the subcultures of The Fifties if
you look for them. Nonetheless, politically, socially, and culturally the
country reached a tipping point in ’65, and from there the rapidity of change
was dizzying. We are still dealing with the aftermath in innumerable ways.
James
T Patterson aims to capture those twelve months in The Eve
of Destruction: How 1965 Transformed America." The author, who was a
30-y.o. (as in don’t-trust-anyone-over) professor at the time, has a
perspective different from mine (not a criticism, just an observation) but does
a pretty good job covering many of the key elements. The title refers to a 1965
hit song that never would have charted just a year or two earlier. Patterson
details a busy year for national and world events. President Johnson openly
committed US combat troops to Vietnam thereby missing the last chance to
avoid Americanizing the war. The Civil Rights Act of 1964 (outlawing public and
commercial discrimination “because of such individual's race, color, religion,
sex, or national origin”) and the Voting Rights Act of 1965 took hold and promised
real improvements. Yet on the street there were racial confrontations in Selma and
all out riots in Watts. Great Society programs coupled benefits with unintended
social consequences. Patterson writes of the role of youth culture, of student
organizations such as SDS, of the generation gap, of the credibility gap, of
sexual politics, and of the environmental movement. The easy confidence about
the future that had been so much a part of American psychology for a century
fled as political divisions deepened in ways that haven’t healed since.
The
book is worth a read. If I have a reservation, it is
the short shrift he gives to the apolitical (and, some would argue, more
important) aspect of the counterculture that flowered (bad pun intended) mid-decade:
the part about personal enlightenment and alternate ways of living. Timothy
Leary: “When the individual's behavior and consciousness get hooked to a
routine sequence of external actions, he is a dead robot, and it is time for
him to die and be reborn. Time to ‘drop out,’ ‘turn on,’ and ‘tune in.’" This,
admittedly, was a Revolution that failed (regrettably) in broader social terms,
but it still has a legacy that matters on another level.
Why
care about 1965 in 2019? There is always
something to be learned from watershed moments of the past. As Professor Joseph
Wittreich (not Mark Twain despite the common misattribution) remarked, history
doesn’t repeat but it often rhymes. A little prep work helps us to sing along.
From
movie commentaries, I knew that the 1944 film To Have and Have Not bore almost no relation to the 1937 novel of
the same title other than featuring a fishing boat owner named Harry Morgan and
the prevarication “Ernest Hemingway’s” in the promotional material. The movie
is set in Martinique in 1940 when the island was still controlled by Vichy
France. Not an adaptation, it is basically Casablanca
reset in the French Caribbean though the dynamic between 19-y.o. Lauren Bacall
and 44-y.o. Humphrey Bogart is different (on and off film) from that between
Bogart and Ingrid Bergman in the earlier film – so different that Bogie and
Bacall became an item and eventually married. Even that gossipy aspect of the
film makes a better story than the novel.
Hemingway
is a towering figure in American letters, though the quality of his work varies
a lot. (Whose doesn’t, one might fairly ask.) I’ve enjoyed most of his short
fiction and a couple of his novels, but struggled to get through others despite
his well-crafted sentences. When at long last I picked up To Have and Have Not last week, it was a struggle. Nor was this
just my own reaction. After slogging through it, out of curiosity I checked the
1937 review by J. Donald Adams in The New
York Times. He writes, “The expertness of the narrative is such that one
wishes profoundly it could have been put to better use... Mr. Hemingway's
record as a creative writer would be stronger if it had never been published.”
Indeed.
Harry
Morgan, married with children, is presented as a Have-Not even though he owns a
charter fishing boat. In addition to legitimate jobs he smuggles contraband and
people between Havana and Key West. He is crude, abusive, obnoxious, and
racist, even by 1930s Florida standards. I suppose this is to reinforce his
representation as a common man, but if the intent is thereby to make him
sympathetic (could that possibly be the intent?) it backfires badly. A rich
Have recreational fisherman charters Harry’s boat but cheats him of his fee.
This leaves Harry stuck in Cuba without money, so he traffics with criminals
and revolutionaries, commits murder to keep an illegal job on track, and
undertakes to smuggle Chinese illegal immigrants into the United States.
Instead, he bilks the Chinese and strands them on a Cuban beach. Somehow we’re
supposed to feel sorry for him when things go bad at the end because he’s a
Have-Not. We don’t. (At least I hope most readers don’t.) The Haves in the book
are reprehensible, yet Harry behaves far worse than any of them. Further, he
doesn’t take responsibility for his actions because of his social position.
Hemingway was influenced at the time by the Marxism of his compadres in the
Spanish Civil War, but if the intended message was pro-working class it comes
across almost backwards.
Recommendation:
Be kind to Ernest and skip this book. Opt for A Farewell to Arms or For
Whom the Bell Tolls instead. Or watch the movie (screenplay by Jules
Furthman), which is quite good.
A
couple weeks ago I advised passing on the latest blockbuster and walking down
the multiplex hall to the indie film with three or four viewers. I neglected to
follow my own advice when Under the
Silver Lake was up against Avengers: Endgame
last spring, but I made up for it yesterday by spinning up a DVD of the flick. *SPOILERS*
of a sort follow, though more regarding the film’s subtext than text.
Anyone
seeing this movie without any prior knowledge of it is likely to think right at
the outset, “Oh, a David Lynch movie.” It’s not. The director is David Robert
Mitchell (It Follows) whose homage to
Lynch is so close as to be initially distracting; fortunately, enough transpires
on screen for that reaction to fade.
The
protagonist Sam (Andrew Garfield) is jobless, behind on his car payments, and
facing eviction. He makes no effort at all to rectify this. So, at first glance
he is a slacker loser. Yet this is not quite right. He is energetic and
diligent at pursuing his interests. Those interests just don’t include the
banalities of everyday responsibilities. He is charming enough to do very well
with the ladies (including Riki Lindhome) despite his impecunity. He has enough
boyish charm to keep viewers in their seats, too, even though he is often
creepy and sometimes villainous. He spies on a topless middle-age neighbor even
(driving home the Freudian element) while talking to his mother on the phone. When
kids vandalize his car he punches them – hard. We see him commit homicide;
granted, the fellow had shot at him, but retreat was very much an option. There
is a dog killer stalking the neighborhood, and (though the killer is not
identified) there is reason to wonder if he is Sam.
Sam’s
real interest is a common one in our secular world: a search for meaning beyond
just drudgery and paying bills. As a friend remarks to him, “Where's the
mystery that makes everything worthwhile? We crave mystery, 'cause there's none
left.” Not everyone handles nihilism well; they frequently find some obsession
(politics is a favorite) to divert themselves from it. Sam wonders if there is
a conspiracy of in-people who run the world for their own benefit and have
access to deeper revelations that they keep to themselves. This is his
obsession. He begins to see secret codes everywhere by which the insiders
communicate with each other. Some of his hypothetical codes are crazy even in the
context of the movie (e.g. Vanna White’s eye movements), but some turn out to
be real, such as messages recorded backwards on popular music. When a neighbor
Sarah (Riley Keough) with whom he has a flirtation disappears, a symbol is left
behind on the wall of her apartment. Sam’s investigation of her disappearance gives
him real leads to the conspiracy, thereby putting himself and others in danger.
The
movie makes no mention of the Illuminati, but in the real world there are
people with views similar to Sam’s who do believe in them. Suppose they exist. Suppose that
underneath their worldly machinations there is an occult purpose. What
if, after arduous effort, you discovered the secrets of the Illuminati only to
find they are as credible as those of the Nike-wearing Heaven’s Gate guru?
Depending on one’s mindset, the revelation could be shattering.
FYI,
there are a lot of self-referential hidden codes in the movie, but none of them
are important. (Animal images share first letters with the title, for example.)
Only bother with them if you enjoy puzzles of that kind for their own sake.
This
surrealistic noir is definitely not for everyone. Yet there is more to it than
will be found in the CGI battles of the spandex superheroes who dominate the
box office.