Sunday, February 24, 2019

Triple Play


An Evening with Beverly Luff Linn (2018)

Starring Craig Robinson as the title character Beverly Luff Linn and Aubrey Plaza as Lulu Danger, Jim Hoskin’s An Evening with Beverly Luff Linn repeatedly made me ask, “Why was this film made?” I’ve yet to come up with a satisfactory answer.

I like Aubrey Plaza whose usual deadpan style only strengthens the emotional moments in her screen roles. I like that she is willing to take chances with offbeat characters, which has paid off nicely in films such as Safety Not Guaranteed, Ingrid Goes West, and even (to a degree) Life after Beth. But they wouldn’t be “chances” if they didn’t sometimes turn up snake eyes. This time was craps (final “s” optional). The problem was not Aubrey or the other actors. It was the material.

I also like “quirky” when done right. Wes Anderson has a habit of doing it right, but he didn’t make this film. This film was a ham-handed attempt at quirky. A movie is not automatically artistic or amusing just because it has characters who talk oddly, behave weirdly, and dance awkwardly.

Plot: Lulu Danger is married to the total jerk (and thief) Shane (Emile Hirsch). In the company of would-be errant knight Colin (Jemaine Clement), Lulu leaves Shane. Lulu and Colin stop in a hotel where Beverly Luff Linn (a fellow with whom Lulu has an unresolved history) will be giving a performance. Beverly’s assistant (Matt Berry) is in love with Beverly. The relationships shake themselves out over a few days. That’s pretty much it, which in the right hands could be enough. The wrong hands were at work here.

In fairness to the filmmakers. I’ll mention that the positive critic and audience scores on Rotten Tomatoes are 52% and 56% respectively, though I can’t imagine what those slight majorities saw in it. Whatever it was, I missed it.

Thumbs Down.

**** ****

The Conspiracy against the Human Race: A Contrivance of Horror
by Thomas Ligotti

The plots, themes, and characters of every fiction writer are informed by that that writer’s philosophy and world view, acknowledged or otherwise. Joss Whedon comes to mind for reasons that will be obvious in the next segment. He rather famously infuses his scripts with existentialist notions about choices: about how we always have them (even if none are good) and about how our choices ultimately are who we are. (If you look closely in one episode of Buffy, you may notice the character Angel reading Sartre’s La Nausée.) Ligotti is a very good writer of especially creepy horror fiction. In 2010 he decided to be explicit about his own philosophy and how it relates to his fiction in The Conspiracy against the Human Race. Last year he published an updated new edition of the book. Ligotti is a philosophical pessimist, Schopenhauer being the best known proponent though Ligotti references many.

To the extent that Pessimism as a coherent philosophy has something to offer beyond “realism,” it is the observation that if you always expect the worst you won’t be much disappointed. There are some who find that comforting. Pessimists view life as a painful experience that inevitably ends in death. They argue that our struggle against death (even though it offers escape from pain) is merely an inherited instinct that makes no sense but is nonetheless real. Existence is accidental and without any inherent meaning. Consciousness is regarded as a catastrophe since it allows humans (unlike most creatures) to be fully aware of pain, anxiety, and mortality. People distract themselves from the awful realities with fantasies of “meaning,” with intoxicants, with physical or intellectual activities, and with made-up romantic notions. Ligotti quotes William S. Burroughs: “Love? What is it? The most natural painkiller there is.” (Woody Allen, another pessimist, said something similar in Annie Hall.)

Ligotti gives us a run-down on the evolution of various forms of horror fiction and tells us how they generate frisson by turning our eyes to the terrible while simultaneously distracting us from the terrible in real life.

Ligotti’s vision is not mine, but I do understand it. It is an interesting take on horror fiction in general and on his own in particular. Overall, though, I’d recommend his fiction itself rather than his analysis of it.

Thumbs cautiously and modestly Up.

**** ****

Buffy the Vampire Slayer No. 1: Buffy Summers (2019)
Story by Jordie Bellaire, illustrated by Dan Mora, created by Joss Whedon.

Into the life of every long-lived comic book character must come the reboot. This happens quickly and repeatedly for characters intended to be a particular age (e.g. the teenager Peter Parker), but eventually it happens to all. One can’t very well have a geriatric Batman, after all. Joss Whedon’s Buffy the Vampire Slayer had a longer first run than most. The TV show lasted for 7 seasons starting in the spring of 1997. It was followed by 5 more “seasons” (elsewhere known as volumes) of comic books that didn’t come out every year. The Season 12 grand finale (The Reckoning) was published late in 2018. The timeline of the comics is not the same as for our world (i.e. not the same as the publication dates), but in Season 12 Buffy says she is age 30, which makes the year 2011 for the events in the comic. That date is too long ago and the protagonist is too adult for the series’ target demographic. Time for new boots.

Fans of any original series always have trepidation about a reboot. This is especially so in recent years when our socio-political divisions have infiltrated our popular culture to a degree that can impede (or replace) good storytelling, and all too often do. Older fans can relax: Buffy No. 1 is a suitably good yarn. Buffy is back again in Sunnydale High at age 16, but in the year 2019. The familiar cast of characters is back, albeit with some differences as one expects in a reboot, such as Anya already running the magic shop.

Older fans might question some of the changes. Buffy, for example, has her act more together than in her 1997 incarnation. The whole theme of the original series is about growing up and (in an unsubtle metaphor) about fighting one’s demons in the process, so 1997 Buffy is full of promise but as yet literally sophomoric. 2019 Buffy is savvier, but she is still fundamentally teen Buffy with plenty of room to grow, so the original theme presumably is not thrown entirely out the window. The biggest change is to Willow. Willow grows more over the course of the TV series than any other character, Buffy included; in 1997 she starts out painfully self-conscious, nerdy, and shy but develops (despite some lapses into bad behavior) into the most formidable of Buffy’s allies. 2019 teenage Willow, on the other hand, already is self-assured and apparently already settled in her orientation, too. As that may be, the point of the reboot is not to satisfy old fans but to win new ones who are themselves experiencing the hellmouth that is high school. It should succeed at that. The new fans won’t be aware of the changes unless they choose to visit the original series, which finished its TV run before most of them were born.

I won’t be buying No.2 (this reboot is definitely not aimed at me) but Thumbs Up nonetheless.


Trailer for An Evening with Beverly Luff Lin


Wednesday, February 20, 2019

Rousseau and Hobbes, Sitting in a Tree…


Rousseau thought that people by nature were peaceful unless corrupted by civilization. Hobbes thought people were violent unless civilized by society.  Both were right. Evolutionary biologist and Harvard professor Richard Wrangham addresses this duality in his book The Goodness Paradox: The Strange Relationship between Virtue and Violence in Human Evolution. Wrangham, who had studied in the field with Jane Goodall, is also the author of Catching Fire: How Cooking Made Us Human and is co-director of the Kibale Chimpanzee Project. He sees the key to the paradox in the distinction between between proactive and reactive aggression.

Reactive aggression is the tendency to respond with threats (such as hisses, growls, and roars) and violence when approached, whether by a member of another species or one’s own. Most animals including our cousins the chimpanzees – particularly (but not exclusively) males – typically act this way not just to strangers but to members of their own pack or social group. Humans rarely do. He notes that 300 chimpanzees never would sit quietly side by side for hours; fights would break out and (with nowhere to run) fatalities would be likely. Yet humans do that on planes and in movie houses all the time. However often bar fights and physical assaults may be on the news, the remarkable thing is how rare they are among humans. We are more tolerant even than bonobos, the mild-tempered close relatives of chimpanzees. We share this level of tolerance only with domestic animals, which have been deliberately bred for this temperament.

People domesticated animals, but who domesticated people? Wrangham argues (as others have before him) that our ancestors did it to themselves by ganging up on any overly violent, dominant, or annoying individual, who was then killed or ostracized and thereby removed from the gene pool. (Remaining hunter-gatherers still do this.) Sociality became a reproductive advantage. Bonobos, whose environmental pressures differ from those on chimps by favoring more social tolerance, did something similar to themselves. Not having language, the capacity of bonobos to stir up conspiracies against bullies is much more limited and the results therefore less extreme than among humans. This ganging up is, of course, proactive aggression. The extensive planning permitted by language made humans’ proactive aggression deadly on an unprecedented scale whether against actual outsiders (e.g. World War One: see my recent review of the documentary They Shall Not Grow Old) or against undesired members of their own social groups.

Domestic animals share a large number of traits that are incidentally related to the primary one of social tolerance, including neoteny, smaller teeth, and smaller brains. This is true even of relatively smart domestic animals; dogs, for instance, have smaller brains than their wolf ancestors of comparable size. Humans, too, have those traits. After millions of years of growing larger, human brains shrank some 15% from their Stone Age peak, reaching their current size many thousands of years before farming (so the reduction was unrelated to it). Apparently, beyond a certain level of population density (still extraordinarily sparse by modern standards), being social conferred more reproductive benefits than being smart.

There remain people who are violent for the fun of it (i.e. criminals), of course, but they are few by ape standards. However, humans are unmatched (in fact, unique) in our capacity for moralistic violence: our intelligence and our language skills let us identify as “other” those with the wrong ideology, religion, accent, or whatever, and enable us to whip up moral outrage against them. The most horrific mass killings are by moralists who think that they are doing the right thing – even the obligatory thing. They are not criminals in the usual sense, and are likely to be kind and polite people in everyday life. As an example, Wrangham relates the story of anthropologist Alexander Hinton who investigated the ideologically driven Cambodian massacres of the 1970s that killed nearly 2,000,000 people. Hinton was disconcerted by a former Khmer Rouge named Lor who openly stated he had killed many men, women, and children: “I saw before me a poor farmer in his late thirties, who greeted me with the broad smile and polite manner that one so often encounters in Cambodia.” Says Wrangham, “So the definition of morality that I will follow here is not limited to altruism or cooperation. I take moral behavior to be behavior guided by a sense of right and wrong… We sometimes think that cooperation is always a worthwhile goal. But just like morality, it can be for good or bad.”
Mae West: “What is this, propaganda?”

So, our better natures have their roots in aggression. However, Wrangham does not suggest that we need to continue a social strategy just because it has evolutionary roots. He opposes capital punishment, for example, as no longer necessary even though we may well owe our peaceful natures to it. We are big brained creatures, after all, (despite the late Paleolithic shrinkage) and we can choose to be better. Most of the time we do. “The one guarantee that an evolutionary analysis can offer, however, is that it will not be easy for fairer and more peaceful societies to emerge.” Fortunately, our harsh ancestors gave us the cooperative skills to make that possible.


Bessie Smith - A Good Man is Hard to Find (1927)


Friday, February 15, 2019

Still Kickin’


Popular musicians have existed for all of history, but until recorded music came along (invented in 1877 but not commercialized until 1892), only a small portion of the population ever heard them, and there was no way (other than memory) to compare them to who came before. So, it wasn’t until the 1910s that a younger generation was really able to dismiss their parents music as old-fashioned tripe and for parents to decry their offspring’s preferred music as newfangled trash – and decadent to boot. Both are always right. And wrong. Every era’s popular music is a mix of wonderful and awful with little obvious relationship between one or the other and sales. However, they are more right at some times than others. Some decades really do have something special.

The 40s was a decade with something special – so was the 60s. The 40s had much more than the Big Band sound, but that was the most iconic 40s music. Arguably it was killed by taxes. In 1944 the U.S. imposed a 30% cabaret tax on clubs hiring live bands – a tax not on the profits but on gross receipts. This made hiring large bands uneconomical for most clubs. Perhaps the sound was on its out anyway, but the words “nail” and “coffin” might be relevant. To be sure Big Bands still exist, but no one goes to a Glenn Miller Orchestra concert (yes, it still tours) to hear new music; they go to hear 40s classics. New music is still written for Big Bands, but the audience for it is scarcely large enough to qualify as a niche. The sound is no longer living popular music: it is a nostalgia act.

Rock and roll (particularly in the 60s variants, which my parents hated or at least pretended to hate) fared better for longer. After decades of dominance, however, rock hasn’t cracked the top ten in the singles charts for the past several years. Hip hop and pop dominate instead. (I now know how my parents felt.) So, “rock is dead” as it so often has been before. Or not. Nowadays most music is downloaded digitally (often for free, legally or otherwise) as singles. Rock may not be among the top singles downloads, but rock albums (new as well as classic) still sell strongly, especially as cds and vinyl. Rock bands in toto still sell more tickets for live performances than other genres. Classic bands (those that yet totter on stage) have fans who buy tickets largely to hear them play classic numbers, it is true, yet the genre has not been relegated to nostalgia gigs. 21st century bands (e.g. The Cadillac Three, Broken Witt Rebels, Greta Van Fleet, etc.) regularly form and win audiences with new material. So, rock remains living popular music, and not just a niche in manner of jazz.

One of the 21st century bands is Dorothy, who released their impressive debut album Rock Is Dead in 2016 and followed it up convincingly last year with 28 Days in the Valley. The band’s live shows are among the best currently on the road that feature (mostly) blues-based power rock with unapologetic infusions of psychedelia and even (in spots) country. The lyrics contain a full range of passions in an age that too often devalues every one of them but self-righteous anger (the emptiest). I caught the band last night at Irving Plaza, one of the better concert venues in NYC. Amid a crowd of fans overwhelmingly young enough to be my grandchildren (had I any grandchildren), I was pleased for once to be outside the demographic. I don’t even mind that 20 hours later my hearing has yet fully to recover.

It’s possible that rock truly is on its way out, but it is not dead yet. Judging by last night it does not go gentle into that good night.


One of their more mellow numbers: Pretty When You’re High

Saturday, February 9, 2019

We That Are Left Grow Old


Peter Jackson is a film producer/director whom I’ve admired more than liked. The reason for the subdued affection until now lay in his source materials. Though it costs me many nerd points to say so, I’ve never been a Tolkien enthusiast, so Jackson’s Lord of the Rings trilogy (for which he won multiple Oscars) didn’t catch my fancy any more than the books did. That doesn’t prevent me from recognizing what marvelous examples of moviemaking his productions are, and I very much understand why so many viewers like them. So, too, much of his other work including The Hobbit and (as producer) Mortal Engines. (I actually liked his King Kong remake.) Despite being a lukewarm fan, therefore, I had little doubt that the Imperial War Museum made the right choice when they approached Jackson to do something with archive footage from World War One to commemorate the 100th anniversary of the Armistice.

Jackson and his superb visual effects team in New Zealand outdid themselves in the documentary They Shall Not Grow Old. (The title is a dyslexic misquote from Laurence Binyon’s 1914 poem For the Fallen.) The scratchy faded footage with which we all are familiar has been startlingly transformed by technical wizardry. What were once grey shadows emerge as real people in bright true-to-life colors and 3D. Audio has been added including voices where the words of soldiers can be determined by lip movement. Jackson did not attempt a chronological historical telling of the war. There is scarcely a mention of any specific battle or date. Instead, we have the day to day experiences of British soldiers at the Front: the tedium, the terror, the meals, the mud, the casualties, the shelling, the rats, and the trench foot. The footage is accompanied by voiceovers from World War One veterans whose interviews were recorded 50 years ago. At that time World War One was as distant (and as close) as the Vietnam War is today. The veterans speak of selflessness, loyalty, and comradery among fellow soldiers, and casual brutality (but little or no animus) toward the enemy. There are offhand remarks about killing surrendering Germans (the exception rather than the rule, fortunately) for example, because taking prisoners was too much trouble in the circumstances. For all the horrors of war and the desire to see it end, the vets almost uniformly speak of a sudden loss of purpose when the guns fell silent on 11/11/1918.

I’m old enough that there were a lot of World War One veterans still around in my youth. One of my grandfathers, who turned 18 in August 1918, escaped serving by a hair. He was called up but, because of the Armistice, not inducted. (He asked to serve in the infantry because it was the first thing that popped into his head when asked his preference.) Peter Jackson dedicated They Shall Not Grow Old to his own grandfather who did serve with the 2nd South Wales Borderers infantry regiment. I didn’t know enough to question that generation in a meaningful way about their experiences and views when I had the opportunity, but I’m glad Jackson found a way to let them speak to us still.

They Shall Not Grow Old is presently in theaters, but you might have to search for it. The closest multiplex to me that offered it is 20 miles away. There was just myself and one other guy in the theater for the showing, which helps explain why it isn’t on more screens. This is a shame, for World War One is the central calamity of the modern era. Understanding it is the key to understanding the rest of the 20th century. We are still living in the war’s aftermath. There is no better way at this late date to visualize how it was for the people directly involved at the time than through Jackson’s film.

Thumbs solidly Up.


Tuesday, February 5, 2019

The Road to Utopia


Dystopian novels and movies have dominated science fiction in recent years, e.g. The Road, The Hunger Games, Blade Runner 2049, Idiocracy, etc. They haven’t entirely squeezed out other types (e.g. The Martian and the Star Trek reboot), but they are at the forefront. To be sure, dystopias always have had a place in the genre (e.g. Metropolis, 1984, the original Planet of the Apes, etc.), but at present we are surprised if a scifi book or film is set in anything else. The dystopian Hotel Artemis (reviewed on this site last week) motivated me to look for something else.

H.G. Wells has occupied space on my bookshelves since I was a boy. The War of the Worlds was the second novel I ever read that wasn’t intended by the author to be children’s literature. (The first was Arthur Conan Doyle’s The Lost World.) His best-known science fiction tales were written between 1895 (The Time Machine) and 1914 (The World Set Free, which features a war with “atomic bombs” built from radioactive materials), but Wells continued to write on various subjects until his death in 1946. His later books include fiction of a non-scifi nature, A Short History of the World, numerous political/philosophical tracts, and, of course, more scifi. One scifi novel from the later phase that I had missed until a few days ago is the utopian Men Like Gods published in 1923. One must remember that this was shortly after a horrifically bloody war in which 2% of the British population died (concentrated, of course, in a single generation). It was a time of cynicism to put it mildly: the self-styled “Lost Generation.” Utopian dreams were in short supply, but Wells came up with one.

Almost every trope in contemporary scifi has some antecedent in Wells, who more often than not did it not only earlier but better. Men Like Gods posits the existence of parallel worlds in a multidimensional multiverse (yes, 1923); they are possibly infinite in number with the “closest” alternate worlds being most nearly alike. In a world right next door to ours, beings (altogether human but socially far more advanced) experimentally open a door (another “direction”) into our world for a moment. It is open long enough for three cars on the road from Slough to Maidenhead to drive through it. The protagonist, Mr. Barnstable, is a writer for a liberal newspaper. He needs a break from his unsatisfying job and his even more unsatisfying family. His decision to get away from both for the weekend puts him on the fateful road alone in his yellow two-seater. The occupants of the other cars include an aristocratic lady, a greedy power-seeker, a puritanical priest, a beautiful minor celeb, a very intelligent but ultimately amoral conservative politician, and a few other hangers-on. They find themselves in what Mr. Barnstable unabashedly calls Utopia. Handily, the Utopians’ telepathic abilities are advanced enough that when they speak listeners understand them in their own languages provided they can grasp the concepts. Of all the earthlings, only Barnstable likes the place. All the others – the prudish priest most of all – consider the Utopians and their society degenerate.

The Utopians at first glance have an anarcho-communist post-scarcity society. While there is no central authority as such, however, we learn that there are global institutions: “Decisions in regard to any particular matter were made by the people who knew most about the matter.” For example, while there is no money per se in Utopia, there is a global “electrical” accounting system (necessarily run by technocrats) that assures everyone gets what he or she needs and gives back appropriately. The “giving back” (i.e. work) is not very burdensome or time-consuming, and its type is freely chosen: a Utopian is “credited at his birth with a sum sufficient to educate and maintain him up to four- or five-and-twenty, and then he was expected to choose some occupation to replenish his account.” When Barnstable asks what happens if someone doesn’t do that, the answer is, “Everybody does.” Amusingly, Wells in his Utopia still allows artists (presumably including writers such as himself) to “grow rich if their work is much desired.” The Utopians are beautiful and we learn that they practice eugenics (a progressive thing back in the 20s) and they use genetic engineering of plants and animals to tame nature. They wear little clothing (being beautiful, why wear more?) and lack sexual hang-ups. They speak of an Age of Confusion in their history that was very much like 20th century Earth. The global population is a manageable 250,000,000, far lower than during their last Age of Confusion. (Earth’s population in 1923 was about 1.8 billion; it’s over 7.7 billion today.) They tell Barnstable that the Ages of Confusion are necessary stages, but that they can be transcended. Their science is advanced and they have aims of reaching the stars. There are adventures and problems with the earthlings including infections they brought with them. Eventually Barnstable realizes that as much as he loves Utopia his place is back home where he can do his bit to put Earth on the path to its own Utopia.

The novel, while somewhat preachy, is worth a read on its own merits, but also for another reason. Wells was a Fabian socialist (though he had nothing nice to say about Marxism or Bolsheviks) who very much wished to do his bit to nudge the world step by step to a higher plane. Wells’ vision of Utopia in this novel is helpful in elucidating what he has ultimately in mind in two of his nonfiction books: The Open Conspiracy (1928) and The New World Order (1939). Someone reading those titles for the first time might be forgiven for thinking Wells is scaremongering against these threats. Quite the opposite. He is all in favor of a conspiracy to effect a New World Order and tells the reader how to be a part of it.

There is a belief among some conspiracy theorists that, going back to the time of Cecil Rhodes, there has been a conspiracy of an international elite to create (as summarized by Wells) a New World Order that is a single, globalized, corporatized, fundamentally undemocratic (despite democratic window dressing) social welfare world-state run by technocrats. The theorists don’t mean a conspiracy by some fanciful secret Illuminati sect (well, all but a fringe don’t mean that) but by actual elites who meet in publicly acknowledged (but closed to the public) settings such as the meetings of the secretive Bilderberg Group or the more formal WTO Ministerial Conferences. Does such a conspiracy exist? The short answer is yes. The Bilderberg Group in particular tends to draw attention precisely because of its efforts to avoid attention. Major political, financial, and business figures attend, but you cannot buy your way into it. It is by invitation only. Just as a small sample, among the attendees have been Henry Kissinger (the author of World Order still attends at age 95), David Rockefeller, Margaret Thatcher (prior to her stint as PM), Bill Clinton (prior to being elected President), Tony Blair (prior to being elected PM), Angela Merkel (prior to being elected Chancellor), Bill Gates, David Petraeus, Queen Beatrix of the Netherlands, NATO Secretary General Stoltenberg, Hillary Clinton, Governor of the Bank of England Mark Carney, Jeff Bezos, and The Wall Street Journal columnist Peggy Noonan. The meetings are not recorded, no notes are kept, and attendees agree not to publicly identify a speaker afterward with anything he or she might say in the meeting. Said Labour MP Denis Healey who was on the Bilderberg Group's steering committee for three decades, “To say we were striving for a one-world government is exaggerated, but not wholly unfair.” With regard to invitations he said, “We make a point of getting along younger politicians who are obviously rising, to bring them together with financiers and industrialists who offer them wise words.”

So, does this shadow elite actually govern the world, making (as some theorists would have it) the politics that dominates the news a mere puppet show to distract the populace? The short answer (for well or ill) is no. They may be influential and they may be (to a degree) like-minded, but their influence is limited. They are thwarted time and again: especially by populist movements, be they mild or radical (in the latter case all too often murderous) and be they on the Left, the Right, or some other (such as religious) direction. No wonder “populism” was on the 2018 meeting agenda last June. Perhaps, though, history flows their way in the longer run despite many sanguineous setbacks along the path. Whether it flows to Utopia the reader can judge.

As for my opinion, I’ll go along with something said by Henry Kissinger decades ago in another context: “We have always believed that every problem must have a solution and that good intentions must necessarily guarantee good results. Utopia was seen not as a dream but as our logical destination if we only traveled the right road. Ours is the first generation to discover that the road is endless, and that in traveling it we shall find not utopia but ourselves.”


Alanis Morissette – Utopia