Wednesday, December 30, 2015

Chronophobia


Is there really only a single day remaining in 2015? Is it really 15 years (- a day) since 01/01/2001? Is it really a quarter century since the Gulf War? Is the Vietnam War as distant in time today as World War 1 was when I was in high school? (I remember how remotely ancient that conflict seemed to me.) Is the end of World War 2 really as far back in time today as the Spanish American War was when I was in high school? How astonishing one finds the answer to each of those questions no doubt depends on one’s current age. To a 15-y.o. 2001 is very long ago: as long ago as possibly can matter except to crusty historians. To a 30-y.o. it is half a lifetime ago. To a 10-y.o. 2015 all by itself is fully 10% of his life – a hefty chunk. To those of us past a certain age, however, something seems terribly wrong with the answers. Surely it was just a handful of years ago that I walked down F St NW in DC with a new diploma in hand wondering what to do now that undergrad studies were over, wasn’t it? Only if the years are equal in number to a handful of dried rice. News Years Eve makes one conscious of one’s life clock as does no other day of the year, birthday included.

By coincidence (I think) the two novels I read this week both dealt explicitly with time. Since I was a boy science fiction has been part of the mix of my recreational reading, and it remains so. (Children’s lit of the Dr. Seuss sort aside, the very first novel I ever read was Doyle’s The Lost World and the second was Wells’ War of the Worlds.) Time travel is such a staple of scifi that it is well-nigh impossible to come up with a completely original take on it, but these two at least put new twists in older ideas. They couldn’t be more different from each other.

Split Second by Douglas Richards is a rousing and well-constructed action/adventure scifi tale. Physicist Nathan Wexler believes he has come up with an elegant theory with little real-world application. If his interpretation is correct, quantum effects and dark energy in particular circumstances could allow time travel into the past a mere .00004515 second. This seems too short a time period to be of any practical use, but he neglects to consider that this effectively would duplicate an object; the duplicate (in a sense the same object) would be displaced by the distance traveled by earth in .00004515 second (about 58 feet). Other people do see the implications and they want to monopolize the technology permitted by the theory before it goes public. Mayhem ensues involving Wexler, his girlfriend Jenna, and a private investigator who finds himself in a bigger fight than he anticipated. Though nearly all scifi requires at least one dubious supposition, the science for the most part is well researched. Thumbs Up, though the philosophical questions the book raises about time are nothing unfamiliar to regular scifi readers.

Whereas Richards writes of split seconds, Claire North (one of the pen names of Catherine Webb) writes of lifetimes. I suspect her novel The First Fifteen Lives of Harry August was inspired by the movie Groundhog Day, which is perpetually playing on some cable channel or other. (Yes, the irony is hard to miss.) Suppose that there are some few people who – rather than repeat the same day – repeat their lives. They are born over and over in the same year and in the same circumstances as always, but starting at the age of 3 their memories of the last life come back to them; each such person lives a life of normal length, dies, and then finds himself or herself at the beginning of it once again. Harry August is one such person, repeatedly born in 1915 in a train station washroom and dying less predictably, but usually between 1997 and 2003. Time and again his life restarts in 1915. We’ve all thought “if I knew then what I know now.” Harry gets to act on the thought. Naturally he becomes an uncannily sage investor, but after several lives he feels himself grow jaded. He does not seek out the loves of his previous lives. It seems to him that nothing he does can make a lasting difference, since whatever impact he has on the world in one life apparently is erased when his life starts over. Or perhaps not. Perhaps there are multiple timelines and all of them are permanent. Harry meets another “ouroboran” (Ouroboros, the reader will recall, is the serpent that swallows its own tail) whose effort to find out which is true by (in part) accelerating technological development has dire consequences. Whether or not those consequences are permanent, they affect the people who experience them and Harry, after initially helping, has a crisis of conscience.

The First Fifteen Lives of Harry August is moodier and more thoughtful than Split Second and, if you pick only one, it is the better read. The busy Ms. Webb (aka Claire North aka Kate Griffin) already has more than a dozen novels in print, but is only 29. That seems a little early to have grasped the jadedness of the quasi-immortal Harry August as she did. Then again, I was never more age-conscious than when in my 20s. Twentysomethings are fully aware of how few are the years they can consider themselves youths. Youthfulness can last much longer of course, but they are aware of the difference. So, maybe it is just the right age for a novel of this sort. I wasn’t jaded at 29 though. Despite appearances, I was a hopeless romantic then and for long afterward. It never ended well. I’m jaded now. Nonetheless I’d still be happy to give the last several decades a second try.


The Guess Who - No Time

Thursday, December 24, 2015

Four Flicks: Two Nix and Two Picks

Between Christmas and New Years Day there may be some time for lazing in front of a TV screen. Below are some thoughts on four of the options.

Safelight (2015)
OK, it had to happen: Juno Temple starred in a bad movie. The busy young actress has had leading or major roles in a remarkable series of indie films in the past several years plus relatively minor roles in big studio productions including The Dark Knight Rises and Sin City: A Dame to Kill For. Indies on her résumé include Little Birds, Dirty Girl, Kaboom, Killer Joe, The Brass Teapot, and Afternoon Delight, all of which have real merit. Quentin Tarantino of all people praised Afternoon Delight in particular. Other indie flicks including Jack and Diane, Magic Magic, and Horns are at least interesting. Even if in the end I didn’t really like them, I don’t regret having spent the time on them. Does no truly bad script ever come Juno’s way? Oops, one did.


“Safelight” has three meanings: it is the name of the town where the main characters live, it is a darkroom light for developing photographic film, and it evokes the lighthouses that appear in the movie. Vicki (Juno) is a stereotypical young truck stop HWAHOG (hooker with a heart of gold) in thrall to her crazy pimp. She befriends Charles, an inexperienced teen with a bad leg who works at the truck stop with his ill dad. For a school project he takes photos of lighthouses on the California coast. Vicki visits the lighthouses with him and they talk a lot about nothing very interesting. Vicki’s story of how she became a runaway is clichéd and tired. Perhaps all this worked as a novel (I haven’t read the book on which it is based) but as a movie it is soporific in the extreme. Now that Juno has had her obligatory miss, she has no need to repeat it. Thumbs down.


Playing It Cool (2014)
21st century screenwriters tasked by a studio with writing a romantic comedy have a problem. For reasons I don’t fully fathom – but which might have something to do with the current state of the gender war – audiences for more than a decade have been too cynical to give credence to old-fashioned romantic love in film, at least insofar as ordinary people are concerned. Maybe if one person is a vampire or an alien or a head of state or traveling backwards in time or some outlandish thing, they might allow the notion as being no more improbable than the rest of it. But for normal folks, forget it. Infatuation yes, but how can that end but badly? Even Disney has doubts: witness Maleficent in which the nonexistence of true love in a romantic sense is an important plot element that is never disputed.

For Playing It Cool, screenwriters Chris Shafer and Paul Vicknair tackled the problem by making their screenplay about a screenwriter (the character is actually listed as “Me”) who is having trouble finishing a romcom screenplay because he himself doesn’t believe in love and can think of no third act that isn’t clichéd, hackneyed, and unbelievable. Naturally he (Chris Evans) falls hard for a girl (“Her”: Michelle Monaghan) who is unavailable, so he tries to have a platonic relationship with her. The third act (can this really be considered a *spoiler*?) is deliberately clichéd, hackneyed, and unbelievable right down to a race against time to stop a marriage. (Harold Lloyd did this best in Girl Shy [1924], and it wasn’t new then.) We the audience are supposed to get that the screenwriters get that these screen conventions are not at all like real life. This movie is not a spoof in the usual sense, despite the protagonist phrasing his love declaration, “I'm willing to regret you for the rest of my life.” Playing It Cool is played straight. Shafer and Vicknair cynically wrote a non-cynical movie that telegraphs the irony to us. Does this meta-romcom work? Not really: too cynical. Thumbs down.

Leviathan (2014)
Some *spoilers* follow. In a bleak Russian coastal town near Murmansk, Koyla lives with his wife Lilya and his son Roma; their home and his business are on the same property. The corrupt mayor Vadim callously uses eminent domain to seize Koyla’s property for purposes that, by benefitting the Orthodox church, also will be of political benefit to himself. Koyla’s old army friend Dmitri is now an urbane lawyer from Moscow, and he tries to help. Dmitri’s legal appeals to stop the seizure, or at least to pay Koyla a fair price, go nowhere. Dmitri has a file on the mayor that he obtained from his connections in Moscow, however, and he threatens to reveal the file’s scandalous contents unless Vadim cooperates. The mayor quickly demonstrates that old-fashioned thuggery is still effective against file-waving lawyers, and Dmitri before long is on a train back to Moscow feeling lucky he is still alive. There are secondary plots involving adultery, teen rebellion, and drunkenness – a lot of drunkenness. There are no happy endings. Corruption rules.

Cronyism benefiting the politically well-connected (“special interests” is the preferred euphemism) at the expense of individuals and individual rights is no rarity in the US, of course, even if most often it is not technically regarded as corruption by most voters. The councils, zoning boards, and regulatory committees which practice this are doing exactly what they have been charged to do by popularly elected politicians. The effect, for those who have been at the losing end of it, is much the same. Nonetheless, the undisguised abuse of power depicted in this movie is grim indeed.

The Biblical and cetaceous references of the title are obvious, but it also calls to mind Hobbes, whose 17th century philosophical work Leviathan defended state authority, which, he argued, whatever its faults was superior to an anarchic state of nature. There always have been governments that cause us to call this into question.

Leviathan won best screenplay at Cannes. Thumbs up.

The Duff (2015)
By the 1930s high school was the majority experience in the US and most other industrialized countries. Since that time, the high school movie has been a recognized genre; almost everyone can relate to it. The movies have long lives: the Brat Pack movies of the ‘80s are still liked by the current crop of teens. OK, High School Confidential (1958) might be too dated to be relatable to current youth, but it is a hoot for just that reason. Adults always have been a big part of the audience for these films (at least on home screens) because the high school experience sticks with us. Most of us forged a big part of our adult identities as adolescents within high school walls.

Mean Girls has been the quintessential film of the type since 2004, but 2015 is not 2004 and each generation needs its own cinematic prime representative. I don’t think The Duff is it, but it is a better than average addition to the roster of high school films nonetheless. The film does a very good job of emphasizing the dominance of smart phones and social media in current teen life. 


Mae Whitman, a young actress reminiscent of Amanda Bynes in her teen years, is fine as Bianca, the lead character. She learns she is the DUFF, the “designated ugly fat friend” in her social circle, used by her friends for social convenience. She has a guy friend who is a jerk on the surface and she has a crush on a seemingly artistic guy who is a jerk underneath. She is tortured by the popular girls led by Madison (Bella Thorne), a character who is drawn just a bit too cartoonishly. I think the reader knows where this is going. You’ve seen high school movies before. But that’s OK, because this one is written and directed well enough. It’s not Mean Girls or 10 Things I Hate about You, but it’s not bad. Sometimes that’s all we ask. Thumbs Up.


Friday, December 18, 2015

Subtitles and Subtexts

English-speakers are famously apt to be monolingual. As English is the world’s most widely spoken second language, they can get by almost anywhere, and they tend to avoid the few circumstances where they can’t. Anglophones are not really lazier by nature than other folks; they just have a greater opportunity to be lazy in linguistic matters, so they are. The laziness extends to subtitled foreign language films which always face an extra challenge at the box office: “I don’t want to read at the movies!” is the gist of the complaint, and it is one I’ve heard many times.

This raises the question of the relative merits of dubbing and subtitles. I think the type of movie makes all the difference. If it’s a guy in a rubber monster suit stomping on a miniature Tokyo or if it’s one of the Hercules movies, choose to dub: it’s less distracting and nothing important will be lost. If it’s Kurasawa or Fellini, opt for subtitles. So too if it’s Rohmer (Pauline at the Beach) or Tykwer (Run Lola Run). There are nuances in language and delivery that no dubbed translation ever gets quite right, and these are often as meaningful as the strict dictionary denotation of the words; it is better to hear them in the original.

Besides, a willingness to “read at the movies” opens the door to marvelous films from all over, e.g. Tangerines (Estonia/Georgia), Timbuktu (Mali), Venus in Fur (France), and Leviathan (Russia) as a few recent examples. A particularly fun one (no art house credentials required) released on DVD earlier this year is the Argentine film Wild Tales [Relatos salvajes].

Quite a lot of harm in the world is committed not by bullies aggrandizing themselves and abusing others for fun, though plenty of such people exist. No, it’s done by people who regard themselves as victims. On account of their victimhood they feel completely justified in lashing out in the most disproportionate ways. Bullies don’t commit mass shootings: self-identified victims do. Listen to proselytizers of extreme and violent ideologies: their talk is all about how abused and put-upon they are. That’s not to say they haven’t been bullied: they surely have been. Who hasn’t been bullied? Some far more so than others. That never justifies more than a proportionate and properly directed response. Often it doesn’t justify any.


Damián Szifrón’s Wild Tales has six stories of people who are unquestionably mistreated, but whose reprisals are, to put it mildly, immoderate. (1) "Pasternak": All the passengers on a plane discover they know a flight crewman named Pasternak, and that he has a reason to bear each of them a grudge. (2) "The Rats": A waitress contemplates a creative use of rat poison when she recognizes a customer as the gangster who ruined her family. (3) "The Strongest": Road rage erupts between two drivers on a lonely highway. (4) "Little Bomb": A demolition professional has his life and career ruined when he fights with bureaucrats over parking fines and towing fees. (5) "The Proposal": A wealthy man’s son has a lethal hit-and-run accident, which the detective in the case and the man’s lawyer both see as an opportunity for extortion. (6) "Until Death Do Us Apart": During her wedding reception, a bride ascertains that her new husband had cheated on her (presumably during their engagement) with one of the guests. She retaliates.

Wild Tales is well-directed, well-constructed, well-acted, and full of graveyard humor. It also has a point, which it doesn’t need to articulate explicitly: the tales themselves say it all. Recommendation: Put on glasses (if you need them) and read the subtitles.

Trailer Wild Tales (2014)

Saturday, December 12, 2015

Getting Graphic

Overall, recreational reading continues its long decline in the US (and not only the US), but there are categories of the traditional paper-and-ink publication business that remain healthy, notably YA literature (discussed in a previous blog: Not So Young Adult), comics, and graphic novels (book-length comic books). Sales this year of comic books and graphic novels should be around 85 million units. That’s lower than the 1950s peak or the 1990s rebound, but the number doesn’t include digital sales; print and digital combined are at record levels. (There are nearly 150,000,000 more people in the US now than in the 1950s, of course.)

In pop culture comics are cool, which they were not even in the ‘50s. Comics once were guilty pleasures, but today no one (well, hardly anyone) is ashamed to flaunt them on coffee tables or to be seen in public in a Comic-Con costume. While many pop phenomena are not explicable in terms of artistic merit, in this case there actually is some to be found. In the past couple of decades there has been some outstanding work from the likes of Neil Gaiman, Alan Moore, Sydney Padua, and others. As a reader I’m no expert swimmer in this literary pool, but I do dip a toe in the waters now and then. Two recent graphic novels by authors I respect found their way into my Amazon shopping cart last week, and I don’t regret the expense.

Canadian graphic novelist Bryan Lee O’Malley is best known for the critically acclaimed Scott Pilgrim, a comic book series published between 2004 and 2010 about a young slacker and rocker in Toronto. The surreal videogame-influenced series was adapted for the screen in 2010 as Scott Pilgrim vs the World starring Michael Cera. Despite positive reviews, the movie did poorly at the box office. The reason is something of a mystery to me and no doubt to Universal Pictures. Millennials were the prime target audience but they stayed away in droves. The ones with whom I’ve viewed it liked it without exception, but it is odd that I (a Boomer) introduced it to them and not they to me.

Last year O’Malley published the equally remarkable graphic novel Seconds, a copy of which arrived at my door last week. Premise: Katie is the chef (though not owner) at the successful restaurant Seconds. She has plans for her own restaurant but faces serious obstacles. Besides her professional problems, she is not over a break-up with her ex-boyfriend Max. The restaurant Seconds, she discovers, is inhabited by a house spirit named Lis. Katie is given the chance to undo her past mistakes by writing the mistake down, ingesting a special mushroom, and going to sleep. When she awakens her reality is changed: the mistake didn’t happen. Katie can’t resist using the technique over and over to try to make her life perfect, but it turns out (mild *Spoiler Alert*) that she can’t change one aspect of reality without changing all of it: the universe is much too interconnected for it to be otherwise. A “mistake” may be undone but the new branch of reality in toto is not necessarily an improvement. Katie’s memories remain unchanged, so she becomes increasingly alienated as the history of each new reality is ever more distant from the original she remembers.


Seconds is a contemplative and wistful graphic novel. As far as I know a screen version isn’t in the works. Given the disappointing ticket sales of Scott Pilgrim vs the World (which at least was action-packed) getting a green-light from a studio might be a challenge. Nonetheless, the book is hard to put down once opened. We’ve all made mistakes we wish we could undo, but we seldom consider that we would be different people today if we had made another choice then. It’s not possible to undo the past, of course. We sometimes do get second chances in life, but we never thereby erase the first one. All our do-overs come with a history, and all we can do is make the best of it.

Scottish graphic novelist Mark Millar and American illustrator John Romita, Jr., both veterans of Marvel, made a splash during 2008-2010 with their controversial comic book series Kick-Ass. The premise: high school nobody Dave Lizewski has no special powers at all, but he is a comic book fan who dreams of being a costumed crime-fighting superhero. So, he buys a costume and becomes one (minus the “super”). He soon discovers that there are others who have done the same thing, and that, unlike himself, these other costumed vigilantes are truly lethal.

Mainstream comics in the West, though often violent, tend to pull punches when depicting brutality. The very graphic Kick-Ass doesn’t. The result is impressive, disturbing, and most definitely not-for-kids. The series is also quite funny, though the humor is very much of the graveyard sort. Millar and Romita followed up the series with the sequels Kick-Ass 2 and Hit-Girl. (Millar also had unrelated success with The Secret Service, which was adapted for the screen as Kingsman.) In 2015 Millar and Romita released the graphic novel Kick-Ass 3. This arrived at my door in the same package with Seconds. In Kick-Ass 3 the imprisoned Hit-Girl seeks the help of Kick-Ass and other costumed not-so-superheroes to get out. She still has in her sights the mob and a corrupt police department.


Millar and Romita haven’t lost their touch. If you have Kick-Ass, Kick-Ass 2, and Hit-Girl, it is definitely worth rounding out the set with Kick-Ass 3, which marks the end of the series. A prequel might be in the offing however. The movie Kick-Ass (starring Aaron Johnson, Nicolas Cage, and Chloë Grace Moretz) was more successful at the box office than Scott Pilgrim. It wasn’t a huge hit, but it was well regarded critically (with exceptions: Roger Ebert hated it) and was big enough to spawn the sequel Kick-Ass 2. Those who have seen the movies but not read the comic books might be surprised to learn that the movies are substantially tamer – really. They also differ somewhat from the comics in backstory and subplots, such as (in the first entry) the relationship of Dave and Katie. Apparently a Kick-Ass 3 movie is in the works, but it is at a very preliminary stage.

Both Seconds and Kick-Ass 3 are recommended for that coffee table, perhaps next to SPQR that I recommended in the last book review. I suspect the first two will inspire more conversation with guests.

A do-over with a history:
Do It Again by the Beach Boys

Tuesday, December 8, 2015

Blue Blazer Time

Digital photographs are marvelous: easy to access and easy to share. They also are easy to lose. They might stay on the cloud forever but that doesn’t mean we can find them. (What was grandma’s password? Have you seen the red flashdrive anywhere?) So, I’m still old fashioned enough to like hardcopies for a physical photo album. Not every single saved digital photo is print-worthy, of course. Not one in ten is. But I print just enough to document various events, the passage of years, and my own progression toward dotage.

I got around to printing a page of pics this morning, which included one of me in a uniform of sorts. (Not coincidentally, though lacking the pocket patch, it was pretty much my prep school uniform of 50 years ago.) It occurs to me that I wear precisely the same blue jacket and red tie both to weddings and to funerals, though I intend no irony or commentary by the practice. The same style, that is: the original jacket, if I still had it, wouldn’t fit. Every year since I was 25 I’ve attended at least one those events – sometimes both. In recent years, life being what it is, funerals have outnumbered the weddings, but in 2015 I attended one of each.

Neither event is really my story to tell, so beyond this unspecific mention of them I won’t. On a strictly actuarial basis, though, it’s good to know the odds for the wedded couple are in their favor. After hovering around 50% during the 1970s-90s, divorce rates in the US in the 21st century have dropped to levels not seen since before World War 2. Why? Apparently, iffier couples are forgoing vows in the first place, so only more secure couples still walk the aisle. (Disclosure: my one and only marriage lasted 3 years, ending in 2001.)

This might be the last wedding I attend in a while (maybe ever), for the hetero marriage rate has gone off a cliff and is still in free fall. The demographics of marriage have shifted too. The median age of first marriage is higher than it ever has been for those who bother at all: the number of lifetime adult singles is the highest on record. Prior to the current century, women with college degrees were the least likely to marry; now they are the most likely. Yet, that stat by itself is misleading. The marriage rate for this group has not gone up; it has gone down. It just hasn’t gone down a lot, whereas for everybody else the rate has collapsed.

For those who seek economic explanations for social trends, the sorry financial state of men at present is a candidate. True enough, men still predominate among CEOs, corporate directorships, and governorships just as they predominate within prisons and halfway houses: more men than women inhabit both tails of the bell curve. CEOs are doing great. But most men are not CEOs, nor are they homeless. The typical male experience is quite different; the middle 80% of men are losing ground. Median male wages peaked 40 years ago in real terms and continue to decline – and not by a small amount. According to Time, men’s wages are down 20% since 1980; moreover, the male labor participation rate is lower than it ever has been. (The current ratio of employed men to employed women is 91:100.) Fewer attend college. Men make up only a third of undergraduates, and women in 2015 earned more degrees at every level up to and including doctorates. In short, modern men are, by and large, lousy prospects, for despite the ongoing increase in female earning power “a secure job” still topped the female list of requirements for a marriage partner according to a 2012 Pew study.

If one is inclined to dismiss economic explanations, however, we can find other factors at work, too. Modern social media have vastly improved communication and understanding among people; the results have not been good. We understand what we hear and we don’t like it. Unexpectedly, the communications revolution has deepened partisanship in all parts of life, including the age-old gender war. It’s hard to say how much this matters:  there always has been much fraternization across battle
Twilight Zone: "Two"
lines and there always will be, but it matters some. Perhaps more important is the retreat from real world (“meatspace”) interactions with other people into virtual ones. It’s common for people to share more with online friends whom they never meet than with those who show up in person. It’s not just marriage as a legal formality that has diminished: twosomes of the informal kind are less common too. Millennials in particular date less and opt to live together less than did Xers and Boomers at their ages. According to Gallup, “Gallup's data reveal that young adults are not simply swapping marriage for living together, but rather staying single longer.” They are “less likely to be making the more serious commitment associated with moving in together – whether in marriage or not.”

Well, I can understand that. There is something to be said for the freedom to crank up one’s stereo at 3 AM (assuming you don’t have thin walls and close neighbors) without having to justify the choice to a housemate. There endless perks to being single, mostly in the form of not having to negotiate every aspect of life. Being single is, in my (long) experience, relaxing.

What about love? OK, if you want to get picky. La Rochefoucauld’s remarks on the subject notwithstanding, I suppose that might be a reason for some folks to turn down the stereo – or at least to use earphones.

Andrews Sisters – Apple Blossom Time (1941)

Wednesday, December 2, 2015

The Game Is Up

From the earliest days, movie producers saw the potential profit in sequels, series, and franchises: Tarzan, Nancy Drew, Frankenstein, etc. Some were designed as a series from the start while others spawned sequels only after the success of a stand-alone film (e.g. The Thin Man). Then there are the serials so popular in the 1930s and ‘40s that played before the main feature: short films with cliffhanger endings and a continuous story arc such as Flash Gordon, Batman, Green Hornet, and the late entry (1952) Commando Cody. George Lucas famously was inspired by these for his Star Wars series. Star Wars coincided with the arrival of home video players, which made discretionary home binge-watching possible for ordinary folks. The home-binge option has made serials more prevalent than ever. High among the sources to which Hollywood has looked for scripts have been comic books and Young Adult fiction.


This year three series based on YA books (Divergent, Maze Runner, and Hunger Games) with broad similarities have installments in theaters. The Hunger Games: Mockingjay Part 2 is the monster hit of the three and the final installment of its series. Though she certainly has made her mark in other films, Jennifer Lawrence became a superstar in The Hunger Games. I try to keep up with at least some pop-culture phenomena, so Monday night I went to a nearby multiplex to see The Hunger Games: Mockingjay Part 2. (I noticed only when I got home that the Millennial cashier had, unasked, given me a senior discount: sigh.)



For those who have read the books, The Hunger Games: Mockingjay Part 2 follows the plot pretty faithfully, with only such simplifications as are sensible to maintain cinematic pacing. Nonetheless, like the other installments, it is a remarkable visualization of the novels and worth seeing for that reason alone. Those who haven’t read the books but who have seen the previous three films know basically what to expect, and shouldn’t be disappointed. Anyone who hasn’t seen the first three, however, (a caveat for most series) will be completely lost. Recommendation for newbies: binge-watch the first three before stepping foot in the theater.

On one level The Hunger Games series is a teen-oriented adventure, but it really is more. It is deeply cynical on many levels, the political being just one of them. Katniss is not a typical heroine. She owes her hero-status to media-hype and she knows it. She is not a very nice person and she knows it. She is humanly inconsistent: she is willing to sacrifice for others, yet also is willing to sacrifice others for herself. She does self-reflect enough, however, to question her own moral choices and those of her friends. Should we conveniently excuse ourselves and our allies for acting the same way as our enemies just because it is a means to an end? What are the limits of loyalty and what is betrayal? At what point is victory too costly? It is unusual for a YA-based series to ask such questions and more unusual to offer the answers this one does.

Contrary to popular opinion we do not live in cynical times. These are partisan times, and partisans are true believers. It is not cynical to believe the worst of one’s opponents. That’s just toeing the party line as a true believer. Cynicism in the good sense involves recognizing unsavory natures and motives in oneself and one’s allies while seeing the goodness in one’s opponents – yet still making an informed choice among the shades of gray. So, the success of The Hunger Games with its more complex world view is surprising and encouraging.


Thumbs up, but not as a stand-alone movie: See the others first.

Essence of The Hunger Games world view:

Wednesday, November 25, 2015

Bearding the Empire

When regarding ancient history it is common for Westerners to admire the Greeks but identify with the Romans. For all their cultural achievements (or perhaps because of them), the classic Hellenes strike us as truly ancient. Not so Rome, which by contrast is eerily familiar. Despite the passage of two millennia, Rome at the time of Augustus somehow seems hardly more alien than the 19th century of our own country. To be sure, there are elements of Roman life that are strange or that shock us, including gladiatorial games and casual brutality, but then again aspects of 19th century American life shock us too.

Despite the vast amount of Greek and Latin literature that has been lost – most simply having rotted away uncopied in the Middle Ages – quite a lot survives: history, epic poetry, fiction, epigrams, drama, rhetoric, and more. The plays of Plautus and Terence read like modern sitcoms (in fact, their plots have been stolen repeatedly for modern sitcoms), Suetonius is as gossipy as TMZ, and Cicero is as bombastic as any US Senator. Anyone interested in Roman history is well advised to visit the basic original sources: Livy, Polybius, Plutarch, Tacitus, and others. (All these authors once were standard fare in secondary school, but no longer.) However, for the modern reader, who more often than not has only a passing acquaintance with Classical civilization, a standard history textbook is a useful accompaniment if only to keep the original sources in proper context. There are plenty of texts from which to choose, and new ones are published regularly. Some are little more than simple chronologies while others are thematic, the most ambitious of the latter still being Edward Gibbon’s 18th century six-volume The Decline and Fall of the Roman Empire. Much more concise, but still tending to the thematic, is a new (2015) treatment by Mary Beard, classics professor at Cambridge: SPQR: A History of Ancient Rome.



Why do we need another one? As Beard explains in her prologue: “It is a dangerous myth that we are better historians than our predecessors. We are not. But we come to Roman history with different priorities – from gender identity to food supply – that make the ancient past speak to us in a new idiom.” Mary Beard’s book is a solid and readable addition to the literature, and one with 21st century priorities. I would not recommend the book to those whose only exposure to the classics are the movies Gladiator and Spartacus. Beard writes for a reader who has at least some prior sense of the general outline of the history of Rome, and when she mentions an author such as Plautus, Juvenal, or Pliny the Elder she assumes the reader has some idea who the person is. For readers with at least this much background, however, she offers an interesting perspective in engaging prose.

Beard’s arrangement of the material is not strictly chronological. She starts in medias res with the conspiracy of Catiline (63 BC), relating it to the modern style of politics, and then backtracks to Rome’s earliest years. Her theme is that there was nothing inevitable about the rise of the city and the Empire. Much of Rome’s success was a throw of the dice that easily could have come up snake eyes. To the extent the Romans made their own luck, however, it was by being adaptable to changing circumstances. The odd Roman mixture of ruthlessness and inclusiveness (slaughter your enemies but give the survivors citizenship) was particularly effective. For all the complexity of the late Roman Republic’s unwritten constitution, the Romans weren’t much interested in political theory other than a nod to libertas – sometimes little more than a nod. They distrusted a concentration of power but weren’t committed to democracy or aristocracy or to some particular mix; they altered their government to suit the needs of Empire. If in the end a concentration of power happened anyway, that too is relatable in the 21st century.

The Romans still matter. They are old family who largely inform who we still are. Thanksgiving weekend is an especially apt time to encounter old family. It’s also a time for a very Roman bout of over-eating though I trust most tables are a little less extravagant than Trimalchio’s

The crude nouveau riche braggart Trimalchio hosts dinner in Satyricon, Fellini’s adaptation of Petronius’ 1st century novel


Wednesday, November 18, 2015

Jolly


Thanksgiving is nearly upon us and the “holiday season,” nowadays reckoned as stretching from Halloween through New Year’s Day, is here. Tis the season to be jolly. So why are so many publications currently offering advice on “Holiday Depression?” It seems that this is an all too common problem. Nor is it a new one.

It once was popular folklore that suicides and murders peak between Thanksgiving and Christmas. This, as it happens, is untrue: the rates for both actually drop. In recent years we have been treated to numerous seasonal news articles cheerily debunking the old myth. However few of those articles mention that suicides spike 40% (source: healthline.com) in the days immediately after Christmas, which are still part of the season. Nor do they usually mention that, while murder may be less frequent, the risk of dying from all causes, including cardiac arrest and accidents, is in fact higher in the holiday season than during the rest of the year. Less severe crimes than homicide rise alarmingly: fraud, identity theft, burglary, and scams of all kinds among them. Being a crime victim can depress anyone. Yet, this is not high on the list of reasons the season is hard on some people.

What is? Unresolved family issues loom large, and they have a way of re-emerging at family gatherings. There are worries about overspending and stresses from overscheduling. There is the recognition that our lives are not as idyllic as a Norman Rockwell painting. The end of the year also brings to mind thoughts of aging, mortality, and missed opportunities. Furthermore, some psychologists argue (no joke) that many adults have a lingering unacknowledged sense of loss from the bad news about Santa Claus. Of course, many of us (all of us who are old enough) have experienced real losses: loved ones who are absent from the table. My mother, for example, though never in a general way depressed by the season, after my sister died in 1995 found it impossible to listen to Elvis’ Blue Christmas, which turns up frequently on the radio this time of year. She always changed stations or turned the radio off.

None of that seems to offer much reason to smile, but in truth there is no more cause to be down than at any other time of the year. As in other aspects of life, so long as we are aware of the potholes ahead we are less likely to step into them. There are pleasant aspects to the season too. Remaining friends and family are likely to be present, for one thing, and presumably we like some of them. If that’s not enough, keep in mind it will all be over January 2, a day to which we can look forward. That’s the day we break our New Year’s Resolutions, and that always is fun.


Blue Christmas – Elvis Presley

Friday, November 13, 2015

The Utopia around the Bend

In previous blogs I’ve offered several possible reasons for the popularity of apocalyptic fiction, but one reason might be just a wish to escape from an unsatisfying existence, even if the total obliteration of civilization is a little hard on one’s neighbors. Of course, escape can be had less destructively by going elsewhere. In Interstellar elsewhere is another solar system in another galaxy. In Tomorrowland it’s another dimension: one which holds a lesson about self-fulfilling prophecies. Sometimes, though, the destination can be more mundane, and this is the case in the teen drama Paper Towns; like so many recent movies, it is based on a YA novel. In Self/less, however, the destination is more extraordinary. I watched both movies this week.

Paper Towns (2015)
The title refers to a copyright device used by commercial mapmakers. General information cannot be copyrighted, so to protect their work publishers commonly invent nonexistent towns. In other words, they give some spot in the middle of nowhere a name, and print the name on their maps. Fiction can be copyrighted, so if this fictional name pops up on a competitor’s map an action for copyright violation can be filed.

Plot: In Orlando, Florida, Quentin at age 9 notices the unconventional girl Margo the day she and her family move into a house across the street. Margo is restless, pensive, and adventurous. Quentin has a crush on her and so he sometimes joins her on her strange adventures, which often involve breaking and entering. She drifts away from him, however, and by high school she has a circle of popular friends that doesn’t include Quentin. Yet one thing doesn’t change: dissatisfied as ever with life as it is, she seems always to be searching for something. One night near the end of senior year she knocks on Quentin’s window, something she hasn’t done for years, and induces him to join her on a night of prankish revenge; the targets are her circle of “friends” who, she believes, betrayed her trust. The next day she doesn’t appear in school and Quentin learns that she has left home. This is something she has done before at various times, so her parents are more exasperated than worried. Besides, at age 18, she is free to go where she wants. As always was her habit, she left obscure clues to her whereabouts. Convinced he is in love with Margo, Quentin follows the clues and concludes she has gone to a paper town in upstate New York. Quentin and four other classmates drive north to find Margo.

The bulk of the movie is the road trip with Quentin and friends. The film is reminiscent in an odd way of the classic 80s Brat Pack movie The Breakfast Club in that it is mostly teen characters verbally expressing their teen angst and desires. Does Margo find what she seeks at the end of the road? Do any of the characters? Is Quentin really in love, and if so is it requited? Is location the real issue? The answers are spoilers, so the viewer, if interested, can watch the movie to find out.

Not all teen movies transcend their target demographic, and this is not one that does. I suspect teens, by and large, will like it, but adults might find themselves looking at their watches. I did.


Self/less (2015)
One way to escape (if you can figure out the technical details) is to leave your own body behind. I suspect Tarsem Singh’s Self/less was inspired primarily by the 1966 scifi drama Seconds starring Rock Hudson, though strictly speaking it isn’t a remake, the ’66 flick is better, and the transfer of consciousness is a plot device used in many books and movies. [I employed it in a couple of my own short stories including Graduation Day] Self/less has a dismal rating on Rotten Tomatoes. It is not really as bad as all that, but time and again it missed opportunities to be better.

Damien (Ben Kingsley) is an aging one-percenter with mere months to live. He learns of a secret process called “shedding” developed by an eccentric scientist, and pays a quarter of billion dollars for a new body (Ryan Reynolds) and a new identity. The transfer of his identity into the new body is a success and Damien, now going by the name Edward, still has plenty of resources to live the life of rich playboy, which for a while is what he does. He has been told the new body was genetically engineered and grown in a lab, but when Edward fails to take his medication he gets flashbacks and realizes the body belonged (or perhaps, properly, belongs) to someone else. He clandestinely learns more about his host, and then deliberately seeks out the wife and daughter of the corpus’ previous occupant. This endangers the secrecy of the body-swap organization, which responds violently, leading to car chases and flying bullets.

Part of the problem with the script/acting/direction is that the Damien we meet at the beginning of the movie showed no inclination for the sort of selfless heroics that he demonstrates as Edward. It is hard to regard him in any way as the same person. On the contrary, the original Damien strikes us as someone who would take his medication for suppressing flashbacks and not worry too much about the source of his new body. The movie hints at several philosophical questions about mortality, wealth, and morality, but doesn’t ever do more than hint. The scriptwriters and actors perhaps would have been better either to explore those questions or to take a more lighthearted approach as in, for example, Face Off. As it stands, Self/less is somewhat somber for an adventure film, and for all its potential, is no more than OK.

Damien extended his life by relocating to another body, and arguably that is reason enough to do it. As was the case with the traveling teens of Paper Towns, however, the new location is not necessarily a recipe for happiness. If you can’t be happy in your own skin, you’re not likely to be happy in someone else’s.

Frank Sinatra - Under My Skin

Sunday, November 8, 2015

Roller Recap

Regular visitors to this blog will not be surprised to read I was trackside last night in Morristown for the women’s roller derby bout between the local Jerzey Derby Brigade (JDB) and the Wilkes-Barre Scranton Roller Radicals.

It was an exciting match in which the score seesawed between the teams until the final 10 minutes. In the first half, whenever one team would build a lead it quickly would vanish in a single power jam. Apocelyse, Lil MO Peep, and CaliforniKate had particularly successful power jams for the JDB while Veronika Gettsburger and Liberty Violence did the same for the Radicals. Both old-school blocking and some of the newer formation defenses were on display by both teams. 15 minutes into the bout the score stood at 67-60 in favor of JDB; at halftime it was 116-129 in favor of the Radicals. In the second half the Radicals opened a 50 point lead. This owed much to Gettsburger who had several multi-pass jams despite being taken down hard in one by Beast Witherspoon. The JDB chipped away at that lead as the clock ran down, and with 3 minutes remaining the score was 190-210. Despite redoubled efforts by both teams, the Radicals held on and took the win with a final score of 201-227.

MVPs were #22  Apocelyse (jammer) and #13 Smiley Cyrus (blocker) for JDB, and #81 Veronika Gettsburger (jammer) and #1200 Liberty Violence (blocker) for the Roller Radicals.


Friday, November 6, 2015

The Solarian Solution

At all times the peculiarities of the younger generation are cause for concern to everybody else, so currently it’s the Millennials who just can’t catch a break. Their work habits, debt levels, and living arrangements are scrutinized and disparaged in the press, on film, and in fiction. Millennials appall some commentators with their supposed hook-up culture while worrying others for not canoodling enough. Just this morning I encountered an article by Millennial author Caroline Beaton at Psychology Today titled “Why Millennials Are Failing to Shack Up: One reason Millennials are marrying later and having sex less.” It joins similarly themed articles in Rolling Stone, The Huffington Post, and even Forbes. The articles often have over-the-top titles such as “Tinder and the Dawn of the ‘Dating Apocalypse’” in Vanity Fair.

Maybe it’s not really an apocalypse. Besides, broad strokes painted of an entire generation are bound to misrepresent large parts of it. Nonetheless, it is apparently true that, statistically as a group, they date less, have less sex, and start having it later than did Xers or Boomers at their age. The “one reason” offered by Beaton (not altogether implausibly) is an excess of apparent choice offered by online dating profiles; they prompt anyone looking through them always to think they can do better. Other articles, at least in regard to hetero dating, note the lack of datable young men: two out of three college students are female, and not all of those remaining one-in-three are “datable” due to other factors, such as that most males are effectively broke. (The top-earning 20% of men are doing better than ever, but the rest have seen steady declines in real income.) Other articles refer to a particularly sullen state of the gender war while still others claim to see an odd sort of neo-Victorianism.

I know nothing of any of this, but it is hard to miss one huge difference from when I was 20: communications technology. That in turn brings to mind a particularly prescient scifi novel by Isaac Asimov published in 1957 titled The Naked Sun. The setting is the planet Solaria. On Solaria individuals live isolated on enormous private estates, but everyone has spectacular communications. Solarians socialize with each other via holographic telepresence, a sort of Skype on steroids. Solarians are utterly immodest while communicating in this VR way. Yet, not only do they prefer keeping physical distance from each other, the very thought of in-the-flesh face-to-face contact with a human being is repulsive to them. Procreation is handled scientifically and antiseptically while robots raise the offspring elsewhere – actually that last part sounds like a good idea. As for carnal desires, robots are available for those too.

We already are halfway to being Solarian in our communications. Have I not seen Millennials in the same room text each other, preferring this to talking? All we need now are better love-bots than the underwhelming models currently on the market; then we can forget about dating altogether. Yes, it would be a form of autoeroticism, but, to steal a line from Woody Allen, that is “sex with someone I love.” Many of us don’t really like other people very much in person, it seems, so, with such a ready market, we might get there soon, and Millennials are well positioned to arrive first.


Friday, October 30, 2015

Not So Young Adult


The final installment of The Hunger Games is expected to fill movie theater seats starting a few weeks from today. While I don’t explore every pop culture phenomenon – there are too many and more than a few of them are off-putting anyway – this one to date has been relatively painless. So, in order to give it a closer look, a few days ago I picked up the trilogy of novels by Suzanne Collins on which the movies are based. More on that in a moment.

It is hard to miss just how many recent movies have been based on Young Adult (YA) fiction. There is good reason. Teens are a prime movie demographic and YA is the one part of the fiction market that remains strong; if teens buy a book they’ll probably see the movie adaptation. The rest of the book market is suffering. Per capita sales of adult-oriented books – fiction in particular – continue the decline that began half a century ago. According to a Huffington Post survey, 42% of adult Americans didn’t read a single novel last year. 28% read no books of any kind and another 25% read between 1 and 5 – including such stuff as diet books. Since people notoriously lie to pollsters to make themselves seem more praiseworthy (we know from liquor taxes collected that they understate their alcohol consumption by 50% for example) it is likely the real figures are more dismal yet. Sales of novels aimed at “young adults” (tweens and teenagers), however, are not only holding their own but rising. So are modern teens avid readers? Not exactly. They remain a healthy market, to be sure, but the increase in sales comes from older readers: 55% of YA book readers are not “young adults” but actual adults. So, have adults not only cut back on their reading but dumbed down their selections? Fortunately no, because YA novels are not what they used to be.

Up until the 1960s the category usually was called “Juvenile Fiction,” but that was when “juvenile” was a word that still had an edge to it.  “Juvenile delinquent” in the 1950s evoked a scary image of a 17-y.o. mugger or gang member. By the mid-60s the term had softened enough to evoke a 10-y.o. toilet-papering a neighbor’s bushes. Teens, accordingly, disdained “juvenile” and the publishing industry obliged by adopting the more flattering “young adult” label. Whatever you call it, it has a long pedigree that includes the Nancy Drew mysteries of the 1930s and Robert Heinlein’s scifi novels of the 1950s. Even early on, a few examples were recognized as quality-lit, e.g. Salinger’s Catcher in the Rye, Golding’s Lord of the Flies, and for that matter Twain’s Huckleberry Finn. But these were the exceptions.  In general novels in the category were widely regarded by adults as kids’ stuff, and few people past high school bothered with them. This changed in the 90s, and Harry Potter had a lot to do with it. Critics and adult readers took notice that some of the most imaginative and remarkable stories being published were YA. To be sure, much in the category, such as the Twilight series, remains unreadable for many past the age of 18, but the best material is very good indeed. Meg Rosoff’s How I Live Now, Stephen Chbosky’s The Perks of Being a Wallflower, Libba Bray’s satirical Beauty Queens, F. Paul Wilson’s Jack trilogy, among others, all have literary merit.

What distinguishes YA from adult fiction? YA, naturally enough, has teen heroes and heroines – college-age 20-somethings at a stretch. The vocabulary and grammar tend to be clear, straightforward, and simple; if there is a semicolon anywhere in The Hunger Games I missed it. (Kurt Vonnegut, who in his novels had an idiosyncratic but simple style, groused that the only reason to use a semicolon is “to show you’ve been to college,” so I imagine he would have approved.) There are plenty of hormones, but the sex is usually (in movie terms) PG-13. There are some examples of R though. The hero/heroine faces some challenge and typically there is some oppressive authority to be overcome with derring-do. Each individual novel tends to be short, though a full series can be lengthy. Most importantly, the novels deal with the peculiar mindset of teens who are in the last stages of forming their adult selves. We’ve all been there. If we have any memory at all we all can relate. Besides, beginnings and endings are usually more interesting than middles, and endings tend to be depressing, so tales set in that age range retain a special appeal.

There is more to the adult appeal than just a peculiar kind of nostalgia, however. Adults always have envied teens their youth while being alarmed by their behavior. All of us still have a surly rebellious teenager inside. A tale in which that rebellion prevails remains satisfying on some level, and also a bit unsettling.

I don’t suggest that YA should dominate the reading lists of adults. I wouldn’t recommend anyone give up Fielding, Dostoevsky, Hemingway, or Nabokov. Authors who write for adults in adult prose are the heart of literature. But not all our recreational reading need be as deep as all that. One shouldn’t be embarrassed to include YA in the mix.

As for The Hunger Games trilogy, Suzanne Collins deserves her success with this series. It is well-crafted, never loses sight of its teen viewpoint, and is cynical without being hopeless. I won’t recap the plot, which the movies follow pretty closely, but the political message is anti-authoritarian, which is a natural parallel to teen rebellion against adults. Before catching the last installment of the movie series, I recommend reading the novels; they definitely will enhance the experience.

My Chemical Romance: Teenagers


Friday, October 23, 2015

“O sweet and lovely wall, Show me thy chink”

“We build too many walls and not enough bridges” – Isaac Newton
“Something there is that doesn't love a wall” – Robert Frost
“As far as death is concerned, humans live in a city without walls” – Epicurus
“Wal-mart... do they like make walls there?” – Paris Hilton



The stucco on the retaining wall in back of my house succumbed to 40 years of NJ weather – particularly winter ice – and detached itself from the blocks beneath. As the stucco was embedded in steel mesh lath, it came off in a huge sheet of alarming weight and size. Once it was down I had to break it into manageable pieces with sledge hammer and wire cutter. I finished the re-stucco job yesterday (the dark patch is still wet in pic). The stones in the photo, plus others out of frame, are for a buttress or two like the ones already flanking the stairs; I plan to add them in order to help delay a repeat of the event.

Remains of Nineveh wall
People have been building walls for about as long as they have been building anything. I don’t mean walls that hold up a roof. I mean exterior structures intended to keep something out, keep something in, or both. A brush kraal surrounding huts, for example, keeps wild animals out and domestic animals in…or at least it reduces the number of intrusions and escapes. My retaining wall keeps dirt out from where I don’t want it. Far more often, though, walls are there to block people. People being what they are, as soon as any of us acquires something worth having, others will want to take it away. Walls are the first line of defense, whether of a private estate or a whole community. The earliest cities in Mesopotamia had them. By the time of Assyria’s ascendency many were truly formidable. Nineveh had a 6 meter (20’) high stone wall topped and backed by a 10 meter (33’) high and 15 meter (49’) thick mud brick wall. Much of the wall, especially the stone, survived 2700 years until it encountered modern explosives: anti-government forces currently occupying the site in Iraq intentionally blew up large parts of the wall earlier this year. [Shameless self-promotion: Dressed to the Nineveh, one of my short stories, is set in ancient Assyria.] Some walls protected whole territories, most notably China’s Great Wall and the much shorter but still impressive Hadrian’s Wall.

In an older blog Ghosts of Dwellings Past I wondered how private permanent homes affected the relationship of individuals and their immediate families to the community. How were sensibilities altered by private space? A similar question applies at a more social level to a town wall, for all the while a wall keeps others out it very much keeps the insiders in – even if, in principle, they are free to walk out the gate. It causes the insiders to pile atop one another more tightly than they otherwise would, it requires them to make arrangements for basic services (above all, water and sanitation), and it requires public arrangements for settling disputes. It requires a polity. This is so even at the level of a few houses in a kraal but is emphatically at the level of a town or city. How much did walls promote a sense of community early in history and prehistory? How did they shape civilization? Perhaps they were every bit as important as the agricultural revolution on account of how they affected the minds of the people within them.

It may seem that we no longer bother as much with walls, but we do. Modern firepower diminished the effectiveness of walls as defenses against armies, so we don’t use them as much for that purpose, but they still have a function against the less well-armed. For 28 years the Berlin Wall did the job for which it was designed: not perfectly (some 5000 made it through, over, or under) but it didn’t have to be perfect to be fundamentally effective. So too with prison walls. Walls to stem immigration at the borders exist, are planned, or are subjects for debate in various countries including the US. Do good fences make good neighbors? Perhaps. Perhaps not. But they aren’t disappearing anytime soon.

My wall, meantime, has no role in holding back outside hordes, armed or unarmed. It holds back dirt. In this it ultimately will fail. But I trust that after my repairs the failure will be long after I’m gone

Sunday, October 18, 2015

Time Warps on Cloud Nine

The people of every age regard their own era as a hotbed of vice. They are always right. Every place and time has its own sexual taboos and requirements: a prevailing list of what one must or mustn’t do or say with whom and how. The modern Western world is rife with them, including much under the heading of PC. Nor need the “prevailing” list be the same as the traditional one: what we regard as traditional mores haven’t prevailed in the West for a long time. There always are people who voluntarily violate them: people who are unwilling or unable to be constrained, pigeon-holed, or compliant. They are the coals in the always glowing hotbed of vice. That is not to say there is no fundamental difference between time periods. Laws, openness, and flavor vary enormously; what is celebrated in one era might be hidden or outlawed in another. Sometimes in the same time and culture a few miles make a difference: age of consent laws vary state by state in the US for example, so teenagers in love might be just a happy couple in one state but criminals over the border. Nonetheless, hidden or open, the same range of behaviors exists everywhere. The word “range” is important at an individual as well as a social level. At a time when so many people seem intent on narrowly categorizing themselves, it’s worth remembering that on the classic 0-6 Kinsey scale of orientation very few people score a 0 or 6 whether or not they choose to acknowledge or act on their mixed attractions.

It is hard to find more contrasting lists than at the bookends of the century between 1879 and 1979. For those not old enough to remember the 1970s – trust me on this one – the zeitgeist was much more open than today. (I’m speaking of atmosphere: a number of laws, I acknowledge, lagged behind.) Playwright Caryl Churchill caught the transition perfectly in her play Cloud Nine, which I first saw at the Lucille Lortel Theater in Greenwich Village in 1981. [Side note: culturally “the 70s” slopped over into the chronological 80s for a couple years just as culturally “the 60s” stretched all the way to 1974 when hippies took off their headbands and put on disco shoes.] Act I is set in 1870s British Colonial Africa; Act II is in 1970s London.  Although a century has passed between acts, the characters in the first act reappear in the second aged only 25 years. This chronological disconnect is not just playwright’s whimsy: Victorian mores and ideals, while finding their fullest expression in the 19th century, really did last until the middle of the 20th. Most adults in the 1970s grew up heavily influenced by them, so the time bending is a neat device. Some of the roles were cast cross-gender, cross-age, and cross-race. Adultery, incest, pederasty, sadomasochism, bisexuality, and homosexuality are present in both ages, but guiltily covered up in the first act while openly practiced in the second. Yet, love is no easier for the characters in Act II than in Act I. They are just as confused about what they want as ever. Victoria’s feminist husband Martin in Act II for all his pandering (or rather on account of it) is every bit as annoying as Betty’s patriarchal Clive in Act I.

The Atlantic Theater Company currently has an off-Broadway revival at the Linda Gross Theater on 20th Street. I was uncertain how well the play would hold up after 34 years. The answer was it holds up beautifully, even with the circle-in-the-round bench seating that is clever but a tad uncomfortable for 2 hours and 40 minutes. Cloud Nine has not lost its edge; the times if anything have honed it. One quaint feature is refreshing: the play is not polemical in 21st century fashion. Even when in the second act Lin says she hates men it’s plainly just a personal thing and not entirely true: she finds herself in a threesome with Victoria and her brother. The script even pokes gentle fun at the bookish Victoria for theorizing that sex cannot be separated from economics.

We’re all complex in these matters and either embrace the confusion or set boundaries for ourselves in order to simplify life – even if this involves some self-oppression. All too often we don’t resist the temptation to try to set the same boundaries for others. It works about as well as Prohibition did for alcohol.

If you get a chance to see Caryl Churchill’s play, which crops up here and there, I recommend it. Be quick about it though, for it rarely stays for long. The NYC production runs through November 1 after a planned run of only two months.


Trailer for a production at the Almeida Theatre in north London

Sunday, October 11, 2015

Plei-ing One’s Trade

Buster Keaton, Margaret Leahy, and Wallace Beery in "Three Ages"

Book notes: Past and future people.

Shaman by Kim Stanley Robinson

Evolutionary psychologists rely on the Savanna Principle, which states that “our hominid ancestors spent 99.9 percent of their evolutionary history as hunter gatherers” and that “the basic functioning of the brain has not changed much in the last ten thousand years” (Alan S Miller/Satoshi Kanazawa). Our Pleistocene brains are not always well suited to postindustrial civilization. While there is plenty of blog material in that, on this occasion I’ll employ the principle just to explain the persistent appeal of a sub-genre of sci-fi: tales set in prehistoric times, sometimes dubbed paleo-fiction or plei-fi. (I’ve dabbled in them myself a couple of times: see Neander Valley Girl and Modern Times at my short story site.) Our brains are very suited to these stories, at least when they are well written. 

Nebula and Hugo award winner Kim Stanley Robinson, best known for his hard-sci-fi Mars trilogy (Red Mars, Green Mars & Blue Mars), writes very well. Some readers complained that his Mars books read too much like terraforming manuals, but most appreciated the results of his meticulous research, which didn’t completely overwhelm the human stories. In the case of Robinson’s plei-fi novel Shaman, his attention to detail is an unmitigated positive. We meet the central character Loon at age 12 on his “wander,” a rite of passage for an apprentice shaman, and soon are immersed in an ice-age world of clans, hunts, gathers, festivals, rituals, cave painting, pairings, and raids. It all feels so very much like “home.” The hopes, desires, and fears of the characters are completely relatable, as is the ultimate pathos of their human state that is with us still: the threat that death will extinguish not just us personally but our legacy.

A big Thumbs Up.

****

Hominid by John C. Boland

John C. Boland has written a taut little thriller, now available in paperback. Faced with the title Hominid, a reader might be forgiven for expecting another example of paleo-fiction. What the reader gets instead in a present-day setting is part murder mystery, part adventure, and part sci-fi. The science in the novel is pretty good. In present-day Maryland, archeologist David Isaac encounters his mentor and an old flame as he joins a dig on a remote Chesapeake Bay island on which there has been significant inbreeding for more than four centuries. They encounter evidence of speciation: a new hominid may be in town.

In truth, there isn’t much risk of a new species of human turning up unless one is deliberately engineered – something beyond our current capabilities but conceivable at some time in the future. As noted above, the past 10,000 years has not been long enough to alter the human species fundamentally. To be sure, superficial adaptations have cropped up here and there in that time span, e.g. adult lactose tolerance in Northern Europe and parts of Africa, high altitude resistance in the Himalayas, and somewhat smaller brains everywhere. (Domesticated people, like domesticated animals, generally are not as bright as their wild ancestors: See The Incredible Shrinking Brain.) Though bought at the expense of large past die-offs that favored minor adaptations, none of these traits were enough to indicate the beginnings of a new species anywhere. With an interconnected global population of 7 billion, splitting off a new branch of humanity is less likely than ever.

What are the conditions that would make speciation possible? In the deliberate improvement of farm animals, this is achieved by a mix of inbreeding and culling. Inbreeding emphasizes desired specific traits while culling reduces the negative consequences. Inbreeding harms a stock only without a cull. The same methods (accidentally achieved if at all, one hopes) will work for humans, and no place is better for a significant mutation to spread than in an isolated colony. It’s called the Founder Effect. The chances of creating a new species this way, while negligible, are not impossible, and Boland spins a good yarn out of the “not impossible.”

Another Thumbs Up, though personally I’m not much worried about speciation. We’ll have AI robots long before that happens. The robots can worry about it.


Jimmy Castor Bunch: Troglodyte (Cave Man) charted in 1972

Monday, October 5, 2015

Say Goodnight Gracie

Popular entertainment always has been with us, sometimes as state sponsored spectacle and always as a business, whether “legitimate” theater, traveling medicine shows, river boats, carnivals, or saloon hall performances. Modern mass pop entertainment culture, however, began to take shape in the latter half of the 19th century when communications and transport technology had advanced to the point that individual performers could be known and seen by large numbers of common folk separated by vast distances. Music halls and grandly named “opera houses” popped up in towns large and small along the railroad routes. A San Franciscan and New Yorker meeting for the first time easily could find they were fans of the same stars.

Vaudeville was at the heart of the change. Benjamin Franklin Keith (1846-1914) usually is credited with molding the American version of vaudeville. He built the remarkably sumptuous Bijou Theater in Boston in 1883 and decreed that popular variety acts performed on its stage would be free of "vulgarity and suggestiveness in words, action, and costume." This Victorian stuffiness was a master stroke, as it was again half a century later for Disney. Family friendly entertainment meant a mixed gender and mixed age audience at a time when a burgeoning middle class had money to spend. Other theater and opera house owners copied his strategy and found themselves with packed houses. By the end of the 1880s a national vaudeville circuit was in place. Performing troupes traveled city to city within the US and internationally, gathering fans as they went.

Why did vaudeville go into decline in the 20th century? The simplest answer is “the movies.” That’s not the whole of it, but the largest part. By 1920 movies offered grander spectacles and bigger stars at a cheaper price; the best performers soon deserted vaudeville for Hollywood with its bigger paychecks and vastly bigger audiences. Vaudeville’s response was to attempt to retain a portion of its audience by becoming less family friendly; many of the venues added strippers in the 1920s. (The 1968 flick The Night They Raided Minsky’s about this event is complete fiction, but it is enjoyable and Britt Ekland never looked better.) This helped slow the decline of ticket sales but it changed the format so much that vaudeville was no longer distinguishable from burlesque, so arguably it hastened the demise.

Among the legacies of the era are the halls and opera houses scattered around the country in which the vaudeville acts performed. These buildings often were sizable and fairly elaborate even in relatively small towns. Many still exist and have found new life refurbished as concert halls. Built in 1882, the Mauch Chunk Opera House in Jim Thorpe, Pennsylvania, (Mauch Chunk was the name of the town before 1953) is a typical example. A century ago it hosted famed acts and performers of the day including Al Jolsen, Mae West, and John Philip Sousa. In 1927 it became a movie house. The Mauch Chunk Historical Society acquired the property in the 1970s and began a restoration. Today the site hosts a range of rock, jazz, and folk artists.

I was at the Mauch Chunk Opera House last Friday with a friend (Hi, Ken) to see the Canadian contemporary folk duo Dala (Amanda Walther and Sheila Carabine) whose performance we had caught a few years ago in Morristown, NJ. They play and sing mostly their own material, but mix in a few covers, Arlo Guthrie’s Coming into Los Angeles for one. “They are charming, clever, and talented.” I’m quoting myself from 2011. If you get a chance, give them a try.

In an era when so much of our entertainment is in our own dens on our own electronic entertainment equipment, live music is still a rewarding alternative, and it seldom sounds better than within old vaudevillean walls.

Dala – Not Alone

 

Tuesday, September 29, 2015

Factive Fiction and Fictive Fact

Two weekend book looks:


A Necessary End by Sarah Pinborough and F. Paul Wilson

Yet another apocalypse? Bookstores and cinemaplexes are rife with them. Are authors and readers/viewers everywhere sensing something in the air? As that may be, this one is a little different. We’ve seen civilization brought to its knees in apocalyptic fiction by plagues, asteroids, zombies, nuclear war, alien invasion, and a myriad other causes. How about flies? This one has flies. A mutated species of fly is infesting the world. It doesn’t spread disease in the usual sense. There is no bacterium or virus. Illness isn’t spread person to person. Instead, the fly’s saliva provokes a fatal autoimmune response in humans and only in humans. The fly then lays eggs in the corpse it has provided for itself: all in all a plausible life cycle. Only a handful of people are immune. The flies have spread so fast that societies are overwhelmed by demands on health care and basic services.

Unusually for catastrophe-fiction, which tends to be action-adventure, A Necessary End is character driven. The central characters Nigel and Abby, who had marital problems even before the arrival of the flies, face their fates with very different philosophies. Nigel is a firm rationalist determined to find physical causes and scientific solutions while Abby relies on her faith. Other characters react with anger, superstition, resolve, generosity, or violence according to their nature and circumstances. Who lives or dies is less important to the story than how they do.

A collaboration between Sarah Pinborough and F. Paul Wilson, both talented authors of horror tales (among other works), A Necessary End is a quick read and is as pleasurable as any story with this premise can be. If you’re in the back seat on a modestly lengthy road trip and have had enough both of scenery and your iPhone, this should keep you occupied for the duration.

****


Metaphors We Live By by George Lakoff and Mark Johnson

There is a 1991 episode of Star Trek Next Generation titled “Darmok” in which the Enterprise has a rendezvous with an alien species called Tamarians whose speech is impenetrable. (The “universal translator” wasn’t ever mentioned in the Next Generation, but it turns up in the later Star Trek prequel series Enterprise and back-explains the oddity that everyone in the galaxy seems to speak English; they really don’t, but a miniature wearable device translates in real time; it evidently fails with the Tamarians.) When the Tamarians speak, the Enterprise crew can understand all of their words but none of their sentences. They say things like "Mirab, his sails unfurled" and "Sokath - his eyes uncovered." Finally Counselor Troi perceives the blindingly obvious. “Imagery is everything to the Tamarians,” she says. “It embodies their emotional states, their very thought processes. It's how they communicate, and it's how they think.”

It is my suspicion that this episode was inspired by a book that made a splash in 1980 entitled Metaphors We Live By. On my reading list for the past 35 years, I got around to it last week. We are Tamarians. Civilizations don’t have knees, books don’t normally splash unless you throw them in a pond, and for that matter we are not Tamarians, but I assume the reader understands those images when I use them. Metaphors are our dominant way of expressing ourselves. Most often we aren’t even aware they are metaphors. For example, most of us would not consider the phrase “inflation is rising” to be a metaphor, but it is. Inflation is not an object that rises up or lowers down (“up” and “down” themselves being directions related to our human experience); it is an abstraction to which we give a numerical value based on a particular set of data. Yet we understand “inflation rises.” We understand “moral fiber,” “falling in love,” "blindingly obvious," "food for thought," “packaging your ideas,” an “ugly side to his personality,” and “a solution to her problems.” Chemistry and math both work for that last one: take your pick. Yet, if we spoke to an alien species about the “foundations of friendship” (friendship as a building with foundations) or "foundations of a theory" they might well be utterly baffled.

Lakoff and Johnson argue that metaphor is the way humans experience reality. It's how we communicate, and it's how we think. The nature of our biological and social existence forms the basis of our metaphors. The biggest challenge of ever getting a computer to think like a human is precisely that computers don’t experience the world in the same way we do. Our metaphors in turn shape our views and actions. Consider the (often unspoken) metaphor that debate is a battle in which one attacks an opponent’s positions, defends one’s own, and either wins or loses. How would a debate differ if instead of a battle metaphor we viewed it as a dance? The authors also discuss the limitations of both objectivism and subjectivism as philosophical systems. They make their point that human understanding is experiential and that new ideas are built upon those experiences, which is to say they are almost inevitably metaphorical. This is fine, they say, but it is “important to realize that the way we have been brought up to perceive our world is not the only way and that it is possible to see beyond the ‘truths’ of our culture.”

You won’t finish this book in the back seat on that road trip. But if you’re inside on a rainy weekend, the book is worth the time it takes to read.



Flushed from the Bathroom of Your Heart