Wednesday, April 18, 2018

Being Flip

An essential element of hipster culture – along with de rigueur denial that one is a hipster – is irony: the fashion and lifestyle choices from silly beards to flip phones are regarded as cool because they knowingly are uncool. My own uncool choices for better or worse have no irony to them: they are merely uncool. No man bun ever could make them pass for anything else. For example, my 9-year-old flip phone (which had replaced a damaged one very much like it in 2009) makes no other statement than that I’m a consummate procrastinator and haven’t gotten around to exchanging it. It’s less a matter of choice than inertia.

At least it's more than two cans and
a string
I don’t need a smartphone for work, as most people do these days, and I’m seldom far from a computer screen, so there is little practical inconvenience from my telephonic backwardness. True, there are times when I idly ponder such things as which astronaut flew the penultimate Mercury flight (Walter Schirra) and what was Jean Harlow’s birthday (March 3, 1911) and then have to wait 20 minutes before I get home and look up the answers on my home screen. So far that’s not been reason enough to say, “Today’s the day I’m going to the Verizon store.” That doesn’t mean I deliberately avoid the tech. One day I’ll damage my flip phone by dropping it on concrete or in a pond or something and finally I’ll be motivated to join the 2010s. Meantime, while missing out on smartphones’ benefits I’ve also been missing the downsides.

There are downsides. For one, they are not good for effective IQ. That’s not a mere assertion. Brain Drain: The Mere Presence of One’s Own Smartphone Reduces Available Cognitive Capacity is a 2017 clinical study by Adrian F. Ward, Kristen Duke, Ayelet Gneezy, and Maarten W. Bos of the University of Texas at Austin. They gave 800 smartphone users tests that required concentration and cognitive effort; all of them powered down their phones but some put their phones in another room while others put them on their desks, in their pockets, or in their bags. The participants who put their phones in another room solidly outperformed all the others. Just having the (powered down) phone nearby was tempting and distracting. The authors concluded “that even when people are successful at maintaining sustained attention—as when avoiding the temptation to check their phones—the mere presence of these devices reduces available cognitive capacity.” Still, this one is easily addressed: if you need to concentrate on something, put your phone in another room.

Another effect – damaging more for some folks than for others – is social and psychological. Checking texts, Facebook, Instagram, Twitter, Snapchat and the rest is notoriously addicting. How many people habitually scroll as they stroll? Social media addiction causes depression in many people as they obsessively pursue “likes” and compare their lives to the virtual facades of others. The more sites we visit, the higher the risk. From a study in ScienceDirect: “Use of multiple SM [social media] platforms is independently associated with symptoms of depression and anxiety, even when controlling for overall TSSM [time spent on social media].”

This brings us to the 2017 indie movie Ingrid Goes West currently available on DVD. It spun in my Blu-ray player last night. The smartphone is a co-star in the film. Ingrid (Aubrey Plaza) is a mentally troubled young woman with a horrible self-image and severe difficulties making and keeping real friendships. Retreating to her phone, she becomes a follower of Instagram star Taylor Sloan (Elizabeth Olsen) who posts about her fabulous California lifestyle of sun, fun, fashion, and joy. When Ingrid inherits money ($60,000) from her mom, she uses it to move west and become part of Taylor’s life, which she does by secretly stealing her dog and then returning the “found” animal. Ingrid judges her own life entirely by the likes and shares on her own Instagram account and by her inclusion in Taylor’s social media. Everything Ingrid pursues in real life is for the sake of the online image. Ingrid’s behavior goes beyond creepy and far into the criminal, yet she remains a sympathetic character throughout the movie and she at least has the excuse of being troubled. It is soon clear, however, that Taylor (along with all of her friends and family) is a massive phony whose real life is anything but enviable. Not quite a **spoiler**: Ingrid, scrolling her phone in the final scene, has a moment of happiness, which for the viewer is a particularly bleak ending.

Thumbs Up on the movie. Nonetheless, despite its warning, my next phone will be smart.

Wednesday, April 11, 2018


Most ancient calendars began at the vernal equinox, which marks the beginning of spring; under the current system that falls on March 21, give or take a day depending on the year. There were numerous exceptions that began the year at the hibernal solstice (December 21 plus/minus a day), but most began at the equinox. This is reflected still in the names of the months, the last four of which are simply wrong. September, October, November, and December mean the seventh, eighth, ninth, and tenth months: in each case off by two. Prior to 45 BC those months were what they purported to be. In that year Julius Caesar, as part of the general calendar reform, reset the start of the year arbitrarily to January 1; inexplicably, he didn’t bother at the same time to change the names of the months to fit the new arrangement.

Bone lunar calendar c.25000 BCE
Calendars based on the lunar and solar cycles preceded formal writing. Among the earliest representational symbols ever found are notches on sticks and bones marking the lunar cycle. Neolithic peoples were very good astronomers as Stonehenge famously attests. The reason the ancients commonly began a year at the vernal equinox (or at the full moon following it) is obvious. Earth in the Northern hemisphere finally shakes loose of winter. Spring is a rebirth. Ancient mythologies are full of resurrection stories symbolizing vegetative rebirth at the start of spring. In ancient Babylon it was the return of Tammuz to Ishtar, in Greece the resurrection of Persephone, in Phrygia the return of Attis to Cybele, and in Egypt of Osiris to Isis. In Japan there are parallels in the tale of Izanagi and Izanami and in Mesoamerica in the return of Quetzalcoatl. The seasonal cycle is such an obvious analogy for a human lifetime that it is a mythological universal. This led poet, classicist, and scholar of mythology Robert Graves to assert in The White Goddess (an indispensable text for Western mythology along with Frazer’s The Golden Bough), “All true poetry is about love, death, or the changing of the seasons.” Verses about other things are just wordplay, he argued, not real poetry.

I can relate to this. It’s my atavistic practice to host solstice and equinox parties when the weather cooperates – as it did not this past snowy March 20. It snowed again yesterday (April 10) in this locale, though just a bit and plainly as a last gasp effort of winter to stay past its time. Persephone must have missed her boat connection back across the Styx this year and had to reschedule with Charon; she likely got a scolding from Demeter for being late. Yesterday’s light snow is what prodded my reflections on the season twenty-one days after its arrival however. My response to spring always has been mixed. It’s the season of new beginnings, of course, but you can’t have a new beginning without an ending, and the endings tend to stand out in my mind.

An outsized proportion of the biggest endings in my own life have come between a vernal equinox and the next estival solstice: not the deaths of friends or family members – those occur randomly at any time of year – but endings involving some volition. Examples: two graduations, a contract to sell my first house, the closing of a business, and the end of all five of the most serious romantic involvements in my life. (One of those five was my idea, the others weren’t.) This isn’t a unique pattern. While January is notorious for a rise in relationship break-ups after a lull during the holidays, according to University of Washington research presented to the American Sociological Association in 2016, consistently over a 15 year period more divorce filings occurred in the equinox month of March than in any other month of the year; a second but smaller bump occurred in August. (Note that “autumn” in the US tends to be regarded popularly as starting with the school year rather than with the autumnal equinox, so it’s a seasonal end-of-summer bump.) Starting afresh just seems an exceptionally good idea when surrounded by the new growth of spring. More often than not, that requires saying “goodbye to you” to what or whom went before; more often than not it’s also the right thing to do or it wouldn’t be seriously considered at all. So, whatever endings and beginnings the reader may be experiencing this particular spring, may you remember the fields you are leaving fondly and may your new fields be ever green.

That time Michelle Branch played The Bronze: Goodbye to You

Friday, April 6, 2018

Choosing Monsters

There is no sense owning DVDs of classic (classic by age if not always by quality) movies without revisiting them occasionally. So, now and then I’ll spin up one or more on a sleepless night even when I’m only lukewarm on the idea. Rarely do they fail to re-catch my interest and play through to the end. Recently over several nights I revisited the flicks in one of Universal’s The Legacy Collection boxes: Dracula (1931), Dracula's Daughter (1936), Son of Dracula (1943), and House of Dracula (1943). Somehow it seemed appropriate then to follow up the vintage vampires with The Legacy Collection box of Universal’s werewolves: The Wolf Man (1941), Frankenstein Meets the Wolf Man (1943), Werewolf of London (1935), and She-Wolf of London (1946).

These hoary movies have accumulated enough reviews over the past eight decades to be in no need of mine. I will mention, though, that Lon Chaney’s character in The Wolf Man has the most interesting psychological profile. He is so guilt-ridden about the harm he does as a wolf that he openly wishes to die; yet whenever he actually is attacked his survival instinct takes over, even while he is in fully human form, and he defends himself … and then feels guilty about that. Dracula, by contrast, is just a narcissistic psychopath with a natty wardrobe and questionable taste in beverages. These movies tweaked the centuries-old vampire and werewolf mythologies into the basic forms that still underlie most tales involving the creatures today, including the supposed mutual antagonism between the two species that still turns up in the 21st century as in the Underworld films.

Harold Lloyd encounters vamp in
"Girl Shy" (1924)
It is hard not to wonder at the persistence of these night creatures in film and fiction. Other classic monsters recur too, of course, but not to the same extent or in the same way. Much of it has to do with the well-acknowledged erotic appeal of vampires and werewolves of and to both sexes, which was recognized even before the Universal movies. “Vampire,” commonly shortened to “vamp,” was colloquial for a seductress of a certain kind (sort-of Goth, but more upscale) in the 1920s as it very occasionally still is today. Then came the movies with Bela and kin as seducers. Then came Anne. It’s not fair to blame Anne Rice for Twilight and its ilk, but it is unlikely the series and others like it could have happened without Anne’s novel Interview with the Vampire (1976) and its sequels, which by the way are better literature than one might expect. Anne in her fiction rebalanced the erotic appeal of the male vampires in particular to such an extent and so effectively that her (less capable) imitators writing Romance fiction largely replaced highwaymen with vampires. Werewolves’ appeal is more feral. The hint of fierceness inside werewolves even when in human form regularly is played as an attractive element in fiction and films, e.g. The Howling, Wolf, and Cursed. Also, calling someone a wolf – in the sense of “predator” – never sounds quite as insulting as intended; it’s often taken as a compliment.

I’m leaving Buffy out of this, for Joss Whedon’s Buffy the Vampire Slayer deserves special treatment all its own.

Vampirism and lycanthropy (werewolfism) are both recognized medical-psychiatric conditions. The sufferers (practitioners?) aren’t the real thing of course, but the people with the conditions think they are. The “vampires” do shun the sun and (often) drink blood. The lycanthropes are convinced they shape shift. (They don’t.) Lycanthropy isn’t common, but there are some 30 cases on record in the US in the past 15 years. One suspects they are less romantically appealing than their fictional brothers and sisters, but I lack the personal experience to state that definitively.

Werewolves and vampires of the fictional type appeal to the dark side of our natures. Humans are forever repressing that side with mixed success, but it is always there. At the time The Legacy Collection movies were first in theaters, many folks because of the dictates of traditional morality were inclined either self-delusionally to deny having a dark side or to be guilt-ridden about having one; today there are folks who deny or are guilt-ridden by their dark sides because of the dictates of PC morality, which is as Victorian in its own way as the traditional kind. Either will keep Dr. Freud and his successors paying the bills. There are no thoughtcrimes. Acknowledging and accepting (but not acting upon) one’s dark impulses is an easier way to get happy. A good vampire or werewolf movie might well help.

It says a lot about a person with which creature of the night he or she most identifies: vampire or werewolf. To which one is attracted (not necessarily the same), if either, also says a lot. Do you hear the call of the tux or the call of the wild? If I get to choose, I’m going with the wolf for both answers. However, while both critters have inspired popular songs, it must be conceded that vampires by and large have the better music.

Concrete Blonde – Bloodletting

Friday, March 30, 2018

Triple Threat: Hawking’s Admonitions

More than a few physicists become famous in their own lifetimes: Bohr, Heisenberg, Meitner, and Fermi come instantly to mind. In addition to their contributions to the field, both Edward Teller and Richard Feynman wrote bestselling books aimed at popular audiences. Yet, among non-scientists none of those examples was or is fully a household name. Oh, someone of high school age or older in an average multi-person household is likely to know them, but probably not everyone. Only three physicists to date achieved true Elvis-style rock-star-level recognition in popular culture: Isaac Newton, Albert Einstein, and Stephen Hawking. By all accounts Isaac for all his gifts was personally a mean vindictive jerk with no sense of humor or self-irony, so perhaps he would not have fared well in a modern social media environment. (Then again, with those qualities he might have fared fabulously. Who is to say?) Albert Einstein did have a sense of humor and irony. He didn’t seek out the limelight much outside of physics, however; while not antagonistic to the general public, he preferred a more private existence. Stephen Hawking not only kept his humor (“Life would be tragic if it weren't funny”) but reveled in his pop culture status, even appearing in TV sitcoms such as The Simpsons and The Big Bang Theory. He was hard not to like, which made his passing a couple of weeks ago an event that was heavily covered as much on entertainment news shows as in more serious media.
Homer intrigues Stephen with
hypothesis of a donut shaped universe 

Hawking throughout his life kept stirring up the scientific community with thoughts on such things as the black hole information paradox, but he stirred up the popular press more often with his warnings and predictions about the future of humanity. Three in particular received much comment recently: a warning about aliens, a warning about Artificial Intelligence, and a counsel to settle other planets or face extinction. 

Regarding the first, humans willfully and repeatedly send messages into space in hopes they one day will be intercepted by extraterrestrials. One particularly groan-worthy instance was in 2008 when NASA used its Deep Space Network to beam out the Beatles’ Across the Universe. (If we are going to send syrupy sidereal songs, why not go all in thematically with Glenn Miller’s When You Wish upon a Star? But maybe ETs are into R&B or metal.) Hawking remarked that signaling our presence to aliens in this way might not be a good idea, and he wasn’t being a music critic: "If aliens visit us, the outcome would be much as when Columbus landed in America, which didn't turn out well for the Native Americans." He said it would be safer to avoid contact. This may well be, but it’s a little late to worry about it. Our unintentional signals overwhelm the intended ones. Defense radars, which are obviously artificial, have been signaling into deep space for 80 years and TV broadcasts aren’t far behind them. On many many frequencies earth flashes brighter than the sun. If there are any modestly tech-savvy critters within 80 light years who bother to look, they’ll see us, and that bubble of space grows constantly at the speed of light. Hawking knew this well, so I think he was just being playful with this one.

Hawking’s warning last year about artificial intelligence also got media attention: "I fear that AI may replace humans altogether,” he told Wired. “This will be a new form of life that outperforms humans." This is an old concern. In science fiction it long predates Skynet and Terminators. The word “robot” was introduced to the world in 1920 in the play R.U.R. by Karel ńĆapek. The robots in R.U.R. are humaniform AIs who are a boon to human civilization until a well-meaning human programs them with a sense of justice; they conclude it is just to overthrow humanity. More recently, Charles Stross imagined our demise at the hands of robots who make love not war. Humans in his novel Saturn’s Children preferred their machines to fellow humans to such an extent that they stopped reproducing and died out. The characters in the novel, set long after humanity’s extinction, are robots with an identity crisis. I think Hawking was semi-serious about his AI warning. He might be right, too, but I find it hard to be upset by it. AIs are our children and children generally do bury their parents – unpleasant, but the natural course of things.

Thirdly, Hawking warned us that earth civilization has only 100 years left and that we need to settle other worlds before then. A decade ago Hawking gave us 1000 years, but upon reconsideration in 2017 he cut it back to 100. The only way to avoid not just the end of civilization but an extinction event, he said, is to occupy more than one world: “My preference would be to pursue rigorously a space-exploration programme, with a view to eventually colonising suitable planets for human habitation. I believe we have reached the point of no return.”  If only those AI children mentioned above survive by then, that’s no problem; they can be built hardier than ourselves so they can occupy worlds without terraforming. If we actually want biological humans to live off-earth, however, the task will be harder. If we plan to go interstellar it will be ridiculously hard. That doesn’t stop science fiction writers from imagining it. (I tried a hand at it myself some years ago in the short story The Lion’s Share.) The consensus seems to be that we’ll just recreate our same old problems in a new place. Nonetheless, while I have no way to know for sure, I suspect Hawking was fully serious about this one, and there is something charmingly non-adult about that. All too few of us bother to grow up these days, and in a general way that is not a good thing. Yet, if you lose all things childlike in your heart, you become too cynical and jaded a creature to get any value out of grown-up ways.

So long, Stephen, and thanks for the warnings. I’m sure Homer and the guys at Moe’s will raise a mug of Duff beer.

The Big Bang Theory – aliens receive deep space message

Friday, March 23, 2018

Three Ways to Ignore Snow

Thumbing its nose at the equinox, yet another nor’easter dumped snow on NJ and much of the rest of Northeast on the first day of spring. Once again, home was the best place to be for a couple of days. Remarkably, my home didn’t lose power this time, so there were diversions of page, screen, and stereo available. Three are worth a mention:

The Golden Age of Wonder Woman, Volume 1 (2017 reprint of 1941-42 originals)
The success of the Wonder Woman movie last year starring Gal Gadot and the prominence of the character in other DC movies prompted me to return to the original comics from December 1941 through December 1942. Reprints of these are collected and available in a single affordable softcover from DC.

The creator of Wonder Woman was psychologist Dr. William Moulton Marston who previously had written about the cultural propaganda potential (using “propaganda” in its literal noncensorious sense) of comic books. Recently there has been some interest in the entertainment media about Marston’s private life, but in the 1940s vice laws restricting the behavior of consenting adults were on the books and sporadically enforced, so Marston maintained a low profile at the time in all but his professional career. A self-styled feminist, he was fond enough of women to live polyamorously with two. He invented the polygraph lie detector, which puts the “lasso of truth” in perspective; he was into S&M and bondage which also puts the lasso of truth in perspective. Marston was very much in favor of shifting society toward the political dominance of women: “Wonder Woman is psychological propaganda for the new type of woman who should, I believe, rule the world.”

The comics hold up remarkably well, making allowances for hefty doses of wartime propaganda and a far less PC culture. The plot of the 1975 TV-movie with Lynda Carter that served as a pilot for the Wonder Woman TV series is pretty close to that of the first few comic books including the side plots with the crooked theater agent (played by Red Buttons in ’75) and the Nazi spy (played by Stella Stevens in ’75); the TV-movie’s plot holes, such as how Wonder Woman becomes Diana Prince in military intelligence, are filled in the comics. The 2017 movie, though reset to World War One, also heavily borrows elements from the first year of Wonder Woman comics including the underlying conflict with Ares, usually called by the Roman name Mars in the comics. Steve Trevor, for whom Diana has a soft spot, is brave but a bit stupid in the comics; he is forever foolishly blundering into danger, getting knocked out, getting captured, and having to be rescued. The 1975 and 2017 movies both omit important recurring characters in the comics: Etta Candy and her sorority sisters at Holliday College who give Wonder Woman back-up when she needs more numbers for some undertaking. Not wanting to fall into a simplistic battle of the sexes, Marston made roughly half of the villains women, such as the brilliant but evil Baroness von Gunther and the monopolist Gloria Bullfinch.

All in all, it’s a lot of fun. Once again, make some allowances for its time, but Thumbs Up.

Wonder Wheel (2017)
Woody Allen’s 2017 drama set in the 1950s at Coney Island and starring Kate Winslet, Justin Timberlake, Juno Temple, and Jim Belushi is now available on DVD. Justin Timberlake plays Mickey, a lifeguard with ambitions to be a playwright; he narrates the movie with a cadence much like Woody Allen’s own. Ginny (Kate Winslet) is a former actress and current clam-bar waitress married to the crude and flawed (but not evil) mechanic Humpty (Jim Belushi); each has a child from a previous relationship. Ginny’s young son loves setting fires and she is terrified he will hurt someone doing it. Humpty’s estranged adult daughter Caroline (Juno Temple) is in danger because she gave evidence against her gangster husband, someone she foolishly married at 20 because “I wanted more.”

The Triangle
All of the characters want more, as we all do, but for various reasons they know they are unlikely to get it. All that seems within their grasp is just getting by, if that. Ginny, acutely aware of her upcoming 40th birthday, especially feels her opportunities slipping away and begins an affair with Mickey, who takes it less seriously than she does. Trouble intensifies when Caroline returns home in order to hide from the mob and reconciles after a fashion with her father. To Ginny’s distress, Caroline catches the eye of Mickey. Ginny is then faced with a moral challenge when mobsters come looking for Caroline. What are the consequences if she simply neglects to warn her?

Setting the film in the 1950s was a good choice for this type of drama. In all times people posture as they present themselves to others, but in that decade most folks postured differently than today; they hid some things that we do and say openly while openly doing and saying some things that we hide. Ginny, for example, has no reticence at all about saying such things as “I’ve become consumed with jealousy!” The characters are aware of the tawdry melodrama of their lives. Humpty even complains about the “bad drama,” but much of human life really is bad drama.

A quick look at Rotten Tomatoes shows that I am in the minority in liking this film – not a fringe minority by any means, but a minority. I think the problem many viewers and critics have with this movie is that there are no transformations of character, no changes of fortune for the better, and no moral messages – except perhaps for the message that you can’t count on karma one way or the other. It all ends on a bleak note. Yet, that is Woody Allen’s vision of the world, and it’s arguably a sound one. The film might not be cheery, but the characters are very human and very relatable.

Thumbs Up, but be aware that most viewers find it too bleak.

Dorothy – 28 Days in the Valley (2018)
More than 20 years have passed since rock and roll was king but there are still plenty of loyalists both among fans and musicians. Nor is the genre solely the domain of aging rockers from the classic to grunge eras. New bands continue to form and win over young fans – as well as old ones like me. One top-notch newcomer is Dorothy, whose debut LP Rock is Dead was my favorite rock album of 2016. Their second LP 28 Days in the Valley is available this month on CD and for download.

The band is fronted by and named for Budapest-born Dorothy Martin whose raw strong vocals give the band its edge. Like the first album, this one is mostly blues-based power rock, though there is more variety in style this time from one song to the next with nods to psychedelia and to country. The closest number to pop is Flawless, and it isn’t very close; the eponymous 28 Days in the Valley is a soulful tune with a Western flavor; and We are Staars is garage band rock. Some tracks are more memorable than others, of course, but every one has at least something to recommend it.

Thumbs Up.

Dorothy – Flawless

Monday, March 19, 2018

Alice through the Windshield Glass

Most of the miles I traverse are in my Chevy Cruze; it is comfortable, economical, and easy to park. Yet, I do have a GMC 1998 Sierra 2500 pickup that this year at age 20 technically becomes a “classic.” There are fewer than 76,000 miles (122,000 km) on it and the only hint of rust is (just a bit) is on the rear bumper, which I’ll sand out sometime soon. Though it’s not my primary transport, I like having a pickup for building supplies (I do many of my own repairs such as re-roofing the barn) and other cargo, and the truck’s 4-wheel drive is a boon in the winter. During recent winter storms it was my vehicle of choice. My Cruze never would have made it up my driveway on some days. Because it’s my back-up vehicle most of the time, however, I tend to forget to change such things in it as the clock for Daylight Savings (it’s still an hour off as I write this) and whatever happens to be in the CD player. During the March snowstorms it was The Best of Alice Cooper: Mascara and Monsters as it is still at this moment. By default I’ve been listening to a lot of Alice.

Alice Cooper’s first hit song I’m Eighteen was released in 1970, the year I turned 18, so he was very much a part of the music scene in that span from age 15 to 25 that always has an outsized influence on a person’s life. Yet at that time I wasn’t much of a fan. I didn’t dislike him. I regarded a few numbers such as School’s Out, Elected, and No More Mr. Nice Guy as catchy, but the only album of his I ever bought in the 20th century was Welcome to My Nightmare and I didn’t play it much. The theatricality of his performances put me off though it inspired other bands from Kiss to Marilyn Manson. My revisit to his music during my week of GMC jaunts convinces me I underestimated him.

Cooper (Vincent Furnier) is best known for his anarchic hardcore rock, but most of his songs really are not that. They often are thoughtful, Freudian, and full of genuine sentiment – dark and twisted sentiment perhaps, but sentiment. Sometimes they are tongue-in-cheek, sometimes not. The “Alice Cooper” persona makes sense, too. All rockers have stage personae that are largely put-ons; Vincent is just up front with it, and he can write lyrics for Alice that would be hard to pull off as himself. Nor does he sit back and live off royalties from the 70s. Welcome 2 My Nightmare, the 2011 sequel to 1975’s Welcome to My Nightmare, contains some of the best and most amusing work he’s ever done. So, too, the 2017 Paranormal, his 27th studio album that my truck rides last week prompted me to purchase. The septuagenarian is still out there playing up to 100 shows per year. OK, Alice is never going to follow in the footsteps of Bob Dylan to Stockholm to pick up a Nobel for literature – though he does play Stockholm sometimes – but give his lyrics a listen. They may surprise you. Even after decades of my presumed familiarity with Alice, they surprised me.

All the same, it’s time for me to swap out the contents of my GMC CD player for something else – perhaps another band to which I should have payed closer attention decades ago. I’m thinking maybe Concrete Blonde.

Clip from 2012 movie Dark Shadows set in 1972:  “Ugliest woman I’ve ever seen.”

Friday, March 16, 2018

Falling Down

During the lengthy power outage described in last week’s blog, my library was a refuge, at least in the daytime. Due to limited shelf space (room for some 2500 books assuming a 1-inch [2.5cm] average thickness), my rule-of-thumb is to keep a book only if in principle I might re-read it. Books I read once and see no reason to read again even if I were immortal never make it to my shelves. In truth, most of the books that are on my shelves will not be re-read either – maybe not even re-opened – because, like all humans, I also have limited lifetime. However, all of them would be read again were my lifespan longer, and a substantial minority will be reread regardless. A not insignificant minority are revisited repeatedly. The fragility of the electric grid, all too much in evidence last week, led me to revisit one in particular that I hadn’t re-opened in a couple decades: To Engineer is Human: The Role of Failure in Successful Design by Henry Petroski published in 1985. On re-perusal, it was, as I had recalled, an informative read, and one made more relevant than expected by the bridge collapse in Florida much in the news the past few days.

Petroski was inspired to write the book by the Hyatt Regency disaster in Kansas City. Those old enough to remember 1981 may recall that the Hyatt had two-level walkways called “skypers” spanning a soaring lobby. The skypers were suspended from the ceiling by long steel rods much like a suspension bridge. They were unusually crowded on July 17, 1981, when they collapsed, killing 114 people and injuring 200 more. It remains to this day the largest loss of life in US history from pure structural failure, which is to say without an extreme aggravating force such as earthquake or explosion.

The risk of structural failure is an old concern. It is even addressed in the Code of Hammurabi c.1772 BCE:

229. If a builder build a house for a man and do not make its construction firm, and the house which he has built collapse and cause the death of the owner of the house, that builder shall be put to death.
230. If it cause the death of a son of the owner of the house, they shall put to death a son of that builder.

I can imagine the job site of that family building business: “Hey dad, I really think we should reinforce this wall.”

Civil engineering was not a precise science then and it is not today. Although the theoretical strength of materials can be calculated, the actual strength is never anything close. There are always imperfections and impurities in real-world structural elements. All of them are subject to fatigue, and so have a lifespan. Their actual carrying capacities are given values based on tests and real world experience; a large margin of safety is then added, though this occasionally proves inadequate. In the famous case of the Tacoma Narrows Bridge, the unanticipated correspondence of the resonance factor of the bridge with local winds made a hash of all the engineers’ calculations. The structural integrity of any design is always to some degree hypothetical: “The structural soundness of the Brooklyn Bridge only proves to us that it has stood for over a hundred years; that it is standing tomorrow is a matter of probability, albeit high probability, rather than one of certainty.” There are, of course, limits to the margins of safety we build into constructions for reasons of affordability and practicality. We don’t build cars like tanks, for instance, because no one could afford one and mileage would be calculated in gallons per mile.

Petroski describes failures of railroad bridges, buildings, and aircraft. He also describes how each failure led to better designs.

The failure of the Hyatt skypers, by the way, was eventually traced to the steel rods; they were so long in the original design that for practical construction reasons they were pieced instead. The bolted connections were literal weak links. The collapse in Florida has yet to be analyzed properly, but the fact that construction was not yet complete hints that incomplete structural elements might be to blame.

The good news is pure structural failures are rare. That's not much comfort to families of victims, but it is reassuring to the rest of us. Your chance of death in any given year by this cause is one in ten million, or thirty-two people annually on average in the US. This compares with tens of thousands who die on highways in accidents. Even in Babylon, most builders’ sons were pretty safe.

Thumbs up on Petroski’s aging but still relevant book.

Metric – Speed the Collapse