Friday, January 31, 2014

A Head for Stairs

The building in which I spend at least half of my waking hours was built in 1850. Normally, I’m the only occupant in the office unit. (There is residential unit with a mirror image floor plan on the other side of the building.) After 40 years of familiarity with the structure, I instinctively take into account its peculiarities. I know that the treads on the stairs to the basement are narrow, and place my feet accordingly; I know just how low the joists are in the basement and duck the appropriate amount without looking. I know that the hinged trap door to the attic is absurdly heavy and needs to be secured firmly in the open position before risking letting go of it. I know which steam radiators to turn on and which to leave off in order to balance the heat in the winter.

None of these will trouble visitors, of course, but there is one peculiarity about which I always warn them: the head room on the stairway to the second floor. The stairs themselves are conventional, but the point at which they pass under the second floor is not as high as on modern buildings, presenting a possible head-knocker. Going up seems not to be a problem for anyone, but coming down can be a painful surprise to people of my height or taller. A little more than half the time I can descend the stairs with head held high and get away with it, but, if there is any extra bounce to my step, thwack! It didn’t take too many repeats of being brought to my knees before cocking my head to left or right on the stairwell became completely unconscious; this buys just enough distance to keep me headache free.

Stairs are dangerous enough as it is, especially when descending. The second most common cause of accidental death, exceeded only by car accidents, is falling down stairs. The most stairs I ever tackled at one go were the 897 steps (with 50 landings) in the Washington Monument. I descended, not ascended. One no longer is allowed to do either. While security is cited as a reason, safety probably is a consideration too. Imagine the poor paramedics having to stretcher an injured visitor stranded half-way up the stairway.

Insurance actuaries have calculated the risks with their usual precision. You will miss a step once every 2,222 times you use stairs. This can result in anything from that jarring but harmless thud we’ve all experienced to a full tumble. You will suffer pain once every 63,000 uses, modest injury once every 734,000 uses, and hospitalization once every 3,516,667 uses. This sounds like a pretty small hazard until one considers how many stairs we encounter just on an average day. For some reason, the highest risk is on stairways with four or fewer steps. Overconfidence perhaps. Some people are especially inept at navigating stairs. It is not clear why, but 40% of injuries on stairs are suffered by people who were injured on stairs before. Fit people are actually more at risk than unhealthy people, probably (once again) through overconfidence. Stairways steeper than 45 degrees or shallower than 27 degrees give the most trouble. US building codes for new construction require risers of no more than 7 inches (17.78cm) or less than 4, and treads (goings) of no less than 11 inches (27.94cm) – which is a maximum of about 32 degrees.

Stairs are such an obvious solution to accessing a different elevation comfortably that they appeared independently everywhere ancient peoples ever built on multiple levels. The oldest surviving example, thanks to the durability of stone, is 9000 years old at Jericho. The oldest wooden staircase still in existence, improbably enough, is in a Bronze Age salt mine in Hallstatt, Austria. It is safe to say stairs aren’t going away anytime soon. But, while navigating them, remember to look up as well as down at one’s feet. Why? This blog was prompted because I was distracted while reading a letter as I descended my office stairs yesterday, and, for the first time in many years, neglected to cock my head.

Stairs uncovered at Jericho 




Sunday, January 26, 2014

Fine Lines

Composer, author, actor, and wit Oscar Levant (1906-72) once remarked, “There's a fine line between genius and insanity. I have erased this line.” It is not at all certain the first the first part of his quip is true, but there is quite a lot of anecdotal evidence to the effect. In his book, The Price of Greatness: Resolving the Creativity and Madness Controversy (1995), psychologist Arnold M. Ludwig collected enough for various tables and graphs. Are very numerous anecdotes data? Not really, but after a certain point they become compelling anyway. Since Amazon offers the book at the steep (and curious) price of $99.41, I recommend finding it in the library. It is both an interesting and entertaining read.

We all have our idiosyncrasies, but the great innovators in theory, technology, and the arts often carry quirkiness to a level on a par with their achievements. Isaac Newton was a social recluse in whom sexual appetite was apparently absent. (Voltaire claimed Newton was a virgin.) There was nothing lacking in the appetite of Richard Feynman, who did his best work (including his theory of Quantum Electrodynamics) in strip joints, yet this behavior, too, is a bit odd; one would think the scenery in such places would be distracting, but he found it inspirational. Thomas Watson, after co-inventing the telephone with Alexander Graham Bell, converted to Islam, enrolled in MIT to study geology just for the fun of it, became a follower of communist philosopher Edward Bellamy (though not to the point of selling his AT&T stock), and then successfully took to the stage as a Shakespearean actor. George Eastman (the Kodak camera fellow) loved nothing more than to bake pies in an upstairs kitchen in the mansion he shared with his mother. Thomas Edison invented and successfully marketed so many world-changing products that we tend to overlook his many silly ideas, such as his concrete furniture. He made concrete beds, concrete cupboards, concrete bureaus, concrete phonograph cases, and whole poured concrete houses. He even made a concrete piano. He was so confident of the appeal of concrete products that he built his own cement plant in Stewartsville, NJ. As anyone else could have predicted (if anyone told him, he didn’t listen), concrete home furnishings didn’t catch on. Steve Jobs, to take a more recent example, bought a new Mercedes sports coupe every six months just so he wouldn’t have to get permanent license plates.

For some reason, successful financial managers and industrialists tend to be much more conventional, which suggests financial genius might be of a different type than that of inventors, artists, and scientists. A few (though not many) are flamboyant, but that is not the same thing as quirky; there is nothing unconventional, for example, in Donald Trump’s taste for expensive homes and younger women. Howard Hughes was flaky, true enough, but that seems to have been genuine mental illness that developed slowly due to injuries sustained in a plane crash; besides, he might qualify more as an inventor. Most of the best known industrialists – JP Morgan, JD Rockefeller, Andrew Carnegie, et. al – were decidedly ordinary family men, however revolutionary their business models were. Olive Beech (co-founder Beech Aircraft) and Ruth Handler (former president of Mattel, and the designer of the Barbie doll) also lacked more than the usual number of peculiarities (as far as is publicly known). Edward Stotesbury, JP Morgan’s right hand man, was such a non-entity that historian Alva Johnson calls him “a dignified hole in the atmosphere.” Fortunately he acquired a colorful wife, as someone with 75 million in 1912 dollars often is able to do, even with the disadvantage of a low barometric reading. Eva proceeded to spend $50 million of the sum on new houses. She entertained on a grand scale; on one occasion she hosted a half-million dollar alligator hunt to acquire the materials for new luggage for her and her guests. When the Crash of ’29 hit, Edward unavailingly tried to scale back her entertainment budget to $50,000 per month ($1 million per month in today’s dollars). Still, I’m not sure that a talent for spending money, even one as extraordinary as this, really qualifies as a quirk in the usual sense.

Somehow, despite my own abundant share of idiosyncrasies, the next breakthrough in theoretical physics, the next world changing patent, and the next great American novel all continue to elude me. Apparently, brilliance may engender eccentricity, but the chain of causation doesn’t work in reverse. (The image of pushing on a chain comes to mind.) Maybe becoming a hole in the atmosphere is a more achievable ambition. Who wouldn’t like $75 million?
 

Randy Newman I'm Different

Monday, January 20, 2014

Ghosts of Dwellings Past

The western U.S. once was littered liberally with ghost towns. Most of these communities had sprung up quickly thanks to some local mine or industry, and then emptied out with equal rapidity when the mine or business closed shop. The majority of these towns over the past several decades have decayed away into scarcely recognizable piles of rotting lumber, but a few remain largely intact. The best preserved ones have achieved new life as tourist attractions. The appeal of these places is hard to explain, but it probably has something to do with a sense of parallel to one’s own individual birth, life, and death.

American ghost towns offer nothing like the deep history of the ones in the Old World, though the Anasazi ruins come close. The most famous ghost town in the world is Pompeii, and for good reason. Not only does the place give us a superbly preserved look at Classical Roman life, but the cataclysmic circumstances of its death give it an eeriness that more recent sites can’t match. Besides, it’s a pleasant day’s outing from Naples, which makes it supremely accessible to tourists. Yet, accessibility isn’t everything. One site far older than Pompeii – in fact, older than Stonehenge, Sumer, or the Pyramids – is the Neolithic village of Skara Brae in the Orkneys. It is less visited precisely because it is out of the way, but is in many ways more intriguing. After centuries of habitation, the village was vacated nearly 5000 years ago for unknown reasons. There is some circumstantial evidence (e.g. beads from a broken necklace left on a floor) that it was abandoned in a hurry, but we can’t know for sure. The structures were covered over by sand and grass; they remained buried until a storm in 1850 washed away the surface and revealed what was underneath. The excavated structures consist of seven private homes and a building of indeterminate use – possibly a workshop of some kind. Others may have been washed to sea (the shoreline has shifted) and still others may yet lie buried and undiscovered.

The fascinating thing about the private homes at Skara Brae is that they are unmistakably recognizable as such. It was a hardship for the Neolithic Orkneyans but a windfall for us that there was no native wood growing on the islands. Accordingly, the walls and even the furniture were made of stone, and so have survived. (The roofs have not, which suggests they were organic; turf atop ribbings of whalebone is one proposed construction.) There are stone dressers, stone bed frames (presumably once topped with straw), stone storage alcoves, and central stone hearths. What did they burn in the hearths? Possibly driftwood, peat, dung, dried seaweed, or some mixture of them. The dwellings were not cramped: they average 36 square meters (385 square feet) with more headroom than in modern houses. They all look very familiar – Flintstone-ish, to be sure, but familiar.

The Neolithic revolution generally is considered to have been about farming. People added to what they hunted and gathered with what they grew and raised. In some locales they replaced wild foods altogether. But maybe it also was about domesticity – about replacing communal space with substantial private homes. Private space for ourselves as individuals or as small family units changes our perspective. I know that my perspective changes when I return home in the evening to my own world within my own walls. Without that degree of privacy and coziness I very likely wouldn’t be writing this blog. Neolithic peoples didn’t give up on communities but they expanded private space. Skara Brae is a village, after all, but the entrance doors to the houses could be barred from the inside. Perhaps this is when a more individualistic and modern outlook developed: when “home” became one’s own space rather than the whole village.

It’s not generally a compliment to say of someone, “He lives in a world of his own,” but maybe it should be, at least when the person in question is not actually delusional. “Groupthink” is not a compliment either, and innovative thought is necessarily idiosyncratic. Private space gives us more freedom to be idiosyncratic. It also lets us form tighter bonds with romantic partners and with members of an immediate family with whom we choose (“choose” being the key word) to share the space. The track record for human happiness on that last basis is mixed, but some folks regard the opportunity highly.

House in Skara Brae



A World of Our Own, recorded by The Seekers in 1968


Monday, January 13, 2014

Spinning History

“History is written by the victors” is a line attributed to various historical persons (including Napoleon and Churchill) but is likely older than any of them. It’s just as well that no one can lay clear claim, because it isn’t true. Official history (where such a thing exists) is written by the victors, true enough, but the losers (or, if none survive, their sympathizers) never stop scribbling, and their views sometimes become dominant in the end.

Many historians are oblivious to their own biases and spin, while others engage more or less openly in advocacy. An example of the former, which (one hopes) should be uncontroversial at this late date, is the republican disposition of Plutarch; on the other hand, Julius Caesar promoted himself in his accounts, as politicians tend to do.

Advocacy is not necessarily a bad thing. The diatribe by the Old Oligarch (aka pseudo-Xenophon) against the 5th century BC Athenian constitution is one of our best sources on how Athenian democracy worked, for the simple reason that you can’t condemn something effectively without describing it. Procopius of Caesarea in the 6th century AD gives us the full gambit of approaches. Wars is a pretty straightforward account of the military campaigns under the Emperor Justinian that restored Roman rule in much of the West. On the Buildings, beyond its core architectural subject matter, is a relentless panegyric of Justinian and Theodora, apparently intended to curry favor with the two. The Secret History, which Procopius wisely directed to be published posthumously, is one of the most entertaining hatchet jobs of all time, describing Justinian and Theodora as “fiends in human form” and blaming them for the deaths of “a myriad myriad of myriads” of people. (This literally is a trillion, so it’s safe to assume he just meant “a whole lot.”) The three together give us a pretty complete picture.

More recently, Charles Beard’s classic 1913 An Economic Interpretation of the Constitution of the United States clearly is influenced heavily by Marx. Present-day English historian Paul Johnson (Modern Times) is unabashedly conservative in his views and interpretations. Both authors are worth reading and neither’s work is negated by the particular perspective.

Currently I’m reading 1968: the Year that Rocked the World by Mark Kurlansky, a journalist/historian whose work Salt: a World History I previously enjoyed. Born in 1948, he was strongly sympathetic with the radical New Left during the 1960s, which he experienced first hand, and hasn’t changed his mind since. He announces his slant at the get-go: “I am stating my prejudices at the outset because even now, more than three decades later [2004], an attempt at objectivity on the subject of 1968 would be dishonest.” I agree. The book has numerous strengths, not least of which is its global view. For those who didn’t experience the 1960s – even for those who did, but from a different spot – this is a valuable peek inside a revolutionary era and mindset. I would recommend supplementing it, though, with another viewpoint – for example that of British historian Dominic Sandbrook who emphasizes the strong conservative trends that underlay the same period. Then open Tom Wolf’s The Electric Kool-Aid Acid Test (1968), which gives a vastly better sense of “being there” than either.

It seems that more than ever we are disinclined to read or listen to whatever clashes with our own views. No doubt this always has been a human tendency, but today the proliferation of information sources makes it possible for us to choose only those news outlets, pundits, bloggers, and magazines that are congenial to our ideology. (Our own preferred sources are just being honest, of course: only those favored by our opponents are biased.) It is a tendency that should be resisted. By and large, the folks on the other side of the fence are not really stupid, crazy, or evil – no more (not much more, anyway) than the folks on our side. They just look at things another way. The reading of competing histories is a good way to stay in practice. If we can’t be flexible enough to do this with regard to issues now so distant as those of 1968, it’s hard to imagine understanding (and respecting) our opponents today.

Seeing all sides doesn’t prevent us from driving our own way, but we are less likely to crash or to run over pedestrians if we don’t black out the windows on one side of the car.


Wild in the Streets (1968)


Monday, January 6, 2014

Foreverly

We all know that music can affect us emotionally. The quality of the composition or presentation is not always the key factor: a poorly whistled tune sometimes can be as evocative as a full-orchestra performance of Chopin if the former links to a thought or memory in some way. Words that are very bad poetry when straightforwardly read aloud suddenly can become moving when sung.

It is not at all clear why this should be so. A surprisingly vast amount of research – using MRIs and PET scans among other tools – has been conducted on the physiology of our responses to music. In a mechanical sense, the investigations provide some answers to how, but still leave us in the dark as to why – specifically, why should these physiological responses have evolved? Do they serve some useful function? If so, is the function biological or social? There is no shortage of learned commentary on these matters either, much of it in dense professor-ese. I’ve yet to read any that is more satisfying than Nietzsche’s speculations in The Birth of Tragedy. However, while I’m convinced he was onto something with his notions of the Apollonian and Dionysian, not even Fred satisfies completely. To his credit, Nietzsche was dissatisfied, too, and at one point was reduced to saying “we listen to music with our muscles.” This isn’t entirely true either (as he knew), but we at least understand what he meant.

These questions turn up in popular literature, too. Arthur C. Clarke in 1953 published one of his best regarded science fiction novels, Childhood’s End. In it, earth is visited by a benevolent race of aliens dubbed the Overlords. Despite their soubriquet, they don’t interfere by force, but merely help humans transition to a stage where they can connect and merge with a galaxy-wide consciousness. Out of curiosity, the Overlords attend a human concert. While they are capable of understanding that the arrangements of sound constitute an art form, they are baffled by the emotional responses the arrangements evoke in the audience. We later learn that the Overlords are a bridge species: they help others join the galactic mind, but are somehow excluded from joining it themselves. They are missing something intangible, though not even they know what; whatever it is prevents them from taking the step. Perhaps it is a failure to hear the music of the spheres.

Whatever the reason for the impact of music, the affect is undeniable. When coupled to nostalgia, even the most simple and otherwise forgettable popular tune can elicit strong reactions. Accordingly, all of us hold a special place in our hearts for the music that was popular from our childhood to our mid-20s, the era when our identities and strongest memories are forged. Some people never learn to like anything else. Even those who keep up open-mindedly with the new, however, are likely to keep space on the shelf (or backed-up in flash drives) for the songs of their youth. They are so interconnected with our life histories that when the artists who performed them die, we often feel the loss as a personal one. One recording artist from my childhood, Phil Everly, died last week at age 74. He and his brother Donald were enormously popular in the late 50s and early 60s.

Most kids become conscious (almost obsessively) of the contemporary music scene sometime around the age of 9 or 10. Nowadays, the tween demographic (9-12, sometimes broadened in definition to 9-15) is recognized as a distinct market, but in the decade straddling 1960 it wasn’t. 10-year-olds and 20-year-olds at the time listened to precisely the same thing, though in truth all of the popular music then had more in common with present-day tween fare than with what currently is aimed at 20-somethings.

Always the youngest in my class in elementary school and in high school, I would have been perpetually a step behind my classmates in popular music and every other element of popular culture (and personal development) except for one advantage: my sister Sharon (1950-1995). Sharon was a little over two years older than I, and, throughout her life, never failed to immerse herself in the age-appropriate zeitgeist, whatever it might be. Because of her I always was introduced to the contemporary thing before I ever would have found it on my own. 45 RPM singles of the Everly Brothers were an early example. They played in the house at least since 1959, and by 1962 I was playing them myself.

I can’t hear any number by the two brothers to this day without being transported to my parents’ house on Main Street in Brookside where my sister spins 45s, an aqua cabover Jeep truck sits in the driveway, the aroma of something seasoned heavily with black pepper emanates from the kitchen, and our Great Dane named Woody romps in the back yard. Thanks for the memories, Phil.


Crying in the Rain reached #6 on the US charts in 1962. Written by Carole King and Harold Greenfield, performed by the Everly Brothers.


Friday, January 3, 2014

Juan Is the Loneliest Number

We now are a sufficient number of days into 2014 for all our New Year’s Resolutions to have been broken, so we all can relax a bit. New Year’s Eve itself I spent as I usually do – cozy on the couch. I give and attend my share of parties and get-togethers during the course of a year, but in the first moments of one I prefer to be warm and solitary while watching the frozen-toed horde in Times Square on TV. An added bonus: an available bathroom if I need one is down the hall. I always suspect the shrieks from the crowd at midnight have more to do with relief at the event being over than with any enthusiasm for the new year.

Simply counting down the minutes is a little dull, though, so for the two hours prior to midnight on the 31st, I watched Don Jon, a film well-regarded by critics, on pay-per-view. I had missed it when it was in the theaters several months ago. It is the directorial debut of Joseph Gordon-Levitt who also wrote the screenplay. (Mild spoilers follow.)

At first blush, the film didn’t look promising. Don Jon is set in NJ and is populated by characters that the cast of Jersey Shore would regard as classless. Yet, underneath his crassness, Jon (Gordon-Levitt) has enough heart for us soon to start to care what happens to him – barely enough, but enough. On the surface, the film is about one man’s porn addiction, but the target is really bigger: it is about modern narcissism of which Jon’s pastime is a symptom. Jon prefers internet porn to real women. He likes real women, to be sure, and is an expert seducer of them: hence the nickname “Don Jon” given to him by his friends. He and his pals enjoy cruising sleazy pick-up clubs, and Jon rarely leaves without an 8, 9 or “dime” on his 1-to-10 scale. Nevertheless, the women don’t ever measure up in his mind to his private sessions in front of his computer screen; in fact, he commonly retreats to his computer after his date for the evening falls asleep. The reason is that porn sex is all about him. He need bring nothing to the relationship. There is no relationship; his indulgence is entirely one-sided.

Jon meets a dime named Barbara (Scarlett Johansson). Jon’s father (Tony Danza) leers his approval when he brings her home to meet the family, and compliments Jon (aside, at least) on the “piece of ass.” His sister Monica (Brie Larson) just rolls her eyes while remaining silently glued to her smart phone. Though Jon falls hard for Barbara, even she doesn’t equal his porn sessions, so he continues with them. Early on, Barbara catches Jon looking at porn and makes him promise to give it up; Jon doesn’t give it up, but is careful to keep his habit out of her sight.

Barbara, we soon learn, is addicted to her own preferred film fantasies: romantic comedies. After watching one, she gushes about how romantic it is: “He gave up everything for her,” she says. This response to the movie is telling, for it expresses exactly what she wants for herself. In truth, the only thing Barbara brings to her relationship with Jon is sex; otherwise she is entirely one-sided. Everything is about her, and she is highly demanding of Jon in ways large and small. Apparently, this is why she feels so threatened by porn: she senses it is a real rival to her only asset. Barbara is appalled when she snoops on Jon’s browser history; she discovers he still visits porn sites, so she breaks up with him. When Jon reveals the break-up to his family, Monica unexpectedly looks up from her phone and with perfect accuracy tells Jon (Brie Larson’s only lines in the whole movie, despite substantial screen time) that he is better off because Barbara just wanted someone to boss around.

Largely by luck, Jon finally does get a taste of what a real two-way affair is like, thanks to an older, wiser, and open-minded woman named Esther, played by Julianne Moore. Neither intends a permanent relationship, but the experience gives him some personal insight about his own shortcomings and motivations.

Not just Jon and Barbara, but nearly all the characters in the movie (omitting Esther) are focused almost exclusively on their own wants. It’s not a rare focus among people in the real world either. Just this morning I failed to meet the expectations of a jogger. It was bitter cold last night and a substantial snow fell. So, early this morning I drove out in my 4WD through empty snow-covered streets in order to shovel the sidewalk in front of my office. Fewer than 40% of the surrounding sidewalks had yet seen a shovel, but, while scooping away, I nonetheless got a nasty look and an annoyed “Tch!” from a passing jogger for not having finished clearing the path for her yet; the neighbor (though not around to hear them) got curses from her for not yet having touched his walks at all. We should have been more aware of the jogging schedule.

Is this really a particularly narcissistic era? Probably not. Folks 50 or 100 years ago probably were much the same fundamentally, but there does seem to be less reticence in 2014 about expressing our self-involvement than was once socially commonplace. Perhaps this is for the best. We enter relationships better forewarned – if we enter them at all. In 2011 the game maker Konami released Love Plus+ which allows the player, Pygmalion style, to create his or her own simulated paramours – a rather more creative alternative than that favored by Jon. As a promotion, Konami hosted a romantic holiday weekend at the resort town Atami for players and their virtual lovers. It was a smashing success. Would such a promotion be just as successful in Atlantic City? I have little doubt.