Tuesday, June 28, 2011

We Did It Before and We Can Do It Again

Two years into the least inspiring economic recovery in a century, we continue to see anemic job growth, an abysmal housing market, and the ongoing threat of sovereign debt default.

Yet, recall earlier events: a reeling financial system and unprecedented bank bailouts; a stock market collapse on the scale of 1929; plunging real estate prices; unemployment rising. No, I’m not talking about 2008. I mean two decades earlier when the Savings and Loan crisis sparked the stock market crash of 1987. The recession that followed technically lasted until 1992 though conditions still felt bad enough in 1994 that both houses of Congress changed parties in a sweep election.

Preceding the S&L troubles and the ‘87 crash was an asset bubble in real estate. While prices still were rising double digits per year, I recall the arguments prevalent in my own real estate industry that the fundamentals no longer applied to housing. Americans, it was said, were a nation of homeowners who viewed houses as a "safe haven" investment; prices were therefore decoupled from rents and other traditional ways of determining value and so had no natural ceiling. Prices might stall in a recession, but (except in the odd local market, such as Florida with its large stock of second homes) wouldn't drop substantially. The fundamentals chuckled at our pronouncements; in 1988 they applied with a vengeance. Homeowners and commercial property investors alike took a beating.

Lesson: whenever folks start telling you the fundamentals don't apply, head for the exits. Whenever "this time is different" for reasons X-Y-Z, reasons X-Y-Z are wrong.

This Time is Different: Eight Centuries of Financial Folly by Princeton economists Carmen Reinhart and Kenneth Rogoff, published in 2009, is one of the early contributions to the literature on the 2008 financial crisis, and still is one of the best. Yet it does something more important than analyze the last crisis: it informs us about the next one. The bulk of the book is filled with charts, compilations of historical data, explanations of methodology, and lists of sources; necessarily, this is rather dry stuff, but it gives scholarly substance to the authors’ arguments. Those arguments are anything but dry.

Financial crises, Reinhart and Rogoff tell us, are recurrent "equal opportunity crises" that take place in every type of economy and regulatory environment. They happen in advanced economies and in emerging ones, in free market economies and command ones, in largely unregulated markets and in heavily regulated ones, in socialized economies and in ones with no social safety nets. Every country has had them. No magic mix of policies prevents them. There are, however, warning signs when one is imminent, and, by 2006 in the USA, every single one of them was flashing bright red. In retrospect, it seems that any informed and reasonable person should have seen them. (I didn't, despite having experienced 1987.) The refusal of investors and of government policy-makers to acknowledge the signals was itself another big signal; the widespread belief that “this time is different” and that "the old rules of valuation no longer apply" is characteristic of every bubble. The rationalizations for this belief vary. Investors may point to changes in the structure of capital markets, to shifts in global trade, to central bank policies, or to (Alan Greenspan's line) the “robustness” of modern financial institutions – which is to say to X-Y-Z. "This time" supposedly was different just before every financial crisis of the past 800 years.

(Just out of curiosity, I'd like to know how the authors' own investments fared 2007-2008, though that doesn't affect the validity of their argument one way or another.)

The authors address sovereign debts and defaults as well as private ones, and, in 2009 presciently warned of the risk of full-blown default crises in Greece, Portugal, and several other countries.

Their final analysis is sobering. Financial crises will happen again, and no package of policy-fixes cobbled together by Congress will prevent them. They will happen because "a financial system can collapse under the pressure of greed, politics, and profits no matter how well regulated it seems to be." The authors don’t just throw up their hands, however. "Even if crises are inevitable, there must be at least some basic insights that we can gather from such an extensive review of financial folly." They then give some basic advice on how at least not to make things worse while picking up the pieces.

For those of us who are not policymakers but private investors, the advice is to 1) avoid the this-time-is-different mindset in the run-up to the next crisis, and 2) avoid it in the one after that.

Sunday, June 26, 2011

HG and Me

The life of pioneer science fiction author H.G. Wells (1866-1946) did not overlap mine, yet there is a sense in which he is an old friend.

My mom was a crafty sort in her own way. She wanted to encourage her kids to read, but knew their contrarian natures. Overtly pushing books on us would have been counterproductive. So, she simply brought into the house the kinds of reading material likely to appeal to my sister and me. She put them where we would find them, and left it at that. She had none of the snobbish prejudice so common at the time against comic books – she figured reading was reading, and anything which fed the habit was good. She supplemented these with magazines, children’s literature, and more ambitious material, including Mark Twain and Sir Walter Scott. The strategy worked. Both Sharon and I became early and lifelong recreational readers.

The first full length legitimate novel I read cover to cover – not counting Dr. Seuss and the like – was The Lost World by Sir Arthur Conan Doyle. The 1959 hardcover is still on my shelf. The second was The War of the Worlds by H.G. Wells, in an edition that also included about a dozen of his short stories. I actively sought out more of his work afterward, and, a half century later, I’m still a fan. My appreciation only grew as I came to understand more of the wry humor and social commentary in his writings.

HG Wells has had a very mixed history on the screen.

The 1960 Time Machine is excellent, though its anti-war theme deviates from the capital/labor theme in the book; it’s also an ambiguous anti-war message since Rod Taylor whips up the Eloi to battle the Morlocks. The 2002 version is visually stunning but has very little to do with the book.

The 1953 The War of the Worlds is superior to the 2005 version (which in some ways doesn’t even make sense) with Tom Cruise, though I’d still like to see one set in the 1890s when the book was written.

The Island of Lost Souls (1936) is the best version of The Island of Doctor Moreau, no doubt because HG himself had a hand in it; later versions tend to lose the central message that we are all wild beasts tortured into the (sometimes unconvincing) appearance of civilized beings.

He also wrote the screenplay to Things to Come based on his novel The Shape of Things to Come. It predicted the Blitz with fightening precision.

Not a single version of The Food of the Gods has come off properly; every one has become a simple monster movie, sometimes with an eco moral; in the book the giants include humans whom HG plainly favors against the masses of “little people” trying to bring them down (a little unsubtle elitism there, though not of a conservative variety).

The Invisible Man (1933) worked well.

The First Men in the Moon (1964) was a solid adaptation, too, even though it got the details of a near future moon landing wrong; the bulk of the film, a flashback to the turn of the century, followed the book well and did it with pretty good fx for the time.

If a miniseries counts as a movie, The Infinite Worlds of HG Wells isn’t bad. It handles several of his short stories with wit and style. It does misrepresent his relationship with his second wife (it doesn’t mention the first at all) Amy Catherine, but, hey, it’s a movie. Advocates of Free Love, HG and Amy Catherine in fact were far more unconventional than depicted; among HG’s mistresses were birth control activist Margaret Sanger, writer Amber Reeves (with whom he had a daughter). and feminist novelist Rebecca West (with whom he had a son).

We tend to forget what a lively intellectual time the turn of the century was. (I mean, of course, c.1900, not c.2000.) True, it saw the last and fullest bloom of reactionary Victorian morals, but in the same garden was a radical rebellion against them. Oscar Wilde (who helped young Wells get published), G.B. Shaw, Emma Goldman, Aleister Crowley, W.B Yeats, Victoria Woodhull and other socialists, anarchists, feminists, eugenicists, free love advocates, neo-pagans, and revolutionaries of all types abounded. Wells was very much a part of that milieu, and it permeates his work without (usually) overpowering the literary merits.

Unless someone invents the time machine described by Wells in The Time Machine (1895), I’ll never meet old Herbert George face to face. Nonetheless, through his books he has been a far larger part of my life than most of the people I have met in person, and I am in his debt.

Thursday, June 23, 2011

The Future Was Then

From the bleachers on Tuesday, I watched 310 West Morris Mendham seniors file to the podium to receive their high school diplomas. Afterward, in their procession to the exit, many of the new grads already were reaching for their BlackBerries as they neared the door. Noticing this, I couldn’t help thinking about what had changed in the (ridiculously numerous) years since I strode be-robed in black with my diploma in hand – and about what, more surprisingly, hadn’t.

My teenage years were a time when hopes for the future were literally sky-high. In 1968, 2001: a Space Odyssey told us that within a few decades there would be passenger flights to the moon, sentient computers, and a manned mission to Jupiter. All of this seemed a reasonable projection at the time. Since the beginning of the 20th century, after all, the world had transformed from one dominated by horses and buggies to one with televisions, jetports, IBM 370 mainframes, car phones, superhighways, consumer electronics, satellite communications, frozen foods, and space flight. Why wouldn’t the next decades see changes just as radical?

It didn’t work out quite that way. I’m not dismissing the communications revolution which has put a phone, a wireless telegraph, an entertainment center, and an encyclopedia in everyone’s pocket. Still, someone who fell asleep a half century ago upon waking would need no more than a single day of instruction to handle the new IT well enough to get by – not proficiently but adequately. The tech really isn’t very difficult to use in the most basic ways. Otherwise, our Rip Van Winkle (the additional TV channels notwithstanding) wouldn’t find daily life that much different. The social changes in the past 50 years, such as the increasing dominance of women in academics and the workforce, have outweighed the consumer tech ones; yet, even the social changes already were clearly visible as trends back then. (Hugh Hefner’s girlfriends remain the same age though.)

The pace of change may pick up again. We are warned by futurist gurus of a coming technical singularity when machines become smarter than we are, thereby changing the nature of civilization forever. The projected date for this event keeps getting shoved further and further down the road. (One wonders if it is like fusion power: always 20 years away.) Unless, against all current expectations, biotech delivers on its early promises, I don’t imagine I’ll be witnessing a high school graduation 50 years from now. If somehow I do, I suspect that, as the high school band plays Pomp and Circumstance, I’ll still be struck more by what hasn’t changed than by what has. I also suspect that the most radical differences will be social rather than electronic or photonic.

What social trend is the most obvious at the moment? I’d say it’s the growing cadre of lifelong singles, and the increasing rarity of traditional two parent families. Outside of LGB circles, marriage is viewed ever more warily by young and old alike. I’ll make a bet that continues for 50 years. Anyone want to offer me odds?

Dick Tracy's Smart Phone (1967)


Monday, June 20, 2011

The Scarlet Hoodie

Last night I slipped a DVD of Red Riding Hood (2011) into the player. Well, I figured, at least Amanda Seyfried offers some eye candy.

Perhaps my experience benefited from low expectations. Having seen various reviews of this film that clustered somewhere between “eh” and “so-so,” I was surprised to find myself enjoying it. It may be a rare movie that works better on a home screen than in a theater. (It was released opposite The Battle for Los Angeles which is definitely a big screen movie.) Red Riding Hood is no blockbuster, nor is it a truly frightening horror film. Nonetheless, if you’re a person who unapologetically enjoys the sort of movie typically shown on Syfy or the Chiller channel, you’ll probably like this also – and the production values are 10 times better than the usual stuff found there. The small medieval village where the story is set is beautifully recreated and filmed. Amanda Seyfried is as good a pick for the title part as is possible, and the other castmembers all play their parts well enough.

As for the central werewolf mystery, I’ve seen a few complaints to the effect of “I saw it coming.” Well, I didn’t exactly. To be sure, the hairy beast turns out to be one out of a bunch of logical suspects, but it’s supposed to be one of those. Fans of mysteries consider it cheating when an author offers no proper clues amid the red herrings; they hate it when, in the final chapter, the killer turns out to be some obscure person with no previously disclosed motive. The film works fine on this level.

All in all, this is a moody and different take on an old tale of a young woman and a big bad wolf. Don’t expect a gore-fest or action-adventure, but, if you take it on its own terms, you just might surprise yourself by enjoying it, too.


Friday, June 17, 2011

Sound Effects

One of the advantages to living alone (and to sharing no common walls with neighbors) is the liberty to do noisy things at any hour of the day or night without having to explain yourself to anybody. At 2 last night (or this morning, if you prefer), sleep, by no deliberate choice of my own, was not on the agenda. So, I let my inner adolescent out for a stretch and cranked up the stereo.

My taste in music is not particularly high-brow, and it lurches oddly from big bands to blues to hard rock to ‘alternative,” however one defines that last one. I can tolerate, but am not a fan of, modern pop. The less said about Eminem the better. I’m also very unlikely to play country – unless The Unknown Hinson can be said to count (http://www.amazon.com/Future-Hinson/dp/B0001KL4IQ ). Last night I opted for the sound track to Scott Pilgrim vs. the World, a surreal movie best summed up as

All the world’s a video game,
And all the men and women merely avatars.

I was surprised at the time of its release that this film didn’t do better with its target demographic, to which I definitely don’t belong. I wasn’t surprised to see its interestingly eclectic music released as a cd (http://www.amazon.com/Pilgrim-Original-Motion-Picture-Soundtrack/dp/B003SG810Y ). On balance, it proved a good choice with which to spend a dark hour. However, the oldest track on this mostly very contemporary cd is Under My Thumb by The Rolling Stones. Hearing it prompted me to follow up the album with a cd of Mick and the guys. The first track to play on this second cd was Time is on My Side. No it’s not. No one past his 20s ever would write those lyrics.

I turned off the stereo and went to bed. A much relieved cat settled herself on the bed, too. Sleep arrived. Then daylight.

I’m not quite sure what is the evolutionary role of a taste for music and poetic lyrics; I’ve read many theories, but none has been altogether convincing. I do know that those wee hours tunes were more rejuvenating than an extra hour of slumber would have been. As for the song that restored thoughts of sleep… well… whatever time may bring one day, today, at least, is pleasant. The radio by my desk beckons. I think some background music is in order.

Sunday, June 12, 2011

Corporal Punishment

I’m not big on spectator sports in a general way. Truth is, unless one is a New York team (in which case the topic is sure to have come up in local small talk), I probably won’t be able to tell you who is playing in either the World Series or the Superbowl until I hear radio and TV ads just prior to the broadcast. Yes, I’m that clueless.

There is an exception, as some readers may know: women’s roller derby. (For a brief description of the sport’s origins and a video explaining the terms and rules, see an earlier post Wheel Appeal: http://richardbellush.blogspot.com/2011/04/wheel-appeal.html .) Make no mistake, it is a real sport, not theater; the hits and scoring are legitimate. Yet, though not theater, it is in many ways theatrical, which is part of the attraction; the skaters are not afraid to embrace high camp, on-track personas, and aggressive anti-elitism. Nor are the skaters afraid of bruises, which none can escape in this contact sport. While competing hard, they seem to be having fun in a way that few players of other sports are. I know they’re fun to watch – and yes, of course, that they are not 275-pound men in football gear is also part of the appeal.

The local team playing closest to my door is the Corporal Punishers of the Jerzey Derby Brigade (see http://www.jerzeyderby.com/ ), and last night’s bout in Morristown, NJ, vs. the Black River Rollers from Watertown, NY, was beyond ordinary. The Punishers built a commanding lead in the first few jams which they never relinquished. Heinz Catchup, Miss USAhole, Assault Shaker, and Maggie Kyllanfall one after the other racked up grand slams (5 point scores possible only by lapping the other jammer) as did Pixie Bust. That makes it sound easy. It wasn’t. Not one of those points was uncontested, and Watertown’s defense, tough from the start, stiffened throughout the bout. In fact, there were more and harder hits and knockdowns than I’ve seen in any other match. Mea Slapaho and Bitchin N’ Rollin from Watertown deserve special mention for aggression (that’s a compliment in this context). Morristown’s Bruta Lee went down (and out) very hard, stopping play for some minutes, but she not only got up but soon returned to the track. Morristown’s defense also was strong, earning blocker and team captain Doom Hilda MVP. The final score of 227-38 is impressive, but misleading; the teams were not mismatched and the numbers easily could have been very different – especially if the penalty box (rarely empty) had been occupied somewhat differently at key moments.

This may not be a sport for everyone, either as a participant or a spectator, but I’ll take it over anything on ESPN any day. If you haven’t tried it, do. There’s almost sure to be a team nearby.

Friday, June 10, 2011

High (School) Jinks at the Movies

High school largely created teenagers as a distinct subculture by forcing them together for four years, and by the 1920s, many businesses were targeting teens with specialized product lines. The movie business, if anything, was slow out of the gate – perhaps because few of the first generation of studio moguls went to high school. College movies actually preceded high school movies as a significant genre even though only a tiny percentage of the population went to college in the 20s and 30s. Some of these early college films are edgy. Today’s college movies are raunchy, but that is not at all the same thing. If you haven’t already done so, try The Plastic Age (1925) in which party girl Clara Bow corrupts a young undergrad by diverting him from study and sports, or Finishing School (1934) in which bad girl Ginger Rogers explains to new arrival Francis Dee that it’s smart to appear good, but stupid to be good. After 1934, the Hays decency code was enforced seriously, and college movies lost their edge for the next 30 years.

In the 1930s, the studios finally discovered high school, and almost from the start the movies invented as many teen cultural fashions as they reflected. There were exceptions, but, in general the 30s and 40s high school movies were wholesome with a vengeance. Witness the first four Nancy Drew movies (pretty good), the Andy Hardy series (hopelessly corny, but some viewers love them), 1942’s The Major and the Minor (good with a dash of propaganda), and the pleasant The Bachelor and the Bobby Soxer (1947) with little-girl-no-more Shirley Temple frightening an older Cary Grant with her crush. Somewhat surprisingly, the family values were ditched in the 1950s. For all the very real social repression and family-friendly TV of the time, 50s productions of high school movies were decidedly less innocent. Socially conscious films appeared such as Blackboard Jungle and Rebel without a Cause as did exploitation films such as Girls Town and High School Confidential.

High school movies evolved with the times in the next few decades, really hitting their stride in the 80s with a full range of sleaze (Angel), comedy (Ferris Bueller’s Day Off), drama (The Breakfast Club), and countless horror movies (Student Bodies). The genre has remained strong ever since. Commonly found on lists of “the best high school movies” of the last two decades are Cruel Intentions (wicked fun), 10 Things I Hate about You (a clever crib from Shakespeare), Mean Girls (genuinely funny), and Superbad. I must admit the inclusion of the last one mystifies me, but it regularly appears on the lists, so perhaps I’m missing something.

The genre is successful not just because teens are the best cinema customers and like seeing movies about themselves, but because these films strike a chord with anyone who ever has been to high school. (PTSD?) The best high school movies are of their time, yet timeless. You can’t get more 80s than the Brat Pack movies, for example, but it is the rare teenager in 2011 who hasn’t seen them all.

My personal all-time favorite is arguably not part of the genre at all even though it does take place in a school full of teenagers. The Prime of Miss Jean Brodie hit theaters in 1969, but it was set in the 1930s. Miss Brodie, played by Maggie Smith, better known today as Minerva in the Harry Potter films (themselves high school movies of a sort), is a teacher at a girls school in Edinburgh, Scotland. She is nonconformist, romantic in the broader sense, and inspiring in every way except that…well...she’s a fascist. Literally. It’s hard not to admire her defiance of the prudish, repressive Victorian values she encounters at every turn, especially from the school’s conservative headmistress, but Miss Brodie truly is objectionable in ways the headmistress doesn’t really grasp. As Brodie’s most trusted student and “special girl” Sandy eventually (and correctly) charges, “You are dangerous and unwholesome, and children should not be exposed to you!” Yet, even in rebellion, Sandy owes to Miss Brodie’s influence her strength and independence of mind about moral choices. Sandy is perfectly played by a young Pamela Franklin.

The Prime of Miss Jean Brodie tries to explain the appeal fascism had (and has) to the most surprising people, though this is just a single part of a multilayered movie.

I hesitate to give Brodie a general recommendation, since it surely is not for everyone. But if you’re up for something set on a school campus that is stylish and thoughtful, this one is worth a try. Besides, I saw and liked it as a high school junior, and I was anything but evolved.

Monday, June 6, 2011

Tassels

The other day, I received a request (to which I agreed, of course) to attend the high school graduation of a neighbor. There is a lot of that going around this month. Actually, June has no monopoly on graduation ceremonies. We attend so many schools, classes, and supplementary training sessions these days that some of us are always graduating from something. We receive diplomas from preschool, college, vocational school classes, and rehab. (I’m not suggesting a necessary life arc there.) They are delivered throughout the year, but June is the month for the most noteworthy ones, and high school diplomas remain the most noteworthy of all.

Once upon a not-so-very-long-ago time, high school graduation marked the transition to adulthood. Biologically and legally it still pretty much does – though the law for 18-21s has grown more muddled lately in the US. Socially, it doesn’t so much anymore. We commonly call 21-year-old college students “kids.” 19-year-old newlyweds shock us, and their families almost surely tried to talk them out of it. Outside of the military, few 18-21s are fully self-supporting. For all that, high school attendance and high school graduation are fundamental rites of passage.

As a central cultural experience, high school is scarcely a century old. Before 1880, only small numbers of privileged offspring attended high school (aka secondary school), and nearly all of those schools were private. Public high schools sprang up alongside the private ones in the next two decades, but even by 1900 only 6% of American teenagers graduated from any of them. None of my grandparents, all born within a few years of 1900, went further than the 8th grade, which was normal for their generation. The big change came 1910-1920. By the 1920s, high school was the majority experience of teenagers with attendance enforced by law. Today, nine decades later, almost everyone attends high school and nearly 90% of Americans finish.

By throwing 13-19-year-olds together en masse, high school arguably created “teenagers” as a distinct subculture. High school became the time of the Vision Quest. It is the place where and when most of us define who we are – and there ever after remains a part of us who is 16-years-old and waiting impatiently for the 3 o’clock bus.

For all the vast amounts of taxpayer money spent on it, what high school doesn’t do (except for a quite small minority of students) is impart much of academic value. Unorthodox private schools such as Sudbury, which has no curriculum or grades (teachers and educational resources are available for anyone who chooses to use them, but students can play baseball all day if they wish), confound traditional educators by turning out graduates who do as well on SATs and in college as graduates of public schools. The force-fed fare of traditional schools exits the minds of most students immediately after final exams. Few people retain skills after 12th grade that are one whit beyond what they had in 8th grade; in future careers, most rely on those 8th grade skills and on whatever subsequent college or vocational training relates specifically to their jobs. If you’ve ever watched adult professionals struggle on an episode of Are You Smarter Than a Fifth Grader?, you know I am being generous with “8th grade” skills. True polymaths are rare in the world, and they probably would have sought out broad knowledge on their own anyway.

This is not to say there is no value at all to the boot camp for life that is high school. The hierarchies and social stresses we encounter there really do parallel those in the adult world. Classroom politics and office politics are much the same. For this reason at least, the diploma stands for something. I will applaud when my neighbor receives hers

Thursday, June 2, 2011

Jellybean Wisdom

When it comes to guessing answers to simple factual questions (e.g. “What is the temperature of this room?” or “How many jellybeans are in that jar?”), crowds do better than individuals. The averaged answer of a sizable crowd always is nearer to the truth than the typical individual answer. It is a statistical phenomenon. Many of the individual misses are wild, but the extreme ones largely cancel out. Social scientists have been demonstrating this in experiments for the past century.

In 2004 James Surowiecki, in a popular book, The Wisdom of Crowds, went a bit further and argued that crowds produce better decisions than experts on broader questions if there are (1) diverse opinions, (2) independence of participants (so social pressures can’t impose groupthink), (3) decentralization, and (4) a proper method for averaging opinions.

It is tempting to perceive the virtues of democracy in this, and some folks do just that. Yet, even if Surowiecki’s proposal has merit, his crowd is a very artificial one. In real world situations, crowds are seldom wise and his four conditions are rarely met. Anyone who ever has witnessed a first class riot (as I have) has seen a crowd do things far dumber and scarier than the individuals in it ever would do alone. Anyone who ever has bought into an asset bubble (as I have) is feeling a little dumb himself.

Political democracy in particular is not analogous to counting jellybeans. For one thing, elections do not average opinions. One candidate wins and another loses. A ballot proposal succeeds or it doesn’t. Winner takes all. No averaging takes place. There is no statistical nudge toward correctness. Groupthink (or two competing groupthinks) is the norm. The riot rather than the jellybean-count is a more apt model for crowds in politics. Shakespeare isn’t misguided when he depicts the Roman crowd swinging impetuously from support of Brutus to support of Antony.

I'm not arguing against democracy. It has its virtues. Primary among them (if I may crib from Churchill) is simply that it isn't one of the other types of government. Wisdom may be too much to expect from it though.