Sunday, July 29, 2012

Cold Blooded Blues

If I could remember the name of the rocker on some VH1 documentary who remarked, “Rock and roll is not supposed to be very good,” I would give it. I don’t, but his argument is sound (pun regretted).

Different types of art serve different purposes. There is a place, in Nietzschean terms, for the Apollonian – rational, ordered, intellectual, refined – and the Dionysian – frenzied, irrational, elemental, and not very good. Most music is a blend of the two, but rock definitely leans heavily to the latter, as much as Mozart does to the former. You can’t get much more Dionysian than a mosh pit. Any rock that is so elegantly produced that it sounds out of place in a biker bar misses the point. It needs rough edges. Ideally it is good, but not too good. Country music and the blues also tend Dionysian, which is one reason the boundaries can blur so much among the three styles. The most emotionally satisfying artists in those genres are the ones who are capable of control, but don’t always exercise it. Janis Joplin had a semi-controlled chaos that worked for her; despite a smoother sound, so did Amy Winehouse.

Yesterday morning,, through whatever digital magic the site uses to concoct its “recommended for you” lists, reminded me of yet another performer about whom I hadn’t thought in a while: Lydia Pense, lead singer for Cold Blood.

As long ago as 1969 a clerk in a local record store (remember those?), who knew I liked Janis Joplin, advised me to try the album Cold Blood by the band of the same name. It was good advice. As it happens, Janis Joplin had been the one to recommend Lydia Pense to music producer Bill Graham. You can hear why. The two sound very much alike, with any advantage on the side of Lydia. There is one very big difference between her and Janis (and Amy), which in part may account for her being lesser known today: Lydia is alive and well and still performing with Cold Blood in 2012.

The group continues to record new material, but the first album is still the one to own, if you buy only one – the first four if you buy only four.

Is there really anything to Nietzsche’s dichotomy? Camille Paglia thinks so. Her 1990 book Sexual Personae is a peculiar blend of erudition and pop culture. It is not an easy book, but it is very much worth the effort, though she might want to consider an update with 21st century pop references. She sees the split as rooted in biology: "The quarrel between Apollo and Dionysus is the quarrel between the higher cortex and the older limbic and reptilian brains." Like Fred, Camille sees art as a bridge between them. She makes a persuasive (if sometimes unpleasant) case.

So, if you’re in a mood to stroll toward the limbic side of the bridge, I recommend giving Lydia a listen on the electronic device of your choice or, better yet, in person. Besides, I’ve always had a special fondness for those who survive the temptations of fame and fortune. “Live fast, die young, and leave a good-looking corpse” (John Derek’s line in Knock on Any Door [1949]) may well improve your odds for lasting fame, assuming there is some underlying talent, too, but it’s not worth it. I much prefer to be surprised by some Ticketmaster offer: “Wow, are they still performing?” I’ve caught some great concerts thanks to such surprises.

Monday, July 23, 2012

Hellrazors Pain Pains – NJ Women's Roller Derby

The last bout back in May between the New Jersey Hellrazors and the Major Pains in Morristown, NJ tied 122-122 as the clock ran out. In an overtime tie-breaking jam, the Hellrazors pulled off a 136-129 win. Last night’s rematch at the Hellrazors’ home rink in Kendall Park had the makings of another exciting bout, and the teams didn’t disappoint.

Proceeds from the bout benefitted a charitable trust for Ana Cru, who was born March 13 with Dandy Walker Syndrome. For more on this see .

The Hellrazors leapt ahead in the opening minutes with Cherry Mercenary picking up 4 points in the initial jam, and A-Bomb adding 5 in the third. From the start and throughout the bout, the Hellrazors played a very good defensive strategy, with their lead jammers calling off jams when there was the slightest risk of the Pains picking up points. This forfeited some opportunities for Hellrazors to add points of their own, but it suppressed the Pains’ score much more effectively. Voldeloxx put the first points on the board for the Pains. A power jam by Ginger Ail put the Pains in the lead. It set the pattern for a see-saw scoreboard. Cherry Mercenary in a triple grand slam regained the lead for the Hellrazors 48-41. Blocking was aggressive on both sides. Hellrazor Lethal Holloway was taken down hard toward the end of the first half, but still got up to score points. At half-time the Hellrazors had built a lead to 88-60.

In the second half, a power jam by Baked Beanz restored a lead 103-113 for the Pains. The Hellrazors took it back. Heinz Catchup consistently added points for the Pains, slipping through the pack, showing real speed in the open, and occasionally exchanging hits with opposing jammers. A-Bomb was just as effective for the Hellrazors; Cherry Mercenary and Mental Block were formidable, too. A jam by Baked Beanz tied up the score 124-124 with 8 minutes remaining. The Hellrazors nudged ahead again, with A-Bomb expanding the lead. At the start of the final jam, the bout still was too close to call. The clock ran out with a score of 138-133, victory to Hellrazors.

It is hard to get much more evenly matched than these two teams. I’m looking forward to the next bout.

(Theme music: I know she's singing about motorcycles, not roller skates, but I couldn't resist: .)


Saturday, July 21, 2012

On Cokes and Camelot

“Everything not forbidden is compulsory,” is a line humorously borrowed by physicist Murray Gell-Man for what he called the Totalitarian Principle in quantum physics. Absolutely everything not forbidden to particles by a handful of conservation laws has an amplitude of probability – it must happen. He borrowed the line from T.H. White’s Arthurian fantasy The Once and Future King. There, it is a motto of Antland, where it summarizes a style of governance rejected by the young Arthur under the tutelage of Merlyn in favor of the age-old liberal principle of English law, “Whatever is not forbidden is allowed.”

Arthur, however, doesn’t reign in the Land of the Free. In the US, zoning laws, for instance, typically are “permissive,” which means the opposite of what it seems to mean; it means that whatever land use isn’t expressly permitted is forbidden. Increasingly, we seem at home with the Antland motto, too, untroubled by the tag it has acquired in physics. In every aspect of our lives – commercial, public, and private – Americans are forever voting to require this and proscribe that while restricting (through taxes, licenses, permits, and so on) some other thing. The courts sometimes mitigate the laws a little, but only sometimes. There is nothing particularly Left or Right wing about the American penchant for protecting every fellow citizen not only from theft, violence, and fraud (all of which involve the deprivation of the life, liberty, or property of one person by another) but from his or her own self. Each major political wing does it in a characteristic way, but each does it. Right-wingers are more likely to ban drug use and regulate sexual practices. Under the bizarre Mann Act still on the books, for example, it is illegal to cross a state line with lascivious intent, which makes every teenage boy everywhere a federal criminal at every single crossing. Left-wingers are more likely to ban or restrict foods (such as the soda limit in NYC), mandate behaviors (e.g. buying insurance), and tax “vices” (such as sugar and tobacco) at sky-high rates.

There is nothing new about this tendency either, alcohol Prohibition (promoted mostly by the Right) being the most blatant and blunderous historical example of it. Sodas, too, came under early fire (mostly from the Left), just they are under fire today. Take the case of the iconic American product Coca-Cola which was nearly put out of business in 1911 by crusading FDA scientist Harvey Washington Wiley.

A little background is in order. In 1886, pharmacist and patent-medicine tinkerer John Pemberton in Columbus, Georgia, invented a new fizzy drink made with (among other things) coca leaves, kola nuts, and sugar -- coca leaves and cocaine extract were perfectly legal at the time. The very first ad (May 29, 1886) in the Atlanta Journal reads, “Coca-Cola. Refreshing! Exhilarating! Invigorating! The new and popular soda fountain drink containing the properties of the wonderful Coca plant and the famous Cola nut!” Despite ownership battles among investors and family members, the Coca-Cola company grew rapidly. The cocaine, which was not a large amount anyway, was removed from the drink in 1903, though non-intoxicant coca extracts remain in it to this day. Given Pemberton’s background, it is not surprising that he initially would market Coca-Cola not just as a beverage but as “a valuable Brain Tonic.” The company quickly dropped the latter approach, however, and so was able to escape an 1898 tax on patent medicines. 

In 1911, Wiley took aim at the company. He claimed that caffeine in Coca-Cola was a drug aimed at unsuspecting children (this same argument is still heard today), and that Coca-Cola engaged in fraud because it advertised having “pure ingredients” even though caffeine was an impurity. Wiley’s objection to Coke at bottom was a moral one, as demonstrated in The United States vs Forty Barrels and Twenty Kegs of Coca-Cola when he called community leaders to testify that caffeine promoted promiscuity in young people. Government scientists also took the stand and testified to the ill-effects of caffeine on laboratory frogs. Coca-Cola countered with its own expert witnesses, who said that caffeine was not harmful in relevant quantities. The company defended its claim to pure ingredients. The company said that it openly declared kola nuts to be an ingredient, and that caffeine was naturally present in the nuts; caffeine wasn’t added separately, so it wasn’t an impurity. The trial was good theater, and newspapers took sides, railing at rapacious business or overweening government, according to their bent.

In the end, the court balked at ruling against caffeine per se, since this inevitably would mean outlawing tea and coffee as well. Instead, the trial court narrowed its focus to the impurity issue; it ruled in favor of Coca-Cola, buying the company’s argument that caffeine wasn’t an additive and therefore wasn’t an impurity. The company’s relief was short-lived. The government appealed the decision, and an upper court reversed the trial court. At this point, rather than appeal to the next higher court, Coca-Cola sought a settlement; Wiley had resigned by this time, so the government lost its enthusiasm for the case, and agreed to talk. The company agreed to cut the caffeine in its drinks in half and to pay all legal costs; the government dropped the case. The company’s survival was a close call, but it lived to defend itself in court another day.

Since 1912 the formula for Coca-Cola has stayed basically the same (except for the brief and disastrous New Coke episode in 1985). It still tastes very much like the original 1886 batch, though the effects of the modern version are considerably less stimulating.

With such a long history of attempts at direct and indirect regulation of what we eat, breathe, touch, and imbibe, there is little doubt we will see more of the same. It is, of course, possible to reject the philosophies of busybodies of both the Right and the Left, and to favor just being left alone in all private matters of drugs, sex, personal safety, diet, and so on. A couple decades ago I wrote a short story called Deep Fried, which was intended to be a satire of laws against marijuana; little did I know it was less satire than prediction: .

An Icon Drinks an Icon


Monday, July 16, 2012

Back to the Mesozoic

Among the advantages to living in or near New York City (I’m in the ‘burbs), quite aside from the well-known upscale and expensive varieties of entertainment on hand, is the abundance of quirky but very professional acts of all kinds playing in small venues. They include musicians, comedians, performance artists, off-Broadway plays, and more. (To local boosters elsewhere, I know NYC is not alone in this regard, but the point still holds.)

Some of the acts go on to greater fame. The walls of The Bitter End (a small and rather dingy place) are covered with posters of past performers who became stars. The overwhelming majority become nothing of the kind; they play a few dates and then vanish without a trace. Sometimes the evanished ones are the best ones. For instance, about a decade ago my friend Ken suggested I catch Fiona Sand at Arlene’s Grocery – that’s the first club featured in the movie Nick and Nora’s Infinite Playlist. I agreed with him she had a sound and stage presence that were (in the good sense) commercial, but apparently we were wrong; eventually Fiona moved back to Norway and on to other things. (Some of her songs are still up on Myspace: .) Other artists gain notoriety without quite being stars, such as playwright/actor Charles Busch whose off-Broadway plays in the 80s were marvelous; when he went Hollywood, however, the plays really didn’t translate as well into movies (e.g. Die, Mommy, Die! and Psycho Beach Party), though one might yet.

Last Friday, I scanned the offerings on Playbill and, just on whimsy, bought a ticket for that night’s performance of a little off-Broadway production called Triassic Parq at the Soho Playhouse. A musical comedy parody of Jurassic Park, it opens with a raptor in a cage and the familiar shout, “Shoot her! Shoot her!” No, the actors are not all in Barney costumes. They rely on a little make-up, clothes with somewhat reptilian patterns, and …well … acting to portray dinosaurs. The plot: a T. Rex spontaneously turns into a male. As you may recall from the book or movie Jurassic Park, all the dinos on the island are female to prevent breeding outside the lab, but some of the critters change.  (It’s just a plot device not meant to be examined for accuracy, of course, but, as it happens, some amphibian and reptile species can change sex due to environmental factors, though usually while still in the egg.) The event causes great consternation among the dinosaurs; it shakes the peculiar faith they have developed regarding the lab and humans – and regarding the goats that mysteriously appear out of the ground at feeding time. The roles are gender bending with the newly male dino played by a woman and several of the females played by men. To the extent the play is serious (or at least rises above simple farce – it’s never altogether serious) the script addresses gender perspectives, intolerance of alternate viewpoints, and the normality of hypocrisy.

The play is clever and funny, though I do not see it breaking out into larger venues in the way Little Shop of Horrors leapt out of its East Village home at the Orpheum 30 years ago. If the production has a fault, it is just a little too much earnestness, but that’s not much of a fault. The humor is a mix of high, middle, and low brow. I’m fine with all three, but I couldn’t help noticing that the low got all the loud laughs. New York audiences like to think they are more sophisticated than others, but the evidence isn’t on their side. The actors must comment among themselves, “You know, if we just dropped our pants and forgot about the play, the audience would be just as happy.” By and large, that is probably true, but I’m glad all the same that the writers, producers, and cast aimed at something a little more ambitious.

So, if any dinos show up in a theater in your neighborhood (or if you already live a reasonable distance from Soho), take a ride past the Tyrannosaur pen. You might enjoy it.

Thursday, July 12, 2012

Cream Rises to the Top

So does scum, of course, but we’ll leave a discussion that duality for another day.

As an addendum to my previous post on Generation Y, I have a book recommendation: The Fourth Turning by William Strauss and Neil Howe. Strauss and Howe argue that generational characteristics (including attitudes toward risk, relationships, wealth, individualism, communitarianism, etc.) repeat every four generations, and always in the same order. The tendency of the young to reject the standards of their parents keeps the pattern recurring. If true, it would make Generation Y (1982-2004 according to them – a somewhat atypical definition) similar to the one born 1901-1924 (substantially later than the cohort I postulated as similar), and the newest crop of kids (Generation Z?), born since 2005 by the Strauss/Howe way of reckoning, like the folks born 1925-1945. I have doubts about their analysis, and about their dividing lines between generations. Their discussion is an interesting one nonetheless, and they do chronicle shifts and swings in the culture over the years.

I’d also like to add that my query to the two Millennials regarding a characteristic song for their generation was not entirely out-of-the-blue. It was more like payback. A year or so ago these same two young people had noticed a vinyl of the Let It Be album by the Beatles on a shelf by my stereo; they then thumbed through the rest of my records, tapes, and CDs. My collection isn’t very highbrow or especially large, but it is an eclectic mix of popular music ranging from the 1940s to the current time. One of them asked me what single song best represented the 1960s. (Let It Be, which prompted the discussion, was 1970, but let’s not quibble.) It was an intriguing question to which I had no good answer. However, if you want to maintain a reputation for being knowledgeable (false though it may be), you have to exude ready confidence, so I pretended I had an answer. I raced a few titles though my head and then quickly proclaimed White Room by Cream. I rattled off a few reasons that I made up on the spot, and then extracted myself from the conversation.

The funny thing is, with time to reflect at leisure, I still think White Room isn’t a bad choice. The song is psychedelic, haunting, unorthodox, and poetic in late ‘60s fashion. While never hitting #1 on the charts (except in Australia, if I’m not mistaken), it nonetheless was on every rock station’s playlist in ’68 and ‘69. The song lacks much traditional order: it has no rhymes, alliterations, or assonances. The lines do scan, but in an unconventional way. In formal terms, they alternate pyrrhic with trochee feet in hexameter: /in the /WHITE room /with black /CURtains /near the /STAtion /. So, like the decade itself, the song is less anarchic than it appears at first glance, and it is less profound, too. All sorts of meanings have been read into the lyrics; some listeners believe they reference Clapton’s drug addiction or the Vietnam War, for example, but Eric Clapton didn’t write the song and the British didn’t fight in Vietnam. No, just as the lyrics say (admittedly in flowery terms), at a party a man meets a woman who is romantic and primal (silver horses and yellow tigers in her eyes), but no strings can hold her so the chick leaves him at the station. He feels desolate. That’s it. But, you know? It’s enough.

As for representative songs of other generations, whether one accepts the Strauss/Howe divisions or some other order, I'll let any members of them who might be lurking pick for themselves.

White Room (1968)

Sunday, July 8, 2012

Granfalloons and Echo Booms

In Cat’s Cradle, Kurt Vonnegut invented the term “granfalloon” to describe a collection of people who make more of an utterly factitious commonality than the commonality deserves. Examples would be Rotary Clubs, Sagittarians, alumni of the same prep school, citizens of a nation, and members of the same ethnic group. Typically, beyond that one shared datum, members have little more in common with each other than they have with random outsiders. Sometimes granfallooning is a harmless and pleasant excuse for socializing; sometimes it is downright deadly and an excuse for cruelty. Whether or not they are foolish artifacts, however, these group identities influence the behaviors of the people in them and the perceptions others have of them.

Generations arguably are granfalloons, since the members within each generation vary enormously in politics, lifestyles, preferences, and circumstances. Yet, people born at roughly the same time in roughly similar cultures (e.g. Boomers in the US and UK, but not also in China) really do experience a common history and grow up in a common popular culture. Those experiences are a framework for a sense of identity. Boomers, those born between 1946 and 1964 (the year the birth rate started a nose dive), formed the largest generation in history up to that time; they were the first born into general affluence and were smugly aware of how they differed from their parents. They still are. Their successors, the Xers, revel in not being Boomers and still like their Grunge; Smells Like Teen Spirit is almost sure to be somewhere on their iPods. Generation Y, also called Millennials and Echo-Boomers, has been in the news lately. The reason: it is the first generation to outnumber the Boomers and it has come of age.

There is no definitive agreement about where the birth years of Generation Y begin and end. I’ve seen start-dates as early as 1976 and end-dates as late as 2004, but most commonly Generation Y is taken to mean the present crop of youthful adults, 18-35, which means the birth years 1977 to 1994. Commonalities? Tech is an obvious answer. They grew up with (or encountered early) the internet, cell phones, video games, and social networks. A majority lived with a single parent at some point prior to leaving high school. They are more ethnically diverse, they have finished more years of school, and they leave school carrying more debt in real as well as nominal terms than members of any previous generation. Illegal drug use is substantially lower among them than among their parents at comparable ages, though legal drug use (e.g. Paxil, Xanax, Ritalin) is higher. Older folks often complain they are slackers, but older folks always say that about younger, so the reality is unclear. Some suggest that digital interconnectivity has damaged their education (see The Dumbest Generation by Mark Bauerlein) since they are less likely to memorize what is a click away on the Net; yet, since there is, in fact, less need to memorize what is a click away on the Net, the jury is still out on this one. Oddly, (and I have no explanation for this) they get their driver’s licenses later, and that trend continues. In 1998, 65% of eligible drivers age 19 or younger had licenses; today only 46% do. When I was in college, men were a small majority of the college population (Vietnam, the draft, and the S2 Student Deferment had something to do with that), but today women earn 60% of Bachelor Degrees, and a majority of Masters and Doctorates.

So, yes, the stats for Generation Y are different from the stats of earlier generations. But then, current stats are always different from older ones. If there is a generation to compare with Y, perhaps it is the one born 1877 to 1894: Carl Sandburg, Margaret Sanger, HL Mencken, Dorothy Parker, Mae West, Aldous Huxley, E. E. Cummings, et al. That cadre felt the 20th century belonged to it in much the same way Millennials feel the 21st is theirs; they were comfortable with telephones, automobiles, and electric appliances in ways their horse-and-buggy parents were not. The Philippine War for the US and the Boer War for the UK were similar to the grim experience of Iraq in the 2000s, and the Panic of 1907 was a financial crisis on the order of the one of 2008.

Let’s hope the similarity ends there. The 20th century got worse – insanely worse – before it got better.

I asked a couple Y-ers what song represents the Generation musically. They hemmed and hawed but finally agreed on this by Adele as being most “pan-niche,” which is a term I think I understand. (Party Rock Anthem by LMFAO was a contender also.) I’ll take their word for it.



Monday, July 2, 2012

John Adams: “The second day of July, 1776, will be the most memorable epoch in the history of America.”

All American history buffs know that the Continental Congress declared independence on July 2, 1776. The signing ceremony of the Declaration of Independence a couple days later was just a bit of theater for the press.

All the same, the ceremony on the 4th served some lasting purpose. Thomas Jefferson’s handiwork has some nice words in it, which still sound pretty good today, and the press-op helped make them famous. Congress modified his original wording here and there prior to the vote, and wherever it did the modification was for the worse. Congress hasn’t changed its habits much in that regard. Jefferson’s original draft of the Declaration, for example, blamed slavery (not really fairly) on King George: “He has waged cruel War against human Nature itself, violating its most sacred Rights of Life and Liberty in the Persons of a distant People who never offended him, captivating and carrying them into Slavery in another Hemisphere, or to incur miserable Death, in their Transportation thither.” The language indicated an aspiration to end the practice after victory, and that was precisely the objection to it from key slaveholders. (Tom himself was a contradiction: he repeatedly decried slavery and he tried to prevent its expansion into the West; yet, to the end of his days, he continued to own slaves, not even freeing them in his will as George Washington, another nominally anti-slavery slaveowner, had done.) Congress decided to fight one war at a time and to leave the fight over slavery for another day. The paragraph was excised, a decision unfortunate on innumerable levels.

If you trip up anyone on the question of when Congress declared independence, you probably can trip the same person by asking who was the first President of the United States. For some reason, Americans tend to forget completely that the current Constitution is the second one. Under the first, The Articles of Confederation, seven Presidents, each with a one year term, preceded Washington: Hanson, Boudinot, Mifflin, Lee, Gorham, St. Clair, and Griffin. The first, John Hanson, was responsible for commissioning and adopting several enduring symbols, including the Great Seal of the United States and the Presidential Seal, both still in use.