Monday, August 27, 2018

Changing Minds


The title of Michael Pollan’s How to Change Your Mind sounds like a self-help book of a type much needed in an age when we are far too apt to make a virtue of being closed-minded toward opinions that differ from our own. In a sense it partly is, but not in the way one might think. The subtitle explains the change the author intends: What the New Science of Psychedelics Teaches Us about Consciousness, Dying, Addiction, Depression, and Transcendence. Pollan’s book is an informative compendium of the modern history and current state of psychedelics.

It has been 80 years since Sandoz chemist Albert Hoffmann synthesized lysergic acid diethylamide (LSD), which he called his “problem child.” It is 50 years since LSD became illegal in the US. It is classified by the Drug Enforcement Administration as a Schedule I drug, which by definition is a drug that has a high potential for abuse, has no accepted medical use, and is unsafe to use even under medical supervision. Possession and sale are federal crimes and penalties can be severe. (Marijuana is still a Schedule I drug, by the way, though the feds at present are choosing to ignore intrastate sales in states that have legalized the substance at the state level.) A thing is not so just because government officials say it is, of course. LSD does have medical uses, though the nonmedical ones are just as intriguing.

Pollan didn’t initially intend his research for this book to become personal, but, perhaps unsurprisingly, it did. His inquiries into the history, chemistry, and applications of psychedelics led to his own supervised (not officially supervised, but supervised) trips on LSD, psilocybin, and 5-MeO-DMT (toad venom). The lengthy chapter on his trips is the least interesting section since such experiences are by their nature personal and difficult to communicate verbally except in banalities. (My own short story on the subject, however, can be found at Brown Acid.) The reaction to these substances is highly context-dependent and is also dependent on the predispositions of the user; religious people are likely to interpret their experiences as religious, for example, while secularists are more likely to speak of a more generic “one with the universe” sensation. The chapter is still useful, however, for understanding the author’s own mindset and biases.

Timothy Leary was the face and voice of LSD in the 1960s, but according to Pollan he did psychedelia no favors. His antics merely scared the straight establishment into outlawing the substances. The more interesting First Wave research, some of it notoriously for the US military, was done not at Harvard by Leary in the 60s but by others in the 40s, 50s, and 60s. Civilian researchers included Humphrey Osmond, Abram Hoffer, and Al Hubbard. Osmond supervised Aldous Huxley’s experiments with mescaline in 1953; Huxley wrote the influential The Doors of Perception the following year and later would experiment with LSD and befriend Timothy Leary. Studies conducted up through the early 60s showed real promise in treating alcoholism, drug addiction, and depression, mostly by changing the perspective of the subjects. LSD binds to serotonin receptors in the brain, which is what SSRIs (selective serotonin reuptake inhibitors: standard pharmaceutical treatments for depression) do, though the effects are more radical. Bill Wilson, co-founder of Alcoholics Anonymous, credited his insights to a 1934 experience with belladonna (which has hallucinogenic properties) and he experimented with LSD in the 50s. Research largely halted, however, with the Schedule I designation.

Leary was certainly antic, but there is no denying his popular influence, much as Pollan would prefer not to dwell on it. So, I will make one point regarding his most famous dictum, even though Pollan doesn’t, since it relates to the change of perspective at the core of the therapeutic and transcendental uses of psychedelics: “The only way out is in. Tune in, turn on, drop out.” He was not urging people to crank up the stereo, drop acid, and give up. As he explained whenever asked, he meant that the way to personal freedom is through inner space: tune into yourself, expand your mind (yes, he advocated psychedelics to help with that), and drop out of the rat race so many of us mindlessly run. Be free instead create your own destiny. That’s not the same as saying “be a lump on a couch,” though I suppose one’s destiny could be that; some people achieve that destiny without psychedelics. Some past users are very hardworking indeed. Steve Jobs attributed his creativity in part to his experience with LSD. He gibed Bill Gates for not having tried it, though Gates said he in fact did.

Since 2000 there has been a renaissance of experimental research into psychedelics (psilocybin and LSD in particular) at legitimate facilities, including at NYU, Columbia, and Johns Hopkins. (Even Schedule I drugs can be grudgingly granted legal exceptions for some experimental studies.) Once again the substances are proving useful in combatting addiction and depression. They also are proving valuable for relieving anxieties in terminal patients. Whether the trips of the dying are felt as spiritual or simply as the loss of ego, they seem to help bring peace of mind.

Substances as powerful as these can be dangerous (as is alcohol), of course, and the risks must be acknowledged. They expand most minds but have been known to shatter a few. That is reason enough in the minds of many to continue to outlaw them. So, it is anyone’s guess whether they ever will regain a legal status even if just for (non-experimental) therapeutic uses. It seems unlikely, but back around 1900 the notion that it could be any business of government to restrict at all what people chose to put in their bodies also seemed unlikely: cocaine and laudanum (opium and alcohol) could be bought over the counter at the time. Times change. Perhaps Hoffman’s problem child may again be allowed to come out and play.


Original Broadway Cast: Walking in Space

Monday, August 20, 2018

Vicious Mythic Parties: Three Reviews


Halestorm: Vicious (2018)
Vicious (2018) is the fourth studio album by Halestorm, not counting cds of cover songs. Their last album Into the Wild Life got mixed reviews, largely because it wasn’t what fans expected. Lzzy Hale’s raw vocals backed by guitar, bass, and brother Arejay Hale on drums have been delivering basic hardcore power rock’n’roll since 2009. A few tracks on Into the Wild Life fit that description, but the band also experimented with various other sounds including country and pop. It wasn’t bad, but much of the fan base wasn’t happy. Fans have no reason to complain about Vicious. “What doesn’t kill me makes me vicious,” sings Hale on the title track. The rock is back. From the opening song Black Vultures to the speed rock “Uncomfortable” to the melodic “Killing Ourselves to Live” to the final sentimental (acoustic but un-silent) “The Silence” and everything in between, the album keeps the edge in sound and lyrics that rock should have “just to make you uncomfortable.” Thumbs Up.

**** ****

Buffy the Vampire Slayer: Myth, Metaphor, and Morality by Mark Field
I didn’t watch Joss Whedon’s Buffy the Vampire Slayer during its initial run of 1997-2003. Those years were tumultuous for me personally, and not in a good way. Even had I the time back then, I would not have been inclined to seek out a campy horror show that appeared to be aimed at teenagers. Only several years later did I idly give some reruns a view and find myself impressed. Yes, it is a campy horror show aimed primarily at teenagers, but it is intelligently scripted and funny. (The shoestring budget first season is admittedly shaky, but it was a 1997 midseason replacement, so there are only 12 episodes in Season One to get through.) It is a show adults can enjoy, and one soon notices that the monsters, demons, and vampires are metaphors for the challenges we all face growing up.

Lest one think that is reading too much into it, the reading is shared by cultural critics and by the creators of the show. There have been more academic studies (called Buffy Studies – really) on Buffy than any other TV show. Said James Marsters (who played the vampire Spike on the show), “I’m not at all surprised that the show in any form continues to live on. I don’t want to oversell this but it’s the same theme as Catcher in the Rye. It’s the same theme as Hamlet. How do you get through adolescence? ... I’m really glad Joss was able to find a metaphor to talk about something that is a serious subject with so much humor.” Whedon’s taste for existentialist philosophy informs the show (the vampire Angel, played by David Boreanaz, can be seen reading Sartre’s La nausée in one scene) along with the theories of Freud and Jung. Not always, of course. In the last episode of Season Four several main characters have meaningful metaphorical dreams; in each of them a fellow with slices of cheese makes a brief appearance. He doesn’t mean anything. Joss is just playing with the audience. Sometimes cheese is just cheese.

One of the more informative and accessible books on the subject is Buffy the Vampire Slayer: Myth, Metaphor, and Morality by Mark Field. In 682 pages he gives episode by episode analysis of the philosophical, psychological, and cultural references. I picked up the book mostly to see whether there was that much to say about a TV show. There is, because it is not just about a TV show but about philosophy, psychology, and culture. I sometimes find myself disagreeing with him, but always find his commentary thoughtful and perceptive. Whether you’re revisiting the show or seeing it for the first time, if you want, in effect, an annotated Buffy, this will do as well as any.

**** ****

How to Talk to Girls at Parties (2017)
When writing about young people, authors often are tempted to set their stories in the places and eras of their own youth. It’s just easier to get the nuances right in everything from speech patterns to pop culture references. (A notable exception was Tom Wolfe whose ear for campus dialogue in the 2000s in I Am Charlotte Simmons was as on target as it was for 1960s hippiedom in The Electric Kool-Aid Acid Test.) It’s a bonus for the storytelling if that era happens to be a particularly vibrant one. This is the case for author Neil Gaiman (b.1960) whose How to Talk to Girls at Parties, originally a scifi short story and later a graphic novella, is set in punk-era London. The 2017 film version stars Elle Fanning, Nicole Kidman, and Alex Sharp.

In 1977 Enn (Sharp) and his friends leave a Croydon punk rock club at closing time to find an after hours house party. Despite some wrong turns they find a house party, but it’s not the one they were seeking. These partiers are different. The boys assume they are some kind of cult from California, but they actually come from much farther away than that. They are manifesting in human form to accumulate certain types of data prior to “the eating”; the aliens have solved the population problem by consuming their own (adult) offspring. Gaiman’s original tale ends when the boys leave the party (“flee” might be a better verb), but this scene happens only 20 minutes into the 102 minute movie. The film’s plot continues as Zan (Elle Fanning), one of the aliens, asks Enn’s help to “further access the punk.” He is happy to comply with a little help from punk scene queen Boadicea (Nicole Kidman).

Gaiman’s short story is a tightly plotted and themed one: Enn fears the alien siren song even as he is drawn to it. (The metaphor of clueless young men fumblingly attempting to interact with women is an obvious one.) The movie by contrast goes off in multiple directions at once: part romcom, part 70s retrospective, part youth rebellion, part scifi parody, and more. The result is messy, but not really a bad messy. This is not a great film, but it is agreeably weird. That’s enough to earn it a mild Thumbs Up.

Trailer: How to Talk to Girls at Parties

Monday, August 13, 2018

Finding Philo


It is currently raining in this part of NJ, which means my satellite TV signal is glitchy. So, this is a good moment to write about TV instead – something I’ve been meaning to do for a couple of days.

Saturday night I watched the 1941 The Wolf Man on Me TV, a nostalgia cable/satellite channel. The Wolf Man wasn’t the first werewolf movie, but it set the standard for all the ones that followed. I’m not sure why I watched it since I own a DVD of it, but somehow it drew me in. It and other Universal monster movies appeared frequently on non-network TV channels when I was a child; there was still a shortage of original content in those years, so old movies got a lot of air time. I don’t remember the first time I saw it but it likely was prior to 1960 and certainly wasn’t much later. I know I had seen it multiple times before assembling an Aurora plastic model of the Wolf Man sometime in the early 60s. All of my elementary school friends had seen it, too, and that is the point I pondered (during commercials) on Saturday.

TV’s hundredth birthday is less than a decade away. In the past few decades we have seen Wunderkinder dropouts design cutting edge personal computers, programs, videogames, social media platforms, smart phones, apps, and other tech, but youth is nothing new in technology. In 1920 14-year-old Philo T. Farnsworth in Rigby Utah showed his high school science teacher his designs and accompanying formulae for a scanning cathode ray television system. It was way over his teacher’s head, but the fellow was supportive and referred him to Brigham Young University. Philo soon moved to California. Despite working with limited funds out of his own workshop, by 1927 he had a fully functional television system from camera to screen. (Herbert Hoover, Secretary of Commerce at the time, appeared on TV on April 7, 1927.) As so often happens when available components and theory make a new technology ripe, several inventors simultaneously were developing similar systems: notably Vladimir Zworkyin who had the resources of RCA at his disposal. In the patent dispute that followed, however, Farnsworth prevailed because his high school science teacher still had his sketches thereby proving his work was prior.

Television broadcasting began in the UK and Germany in the 1930s and in the US in 1939, in which year Franklin Roosevelt became the first sitting president to be televised. Nonetheless, the eye-watering cost of early TV sets, the Depression, and then World War 2 prevented the technology from taking off commercially. Prior to the end of the war, home TV sets in the US numbered in the mere hundreds. After the war, pent up civilian consumer demand quickly changed all that. US TV sets numbered in the millions by 1949. My parents bought their first set in 1948.

This collection of Vidal's 1950s
TV screenplays still holds up well
Television broadcasting in the 1950s was better than one might expect of a fledgling medium. (I was a little young to judge it at the time, but I did like to watch reruns of The Little Rascals before school and acquired my taste for scifi after school.) Top notch writers were hired for weekly dramas, vaudeville performers still on their game got their own shows, innovative series such as The Twilight Zone got a footing, and (bringing us back to The Wolf Man) older movies were introduced to a new generation. Because the channels were few, the effect was to homogenize popular culture in a way that radio hadn’t. People from all regions and walks of life watched the same shows and televised movies. So much was this the case that everything from water usage to restaurant attendance varied predictably depending on what was on. You could talk about the latest Ed Sullivan Show at a party and be pretty sure most people there had seen it. As late as the 1970s shows like All in the Family and The Jeffersons had diverse audiences of 30 or 40 million per episode.

No longer. The proliferation of channels and the blurring of television and internet have split us into niche audiences. We can pick and choose as we please among near endless options. Viewership divides along ideological, ethnic, and generational lines (among others) even where one wouldn’t expect. That is not entirely a bad thing. Greater choice is something I’m certainly happy to have. It is not the job of video entertainment (in whatever format on whatever screen) to homogenize popular culture. It is worth noting, though, that it once did and now doesn’t.

On the other hand, some of the niches span the divides in curious ways. Look at the membership, for example, of classic movie or science fiction or detective fiction discussion groups on Facebook. The members bear little in common but enjoyment of the genre. (By the way, the term "idiot box" is unjustified. Academic and IQ scores actually are higher in countries that watch more TV. Within countries there is a bell curve: in the US, academic scores are positively correlated with TV viewership up to 3 hours per day but fall off above that.) So, while we no longer can assume the people we meet  have seen the same television and movies as we, it’s easy enough to find some folks who have: so much so that if I had kept that Wolf Man model in the box it would be worth something. That’s not a common culture, but it will do.


Blondie’s ode to TV: Fade Away and Radiate

Monday, August 6, 2018

One If by Band Two If by Seat


Two reviews from a chair in the den and one from a standing-room-only club:

**** ****

Tully (2018)

To call Tully a chickflick is to understate the case, which I mention both as a warning and an inducement depending on the taste of the viewer. I don’t mean the term in the old-fashioned sense of a particular brand of romcom or in the more recent sense of some athletic heroine singlehandedly thrashing a platoon of burly men twice her size, but rather something much less formulaic. A few words about the screenwriter Diablo Cody are a good place to start to tweak the definition.

Diablo Cody had an unusual path to becoming one of Hollywood’s star screenwriters. Back in 2005 she published a book called Candy Girl about her life as a Minneapolis stripper. It caught the eye of film producer Mason Novick who thought it had potential for a movie. He contacted Cody and asked her to send him a sample script on any subject in order to see if she could write as well for the screen as she did in her memoir. Over the next few weeks, mostly at a local Starbucks, she wrote a script titled Juno. The Candy Girl movie was never made, but Juno was. Her scripts since then include Jennifer’s Body (2009), the TV series United States of Tara (2009-2011), Young Adult (2011), Paradise (2013), and Ricki and the Flash (2015). All of them are female-centric and are off-beat, down-to-earth, and literary in an idiosyncratic mix. My favorite, BTW, is Young Adult with Charlize Theron as an author of young adult novels. It’s hard not to notice that Cody’s protagonists are aging along with her, generally tracking behind by a few years; the one out of sequence film (she got ahead of herself on this one) is Ricki and the Flash about an aging rocker. All of her characters are humanly flawed (or inhumanly, in the case of Jennifer’s Body) and their heroism, when it emerges, is of a (typically underappreciated) everyday kind. Cody wrote Tully after the birth of her third child.

In Tully Charlize Theron is Marlo, a forty-something suburban mother who early in the movie has her third child. Her kindergarten son is challenging and possibly challenged (it’s still an open question), which further frazzles her nerves. Her daughter requires her attention, too, while her husband’s long hours leave her with the domestic load. Sleep-deprived by her new baby, she finds her capacity to cope has been sorely exceeded. Seeing her exhaustion, Marlo’s rich brother suggests that she get a “night nanny” who, as the name indicates, would arrive only at night and do everything with the baby but nurse, for which single task she would wake up Marlo before taking over again until daybreak. A night nanny arrives at the door. She is a 26-year-old named Tully who in energy, outlook, and body tone is everything Marlo once was but no longer is. I can hear the reader saying “I know where this is going.” While writing, Cody obviously could hear potential viewers say the same thing, because Marlo (joking, but not) openly expresses worry that the arrangement is “like a Lifetime movie where the nanny tries to kill the family and the mom survives and she has to walk with a cane at the end.” It isn’t. It is something much stranger, and yet understandable.

Recommended, but definitely not for those who require explosions, car chases, superheroes, and villains scheming to take over Gotham City. (I like those, too, by the way, but not exclusively.) It also may deter some viewers from having kids, though I think that response was unintended.

**** ****

The Humans by Matt Haig
The British author Matt Haig approaches most of his novels – even his children’s literature – at a slightly bent angle such as the family drama The Radleys, who are vampires, and The Last Family in England, which is basically King Henry IV, Part I but with dogs. In The Humans an extraterrestrial comes to earth when a Cambridge mathematician’s proof of the Riemann hypothesis regarding prime numbers threatens to end death and disease on earth while opening up access to the universe to humans without the need of pesky encumbrances such as spaceships. As is typical in stories of this kind, the existing advanced species Out There don’t much like the idea of unruly savage humans joining them in the wider cosmos. They know full well that humans have a history of inventing things they don’t use properly, e.g. “the atomic bomb, the Internet, the semicolon.” (By the way, years ago I blogged about the last of those in Save the Semicolon.)

The alien from Vannador replaces the mathematician Andrew Martin, and is tasked with killing anyone the real Professor Martin told about his breakthrough including his wife and son. Even if they don’t know the details of his proof, the mere knowledge that there is one is dangerous as it would prompt others to seek it. Despite his adaptive gifts, the alien makes social mistakes, the first being to arrive without clothes. He lives as Professor Martin and delays eliminating his family while he tries to find out how many people have learned about the professor’s success; fortunately, due to the cutthroat world of academia and the real Professor Martin’s own secretive personality, the number seems to be few. Much to the displeasure of his taskmasters, the Vannadorian is slowly corrupted by his human form. He is tempted to go off task as he gains empathy with humans generally and affection for his wife, son, and dog in particular.

The basic premise of an alien awkwardly trying to hide in plain sight has been done many times before in books and on screen. The TV show 3rd Rock from the Sun got six seasons out of it. However, this is true of almost any scifi premise. It’s a little late in scifi’s day for much real originality in premises. What matters is how well the material is handled, and Haig does a good job of it. There are few better ways to display the absurdities of human life than from the perspective of a putative alien. The redeeming human characteristics discovered by the Vannadorian also make an unoriginal list, but are no less persuasive for that.

Thumbs Up for amusement value.  

**** ****

Gone Fishing at the Stanhope House
Smallish music clubs are apt to come and go, but every area has that venue – maybe more than one – that lingers for generations. The Stanhope House is one such place near me. The building started as a stagecoach stop in 1794 and had various lives after that as rooming house, tavern, and restaurant. Some 50 years ago it became a music club specializing in blues. Despite its cozy interior (a choppy floorplan makes it effectively smaller than the exterior suggests) it booked heavy talent in the 70s including Muddy Waters, Willie Dixon, John Lee Hooker, Buddy Guy and Stevie Ray Vaughn. The place was a frequent haunt of mine back in my 80s and 90s club-hopping days. Nowadays it mostly books local or lesser known talent (as one expects of an out-of-the-way NJ club) but there still are surprising exceptions including Samantha Fish a few days ago, who unsurprisingly sold out soon after going on the schedule.

Regular readers of this blog (there are a few) may recall my earlier mentions of the Kansas City blues guitarist. Her band keeps growing and now includes keyboard, horns, and violin. Samantha is not stylistically stuck in a single groove but churns out blues, country, rock and roll, and pop – not a lot of pop, but some including the title song of last year’s Chills and Fever album. The album is worth a listen, by the way, as is Belle of the West, her country-tilted second album of 2017. Her live performances are by far the best, however. She tours constantly and still can be found in relatively modest venues even though she has outgrown them (http://www.samanthafish.com/tour/). I recommend finding one near you before that changes.


Samantha Fish @ Stanhope House, August 2018 – Heartbreaker