Wednesday, September 25, 2019

Winning the Zombie Vote


Nowadays we are battered constantly by political content in print, on the air, and online – never mind the stream on social media of ridiculously misleading memes created by professional propagandists. Even disregarding the silly stuff, there is so much content that I rarely feel the need to seek out more in bookstores, whether of the brick-and-mortar or the online kind: not of the of-the-moment “this politician is _______ [wonderful/awful/visionary/loony/transformative/criminal/or what-have-you]” variety anyway. (In elections I pick my own poisons for my own reasons.) I prefer the books I buy to be more elevating – or, just as satisfactory, more decadent. I sometimes do buy books with more generalized political content, however, and the two that occupied my night table last week were Can Democracy Work?: A Short History of a Radical Idea from Ancient Athens to Our World by James Miller and the idiosyncratic The Sky Is Falling: How Vampires, Zombies, Androids, and Superheroes Made America Great for Extremism by Peter Biskind. Both of them share a sense of unease that modern democracies in general and our republic in particular are becoming ungovernable.


Professor of politics and liberal studies James Miller is a former SDS member and recent Occupy Wall Street marcher, though age and experience have tempered some of his idealism. In his book, which will appeal especially to armchair historians, he takes the long historical view of democracy. He starts with a detailed account of ancient Athens, fast-forwards to the Enlightenment ideas and philosophers culminating in the French Revolution, continues through the 19th century trends, evolutions, and uprisings (including the short lived Paris Commune), gives an overview of Wilsonian progressivism and the Russian revolution, and finally tries to relate all that to the populism of the present day. He doesn’t really have an answer for the question he asks in the title, though he does say democracy always relies on widespread public faith in its basic tenets. This is pretty hit-and-miss over time and around the world. Most governments give at least a passing nod to the word democracy either informally or right there in their names (e.g. the Democratic People's Republic of Korea) but a nod might be all it gets.

The problem, Miller notes, always has been that democracy does not equal liberal democracy – the latter defined as including the protection of individual rights not revocable by any majority. Most often it doesn’t. Liberalism in the sense of limitations on government (democratic or otherwise) didn’t even arise as a philosophy until the 18th century and isn’t on firm footing now. Historically, majorities have been quite willing to vote away the liberties of disliked individuals and minorities; they sometimes vote away their own liberties in pursuit of some other ends. This shakes the faith in democracy by those who might be on the losing side of that vote to an extent that undermines the system itself: “Whether democracy in America, or anyplace else, can flourish, either as a historically conditioned set of political institutions or as a moral vision, must remain, by the very logic of democracy, an open question.” He ends nonetheless on a guardedly optimistic note.

Disquiet about the future of democracy long predates Miller, of course. In fact, for most of history there was outright hostility to democracy, especially (unsurprisingly) in intellectual circles. Witness Aristotle, who tells us that there are three “true” or good forms of government: monarchy, aristocracy, and a republic. He follows with a description of three “perversions” of these: tyranny, oligarchy, and democracy. The true forms, he says, are perverted into their evil twins when those in power pursue primarily their own or narrow interests instead of the common interests – which is to say including the interests of their opponents. The U.S. Founders were avid classicists and they took Aristotle to heart. They thereby deliberately tried to craft a constitutional republic that limited the power of the majority. The very word “democracy” didn’t fully lose its negative Aristotelian connotation on these shores until the era of Andrew Jackson.

A number of internet memes attribute false quotes (what a shock) on democracy to various Founders, but these actually turn up in their writings:
Alexander Hamilton: “Real liberty is never found in despotism or in the extremes of democracy.”
James Madison: "Where a majority are united by a common sentiment, and have an opportunity, the rights of the minor party become insecure."
John Adams: “There never was a democracy yet that did not commit suicide.”
Benjamin Rush: "A simple democracy is the devil's own government."
Elbridge Gerry (the “Gerry” in “gerrymander” btw): "The evils we experience flow from the excess of democracy. The people do not want virtue, but are the dupes of pretended patriots."

Events in the first half of the 20th century should give pause if nothing else does. The rise of fascism and communism – with ultimately shocking results – was immensely popular once the regimes took power. There were several plebiscites with 90% majorities. Fabian socialist playwright George Bernard Shaw in 1936 took note of this and wrote, “Parliaments are supposed to have their fingers always on the people's pulse and to respond to its slightest throb. Mussolini proved that parliaments have not the slightest notion of how the people are feeling, and that he, being a good psychologist and a man of the people himself to boot, was a true organ of democracy. I, being a bit of a psychologist myself, also understood the situation, and was immediately denounced by the refugees and their champions as an anti-democrat, a hero worshipper of tyrants, and all the rest of it.” Regrettably, he had a point. So did those refugees and their champions.

Peter Biskind, best known as an arts and movie reviewer, approaches the subject from a completely different direction. As a cultural commentator, he argues that movies and TV shows with extremist subtexts (or sometimes just plain texts) of both left (Avatar) and the right (24) variety push the public away from consensus and toward the extremes. He is writing about the USA, but notes that Hollywood’s global reach influences publics elsewhere, too. In The Sky is Falling he writes, “After the election of Dwight D. Eisenhower in 1952, the government fell into the hands of a bipartisan coalition of center-right liberals and center-left conservatives, that is, Cold War Democrats and East Coast Republicans whose visions of postwar America were similar enough that they could see eye-to-eye on basic principles.” Wall Street accommodating Democrats and social program tolerant Republicans didn’t especially like all aspects of each other’s policies, but they could accept enough of them to get along. Extremists by contrast consider even centrists on the other side to be beyond the pale; so, when extremists dominate the results are rancorous, divisive, and obstructionist in a manner with which we are all too familiar in the 21st century.

Biskind is at his best when discussing films and TV shows of the last several decades and whether they uphold (Game of Thrones) or subvert (The Dark Knight) mainstream consensus values – especially but not exclusively in the apocalyptic scenarios in which so much modern entertainment is set. Whether of the right or the left, extremist heroes ignore normal civilized conventions. As for the superheroes, 1950s TV Superman was very much of the mainstream. The X-Men are on the left. Biskind calls Deadpool the first alt-right superhero. It should be noted that the underlying direction of the films is often very different from the avowed politics of the director; the internal logic of certain movies just takes them in a particular direction regardless.

His remarks on film are worth reading simply for themselves, but is Biskind’s broader point right? Do our favorite TV shows and movies really “inflame our emotions” so that (as someone once wrote in an even more divided time) Things fall apart; the centre cannot hold? Maybe. But I suspect they reflect rather than lead the Zeitgeist. If a new consensus ever forms, so will films celebrating it. For the immediate future, our movies (and protagonists) are likely to be as immoderate as our elections.



Jack Teagarden - I Swung the Election (1939)

Sunday, September 22, 2019

Last Breeze of Summer


Dusk has arrived thereby ending the last day of summer – not forever, one hopes, but for 9 months. There are parts of the world where this is a welcome relief. I don’t live in one of those parts. On my 5 acres there are 3 months of t-shirts and swimsuits, 6 months of longsleeves and sweaters, and 3 months of winter coats and gloves. Already this month the overnight air is chilly most nights while the leaves in the trees slow turn red. Warm days are few enough to miss them when they are gone. Accordingly, I went swimming today even though the water temperature (thanks to those overnights) is in hypothermia range; the pool will close this week. Much of the rest of the day, I spent by the picnic table in the still warm daytime breeze: “those lazy-hazy-crazy days of summer” and all that. Now gone.
Leaves still mostly green but trending red

The ancients liked the autumn. The crops came in, which was always a good thing. A plurality of modern-day Americans list autumn as their favorite season, though at 29% it isn’t by a big margin. I don’t mind it. The season isn’t actually harsh around here, and couple of the holidays are fun. Still, it’s hard not to be conscious of the slide toward winter. I’m more wistful about this time of year now than when young even though one might assume summer vacation gave summer a brighter glow then. I was one of those strange kids who actually liked school (after age 11 anyway), so I welcomed September (though school started a few weeks before autumn proper). As for winter, I didn’t have to pay my own heating bills back then or pay for my own broken pipes, so it didn’t hold much menace.

So, for the next 6 months the sun will spend more time below the horizon than above it. I won’t toast its departure. I don’t feel much like celebrating it. I have just the right spirits on hand for when it returns to dominance on March 19, 2020 however – and again on June 20.


The Cure – The Last Day of Summer

Tuesday, September 17, 2019

All That and a Bag of …


This week’s Economist has a special section called (with the magazine’s usual penchant for puns) “Chips with everything.” It is about the inclusion of computer chips not just in items that we normally think of as high tech, but in things as mundane as lamps, doorbells, and fish tanks. (A Las Vegas casino’s security systems recently were hacked via its smart fish tank.) They talk to each other, too, as elements of pervasive Artificial Intelligences. Smart TVs (and it is getting hard to buy any other kind) watch the watchers and collect data on viewing habits, which they pass on to data mining AIs. Some TVs monitor conversations. Connecting them with the other household chips offers some interesting possibilities. With existing tech (though to my knowledge it is not yet done) a TV could, for example, enhance the experience of a horror movie by flickering your lights and turning fans on and off to make creepy sounds – and to judge for itself how and when to do it. Our cars already are apt to take over the driving (e.g. by braking or correcting for lane drift) from us if they think we are doing a bad job. Everything is getting smarter except possibly the people who increasingly rely on tech to do their thinking for them. One wonders if the driving skills of humans will atrophy as AIs take over. So, too, other skills.


Moore’s Law, which states that the number of transistors per chip doubles while the price halves every two years, continues to operate as it has for half a century. Repeated predictions that it was nearing its practical limit have proven premature. So our toys are ever smarter while their integration into the internet makes the emergent AIs ever eerier. For a time it seemed as though humans were getting brighter, too, albeit at a much slower pace. The Flynn Effect is the apparent increase in average intelligence that manifested in the 20th century. Average intelligence is by definition 100, so whatever the average number of correct answers on an IQ test might be in any given year is assigned the number 100. (The most common scale has a standard deviation of 15, meaning that 50% of the population falls between 85 and 115; 22.5% are between 116 and 130 and 22.5% are between 84 and 70; only 2.5% are above 130 and 2.5% below 70.) Yet, the raw number of correct answers on the tests increased year by year in the 20th century. Judging by the original 100-year-old tests results (without renormalizing the results each year), it appears that average intelligence in advanced countries rose 3 points per decade.

Something seems wrong about this. It means that the average test-taker in 1952 (the year I was born) was, by 2002 standards (and also by 2019 standards, but more on that in a moment), below normal intelligence (i.e. below 85); to put it the other way around, it means the average present-day test-taker is by 1952 standards intellectually gifted (above 115). This bothered James Flynn, who discovered the effect in the 1980s. “Why,” he asks, “did teachers of 30 years experience not express amazement at finding their classes filling up with gifted students?” They didn’t and they don’t. Quite the contrary. Despite much heavier homework loads, kids in high school today perform worse than students in the 1950s and 1960s. 12th graders have a smaller active vocabulary than 12th graders of 50 years ago. They have no better understanding of algebra or geometry, and without calculators their elementary math skills are worse. The SATs had to be made easier to keep the nominal scores from tanking. If you took SATs in the 1970s you should add 70 points to your verbal scores and 30 points to your math in order to adjust them to 21st century standards.

How then to explain the 20th century rise in raw scores on IQ tests? A large part of it is that education and scholastic tests have steadily shifted to focus more on abstract reasoning (which is what IQ tests measure) and less on nuts-and-bolts facts and skills. In short, advertently or otherwise, educators increasingly have taught to the exam. Flynn uses the example of crossword puzzles. Almost all neophytes are bad at them, but eventually one learns to anticipate puns and unusual uses of words. With practice one becomes good at them. Something analogous seems to have gone on in IQ tests: practice. (It should be noted that none of the improvement was at the high end, which is to say that the raw scores of the top 2.5% who are naturally talented at abstract reasoning didn’t get any better over the decades.) Some of the Flynn Effect may well be developmental however: a result of better childhood nutrition and health, in which case the effect should stall as good health and nutrition become more widespread. And, indeed, the effect stalled in Scandinavia in the 1990s and most of the rest of the West in the early 2000s. In fact they went into reverse. See the 2006 paper Secular declines in cognitive test scores: A reversal of the Flynn Effect by Thomas Teasdale and David Owen documenting decline in IQ scores in Denmark and Norway between 1998 and 2004: “Across all tests, the decrease in the 5/6 year period corresponds to approximately 1.5 IQ points, very close to the net gain between 1988 and 1998.” They surmised that the Flynn effect is at an end in advanced countries, though still at work in less developed ones. Similar results soon turned up elsewhere. A few years later The Telegraph noted, “Tests carried out [in the UK] in 1980 and again in 2008 show that the IQ score of an average 14-year-old dropped by more than two points over the period. Among those in the upper half of the intelligence scale, a group that is typically dominated by children from middle class families, performance was even worse, with an average IQ score six points below what it was 28 years ago.”

Just as the 20th century Flynn Effect was (in my opinion) more apparent than real, so too with the 21st century declines in IQ. In other words, I don’t think kids are biologically dumber than kids 20 years ago any more than I think they are biologically smarter than kids 100 years ago. Something in the cultural environment changed that affected their average scores. I don’t think it is an accident that the declines in First World countries coincided with widespread access to the internet, which brings us back to artificial intelligences and chips for all. Outside of school, young people give every appearance of being far smarter in everyday life than I and my peers were at their age, but it is an appearance facilitated by smart phones with endless information at fingertips. When we didn’t know something back in the 1960s, we just stumbled around not knowing it until we asked someone (who might give us the wrong answer) or looked it up in a book, map, or other non-digital resource. Compared to the instantly informed youths of today, we were idiots. The credit for the difference, however, belongs to Steve Jobs.

Trivial Pursuit was a game for 1984 (when sales peaked), not 2019 when anyone with a phone can get a perfect score – or defeat any Jeopardy champion. Yet there is a difference between information and knowledge. It may seem like a waste of brainpower to remember historical dates, or a Robert Frost poem, or the atomic number of sodium when those are just a finger tap away, but it isn’t. Creativity, insight, and comprehension are all about making connections between different bits of knowledge in our own heads, not floating out there in a cloud. Knowledge also helps us think for ourselves and be less susceptible to propaganda perfectly tailored (by those AI data miners who know us so well) to appeal to our existing biases. Yes, our machines can be programmed to make those connections for us in a way that simulates (or perhaps is) creativity, but at that point the machines become earth’s sentient beings and it is time to hand over to them the keys to the planet.

I’m not a Luddite. I’m happy to let my phone remember other phone numbers and to let my GPS (on the first trip, not the second – or at least not the third) tell me when to turn right. I’ve no wish to go off the grid. I like having a world of information in the palm of my hand. Our easy reliance on the internet does give me some pause though, as does the proliferation of chips, AI, and surveillance technologies. Increasingly – and perhaps irreversibly – we have too much information and not enough knowledge.


Quiet Riot – Too Much Information

Tuesday, September 10, 2019

Place on the Food Chain


The quintessential American cuisine has far less to do with the entrees themselves than with the process by which they are prepared and served. It is fast food of a wide variety of types from the likes of Dairy Queen, McDonald’s, Pizza Hut, Taco Bell, KFC, et al. We also must include the presentations of the fast food chains’ slightly more dressed up cousins: Red Robin, Applebee’s, Red Lobster, IHOP, and so on. The chains offer consistency and reasonable pricing from coast to coast, which was their original attraction at a time when food safety regimens at roadside restaurants were…well…inconsistent. Consistency still matters to travelers on road trips. Most customers nevertheless are locals. You can’t tell where in the country you are by the architecture or menu of a McDonald’s (with a few exceptions such as a flying saucer McD’s in Roswell), but you can tell by the customers. At a time when the nation is increasingly divided along class lines (and other lines too), the customers remain a full cross-section of the local population: rich, poor, and all the shades in between patronize fast food outlets. An interesting account of the restaurants’ history and current place in the culture is Drive-Thru Dreams: A Journey through the Heart of America's Fast-Food Kingdom by Adam Chandler.


Chandler tells us that few of the common offerings by the chains were invented by them; at most they tweaked the recipes a little. French fries, for example, go back centuries. Thomas Jefferson, for one, requested from the White House chef, “Potatoes, deep-fried while raw, in small cuttings, served in the French manner.” A few years later Dolley Madison famously served ice cream at the White House, which is why there is a brand named after her. (She liked oyster flavor, of all things, which fortunately did not catch on generally.) It is unknown who invented the hamburger sandwich. Hamburger without the bun was regarded as a German specialty in the late 19th century and could be found described as such on many American restaurant menus. Someone (or many someones) must have tried it on a bun, but who is not recorded. By the 20th century it was no novelty.

The modern sandwich version of the hamburger was popularized (once again, not invented) in Wichita, Kansas, in 1916 by Walt Anderson who sold burgers for 5 cents each in his restaurant named White Castle. At a time when ground meat was a chancy selection in many establishments, his open kitchen where the beef could be seen being ground fresh made his burgers a hit. So did the onions and spices he added. Promoter Billy Ingram happened to eat there and knew a hit when he tasted one. He teamed up with Anderson and took the chain national in 1921 with a franchise model soon copied by competitors. From the start the locations and site plans wherever possible were designed to be automobile-friendly, which Ingram rightly deemed a key to success. Despite being the earliest hamburger franchise, White Castle was far surpassed in popularity by several others after WW2, but it still has dedicated fans. I have friends who on some sort of principle won’t step foot in a McDonald’s or Burger King but who will nonetheless drive the extra miles to a White Castle.

The founder-plus-promoter combo was a pattern for success repeated many times in the 20th century. Someone would open a restaurant or roadside stand that proved locally popular; a promoter with a vision partnered up with the owner and created a franchise. The most famous case was Ray Kroc who in 1954 espied the innovative and very successful drive-in McDonald’s, operated by the brothers Mac and Dick McDonald in San Bernardino. (The Big Mac was named for Mac McDonald, which was probably better than naming it for his brother.) Within a decade Kroc turned the franchise into not just a national but a global brand.

Perhaps counterintuitively (or then again perhaps not), what the founders and the promoters of the successful chains seem to have had in common was growing up poor. What for most people would have been a serious obstacle became instead a singular motivation. Glen Bell (Taco Bell) worked a series of menial jobs after WW2 before opening a taco stand. Al Copeland (Popeyes) said, “I never forget being poor. I know what it is and I don’t want it.” Colonel Sanders was broke time and again from failed businesses before his fried chicken recipe took flight. (The “Colonel” is legit, by the way: not as a military rank but as an honorific awarded by the Governor of Kentucky.) S. Truett Cathy (Chick-fil-A) was one of seven siblings in a public housing project. William Rosenberg dropped out of school at 14 and delivered telegrams before opening Dunkin’ Donuts. So, too, many of the most successful franchisees: Aslam Khan, for example, was born into poverty in Pakistan and emigrated to the US in the 1980s. He started as a dishwasher at a Church’s Chicken, rose to manager in three months because of his reliability and hard work, and eventually became a business owner with 97 Church’s outlets. His career from dishwasher to multimillionaire took 13 years.

The chains take a lot of heat from advocates of healthy eating. McDonald’s notoriously was waylaid in Morgan Spurlock’s movie Super Size Me, which documented Spurlock’s month of eating only at McDonald’s thrice daily. Spurlock ate 5000 calories per day and the results were not good. McD’s apologists respond that in fairness it’s hard to imagine 5000 calories per day of anything for a month being good for you. One can see both points of view, but few people really go into a fast food restaurant expecting health food. Nutritional factors aside, it’s easy to be snobbish against the chains, especially in parts of the country where non-chain midrange providers still prevail. As it happens one of those areas is my home state of NJ; independently owned diners in particular are abundant here: 525 at last count in a quite smallish state. The chains exist to be sure, but they are not as prominent as in much of the country and are mostly to be found on major highways. As a single man who dislikes cooking for himself, I eat out for either breakfast or lunch at modestly priced establishments 5 or 6 days per week, but I haven’t bought from a chain in all of 2019 to date. That’s not meant as some social statement, however: it’s just a reflection of the local options. I have no objection to Wendy’s and its ilk per se. When I’m on a road trip they are my most likely lunch stops, and for the very same reason they were for folks in 1921.

One segment of the customer base, it must be acknowledged, has had some trouble ordering from the fast food menus: traditionally the chains have not been vegan-friendly. The ground is shifting a little in this regard as we are admonished from some quarters to get less of our protein from traditional farm animals and more from plants and insects. A few of the chains offer veggie burgers nowadays though I’m not aware of any offering bugs. I’m not eager to get on board with either personally as a consumer, but it does raise the possibility of fortunes to be made from brand new franchises that offer nothing else. Kentucky Fried Crickets perhaps.


The Smithereens – White Castle Blues


Tuesday, September 3, 2019

Barber-ism


I have a far less pileous pate than I once did, but what hair there is still needs to be cut once a month or so. Well, strictly speaking it doesn’t need to be cut, but I don’t like the bedraggled look that otherwise ensues; I’m not enough of an old hippie to pull off the look successfully. I just look unkempt – and for some reason older and grayer. I still go to a traditional barbershop (not a stylist) in nearby Morristown that charges $15. I’ve been going there for decades: long enough to see three generations of clippers. It is a pleasant enough experience.

Not so when I was little. As a kid I hated getting haircuts. I don’t know why. I didn’t mind the result. Short hair was fine by me. (At age 6 I cut my own hair with scissors but made such a wreck of it that I needed a buzz cut to hide the damage.) There was just something about the barbershop process I didn’t like. Freud had an opinion that a dislike of haircuts stems from them being symbolic of cutting something else. Knowing Freud, you can guess what, but I’m quite certain that notion never crossed my mind as a child. By the time I was old enough to understand what Freud was talking about, the haircuts no longer bothered me. I didn’t really change my style of haircut dramatically over the years even though I was in high school in the late 60s when long hair on men became a social statement. I didn’t fully embrace either the style or the statement, so my hair was never longer than in the 1974 pic below (with my sister in San Francisco) and (except for that buzz cut incident) never shorter than it is now. In 1974, by the way, that length was pretty conservative, as was the lapel width, which looks excessive today. Most of my friends in the 60s and 70s (except in prep school, which had codes) were considerably shaggier.
 
1962                   1974                              2019


Upper middle class hippies largely gave up long hair when construction workers and other working class young men started wearing ponytails in the mid-70s. Author Tom Wolfe had predicted this and wasn’t above patting himself on the back for getting it right. Some genuine hippies stuck with it, of course. In his book The Baby Boom: How It Got That Way (And It Wasn't My Fault) (And I'll Never Do It Again) P.J. O’Rourke apologizes to subsequent generations for having “used up all the weird” in hair and clothes, thereby forcing them to rebel in more painful ways such as piercings and extensive tattoos. The current crop of Millennials and iGens (aka Gen Z) demonstrates he was wrong about that. Some of them sport hairstyles that would have startled folks at Woodstock in 1969. The hipster thing of ironically retro facial hair is also very non-60s/70s.

I obviously don’t have firsthand experience of the social implications of hair from the female perspective, but there are many books on the subject including Rapunzel's Daughters: What Women's Hair Tells Us about Women's Lives by Rose Weitz, a sociologist at Arizona State. She notes, for example, that (in general…always in general) straight hair comes off as more conservative and curly hair as more informal. Long hair is regarded as more sexually attractive but short hair as more professional. She recalls the remark of a female exec regarding the corporate management hierarchy that “you could draw a line: Above that line, no woman’s hair touched her shoulders.”


An interesting little book that covers the subject of hair compendiously, if not in great depth, is Hair by Scott Lowe. Hair, as Lowe writes on page 1, “has an incredible power to annoy your antagonists, attract potential lovers, infuriate your neighbors, upset your parents, raise eyebrows at work, find compatible friends, and allow you to create, or recreate, your identity.” Sometimes the style is a deliberate statement, as was the Afro in the 60s/70s, and sometimes it is an old tradition (such as beards in certain Islamist sects and among the Amish) that it would be a deliberate statement to buck. Metal rock bands of a certain flavor are called Hair Bands for obvious reasons. Shorn heads have long been a symbol of shame: famously, collaborators with Germans were forcibly shorn in liberated areas in WW2. Sometimes, however, shearing one’s head is a religious rite as among the Jains. There is, in short, a lot of symbolism emanating from those head and face follicles. Interpreting it across cultures can be a challenge.

For myself, however, I just sit in the barber chair and say, “Just shorten it up for me please.” “You got it,” is the usual response. There is no doubt a lot of symbolism in that simple exchange, but I’ll leave figuring out what it is to others – preferably not Freud.


George Thorogood and the Destroyers – Get A Haircut