Thursday, January 31, 2013

Devolution


Two weeks ago in the post The Big Brain Theory, I remarked on the inverse relationship between intelligence and fertility. Given the high heritability of intelligence (between 0.7 and 0.8 in most studies on a 0 to 1 scale), this might raise fears of breeding future generations of dullards. We can’t be sure how long this inverse relationship has held, but given that average human brain size (determined by skeletal evidence) peaked 20,000 years ago and since has shrunk by 150 cc, there is at least circumstantial evidence that it is a long-standing one. Yet, all through the 20th century (when IQ tests first became available), the numbers indicated something quite different: raw scores on IQ tests rose steadily decade by decade. This is the so-called Flynn Effect (see All in All I’d Rather Be Errol ).

There are clues in the data to possible reasons. Scores improved most at the low end, less in the middle to upper-middle ranges, and not at all on the high end. This suggests that, during the course of the 20 century, more and more test-takers reached their natural potential, perhaps due better living standards, better nutrition, and education targeted more at cognitive reasoning. High scorers of decades past already had reached their potential, so they didn’t improve. Since nutrition, living standards, and education no longer are getting better in the First World, one might expect the Flynn effect to stall at some point.


The point arrived with the 21st century. The Telegraph noted that in the UK, “Tests carried out in 1980 and again in 2008 show that the IQ score of an average 14-year-old dropped by more than two points over the period. Among those in the upper half of the intelligence scale, a group that is typically dominated by children from middle class families, performance was even worse, with an average IQ score six points below what it was 28 years ago.” They are not alone. Similar declines are turning up in a number of advanced nations. Thomas Teasdale and David Owen in their 2006 paper Secular declines in cognitive test scores: A reversal of the Flynn Effect analyze the decline in scores in Denmark and Norway between 1998 and 2004: “Across all tests, the decrease in the 5/6 year period corresponds to approximately 1.5 IQ points, very close to the net gain between 1988 and 1998.” They surmise that the Flynn effect is at an end in advanced countries, though still at work in less developed ones. The reversal hasn’t shown up yet in the US, but, since the US has lower absolute scores than Scandinavia anyway (and some curiously Third World aspects to boot), Americans probably are just a few steps behind.

Darwin may be catching up with us after all. Advances in information technology, machine intelligence, and other mental aids are coming along at just the right time. It looks like we’re going to need them.


Why It May Be Advantageous Not To Be Smarter Than One's Fellows


Saturday, January 26, 2013

The Council Counsel


I live in a single story house, but a portion of my basement is finished; I use this basement room as my library. This keeps the upstairs uncluttered, but (lazy as this sounds) the stairs mean that books tend to pile up on my coffee tables, because it is easy to say, “I’ll put these back downstairs later.” “Later” does eventually come, though, as it did last night when I carried a dozen books to the basement to file away on shelves. I also keep some memorabilia down there, such as the six yearbooks (grades 7-12) from my prep school years. After putting the books away, I happened to pick up the 8th grade yearbook and flick through it, looking at half-remembered faces and scrawled signatures. One student (I’m no longer sure who) in lieu of a signature had written inside the front cover, “Vote Right Vote Knight.”

I do remember what this meant. John Knight, a junior, had run for President of the Student Council for the following year. I (and apparently the comment-scribbler as well) had supported him. He lost. My preferred candidate the previous year had lost, too, which accustomed me early to voting for losing candidates. Student Council elections in those days were raucous affairs held in the old gym (nowadays the school’s theater). There was much shouting, cheering, booing, speechifying, balled-up-paper tossing, and (eventually) voting. The whole exercise meant nothing, since the Student Council had no authority whatsoever. Arguably, authority didn’t even lie with the school administration but with the Board of Trustees, the owners of the school, whom we never met – Henry Luce (Time/Life) was the only one we knew by name. There is a worldview (especially common, curiously, among elements of both the radical Left and radical Right) to the effect that this is a pretty fair microcosm of democratic politics generally.

Whatever degree of truth there may be to that view, there is a widespread sense in the US that something is fundamentally amiss with our democracy. Even though we hear much about the “polarization” of the electorate, and even though election season never really ends on these shores, the most significant polarization is between voters and non-voters. For all the politicians’ bombast and all of the efforts to “turn out the base,” the US ranks 120 in voter turnout among the 169 countries for which numbers are available. That’s right in between the Dominican Republic and Benin.

I don’t think the reason is apathy. I think it is closer to despair – a conviction that all of the nasty things Republicans and Democrats in Congress say about each other are true, and that neither party deserves a vote. The opinion is nothing new either. Perhaps this sounds familiar: “Our president blames Congress, Congress blames the president, the Democrats and Republicans blame each other. Nobody steps up to the plate and accepts responsibility for anything.” That was Ross Perot in 1992.

Perot, running on the Reform Party, came closer to winning the Presidency than any third party candidate other than Teddy Roosevelt. His campaign team included Hamilton Jordon, Jimmy Carter’s former chief of staff, and Ed Rollins, Ronald Reagan’s campaign manager. At the beginning of July 1992, Perot was, in fact, leading in the Presidential race with a 39.5% plurality in the polls despite a message that was anything but politic by conventional standards. He wanted substantial increases in taxes, including on gasoline, plus major cuts in defense and entitlements, including social security, which, he said, were on an unsustainable course. The message of fiscal responsibility struck a chord with the public. Then, suddenly, he withdrew from the race. (The reasons make a strange story of their own, but they are not to the point.) His supporters felt undercut and abandoned. Then he changed his mind again and re-entered the race, but it was too late. Though Perot generally is thought to have won the first presidential debate, he had lost all momentum. On election day Clinton won with 44%, followed by Bush with 36%, and then Perot with 15%. On his second attempt in 1996, Perot's share of the vote dropped to 8%.

I didn’t vote for Ross Perot in 1992. (Even more quixotically, I voted for a fourth party candidate.) However, I thought about it, if only in order to help break the two-party duopoly. 1992 might well have been the last realistic chance for that.

What does Ross himself have to say about the matter? (Yes, he is 82 and still with us.) He said to USA Today about a third party candidacy in the current environment, “It's almost impossible to do it. It would be a very healthy thing if you could get it done and make it happen, but it's very difficult to do, and very few people would want to try…They know they're going to be butchered from day one for what they've done, and much of the media will participate actively in that.”

Of course, if the conspiracy theorists are right after all and there really is a Board of Trustees, its members simply would smile regardless.

From 1996


Saturday, January 19, 2013

The Big Brain Theory


The most highly rated sitcom in the US at present is The Big Bang Theory, now in its 6th season.  The main characters are scientists who are brilliant in their various fields, but who (despite their charmingly child-like affection for comic books, video games, and scifi) lack basic social skills and common sense. The comic foil is Penny, the actress/waitress who lives across the hall, and who is their opposite in almost every way.

Outside of sitcoms, is there any truth to the stereotype of eccentricity and everyday ineptitude among the brightest of the bright? Most of us know someone who fits the description, and we’ve all heard anecdotes. Mathematician/philosopher Bertrand Russell, for example, was unable to follow his wife’s directions for steeping tea while she was away. Albert Einstein filled his closet with identical shirts, pants, and jackets in order to avoid making a fashion mistake. Dean Kamon, who has hundreds of patents and inventions (including the Segway Scooter), has declared the Connecticut island on which he lives to be the independent Kingdom of North Dumpling, and has issued currency in units of pi. Are they exceptions? Not really. A bit extreme, perhaps, but otherwise unsurprising.

An interesting and readable book on the subject is The Intelligence Paradox: Why the Intelligent Choice Isn’t Always the Smart One by Satoshi Kanazawa. Kanazawa is an evolutionary psychologist at the London School of Economics and is Associate Editor of the Journal of Social, Evolutionary, and Cultural Psychology. Much of what he has to say is controversial, but none of it is simple assertion; he bases his conclusions on studies (a few of them his own) and data, and cites his sources. At the core of his analysis is the Savanna Principle, which states that humans evolved instincts and mental predilections to suit our hunter-gather ancestors prior to 10,000 years ago (the beginnings of agriculture), and that these continue to motivate us today. In the aspects of life that are not novel in evolutionary terms – making friends, dealing with enemies, caring for offspring, finding mates, etc. – high intelligence is no advantage; “common sense,” which is another way of saying our evolved predispositions, serves us better. High intelligence is useful in dealing with novel situations for which behaviors could not have evolved, since evolution requires long-term repeated exposure over many generations to work. It would be useful, for example, in figuring out how to cross a swollen river with a raft, which would not be a daily occurrence. Today, nearly all of us live in a deeply unnatural environment – an evolutionarily novel environment – and intelligence is highly correlated with success in laboratories, academics, business offices, and so on, none of which existed on the savanna. On the other hand, the tendency of intelligent people to bring intellectual analysis to aspects of life that are not novel in evolutionary terms often leads them to behaviors and opinions that are at the same time clever and boneheaded. They fail to employ common sense.

One would predict from the Savanna Principle that very bright people would have no advantage in the age-old pursuit of mates and reproduction. They don’t. Quite the contrary. As an example, 5000 gifted Americans have been tracked since the 8th grade by The Study of Mathematically Precocious Youth – they scored at age 13 more than 700 (out of 800) on the math SAT or more than 630 on the verbal. Nearly all have had exceptional career success, many of them in academia. Yet, 64.9% of the men and 69% of the women remain childless at age 33 compared to 26.4% of the general population. Across the intelligence spectrum – and adjusted for differences in fertility among different cultural groups – from very dull to very bright (very bright = 125 or higher on a standard IQ test, which is to say less than 5% of the population) fertility and likelihood of marriage are inversely related to IQ. Very bright people are the least likely to marry and reproduce, and they have fewer children when they do. You might argue that this difference is by choice rather than from social ineptitude, but that reinforces the point: such a choice is an evolutionarily novel one.

Bright people make plenty of other choices about novel things and circumstances. They are more likely to be vegetarian. They drink more than duller folk, and are more likely to binge drink. (Naturally fermented fruit was part of the diet of our ancestors, but it’s hard to find enough of that to get drunk; purposeful brewing began with farming, which supplied the necessary abundance of fruit and grain, and distilled hard liquors came along only after 700 AD. Without distilling, wine maxes out at 12-15% alcohol, at which point it kills off its own yeast.) They use more illegal drugs. Bright people commit fewer violent crimes, but violence (directed outside of one’s own clan of 20-150) was common on the savanna, and remains so today among modern hunter-gatherers. Such violence predates people. (See October blog Tea and Chimpanzee for video of chimps raiding a neighboring group of chimps and eating the one they catch.) Bright folks commit more white-collar frauds however.

Kanazawa concludes, “Yes, intelligent people make better physicians, better astronauts, better scientists, and better violinists, because all of these pursuits are evolutionarily novel. But these are all the unimportant things in life.”

One could debate that point, but, to return to the first question, the sitcoms might well have it broadly right.


Penny Accompanies Sheldon and Amy on Their First Date. (Video embedding disabled by request, but follow this link to the clip: http://www.youtube.com/watch?v=-TIgftOZwy0 )


Saturday, January 12, 2013

All about the Benjamins


Sumerian proverb inscribed on a clay tablet, c.2400 BC: “You can have a lord, you can have a king, but the man to fear is the tax-collector.” Little changes. Presently, the State of NJ is demanding a payment from me for 2011 that my bank statement shows was paid at the appropriate time. Perhaps Hurricane Sandy had something to do with the records mix-up. I should be able to clear up that one. Meantime, quarterly local property taxes loom; NJ has the highest in the nation, as it has for many years. 

One of the charming chores of the new year all of us face in the US is assembling data for federal taxes due on April 15. Federal, state, and local taxes are all up this year over last, but enjoy the current rates while they last, for they are headed much higher still. Since governments at all levels are plainly incapable of restraining spending (and since We the People are plainly incapable of electing any that are capable), taxes must go up. A lot. (Massive inflation to reduce the real value of the debt is the only other alternative, but the net effect of that – lower real personal income – is much the same.) So, this is a golden era for taxpayers – at least in the current century.

It has been a round 100 years since the 16th Amendment to the U.S. Constitution went into effect. There is a notion in some quarters that the income tax was unconstitutional prior to the amendment’s adoption in 1913. This is not quite right. Congress had the power to lay taxes, but there was a catch on “direct taxes”:  “No Capitation, or other direct, Tax shall be laid, unless in Proportion to the Census or Enumeration herein before directed to be taken.” A direct tax is a tax on people or property; an indirect tax is a tax on an event such as a sales tax, excise tax, or tariff. In 1894, populists in Congress approved a 2% income tax (the first peacetime income tax enacted) on incomes over $4000, a limit that exempted 90% of the population. It was not a revenue measure, since the government ran a surplus, but a fairness issue. (Old themes recur with frequency in taxation matters.) In 1895, however, the Supreme Court ruled in Pollock vs. Farmers Loan and Trust Company that income tax on certain types of income counted as a “direct tax”; the 1894 bill hadn’t apportioned the tax among the states according to the census, so it was nullified. This apportionment requirement was an annoyance to legislators, so the 16th Amendment was proposed. It passed by the required majorities of Congress and the states with surprisingly little argument. The Senate vote was 77-0, the House 318-14. The Amendment reads:

The Congress shall have power to lay and collect taxes on incomes, from whatever source derived, without apportionment among the several States, and without regard to any census or enumeration.”

It took effect early in 1913, in the last days of the Taft Administration. Later that year, during the Wilson Administration, an apportion-less income tax was levied of 1% at $20,000 and 7% on incomes above $500,000 (more than $10 million 2013 dollars). Rates varied enormously over the next decades, but until World War Two, most income-earners didn’t earn enough to be required to file; as late as 1939, the height of the pre-War New Deal, only 5% of the population filed income tax returns.

Today, of course, nearly everyone files, and even those who don’t owe “income tax” still pay payroll taxes, sales taxes, communications taxes, energy taxes, property taxes (even renters pay these indirectly), transfer taxes, vice taxes, and so on.

Once again, this is nothing new. A Sumerian historian on another tablet complains that a shepherd has to pay five shekels to the ishakku (local king) of Lagash for each white fleece. The ishakku gets five shekels (and one more to his vizier) for each divorce. The perfume maker owes 5 shekels to the ishakku (plus one to the vizier and one to the steward) for each batch (it’s not clear what quantity). When a Lagashian died, he tells us, officials would show up to collect an inheritance tax of barley, bread, beer, and furnishings from his grieving relatives. The list goes on.

On occasion, a political leader will get uneasy about such exactions. In the second century AD, the Roman emperor Marcus Aurelius realized that the tax burden was exceeding the capacity of the citizenry to carry it. Accordingly, he admonished his troops who wanted a bonus for their hard fighting on the Rhine and Danube, “Anything you receive over and above your regular wages must be exacted from the blood of your parents and relations.” Nevertheless, to meet those regular wages, he sold off imperial furniture, jewels, and treasures (including his wife’s silks) in an auction rather than raise taxes. I don’t expect any such thing from contemporary politicians. (Marcus did debase the silver denarius to pay off debts in cheaper coins though – which is to say he opted for inflation.)

It could be argued that, in between the rules of extravagant tyrants at least, the Romans got fair value for their denarii. Whether or not we do ultimately doesn’t matter. We will pay them regardless. It’s hard to improve on Ben Franklin’s famous remark on the subject, so I’ll simple quote it; "Our new Constitution is now established, and has an appearance that promises permanency; but in this world nothing can be said to be certain, except death and taxes." Margaret Mitchell in Gone with the Wind agreed: "Death, taxes and childbirth! There's never any convenient time for any of them." At least I don’t have to worry about one of those.

Saturday, January 5, 2013

Pygmalion 2.0


Last night I finished Children of the Sky, Vernor Vinge’s sequel to his marvelously imaginative sci-fi novel A Fire Upon the Deep, set largely on a world where packs of telepathic canines are intelligent life-forms – the intelligent individuals are not the “dogs” one-by-one (they’re just dogs, pretty much), but the packs. I’d recommend Fire to anyone, but Children only to the hardest of hardcore Vinge fans. Nevertheless, it brought to mind something else for which Vinge is well known, even among those who are not fans of his novels.

It has been 20 years since Vinge presented his paper, The Coming Technological Singularity: How to Survive in the Post-Human Era. The singularity refers to the emergence of superhuman intelligence – aware intelligence – after which we will enter “a regime as radically different from our human past as we humans are from the lower animals.” He predicted the moment would arrive after 2005 but before 2030. He hasn’t yet altered that prediction. After the singularity, the future belongs to post-humans, who might or might not be just technology-enhanced humans.

Artificial Intelligence is an old concept. Self-aware computers and robots have been a staple of sci-fi almost from the start; Maria in Metropolis, HAL in 2001 and Colossus in The Forbin Project are obvious examples. Vinge was not the first to suggest that a bio-cultural sea-change would follow the appearance of true AI, nor did he invent the term “singularity” to describe it. As Vinge notes in his 1993 paper, early computer scientist von Neumann (1905-1957) used the term in a similar context. However, Vinge popularized the idea more successfully than anyone before him, and most discussions of the singularity begin with Vinge. He notes that there are several routes by which superintelligence can arrive:

"--There may be developed computers that are 'awake' and superhumanly intelligent…
--Large computer networks (and their associated users) may 'wake up' as a superhumanly intelligent entity.
--Computer/human interfaces may become so intimate that users may reasonably be considered superhumanly intelligent.
--Biological science may provide means to improve natural human intellect."

The last option recalls the eugenics movement popular in intellectual circles a century ago. Even with the growing potential of genetic engineering, however, the inside track presently seems to belong to electronic/photonic technology. Already, an otherwise ordinary person connected to the internet can ace an IQ test, even though the credit for that belongs to his hardware rather than his wetware. Google glasses (demo below), scheduled for limited market release this year, offer a continuous internet connection with a heads-up display. This comes close to option #3.

The ultimate game-changer, though, would be true machine intelligence. Sci-fi is full of Terminators and other evil intelligent machines bent on destroying humanity, but nothing of the sort need be the case. Machines will have whatever values we give them. (On second thought, there is something scary about that.) A sci-fi novel worth a read is Saturn’s Children (2008) by Charles Stross. Humanity in his future solar system has ceased to exist, not because it was destroyed but because it just faded away; humans didn’t see the point of biological reproduction when their robots were so superior. The robots retain their human-like forms and values even though they make little sense in a universe with no more people, because to change would alter their identities – the essence of what they are.

Does tomorrow really belong to our souped-up tinkertoys? Maybe. Perhaps that is for the best, too. If the idea makes you uncomfortable though, you are not alone. Vinge himself remarks, “I think I'd be more comfortable if I were regarding these transcendental events from one thousand years remove ... instead of twenty.”

Google Glasses Demo

HAL-9000 Would Rather Do It Himself in 2001: A Space Odyssey (1968)