This week’s Economist has a special section called
(with the magazine’s usual penchant for puns) “Chips with everything.” It is
about the inclusion of computer chips not just in items that we normally think
of as high tech, but in things as mundane as lamps, doorbells, and fish tanks. (A
Las Vegas casino’s security systems recently were hacked via its smart fish
tank.) They talk to each other, too, as elements of pervasive Artificial
Intelligences. Smart TVs (and it is getting hard to buy any other kind) watch
the watchers and collect data on viewing habits, which they pass on to data
mining AIs. Some TVs monitor conversations. Connecting them with the other
household chips offers some interesting possibilities. With existing tech
(though to my knowledge it is not yet done) a TV could, for example, enhance
the experience of a horror movie by flickering your lights and turning fans on
and off to make creepy sounds – and to judge for itself how and when to do it. Our
cars already are apt to take over the driving (e.g. by braking or correcting
for lane drift) from us if they think we are doing a bad job. Everything is
getting smarter except possibly the people who increasingly rely on tech to do
their thinking for them. One wonders if the driving skills of humans will
atrophy as AIs take over. So, too, other skills.
Moore’s Law, which states
that the number of transistors per chip doubles while the price halves every
two years, continues to operate as it has for half a century. Repeated
predictions that it was nearing its practical limit have proven premature. So
our toys are ever smarter while their integration into the internet makes the
emergent AIs ever eerier. For a time it seemed as though humans were getting
brighter, too, albeit at a much slower pace. The Flynn Effect is the apparent
increase in average intelligence that manifested in the 20th
century. Average intelligence is by definition 100, so whatever the average
number of correct answers on an IQ test might be in any given year is assigned
the number 100. (The most common scale has a standard deviation of 15, meaning
that 50% of the population falls between 85 and 115; 22.5% are between 116 and
130 and 22.5% are between 84 and 70; only 2.5% are above 130 and 2.5% below
70.) Yet, the raw number of correct answers on the tests increased year by year
in the 20th century. Judging by the original 100-year-old tests results
(without renormalizing the results each year), it appears that average
intelligence in advanced countries rose 3 points per decade.
Something seems wrong about this. It means that the
average test-taker in 1952 (the year I was born) was, by 2002 standards (and
also by 2019 standards, but more on that in a moment), below normal intelligence (i.e. below 85); to put it the other way around, it means the
average present-day test-taker is by 1952 standards intellectually gifted
(above 115). This bothered James Flynn, who discovered the effect in the 1980s.
“Why,” he asks, “did teachers of 30
years experience not express amazement at finding their classes filling up with
gifted students?” They didn’t and they don’t. Quite the contrary. Despite much heavier
homework loads, kids in high school today perform worse than students in the
1950s and 1960s. 12th graders have a smaller active vocabulary than
12th graders of 50 years ago. They have no better understanding of
algebra or geometry, and without calculators their elementary math skills are
worse. The SATs had to be made easier to keep the nominal scores from tanking.
If you took SATs in the 1970s you should add 70 points to your verbal scores
and 30 points to your math in order to adjust them to 21st century
standards.
How then to explain the 20th
century rise in raw scores on IQ tests? A large part of it is that education
and scholastic tests have steadily shifted to focus more on abstract reasoning
(which is what IQ tests measure) and less on nuts-and-bolts facts and skills.
In short, advertently or otherwise, educators increasingly have taught to the
exam. Flynn uses the example of crossword puzzles. Almost
all neophytes are bad at them, but eventually one learns to anticipate puns and
unusual uses of words. With practice one becomes good at them. Something
analogous seems to have gone on in IQ tests: practice. (It should be
noted that none of the improvement was at the high end, which is to say that
the raw scores of the top 2.5% who are naturally talented at abstract reasoning
didn’t get any better over the decades.) Some of the Flynn Effect may well be developmental
however: a result of better childhood nutrition and health, in which case the
effect should stall as good health and nutrition become more widespread. And,
indeed, the effect stalled in Scandinavia in the 1990s and most of the rest of the
West in the early 2000s. In fact they went into reverse. See the 2006 paper Secular declines in cognitive test scores: A
reversal of the Flynn Effect by Thomas Teasdale and David Owen documenting
decline in IQ scores in Denmark and Norway between 1998 and 2004: “Across all
tests, the decrease in the 5/6 year period corresponds to approximately 1.5 IQ
points, very close to the net gain between 1988 and 1998.” They surmised that
the Flynn effect is at an end in advanced countries, though still at work in
less developed ones. Similar results soon turned up elsewhere. A few years
later The Telegraph noted, “Tests
carried out [in the UK] in 1980 and again in 2008 show that the IQ score of an
average 14-year-old dropped by more than two points over the period. Among
those in the upper half of the intelligence scale, a group that is typically
dominated by children from middle class families, performance was even worse,
with an average IQ score six points below what it was 28 years ago.”
Just as the 20th century
Flynn Effect was (in my opinion) more apparent than real, so too with the 21st
century declines in IQ. In other words, I don’t think kids are biologically
dumber than kids 20 years ago any more than I think they are biologically
smarter than kids 100 years ago. Something in the cultural environment changed
that affected their average scores. I don’t think it is an accident that the
declines in First World countries coincided with widespread access to the
internet, which brings us back to artificial intelligences and chips for all. Outside
of school, young people give every appearance of being far smarter in everyday
life than I and my peers were at their age, but it is an appearance facilitated
by smart phones with endless information at fingertips. When we didn’t know
something back in the 1960s, we just stumbled around not knowing it until we
asked someone (who might give us the wrong answer) or looked it up in a book,
map, or other non-digital resource. Compared to the instantly informed youths
of today, we were idiots. The credit for the difference, however, belongs to
Steve Jobs.
Trivial Pursuit was a game for 1984 (when sales peaked), not 2019 when
anyone with a phone can get a perfect score – or defeat any Jeopardy champion. Yet there is a
difference between information and knowledge. It may seem like a waste of
brainpower to remember historical dates, or a Robert Frost poem, or the atomic
number of sodium when those are just a finger tap away, but it isn’t.
Creativity, insight, and comprehension are all about making connections between
different bits of knowledge in our own heads, not floating out there in a
cloud. Knowledge also helps us think for ourselves and be less susceptible to
propaganda perfectly tailored (by those AI data miners who know us so well) to
appeal to our existing biases. Yes, our machines can be programmed to make
those connections for us in a way that simulates (or perhaps is) creativity,
but at that point the machines become earth’s sentient beings and it is time to
hand over to them the keys to the planet.
I’m not a Luddite. I’m happy
to let my phone remember other phone numbers and to let my GPS (on the first trip, not the second – or at least
not the third) tell me when to turn right. I’ve no wish to go off the grid. I
like having a world of information in the palm of my hand. Our easy reliance on
the internet does give me some pause though, as does the proliferation of
chips, AI, and surveillance technologies. Increasingly – and perhaps irreversibly
– we have too much information and not enough knowledge.
Quiet Riot – Too
Much Information
Well said. I feel smarter today than 30 years ago, granted most of that is age, but I do like having instant access to knowledge even if most of it is trivial for the most part.
ReplyDeleteI do sigh a bit when I mention some trivia and then hear the tapping of fingers as someone checks my facts. There is always someone who likes to prove that his phone is smarter than I am, which of course it is.
Delete