Friday, December 31, 2010

Hi Five

It is 5 hours EST until 2011, which is reason enough to mention that 5 is one of my favorite numbers. Yes, I have favorite numbers. Don’t you? I’m not alone on 5 either. The ancient Pythagoreans thought numbers were what the universe was all about, and 5 was a big deal for them. First of all, the number and the associated pentagram were symbols of health. Why?


AB/BC is the Golden Ratio, [(the square root of 5) + 1] divided by 2, or 1.61803… This ratio is also derivable from the Fibonacci sequence, which Fibonacci described in 1202 AD to explain the growth of rabbit populations – no kidding – and which includes the number 5 (1,1,2,3,5,8,13…). The ratio and sequence occur frequently in nature for reasons no one yet has figured out. They turn up in everything from leaf growth to sea shells, and the Greeks believed the ratio produced the most aesthetically pleasing physical objects. The long sides of the Parthenon are 1.618 times the short sides.
The Pythagoreans also associated 5 with marriage, but, hey, no number is perfect. (The mathematicians out there are shouting, “You're wrong! Any number that is the sum of its positive divisors excluding itself is ‘perfect.’” That is true, but you know what I mean.) Marriage comes into the picture because 5 is the sum of the first “female” number, 2, and the first ‘male’ number, 3 – they didn’t consider 1 to be a number since it is a unity.
There are 5 Platonic solids consisting of regular polygons around a central point: tetrahedron, cube, octahedron, dodecahedron, and icosahedron.
5 is the largest number most people instinctively recognize without pausing actually to count. So, if you simply glance at a shelf with no more than 5 books on it, you will be able to say instantly how many are there. Unless you are unusual, if there are more than 5 books, you (however briefly) will have to count. Most higher animals lose track at 3, but somehow it is hard to be proud about doing only 2 better. This is one reason counting by 5s is easy; the 5 digits on each hand help too, of course. 5 has a prominent place in many number systems, such as the Roman (V=5, L=50, D=500). Incidentally, the classical Romans almost never did subtractive notation. They did not write IV for 4 or XC for 90 as we do today; nearly always they wrote IIII or LXXXX; the subtractive notation became popular in Medieval times, perhaps to befuddle the peasantry further.
Finally, on a personal note, 5 is the number of truly serious inamoratas who have entered and left my life to date, and all were a handful.


Sunday, December 26, 2010

To Dally in the Valley

I’ve seen only the trailers, but the movie Tron Legacy, still in theaters at this writing, has not impressed critics with anything other than its graphics, which are admitted to be excellent. These are enough to make the film notable, however, whether or not it is worth seeing as entertainment. In particular, a digitally created younger version of Jeff Bridges, playing opposite the actual older Jeff Bridges, is so well crafted that the animation isn’t perceptible. Virtual Jeff lives in the hills on the far side of the Uncanny Valley.
 
The existence of the Uncanny Valley was proposed in 1970 by roboticist Masahiro Mori. It refers to a peculiar variation in human emotional response to human-like images and figures. Mori notes that people generally respond positively to cartoon characters, robots, or dolls if the figures have some human features – think of Bugs Bunny, Robbie the Robot in Forbidden Planet, or Teddy bears. Human response gets more and more positive as more anthropomorphic features are added to the figures, but only up to a certain point. Then something happens. Past that point, as the images continue to get closer to the appearance of authentic humans, human receptivity plunges. We think they are creepy. If the images continue to improve in realism, however, we reach another point where the creepiness starts to diminish. Human receptivity rises again. The zone of verisimilitude in which humans respond negatively to the images and figures is the Uncanny Valley. The trick for roboticists, dollmakers, and animators is to keep on this side of the valley or get all the way to the other side. Reaching the other side is so technically difficult that only in recent years has it even become possible. Examples of imagery that didn’t make it to the farther hills can be found in Polar Express (2005); the animation is very realistic, but off by just enough that critics dubbed this visually disturbing movie Zombie Express.

The most common explanation for our negative reaction to “close but no cigar” anthropic images is that they evoke our instinctive revulsion in the presence of sickness and death. The images look real but not healthy, and we don’t want to catch whatever is wrong with them.

CGI is now capable (barely) of creating images of people indistinguishable from live actors, and which therefore don’t creep us out. I’m an old fashioned guy, however, and stuff made only of electrons and photons doesn’t impress me nearly so much as stuff with plenty of nucleons, i.e. real solid objects, robots in particular. The future of android robots is visible, even if it is not yet here. The very best Disney animatronics jump the Uncanny Valley, but only within the context of their specific displays and rides. The Japanese firm Kokoro manufactures much more versatile “actroids” that, if not yet out of the valley, at least have reached the far foothills. Nevertheless, I suspect I still have a long wait before I can order my Cherry 2000 (see http://www.youtube.com/watch?v=WkOfriUsB6A ).


Actroid Robot Manufactured by Kokoro


Monday, December 20, 2010

Virtually Wise

In the age of the Kindle and its clones, a personal library is an anachronism. Amazon offers 750,000 titles on its virtual bookshelves, and Google eBooks already has scanned some 15 million books, newspapers, and magazines into its records. A wafer-thin electronic pad costing less than $200 can access online not only more books than are in my personal library, but a couple orders of magnitude more than are in the county library. I started filling my wooden bookshelves long before there was an alternative to paper-and-ink, however, and I couldn’t afford 750,000 books. I own merely a few thousand. This isn’t enough to impress anyone greatly, but visitors to my house nonetheless sometimes ask, “So, have you read all these books?”

My response is always, “Yes. I should be a lot smarter than I am, shouldn’t I?”

This is meant as a joke, but actually it is true. The odd bits of information that have stuck with me from all those pages probably wouldn’t see me through Are You Smarter than a Fifth Grader? Oh, I probably can pick out a line by Yeats from four possible answers, and I know who Basil the Bulgar Slayer was (beyond just some guy who slew Bulgars), but how likely is it anyone will ask those questions? I might well miss the Keats question and I’d have to look up who succeeded Basil.

Reading whole books, solid or virtual, is an old-fashioned way of acquiring information, and one that is fading. Nearly anything we need to know is a few clicks away on our computer screen or iPhone, so why try to pre-stuff our heads with data? Any D student with a laptop can answer in seconds obscure questions that stump his unaided professor.

Is it fair to say these new intellectual capabilities make us smarter? Wiser? Or are we actually getting dimmer as our microchips take over?

I once mentioned to a friend that pocket calculators had diminished basic math skills. I gave as an example the extraction of square roots, which we learn to do in grammar school (at least I assume this is still taught in grammar school). Few adults remember how to do it; the omnipresence of calculators makes it unnecessary. My friend answered, “If I saw an employee trying to extract a square root by hand, I’d fire him.” He had a point. In the modern world this skill is about as useful as flint knapping.

Is that D student with his laptop really smarter than his professor? Maybe, if you consider the student and his computer together as a unit. Still, I can’t help feeling uneasy about this answer.

If there is virtual intelligence, can there also be virtual wisdom? I suppose we first need to define wisdom before we can answer that question. Fortunately, at the University of Chicago there is in progress a $2,000,000 four-year project called Defining Wisdom. I’ll not comment on how wise a use of $2,000,000 that may be. The University of Chicago has competition, too. At Butler University in Indiana, the computer scientist Ankur Gupta takes a quantitative approach. "The goal is to try to use data compression as a mathematical measure of wisdom," he says. With or without a Kindle, I never would have thought of that.

Thursday, December 16, 2010

On Doing It Twice

Despite the evidence of a few recent posts, I’m really not stuck in the 60s. The decade simply has been getting underfoot lately. They tripped me again the other night when a friend, who is some 20 years younger than I am, commented on a TV commercial for HP ePrint. The ad shows a happy baby in a stroller tearing through the countryside and around city streets to the song Brand New Key.

“I like that commercial and that song,” she said. “Who sings it?”

As it happened, I knew the answer. “Melanie.”

“Melanie who?”

“Melanie Safka. I might even have it on vinyl.” (I do. It’s on the Gather Me album.)

“On vinyl? It’s that old? I don’t remember it.”

“It’s older than you are. Some radio stations wouldn’t play it back then.”

“Why not?”

“Because it’s about a young girl exploring her sexuality.”

“You’re kidding. Then why are they using it for that ad?”

“I’m not kidding. Melanie was a little coy at the time, of course, since there was the issue with radio, but she didn’t deny it either. But that was 40 years ago. Hardly anyone remembers anything about it. Besides, it’s all metaphor. Lyrics are a lot more direct these days, so I doubt many people would read anything into it now.”

I also had to explain the literal lyrics: there were kids’ roller skates then that fitted over the top of sneakers or shoes and were tightened with a key. Perhaps these are still manufactured, but I haven’t seen any in a long time.

I remember Melanie well for a lot of reasons. The ultimate Flower Child performer, she was everything that was right about the 60s. You can’t get a better expression of the naïve but captivating counterculture world view than her song Beautiful People (http://www.youtube.com/watch?v=JeHtuwsUeRw  if you have the time). She also is associated in my mind, through no fault of her own, with what was wrong with the decade. The 60s were a kind of party, and every party comes with a hangover. They are inseparable. (The current economic malaise is a hangover from a decade-long house-buying party.) I won’t list all the types of hangovers from the era, though there were a lot and include some we haven’t yet shaken off. I will mention a literal one though.

In 1971 my head hung over the toilet bowl. You know why. We’ve all been there – OK, maybe there are some exceptions who haven’t ever assumed this position, but they can’t be numerous. As my body strained to eject whatever was left inside of me (though there had been nothing left for the past ten minutes), the thought that kept going through my head was, “Why would anyone ever do this twice?” (It was another decade before I became a near teetotaler though.) The song on the stereo couldn’t have been more appropriate. It was Melanie’s Leftover Wine. To this day, I can’t listen to it without feeling queasy. Too bad. It’s a good song.


Saturday, December 11, 2010

Private Eyes

A side effect of the communications revolution – one missed by Marshall McLuhan – is the rise of the surveillance society. One may argue it was not entirely unanticipated; it is a central element of Orwell’s Nineteen Eighty-Four (1949), after all. However, Orwell conceived of video surveillance as an extension of secret police tactics already in use in the 40s – as an enhancement of the security state. That’s not quite the way it turned out.

To be sure, there are plenty of police and security applications of the technology. If you offer an alibi that, for example, you were driving to Manhattan at the time a crime occurred in Easton, police can check your EZ Pass statement and video footage from inside the Lincoln Tunnel to check your story. Yet, this is not quite the same thing as state authorities keeping a watchful eye on the masses. For the most part, we the people spy on each other. Surveillance isn’t centralized, but for that reason it is all the more pervasive. The security firm ADT, for example, currently runs a TV ad which depicts a parent at work using his laptop to check the camera in his home foyer; he smiles as he watches his teen daughter enter the front door after school. RFID tags originally intended to track pets are now marketed to track kids. Using a cell phone provides your GPS co-ordinates to the phone company. Scarcely a speck of ground is uncovered by the satellite imagery of maps.google (often accompanied by ground-level pics). A growing number of real time cameras mounted on buildings and lampposts are publically accessible on the net (a high angle view of Times Square: www.earthcam.com/usa/newyork/timessquare/ ), and more than once such images have been used to catch a philandering spouse. Even a minor event such as a spat between high schoolers is likely to be captured by someone’s cell phone camera and posted on Facebook. With very little more integration of these technologies, anyone who cares to do so will be able to track another individual’s movements 24/7. The remarkable thing, at least to many of my generation, is how little concern all this seems to generate, especially among the young who have grown up with it.

To the extent electronic eyes were meant to deter crime, they haven’t worked. There are so many cameras everywhere that watchers simply can’t monitor them all. A mugging directly in front of a street-cam most likely will go unnoticed. Drug deals and other street crimes are funneled into blind spots or completed inside cars where the exchanges can’t be recorded clearly. Working the night shift alone in a convenience store is no safer than it was 40 years ago. The cameras do help solve crimes after the fact, though, which is reason enough to expect their coverage to go on expanding.

This is so different from my childhood when – with no cell phone or beeper – I played either alone or with friends (often biking to the school playground or local shopping center) with no “supervision” or expectation of any. Today, I suppose this would be considered a case of negligent parenting, but at the time it was the norm; we made fun of kids whose parents hovered over them more than this. Well into my adulthood there was a simple assumption of privacy pretty much everywhere. It still catches me a little off guard when I notice a camera pointed my way when grocery-shopping or just strolling down a sidewalk.

This is one of those “you can’t put the genie back in the bottle” developments that does little good to bemoan. It is, however, a change significant enough to deserve more comment than the shrug it usually gets. An expectation of the lack of privacy is part of the shape of the modern world. We have met Big Brother, and we are he.

Tuesday, December 7, 2010

The 7th Revisited

Rather than a rehash of the subject in slightly different words, reprinted below is a blog I wrote back in 2007 about Pearl Harbor. I see no reason to change it.

Iowa on the Oder

Back around the time I was born, this complaint about the war of the day (Korea) was commonplace: "It's the wrong war at the wrong time in the wrong place against the wrong enemy." It was, too. The same can be said for every war fought by the US at least since 1898.

Pacifism? No, not really. I'm all for hitting back. I'm just not keen on seeking out wildcats and poking them with sticks.

I want to distinguish between context and blame, since the two are too often confused. In particular, any attempt to place 9/11 in geopolitical context is likely to evoke charges of “blaming the victims” when no such thing is intended.

So, in recognition of the day, let’s instead consider the context of an attack we can view with more detachment: the one of 12/7/41. The context of the attack was a series of deliberate provocations of Japan by the US, the most damaging of which was an embargo of strategic materials including oil. (Back in the day, the US was the leading oil producer and exporter.) The provocations were not mindless, but were in response to Japan’s ongoing war with China; all the same, most Americans, while sympathetic to China, felt the Sino-Japanese war was "not our business." The Roosevelt Administration knew full well the consequence of its actions would be war between Japan and the United States. This was, in fact, the idea. Secretary of War Henry Stimson wrote the following in his diary shortly before the attack:

"[Roosevelt] brought up the event that we were likely to be attacked perhaps next Monday [December 1], for the Japanese are notorious for making an attack without warning, and the question was…what we should do. The question was how we should maneuver them into the position of firing the first shot without allowing too much danger to ourselves."(Source: The Pacific War by John Toland.)

Kido Butai, the Japanese carrier force, fired a more effective first shot than Stimson or Roosevelt anticipated.

It is important to understand this context. It does not excuse the Japanese strike on Pearl Harbor, nor does it blame the victims. The US provocations fell short of acts of war. Japan could have bought (or seized) the needed strategic materials in Southeast Asia without attacking the US, though its sea lanes would have been precariously exposed. A less warmongering government than the one in Tokyo at the time wouldn't have opted to attack. The ultimate blame therefore lay with Japan; after Pearl Harbor there was little left for the US to do but hit back. However, it is fair to ask if FDR should have poked this particular wildcat.

Similarly, more recent attacks on the US are not excused by the context, and the victims certainly are not to blame. Still it is fair to question the poking of wildcats.

Otto von Bismarck back in the 19th century predicted the next big war would be started by "some damn fool thing in the Balkans." He also said, "The whole of the Balkans is not worth the life of a single Pomeranian grenadier." He was prescient. After a damn fool thing in the Balkans in 1914, 18 million people died, including more than a few Pomeranian grenadiers.

Well, the whole of the Middle East and Central Asia is not worth the life of a single Iowan rifleman. The matter of oil complicates our approach to the region. It shouldn't. There are other sources and there are other fuels – all of which must be cheaper than the trillion dollars spent on the wars. Besides, regardless of the regime, what can any producer do with the stuff but sell it to the West?

Isolationism? Why not? More often than not, the isolationists have the better argument.

Wednesday, December 1, 2010

Velvet Mouse

Tom Wolfe once said that it never pays to be more than five minutes ahead of your time. A hippie in the 50s, a disco dancer in the 60s, and an Eminem-style rapper in the 70s all would have faced social scorn and ridicule. The innovative streamlined 1934 Chrysler Airflow failed not just because of the Depression but because the public wasn’t ready for the look. In the 1970s, Xerox developed an operating system that was essentially Windows, but home computers powerful enough to run it didn’t yet exist. The first generation of plate-size video discs appeared too soon, and was leapfrogged by later consumer electronics technology. F. Paul Wilson: “The late mouse gets the cheese.”

If you are one of those people more than two steps ahead, folks someday, when they look back, may point you out as a pioneer or trailblazer, but that doesn’t help your present bank account or land you a spot on The Tonight Show.

What brings all this to mind is the CD I randomly plucked out of the center console this morning while driving to work. It was The Best of the Velvet Underground. The original alternative rock group, Velvet Underground is often called the “most influential rock band of all time.” Yet, in the 60s and 70s it was not a commercial success, despite (because of?) its association with Andy Warhol and The Factory. The dark moody lyrics didn’t fit the 60s Zeitgeist, which was a blend of rebellion and Love; besides, the sound didn’t rock and you couldn’t dance to it. The total effect of the music was just weird in a way that connected only with a relative handful of enthusiasts. Thanks to Warhol, almost everyone at the time who was moderately conscious of contemporary music knew the group existed, but hardly anyone could identify one of its songs. The band didn’t get airtime on the most popular radio channels. Not one of my friends in high school (1966-70) owned a Velvet Underground album, and, for all the prevalent music talk among them, I don’t recall any of my classmates so much as mentioning the band. Velvet Underground was more than five minutes ahead of its time, and suffered for it.

Other musicians and lyricists noticed, even if most of the general public did not, and adapted their sounds accordingly. The Zeitgeist eventually caught up, and dark, moody weirdness went mainstream – more mainstream, anyway. The group’s albums sell far better now than they did when they were recorded in the 1960s, and veteran Lou Reed is better known and respected than ever.

I didn’t give the group a serious listen until 1974, nearly a decade late. The (sometimes literally) offbeat sound appealed to me then and still does today.

Tom’s point remains valid with regard to commercial success and social acceptance, but, all the same, there is something to be said for striking out into new territory – at least for those of us further back on the trail.