On this Memorial Day weekend, I’d like to extend the remembrances to the Roman legions. Why? Because on this date 558 years ago, the Roman Empire fell.
Well, sort of. Ask 10 historians when Rome fell and you’ll get 10 different answers. Some popular dates for the fall: 293 AD when Diocletian radically changed the administration; 330 when the first Christian emperor Constantine moved the primary imperial residence to Byzantium, aka Nova Roma, aka Constantinople, thereby beginning “the Byzantine Empire” (though this is a modern term never used by anyone who lived in it – they still called the place the Roman Empire); 363 at the death of Julian, the last pagan emperor; and 410 when the Visigoths sacked the city of Rome. The western emperor Honorius, who lived in Ravenna in 410, heard about what the Visigoths had done so he took bold action by declaring it illegal to wear pants – pants being Germanic attire. The date for the fall of Rome most commonly found in schoolbooks is 476 AD. Nothing much happened then. The Ostrogoths were German allies of Rome who had been allowed to settle in Italy; Odoacer, the Ostrogothic chief, in 476 deposed the useless western emperor Romulus Augustulus without even bothering to kill him. Yet, Odoacer didn’t set up an independent kingdom. The Roman Senate made Odoacer a Roman patrician, and the eastern emperor Zeno then appointed him to administer Italy, which remained part of the Roman Empire. Nothing at all changed for the residents of Italy. Theodoric the Great then invited Odoacer to dinner and killed him – “the Great” apparently didn’t refer to his qualities as a host. Theodoric, in turn, became a viceroy for Zeno. The emperor Justinian later restored Italy to direct Roman imperial rule, without any more nonsense about western emperors or Ostrogothic viceroys. The end of Justinian’s reign in 565 is another date commonly favored as “the end” of Rome.
My own preferred date, following Edward Gibbon, is May 29, 1453 at about 11 o’clock in the morning. On that day, after relentless battering by massive cannons, the walls of Constantinople fell. Turkish troops under Muhammad the Conqueror rushed the city, and the last emperor, Constantine XI Palaeologus Augustus, was killed fighting them in the rubble. The city was sacked. The Empire was unmistakably and irretrievably over.
The Romans are overrated. They surely knew how to build bridges and administer laws, but they were stolid authoritarians who hampered technological and social innovation. Still, the modern world owes much to them, and especially to the troops who defended them for 2000 years. Without the Legions holding the line for so long, we in the West might never have had the time, chance, or rich fertile soil to grow beyond them.
Is there a lesson we can take away from any of this? Perhaps that it’s a bad sign when the powers-that-be issue rules about pants.
Sunday, May 29, 2011
Wednesday, May 25, 2011
The Dishonest Truth
We all know how to mislead without telling a lie; we simply are selective about what truths we mention.
All history books, even when accurate, are similarly selective. Popular conceptions of history are even more so. Here is an example. Do the names Hanson, Boudinot, Mifflin, Lee, Gorham, St. Clair, and Griffin ring a bell? If they do, you belong to fewer than 5% of Americans or to a vanishing percentage of anyone else. Many average-length general histories of the USA don’t mention any of these men at all.
So, who the hell are these guys? They are the first 7 presidents of the United States. For reasons that are unclear to me, most Americans simply choose to forget that our current Constitution, drafted in 1787 and in force with modifications since 1789 (the year Washington took office), is our second constitution. The first one, The Articles of Confederation, was adopted in 1781. Presidents under it served 1-year terms. The first one, John Hanson, was responsible for adopting a number of enduring symbols, including the Great Seal of the United States and, yes, the Presidential Seal still in use.
We see a similar selectivity in contemporary news – and not just political news. The trial of Casey Anthony, accused of killing her daughter, is currently a major news story in Florida, for example. Roughly 200 kids are killed by their mothers in this country every year. Why does this particular case (and perhaps one or two others) get relentless media coverage while nearly all the rest are just statistics? I don’t really have an answer.
There is something in this akin to the old Boss Tweed, dictum, "It doesn't matter who votes - it matters who counts the votes." Who picks what is newsworthy matters more than who reports it or how. Selective news reporting always has been and remains the rule. (Analysts at MSNBC and Fox correctly accuse each other of it all the time.) At bottom, there is no way around this. We can’t possibly record or report everything, and so we rely on our own subjective judgments about what facts are important.
This necessary selectivity stirs up extreme conspiracy theorists (themselves notoriously selective), of course, since they see that information they deem important has been left out of official or mainstream accounts. Birthers and 9/11 Truthers accordingly scoff at each other while defending their own views by insisting, “Ah, but you didn’t mention or explain this detail.” And we probably didn’t. We didn’t think it mattered. In fact, going back again to the 18th century, why do we so seldom hear that 1776, in addition to being the year of the American Revolution, also was the year of the publication of Adam Smith’s The Wealth of Nations, the publication of Gibbon’s first volume of The Decline and Fall of the Roman Empire, and the founding by Adam Weishaupt of the subversive secret society The Illuminati in Bavaria? Have those in power conspired to downplay the coincidence so that we would miss it? I kid you not that some people think so – and think The Illuminati achieved power and are still at work pulling strings.
Are the conspiracy theorists crazy? Maybe, but their beliefs, right or wrong, don’t necessarily prove it. These people simply select and dismiss evidence eccentrically. They are useful, if for no other reason, because they force us to question how wisely we make our own selections.
By the way, if you Illuminati folks really are running things behind the curtain, you’re doing a lousy job.
All history books, even when accurate, are similarly selective. Popular conceptions of history are even more so. Here is an example. Do the names Hanson, Boudinot, Mifflin, Lee, Gorham, St. Clair, and Griffin ring a bell? If they do, you belong to fewer than 5% of Americans or to a vanishing percentage of anyone else. Many average-length general histories of the USA don’t mention any of these men at all.
So, who the hell are these guys? They are the first 7 presidents of the United States. For reasons that are unclear to me, most Americans simply choose to forget that our current Constitution, drafted in 1787 and in force with modifications since 1789 (the year Washington took office), is our second constitution. The first one, The Articles of Confederation, was adopted in 1781. Presidents under it served 1-year terms. The first one, John Hanson, was responsible for adopting a number of enduring symbols, including the Great Seal of the United States and, yes, the Presidential Seal still in use.
We see a similar selectivity in contemporary news – and not just political news. The trial of Casey Anthony, accused of killing her daughter, is currently a major news story in Florida, for example. Roughly 200 kids are killed by their mothers in this country every year. Why does this particular case (and perhaps one or two others) get relentless media coverage while nearly all the rest are just statistics? I don’t really have an answer.
There is something in this akin to the old Boss Tweed, dictum, "It doesn't matter who votes - it matters who counts the votes." Who picks what is newsworthy matters more than who reports it or how. Selective news reporting always has been and remains the rule. (Analysts at MSNBC and Fox correctly accuse each other of it all the time.) At bottom, there is no way around this. We can’t possibly record or report everything, and so we rely on our own subjective judgments about what facts are important.
This necessary selectivity stirs up extreme conspiracy theorists (themselves notoriously selective), of course, since they see that information they deem important has been left out of official or mainstream accounts. Birthers and 9/11 Truthers accordingly scoff at each other while defending their own views by insisting, “Ah, but you didn’t mention or explain this detail.” And we probably didn’t. We didn’t think it mattered. In fact, going back again to the 18th century, why do we so seldom hear that 1776, in addition to being the year of the American Revolution, also was the year of the publication of Adam Smith’s The Wealth of Nations, the publication of Gibbon’s first volume of The Decline and Fall of the Roman Empire, and the founding by Adam Weishaupt of the subversive secret society The Illuminati in Bavaria? Have those in power conspired to downplay the coincidence so that we would miss it? I kid you not that some people think so – and think The Illuminati achieved power and are still at work pulling strings.
Are the conspiracy theorists crazy? Maybe, but their beliefs, right or wrong, don’t necessarily prove it. These people simply select and dismiss evidence eccentrically. They are useful, if for no other reason, because they force us to question how wisely we make our own selections.
By the way, if you Illuminati folks really are running things behind the curtain, you’re doing a lousy job.
Sunday, May 22, 2011
Vote for the Bikini Model of Your Choice, But Vote
In 2006 Myspace was the dominant social network. It had well over a hundred million users worldwide and was adding tens of thousands per day. In 2007 it confidently declared its victory over the upstart rival Facebook. I still have a Myspace account, but it is a pretty lonely place these days. Just out of curiosity, every week or two I’ll log on and dodge the passing tumbleweeds. For months, the number of my friends listed as online has been 0.
What happened? I don’t know exactly. Fashions come and go. Maybe Myspace will benefit from a nostalgia fad at some point. Meantime, there remain at least two signs of life on the site, though by all indications the first one is robotic. Whenever I do log on (and typically post a link to this blog site), almost instantly I get messages ostensibly from bikini-clad young ladies with names like Chloe or Misty or Brittany. “Hi, I’m new to Myspace and I noticed your profile. Message me at something-or-other-dot-yahoo-dot-com or click on the link below to see my private photos.”
Myspace deletes the profile pages of the bikini girls pretty quickly. Perhaps they shouldn’t. Some of the fun still to be had on the network comes from checking out these phony profile pages, though not for what the bikini girls supposedly post about themselves, which is pretty tame (the links, of course, go to places that are not). No, the fun is in the comments posted by … well … let’s be kind and call them “naïve” men. There are always some who will thank Brittany for noticing their profiles, and they then will suggest "getting together." They are the second sign of life. Perhaps they persist on Myspace solely in hopes of one day getting an answer from a bikini girl.
I don’t know what the explanation possibly could be for the guys' obviously futile behavior, but I suspect it is the same as the explanation for why we vote for the same politicians (or their clones) in election after election.
What happened? I don’t know exactly. Fashions come and go. Maybe Myspace will benefit from a nostalgia fad at some point. Meantime, there remain at least two signs of life on the site, though by all indications the first one is robotic. Whenever I do log on (and typically post a link to this blog site), almost instantly I get messages ostensibly from bikini-clad young ladies with names like Chloe or Misty or Brittany. “Hi, I’m new to Myspace and I noticed your profile. Message me at something-or-other-dot-yahoo-dot-com or click on the link below to see my private photos.”
Myspace deletes the profile pages of the bikini girls pretty quickly. Perhaps they shouldn’t. Some of the fun still to be had on the network comes from checking out these phony profile pages, though not for what the bikini girls supposedly post about themselves, which is pretty tame (the links, of course, go to places that are not). No, the fun is in the comments posted by … well … let’s be kind and call them “naïve” men. There are always some who will thank Brittany for noticing their profiles, and they then will suggest "getting together." They are the second sign of life. Perhaps they persist on Myspace solely in hopes of one day getting an answer from a bikini girl.
I don’t know what the explanation possibly could be for the guys' obviously futile behavior, but I suspect it is the same as the explanation for why we vote for the same politicians (or their clones) in election after election.
Wednesday, May 18, 2011
The Queen's Mirror Was a Blabbermouth
Global life expectancy in the past century rose from 30 years to 67. Countries vary dramatically, as you might expect. Monaco currently tops the list with a life expectancy at birth of 89.73. Japan, number 5 overall, is foremost of the largish countries at 82.25. The position of the US on the list depends on what you count as a country. (Is Hong Kong to be counted separately from China? Or Jersey from the UK?) Using the UN’s definition, the US is number 34 at 78.37. Those are overall figures for both sexes. Women substantially outlive men almost everywhere.
Yet, in a sense, the human lifespan has not increased at all. An 80-year-old today is much the same as an 80-year-old was in 1911. Nor is an 80-year-old American much different from an 80-year-old Liberian, even though a Liberian has a life expectancy at birth of 41.84. (By the way, according to the Social Security Administration, if you’ve made it to 80 in the US, you’ll probably make it to 89.) Despite all our best science, humans do not age any slower than they ever did. All of the increase in life expectancy is thanks to reductions in premature deaths from infections, disease, and accidents – especially among infants. In 1900 the leading cause of death was not heart failure, as it far and away is today, but infection. So, nowadays we are more likely to stay on stage all the way through the Fifth Act, but we aren’t any more likely than actors of previous generations to stay past curtain calls.
Despite the neverending claims of snake oil salesmen (“New and Improved! Rebuilds telomeres!”), the Fountain of Youth remains as elusive in 2011 as when Ponce de Leon scoured Florida for it in 1513. The annoying advice, “Eat right and exercise regularly,” still has some validity, but even this regimen buys us far less time and even less youthfulness than most of us would like for all that work.
This news article did catch my eye a while back, however:
“LAGOS (Reuters) – Nigeria's anti-narcotics agency confiscated 6.5 tonnes of marijuana Tuesday from the home of a man who claimed to be 114 years old."
He also claimed it was just for personal use. Draw your own conclusions.
Yet, in a sense, the human lifespan has not increased at all. An 80-year-old today is much the same as an 80-year-old was in 1911. Nor is an 80-year-old American much different from an 80-year-old Liberian, even though a Liberian has a life expectancy at birth of 41.84. (By the way, according to the Social Security Administration, if you’ve made it to 80 in the US, you’ll probably make it to 89.) Despite all our best science, humans do not age any slower than they ever did. All of the increase in life expectancy is thanks to reductions in premature deaths from infections, disease, and accidents – especially among infants. In 1900 the leading cause of death was not heart failure, as it far and away is today, but infection. So, nowadays we are more likely to stay on stage all the way through the Fifth Act, but we aren’t any more likely than actors of previous generations to stay past curtain calls.
Despite the neverending claims of snake oil salesmen (“New and Improved! Rebuilds telomeres!”), the Fountain of Youth remains as elusive in 2011 as when Ponce de Leon scoured Florida for it in 1513. The annoying advice, “Eat right and exercise regularly,” still has some validity, but even this regimen buys us far less time and even less youthfulness than most of us would like for all that work.
This news article did catch my eye a while back, however:
“LAGOS (Reuters) – Nigeria's anti-narcotics agency confiscated 6.5 tonnes of marijuana Tuesday from the home of a man who claimed to be 114 years old."
He also claimed it was just for personal use. Draw your own conclusions.
Sunday, May 15, 2011
An Owl Hooted Three Times
I attended the wedding ceremony of a friend of mine a couple days ago on Friday the 13th. Obviously, the bride and groom are not very superstitious, and no doubt they had an easy time booking the facilities for the day. As if the calendar weren’t enough, prior to the ceremony a mirror in the room fell from the wall and broke. If a black cat ran by, I didn’t see it. The couple took it all in good humor, though I wonder if they’ll think back to the date and mirror when they hit the inevitable rough patches all relationships have.
I strive to be skeptical in my assessments of the world at large, to the point of being a fan of CSICOP and a reader of the Skeptical Inquirer. These days, out of politeness, I refrain from offering unsolicited statements of these views in general company since they always offend somebody. (I was ruder when I was younger, but who isn’t?) If asked, however, I will answer. If a discussion about some paranormal or supernatural subject arises, I will toss in my skeptical two cents. I don’t raise the subject myself though. Not only is there the politeness issue, but I no longer think it is possible to change minds in this matter. As with so many other issues, most (perhaps all) of us believe what we do not because of reason but because of our predispositions; we then chose what evidence to notice and what to ignore to support them. We rationalize rather than reason. The especial problem with superstition is that a superstitious tendency is hardwired into our very natures – and not just into human nature.
In 1948 B.F. Skinner published a classic paper called Superstition in the Pigeon. At this point, perhaps we should define “superstition” in a way that applies both to pigeons and people: an irrational belief, often accompanied by a ritual intended to influence an outcome. Skinner noted that if he fed pigeons on a purely random schedule, each pigeon would tend to repeat a behavior (twirling, pacing, head-nodding, or whatever) it was performing the first time it was fed. Since the bird now was performing this behavior more often, just by the odds it was likely to be fed again while performing it, which reinforced the behavior. So, every bird tended to acquire a ritual behavior that it associated with getting fed. There in fact was no connection between the rituals and the food, but that didn’t stop the birds.
Humans aren’t much different. Baseball players, for example, are notoriously superstitious. This one insists on eating a Twinkie before a big game and that one carries a rabbit foot. What is more, the rituals really do have an impact. If the batter doesn’t get his Twinkie or the First Baseman forgets his rabbit foot, he is likely to be distracted by the thought just enough to miss or fumble the ball more often. Even if the Twinkie-eater is skeptical enough to believe the association between pastry and batting skill is in his own mind rather than in any magical properties in Twinkies, he still may be distracted if he doesn’t get one.
Why would nature hardwire us to pick up irrational beliefs? Apparently because it is more dangerous to miss a pattern (between tall grasses and lurking lions, for example) than to see one that doesn’t exist. So, we infer patterns from random coincidences and end up performing senseless rituals to deflect nonexistent threats or to affect outcomes in impossible ways – e.g. avoiding cracks in the sidewalk while en route to buy a lottery ticket.
Some even argue that openly superstitious people are happier than those of us who resist those impulses (with whatever degree of success). There actually is a condition called “depressive realism.” Depressed people, it seems, have a much more accurate and realistic world view than people generally, though it is not clear if depression makes them realistic or if realism makes them depressed.
While the condition apparently exists, I don’t really buy the argument's implication that superstition is good for you. It is possible to be both realistic and carefree – even if realistic and pessimistic are effectively synonyms. Moreover, I've seen plenty of unhappy mystics. Obsessive-compulsive behavior is just superstition in overdrive, after all, and doesn’t look like a cheery way to live to me. It doesn’t matter anyway as a practical matter. I couldn’t believe in a rabbit’s foot if I tried, though no doubt I believe many things just as silly.
In any event, I hope to attend my friends’ first anniversary, held not in exactly a year but on the nearest Friday the 13th. I suggest they make breaking a mirror part of the tradition.
I strive to be skeptical in my assessments of the world at large, to the point of being a fan of CSICOP and a reader of the Skeptical Inquirer. These days, out of politeness, I refrain from offering unsolicited statements of these views in general company since they always offend somebody. (I was ruder when I was younger, but who isn’t?) If asked, however, I will answer. If a discussion about some paranormal or supernatural subject arises, I will toss in my skeptical two cents. I don’t raise the subject myself though. Not only is there the politeness issue, but I no longer think it is possible to change minds in this matter. As with so many other issues, most (perhaps all) of us believe what we do not because of reason but because of our predispositions; we then chose what evidence to notice and what to ignore to support them. We rationalize rather than reason. The especial problem with superstition is that a superstitious tendency is hardwired into our very natures – and not just into human nature.
In 1948 B.F. Skinner published a classic paper called Superstition in the Pigeon. At this point, perhaps we should define “superstition” in a way that applies both to pigeons and people: an irrational belief, often accompanied by a ritual intended to influence an outcome. Skinner noted that if he fed pigeons on a purely random schedule, each pigeon would tend to repeat a behavior (twirling, pacing, head-nodding, or whatever) it was performing the first time it was fed. Since the bird now was performing this behavior more often, just by the odds it was likely to be fed again while performing it, which reinforced the behavior. So, every bird tended to acquire a ritual behavior that it associated with getting fed. There in fact was no connection between the rituals and the food, but that didn’t stop the birds.
Humans aren’t much different. Baseball players, for example, are notoriously superstitious. This one insists on eating a Twinkie before a big game and that one carries a rabbit foot. What is more, the rituals really do have an impact. If the batter doesn’t get his Twinkie or the First Baseman forgets his rabbit foot, he is likely to be distracted by the thought just enough to miss or fumble the ball more often. Even if the Twinkie-eater is skeptical enough to believe the association between pastry and batting skill is in his own mind rather than in any magical properties in Twinkies, he still may be distracted if he doesn’t get one.
Why would nature hardwire us to pick up irrational beliefs? Apparently because it is more dangerous to miss a pattern (between tall grasses and lurking lions, for example) than to see one that doesn’t exist. So, we infer patterns from random coincidences and end up performing senseless rituals to deflect nonexistent threats or to affect outcomes in impossible ways – e.g. avoiding cracks in the sidewalk while en route to buy a lottery ticket.
Some even argue that openly superstitious people are happier than those of us who resist those impulses (with whatever degree of success). There actually is a condition called “depressive realism.” Depressed people, it seems, have a much more accurate and realistic world view than people generally, though it is not clear if depression makes them realistic or if realism makes them depressed.
While the condition apparently exists, I don’t really buy the argument's implication that superstition is good for you. It is possible to be both realistic and carefree – even if realistic and pessimistic are effectively synonyms. Moreover, I've seen plenty of unhappy mystics. Obsessive-compulsive behavior is just superstition in overdrive, after all, and doesn’t look like a cheery way to live to me. It doesn’t matter anyway as a practical matter. I couldn’t believe in a rabbit’s foot if I tried, though no doubt I believe many things just as silly.
In any event, I hope to attend my friends’ first anniversary, held not in exactly a year but on the nearest Friday the 13th. I suggest they make breaking a mirror part of the tradition.
Wednesday, May 11, 2011
Sparks
A longstanding fantasy of those who have endured too many bad relationships and awkward singles scenes (see previous blog) has been to bypass the whole business by literally manufacturing the perfect date instead – a logical idea for tool-making creatures such as ourselves. Witness the artificial love objects in Metropolis (1927), My Living Doll (1964), Cherry 2000 (1987), or, for that matter, "Pygmalion" – not the play by Shaw but the 2000 year-old poem by Ovid in Metamorphoses. Nor is this purely a male fantasy. In the pleasant movie Making Mr. Right (1987) directed by Susan Seidelman, the heroine, unlike the hero in Cherry 2000 who falls for the real woman, ultimately opts for the robot over the real man.
Calvin Klein whiffed enough of this to take the precaution of trademarking the term "technosexual" for future use, though the company has no specific product in the works for it yet.
We already employ machine intelligence in war, in part because robots, as The Economist noted in a recent article, "have the potential to act more humanely than people. Stress does not affect a robot's judgment in the way it affects a soldier's." Sad, but hard to dispute. It is not a stretch to expect robots might love more humanely, too – or at least simulate love, which is close enough for many folks.
A roboticist named David Levy has gone to the trouble of writing a book about the subject titled LOVE AND SEX WITH ROBOTS: The Evolution of Human-Robot Relationships. He describes the high but, in his view, surmountable technical obstacles; mostly, though, he is interested in the social aspects. "Love with robots will be as normal as love with other humans," he says. The lumping of robots with “other humans” in that sentence gives me some pause, but otherwise his message is surprisingly unsurprising. It is no great revelation that people can love (and have sex with) almost anything; a truly anthropic machine should have no trouble getting dates. There are few insights in the book that most readers wouldn’t already have going in.
Some might point out that with birthrates already dropping below replacement level in all highly industrialized countries, these robots, if and when they arrive, could lead to a population crash as we abandon each other for techno-toys. Perhaps we need not fear extinction at the hands of robotic killers like the ones in the Terminator movies. Perhaps our machines will love us to death.
P.S.
My own tale of robotic love, “Going through the Motions,” can be found at http://richardbellush2.blogspot.com/
ANDROID GIRL (MUSIC VIDEO) from Aaron Potter on Vimeo.
Calvin Klein whiffed enough of this to take the precaution of trademarking the term "technosexual" for future use, though the company has no specific product in the works for it yet.
We already employ machine intelligence in war, in part because robots, as The Economist noted in a recent article, "have the potential to act more humanely than people. Stress does not affect a robot's judgment in the way it affects a soldier's." Sad, but hard to dispute. It is not a stretch to expect robots might love more humanely, too – or at least simulate love, which is close enough for many folks.
A roboticist named David Levy has gone to the trouble of writing a book about the subject titled LOVE AND SEX WITH ROBOTS: The Evolution of Human-Robot Relationships. He describes the high but, in his view, surmountable technical obstacles; mostly, though, he is interested in the social aspects. "Love with robots will be as normal as love with other humans," he says. The lumping of robots with “other humans” in that sentence gives me some pause, but otherwise his message is surprisingly unsurprising. It is no great revelation that people can love (and have sex with) almost anything; a truly anthropic machine should have no trouble getting dates. There are few insights in the book that most readers wouldn’t already have going in.
Some might point out that with birthrates already dropping below replacement level in all highly industrialized countries, these robots, if and when they arrive, could lead to a population crash as we abandon each other for techno-toys. Perhaps we need not fear extinction at the hands of robotic killers like the ones in the Terminator movies. Perhaps our machines will love us to death.
P.S.
My own tale of robotic love, “Going through the Motions,” can be found at http://richardbellush2.blogspot.com/
ANDROID GIRL (MUSIC VIDEO) from Aaron Potter on Vimeo.
Sunday, May 8, 2011
Singular Strategy
The major roads through Morristown, NJ, meet at the central square called the Green. Unsurprisingly, traffic jams are normal at rush hour. They are not normal at other times. It just isn’t that big a town. Yet, I encountered one last night. A quick look at the sidewalks revealed the reason: the Saturday night bar-hoppers were out in force and more were driving in. The same geography that snarls traffic at rush hour makes an ideal central location for singles bars, and Morristown has a bunch.
I feel increasingly outside the demographic in these bars with each passing year, and so I rarely enter one unless I happen to like a band playing there. More than a few of my younger male friends, however, have given up trolling such places for another reason: simple discouragement. One recently complained to me, “They’re called ‘singles bars’ because you enter and leave them alone.” Perhaps. Or perhaps he just employs the wrong strategy. I claim no personal expertise in this matter, but those who do point to two classic errors men make when attempting a pick-up: being indirect and being direct.
The trouble with being indirect is self-evident. You can’t very well expect to get a “yes” if you haven’t asked the question. With regard to the risk in being direct, I refer the reader to a classic study called Gender Differences in Receptivity to Sexual Offers co-authored by Russell D. Clark III of Florida State University and Elaine Hatfield of the University of Hawaii. They employed reasonably attractive women and reasonably attractive men for their experiment. (They reckoned that spectacularly attractive people – or ugly people for that matter – might get results that couldn’t be generalized to the rest of the population.) The task of each reasonably attractive woman was to approach one man after another at random. She would say to the man that she had noticed him before; then, she would invite him to meet her that night at her apartment for sex. The reasonably attractive men randomly approached women and said the same thing. The male response? 75% of men who were approached immediately agreed to sex. The minority who declined mostly cited commitments to girlfriends or spouses. None of the men seemed offended. The female response? If you guessed 0% agreed to sex you are right. Zero, as in not even one (out of 48). Some became angry. The authors reran the experiment (again with 48 male and 48 female subjects) a few years later and got the same results.
I don’t pretend to have insight into why such a traditional gender difference persists in the modern world. I merely present the evidence that “let’s have sex” might not be your best opening line if you’re a guy – at least if you’re are not a truly extraordinary guy. Most likely it would earn you 96 shoot-downs in a row. It apparently works if you’re a gal, though, which may be precisely the problem with it.
It is possible, of course, that the results of this experiment would be different if conducted at “last call” at the local pub instead of at noon on the campus quad. Does anyone smell a research grant to explore that one?
I feel increasingly outside the demographic in these bars with each passing year, and so I rarely enter one unless I happen to like a band playing there. More than a few of my younger male friends, however, have given up trolling such places for another reason: simple discouragement. One recently complained to me, “They’re called ‘singles bars’ because you enter and leave them alone.” Perhaps. Or perhaps he just employs the wrong strategy. I claim no personal expertise in this matter, but those who do point to two classic errors men make when attempting a pick-up: being indirect and being direct.
The trouble with being indirect is self-evident. You can’t very well expect to get a “yes” if you haven’t asked the question. With regard to the risk in being direct, I refer the reader to a classic study called Gender Differences in Receptivity to Sexual Offers co-authored by Russell D. Clark III of Florida State University and Elaine Hatfield of the University of Hawaii. They employed reasonably attractive women and reasonably attractive men for their experiment. (They reckoned that spectacularly attractive people – or ugly people for that matter – might get results that couldn’t be generalized to the rest of the population.) The task of each reasonably attractive woman was to approach one man after another at random. She would say to the man that she had noticed him before; then, she would invite him to meet her that night at her apartment for sex. The reasonably attractive men randomly approached women and said the same thing. The male response? 75% of men who were approached immediately agreed to sex. The minority who declined mostly cited commitments to girlfriends or spouses. None of the men seemed offended. The female response? If you guessed 0% agreed to sex you are right. Zero, as in not even one (out of 48). Some became angry. The authors reran the experiment (again with 48 male and 48 female subjects) a few years later and got the same results.
I don’t pretend to have insight into why such a traditional gender difference persists in the modern world. I merely present the evidence that “let’s have sex” might not be your best opening line if you’re a guy – at least if you’re are not a truly extraordinary guy. Most likely it would earn you 96 shoot-downs in a row. It apparently works if you’re a gal, though, which may be precisely the problem with it.
It is possible, of course, that the results of this experiment would be different if conducted at “last call” at the local pub instead of at noon on the campus quad. Does anyone smell a research grant to explore that one?
Wednesday, May 4, 2011
Edgar Allen Crow
There is a crow that visits my property and enjoys teasing my cat. He repeatedly alights just outside of Maxi’s lethal range. It seems to me a dangerous game, but he plays it so successfully that half the time my cat doesn’t bother lunging for him anymore. Crows are smart, and not just by bird standards. They have a brain/body mass ratio higher than any great ape other than humans; they rival even us. They recognize themselves in mirrors, thereby demonstrating a level of self-awareness. They make and use tools, such as hooked twigs for snaring larvae from holes in trees. None of that means they are as bright as we are, of course; absolute brain size also counts for something. Yet, they are bright.
High intelligence is rare among animals because it is less useful than one might imagine. Brains aplenty are not a better survival kit than a good set of claws and fleet feet. Primate and corvid level brainpower seems to evolve (when it does at all) as a second-best compensation in creatures lacking those more desirable attributes. The weight/power/wingspan mechanics of flight are a restraining influence on the size of brains in flighted birds, so our avian friends might have peaked with crows and parrots. However, once upon a time there existed close relatives of birds without wings; they had dexterous claws instead.
A bipedal dinosaur called Troodon formosus was about two meters from nose to tail and weighed some 50kg (110 lbs). The troodon is closely related to known feathered dinosaurs, though we don’t know if it sported feathers of its own. The size of the brain case indicates it was pretty bright by dino standards. It might have been able to outwit an opossum (had there been any opossums). Maybe. This is impressive enough to have set paleontologists Dale Russell and R. Seguin back in 1982 thinking about how evolution might have proceeded had that asteroid not killed the troodons and other dinosaurs 65 million years ago. In full expectation, I’m sure, that the popular press would pick up on the story (scientists love the spotlight as much as anybody), in 1982 they published an article in which they described a hypothetical dinosaur with human-level intelligence. Troodon sapiens (pictured below with its ancestor) bears a striking resemblance to ET, but its body type is very human-like. Arguing from the principle of convergence, which is what makes dolphins look like fish even though they aren’t, they concluded there are predictable developments over time in body shapes. A humanoid form, they suggested, has many advantages for an intelligent species and nature therefore would tend to select for it.
Paleontologist and science writer Stephen Jay Gould reacted at once with annoyance. He said that convergence is the exception, not the rule, in evolution. The rule is divergence. Convergence happens only in response to very specific environmental pressures (e.g. a streamlined shape for swimming). No such pressures tie intelligence specifically to a humanoid shape. Humans owe their body form to their peculiar primate ancestry. Accident and chance dominate the evolution of life; if the K-T (Cretaceous-Tertiary) boundary asteroid never had struck, he argued, subsequent lifeforms and the shape of any intelligent ones would be impossible to predict. He added that human-level intelligence arose only once on earth, which suggests it is a low probability event anyway.
Paleontologists on both sides of the issue talked at each other, but as is usual in such debates, no one did much convincing of the other. The debate hasn't ended yet.
I side with Gould on this one, but I do enjoy conjuring the image of a troodon-like dinosauroid on some parallel earth sitting in his study and writing a poem with the refrain, “Quoth the monkey, Nevermore.”
High intelligence is rare among animals because it is less useful than one might imagine. Brains aplenty are not a better survival kit than a good set of claws and fleet feet. Primate and corvid level brainpower seems to evolve (when it does at all) as a second-best compensation in creatures lacking those more desirable attributes. The weight/power/wingspan mechanics of flight are a restraining influence on the size of brains in flighted birds, so our avian friends might have peaked with crows and parrots. However, once upon a time there existed close relatives of birds without wings; they had dexterous claws instead.
A bipedal dinosaur called Troodon formosus was about two meters from nose to tail and weighed some 50kg (110 lbs). The troodon is closely related to known feathered dinosaurs, though we don’t know if it sported feathers of its own. The size of the brain case indicates it was pretty bright by dino standards. It might have been able to outwit an opossum (had there been any opossums). Maybe. This is impressive enough to have set paleontologists Dale Russell and R. Seguin back in 1982 thinking about how evolution might have proceeded had that asteroid not killed the troodons and other dinosaurs 65 million years ago. In full expectation, I’m sure, that the popular press would pick up on the story (scientists love the spotlight as much as anybody), in 1982 they published an article in which they described a hypothetical dinosaur with human-level intelligence. Troodon sapiens (pictured below with its ancestor) bears a striking resemblance to ET, but its body type is very human-like. Arguing from the principle of convergence, which is what makes dolphins look like fish even though they aren’t, they concluded there are predictable developments over time in body shapes. A humanoid form, they suggested, has many advantages for an intelligent species and nature therefore would tend to select for it.
Paleontologist and science writer Stephen Jay Gould reacted at once with annoyance. He said that convergence is the exception, not the rule, in evolution. The rule is divergence. Convergence happens only in response to very specific environmental pressures (e.g. a streamlined shape for swimming). No such pressures tie intelligence specifically to a humanoid shape. Humans owe their body form to their peculiar primate ancestry. Accident and chance dominate the evolution of life; if the K-T (Cretaceous-Tertiary) boundary asteroid never had struck, he argued, subsequent lifeforms and the shape of any intelligent ones would be impossible to predict. He added that human-level intelligence arose only once on earth, which suggests it is a low probability event anyway.
Paleontologists on both sides of the issue talked at each other, but as is usual in such debates, no one did much convincing of the other. The debate hasn't ended yet.
I side with Gould on this one, but I do enjoy conjuring the image of a troodon-like dinosauroid on some parallel earth sitting in his study and writing a poem with the refrain, “Quoth the monkey, Nevermore.”
Sunday, May 1, 2011
End of the World
Following a visit to the Chiller Theater Convention in nearby Parsippany yesterday, I was in the mood for something dark to slide into the dvd player. Perhaps something creepy. Perhaps some mayhem. Ultimately, I settled on an end-of-the-world movie appropriately titled Last Night.
Apocalyptic and post-apocalyptic settings have been frequent in film (as in literature) almost from the beginning. The tone varies a lot. The ultimate message is sometimes hopeful (e.g. Things to Come, though its presage of the Blitz is almost eerie), sometimes hopeless (On the Beach), and sometimes a bit of both (World without End). Sometimes the setting is just a convenient one for adventure with no other tone or commentary intended (The Road Warrior). Last Night (1998), a low budget but well-written movie, is something else altogether. The cast includes Sandra Oh, David Cronenberg, and Sarah Polley.
It is 6 PM in Toronto. It is precisely six hours before the end of the world. The reason for the end is never specified. There are clues: the sky never turns dark during those final six hours and we get a glimpse of what appears to be (but can’t be) the sun. So, my guess is that a vagrant neutron star wandered into the solar system on a collision course with earth. It doesn’t really matter though. The mechanism of the end isn’t the point. What matters is that everyone in the world knows about the end. They’ve known about it for months. There is nothing anyone can do to prevent it, and this is the last night.
Most people in the movie act much the same as they always did, only more intensely. Some try to ignore reality, some seek out their families (which continue to display the usual rivalries and dysfunctions), some seek solitude, some turn to religion, some party, and some lose themselves in work. Yes, some engage in pillage, violence, and rapine, though not very many considering that there can be no legal consequences. One character tries to fulfill every one of his sexual fantasies before the end. One woman runs nonstop around Toronto the whole six hours. A fellow at the gas company makes thank-you calls to customers for their patronage. One couple plans to commit mutual suicide just before midnight so that they go out on their own terms rather than by an accident of nature.
The varied reactions are strange at first glance but they make a kind of sense. After all, in actual life we all really do face last night even if we don’t know precisely which one it is. An expectation of six more hours, six years, or six decades is all much the same in the scheme of things. We respond by behaving in parallel ways to the characters in the movie.
The script is funny in its own way. One woman is irked by another’s comment, “It’s is hard on the children.” She answers, “I don't give a damn. People are always saying 'The children. Pity the children'. I'm tired of the children. They haven't lived, given birth, watch their friends die. I have invested 80 years in this life. The children don't know what they're missing.” This sentiment is not commonly voiced, but it may not be so very rare among 80-year-olds.
There is no happy ending, unless you count the obsessive runner who, as the world glares bright white, gleefully jumps and shouts, “It’s over! It’s over!” Yet, oddly, this is not a sad film. It may not be for everyone, but I recommend it.
Shameless Pitch:
My own post-apocalyptic novel (yes, I tried a hand at it) is Slog
Apocalyptic and post-apocalyptic settings have been frequent in film (as in literature) almost from the beginning. The tone varies a lot. The ultimate message is sometimes hopeful (e.g. Things to Come, though its presage of the Blitz is almost eerie), sometimes hopeless (On the Beach), and sometimes a bit of both (World without End). Sometimes the setting is just a convenient one for adventure with no other tone or commentary intended (The Road Warrior). Last Night (1998), a low budget but well-written movie, is something else altogether. The cast includes Sandra Oh, David Cronenberg, and Sarah Polley.
It is 6 PM in Toronto. It is precisely six hours before the end of the world. The reason for the end is never specified. There are clues: the sky never turns dark during those final six hours and we get a glimpse of what appears to be (but can’t be) the sun. So, my guess is that a vagrant neutron star wandered into the solar system on a collision course with earth. It doesn’t really matter though. The mechanism of the end isn’t the point. What matters is that everyone in the world knows about the end. They’ve known about it for months. There is nothing anyone can do to prevent it, and this is the last night.
Most people in the movie act much the same as they always did, only more intensely. Some try to ignore reality, some seek out their families (which continue to display the usual rivalries and dysfunctions), some seek solitude, some turn to religion, some party, and some lose themselves in work. Yes, some engage in pillage, violence, and rapine, though not very many considering that there can be no legal consequences. One character tries to fulfill every one of his sexual fantasies before the end. One woman runs nonstop around Toronto the whole six hours. A fellow at the gas company makes thank-you calls to customers for their patronage. One couple plans to commit mutual suicide just before midnight so that they go out on their own terms rather than by an accident of nature.
The varied reactions are strange at first glance but they make a kind of sense. After all, in actual life we all really do face last night even if we don’t know precisely which one it is. An expectation of six more hours, six years, or six decades is all much the same in the scheme of things. We respond by behaving in parallel ways to the characters in the movie.
The script is funny in its own way. One woman is irked by another’s comment, “It’s is hard on the children.” She answers, “I don't give a damn. People are always saying 'The children. Pity the children'. I'm tired of the children. They haven't lived, given birth, watch their friends die. I have invested 80 years in this life. The children don't know what they're missing.” This sentiment is not commonly voiced, but it may not be so very rare among 80-year-olds.
There is no happy ending, unless you count the obsessive runner who, as the world glares bright white, gleefully jumps and shouts, “It’s over! It’s over!” Yet, oddly, this is not a sad film. It may not be for everyone, but I recommend it.
Shameless Pitch:
My own post-apocalyptic novel (yes, I tried a hand at it) is Slog