Thursday, July 25, 2013

Salty Language

Everyday commodities are so intertwined with human events that it is possible to write history (local, regional, or world) from the perspective of any one of them. So, too, with everyday age-old technologies. I’ve read histories of spices, horse domestication, pork, astronomy (think Stonehenge), masonry, libraries, cod, rats, mapmaking, and shipbuilding, among many others.  In college, I wrote papers of my own on the role of grain, calendars, and taxes in historical events. A well written book of this sort can make it seem as though the chosen item is the key to understanding the whole human saga. Of course this never is true. The item is just one part of the whole. Nonetheless, such focused histories are valuable, because they remind us that individual parts matter. There are moments when a better map, a better ship, a crop failure, or an ill-considered tax really does make a difference. A small thing, which we take for granted most of the time, can tip the scales at critical junctures.

At present I’m reading Salt: A World History by Mark Kurlansky, a prime example of the genre. Salt is something we take very much for granted in the modern world. (A salt is a compound formed by a chemical reaction between an acid and a base, and Kurlansky addresses several different chemicals, but in this instance I refer only to sodium chloride, common table salt.) When we think about salt at all nowadays, it is usually just to worry we are eating too much of it. It is so cheap and plentiful that 51% of all the salt used in the US is simply thrown on roads in the wintertime. This wasn’t always the case. Though underground seams and domes of salt are plentiful, and the oceans are full of the stuff, getting at it traditionally has been a labor intensive business. There are two traditional methods of collecting salt. Both are still used, though modern equipment makes the job easier. The first is the evaporation of sea water – or of water from brine springs. (More than one town has been named Salina after its salt springs.) The other is the dangerous business of mining. Rock salt is the only rock that humans eat – well, normal humans anyway.

All animals need salt. Carnivores, including humans who eat a lot of red meat or seafood, can get what they need through their diet. Herbivores need to make a special effort to find it, which is why salt licks are popular with horses and cattle, and why all agricultural peoples (but not hunters) have produced salt. The amount needed for metabolic purposes isn’t very large, though. What turned salt production into a major industry was something else. Prior to the advent of modern refrigeration a little more than a century ago, salt was the only way to preserve food effectively. It also was a useful disinfectant. Historically, people ate huge amounts of salted fish, salt pork and salt beef, as well as vegetables and fruits pickled in brine, notably olives. Armies scarcely could move without large supplies of salted foods, which were the only ones that would keep. As late as the US Civil War, Confederate salt works were prime targets of Union raids. Said General William Tecumseh Sherman in 1862, “Salt is eminently contraband, because of its use in curing meats, without which armies cannot be subsisted.” It wasn’t just meats. The Roman habit of salting vegetables is the source of the word “salad.” Cato in De Agricultura: “If you want your cabbage chopped, washed, dried, sprinkled with salt or vinegar, there is nothing healthier.” When precious metals were scarce, Roman soldiers were sometimes paid in salt, hence the word “salary.” Salt long has had associations with sex and fertility, hence “salacious.” In ancient China, salt taxes were a major source of revenue – and of controversy. So they were also in 18th century France, though that particular tax regimen ended badly for the royal government.

There is something pleasing about the way one can derive such a broad history from a single item. In a whimsical moment back in college, I considered writing a paper on the role of the marshmallow in history. After all, marshmallows have been found in ancient Egyptian tombs, and were considered both a tasty treat and a medicine. At the time they literally were made from the marshmallow (Althaea officinalis), as they continued to be through the 19th century. The plant extracts were replaced with gelatin in most recipes in the 20th. I chose another topic in the end because, in those pre-internet days, the research loomed as a bit daunting for what was a minor assignment. But perhaps I’ll return to the subject yet. Who knows? Perhaps marshmallows really will prove to be the key to understanding the whole human saga.


The Salt of the Earth


Thursday, July 18, 2013

World War R

Zombie movies have been a minor genre for more than 7 decades, but in the most recent decade they’ve been all the rage, sometimes as standard horror fare (e.g. Flight of the Living Dead), sometimes as comedy (Warm Bodies, Zombieland), and sometimes as adventure (World War Z, presently doing big box office). Budgets range from 0 to astronomical. They don’t do much for me, by and large, so I miss most of them. I never quite “got” them – even though the likable Kelli Maroney (Night of the Comet [1984]) was kind enough to sign a photo for me. Perhaps the issue for me is that – while plagues, in a general way, always are real and present dangers – a plague that turns humans into single-minded ravenous murderous cannibals doesn’t seem a very high risk. Never mind one that reanimates the dead.

On the other hand, I have seen and enjoyed killer robot movies, including the classic Westworld and the Terminator franchise. Arguably the genre is similar to the zombie films. Both scenarios might seem equally unlikely. Yet, strangely enough, the screenwriters are on firmer ground with the machines. The US Army wants a third of its ground combat vehicles to be unmanned within this decade. Northrop and BAE produce autonomous drones that, with little modification, could make their own decisions to fire munitions – though for the moment a human operator always makes that decision. Robot sentries guard the border of the two Koreas, and have an “automatic” option (though a human has to decide to turn it on) which allows them to identify and fire on targets on their own. Anti-missile Gatlings on warships have a similar option, because no human is fast enough to do the job. Enough people find this unsettling that a Campaign to Stop Killer Robots has attracted serious support. UN investigator Christof Heyns warns, “War without reflection is mechanical slaughter... a decision to allow machines to be deployed to kill human beings deserves a collective pause worldwide."

Well, that’s one way of looking at it. Some roboticists have a different view. An Economist article on machine intelligence and robotic warfare a few years ago noted the views of one: "Dr. Arkin believes there is another reason for putting robots into battle, which is that they have the potential to act more humanely than people. Stress does not affect a robot's judgment in the way it affects a soldier's." Yes, more humanely. Oddly enough, this is credible.

The robots still need a lot of work, of course, to be truly autonomous. For one thing (and foremost), AI, while getting better, still falls far short of even a convincing simulation of consciousness. Second, machines need to be able to construct copies of themselves, as anticipated in the 1960s Berserker sci-fi stories by Fred Saberhagen. Finally, they need to be able to recharge without help – to live off the land.

Fortunately, this last problem was solved more than a decade ago. The robot Chew Chew (yes, really – look it up) has a microbial fuel cell (MFC) that breaks down biological material and converts the chemical energy into electricity. The robot was fed sugar, but the inventor, Stuart Wilkinson, notes that the ideal food for energy gain is meat. "Vegetation is not nearly as nutritious," he says.

So, there we have it. If we bring these three elements together, we can have intelligent, self-replicating, carnivorous killer robots. What could go wrong? Now there is a kind of zombie I find exciting.

[Book notes: I’m currently reading Neptune’s Brood by Charles Stross, a newly released sequel to Saturn’s Children (2008). Both are set in a future in which robots have supplanted people. Humanity faded away, not because the robots ate them, but because humans didn’t see the point of biological reproduction anymore. Worth a read. I don't do many robot stories of my own, but I do have one, Going through the Motions, at http://richardbellush2.blogspot.com/2012/10/going-through-motions.html .]



Thursday, July 11, 2013

On Dog-Walking and the Neighborhood Effect

The cul-de-sac street on which I live has 20 houses. There are three smaller cul-de-sac offshoots of my street with a few houses each; so, altogether 36 houses including mine ultimately share a single exit onto the main road. (The word “main” may give a false impression.) Police love street arrangements like this, since they can block off a whole area at a single chokepoint. Firefighters hate them for the same reason. I don’t know what the total population of my neighborhood is, but a very generous supposition of 4 occupants per home would give a total of 144. This is almost certainly a significant overshoot; I don’t see many kids outside, and I know for sure of 2 homes, including mine, in which the occupancy is merely 1. So, 144 is a maximum, and something like 100 much more likely.

When the street was built in the 1970s, I knew by name almost everyone on it. (What is now my house belonged to my parents at the time, but I still knew the neighbors.) I don’t anymore. Only 3 of the original 1970s buyers are still there, and I never got to know the replacements – some of the homes have sold three or four times since then. Accordingly, as is more common than not these days, I live amid strangers. Yet not quite. When I drive past someone walking a dog – or just walking – on my street, we exchange courteous waves and smiles; if we’re both walking we exchange verbal pleasantries as we pass. No such exchange happens once I turn onto the main road, even if the dog-walker is someone I’ve passed 500 times. (Dog-walkers tend to have routine schedules, so if you drive on a schedule you will see the same ones again and again.) Yes, I’ve tried, just as experiments. My waves on the main road – or any other road – merely get quizzical and somewhat suspicious stares back. This is the neighborhood effect, created not just by the chokepoint at the stop sign but by the population size on my home’s side of it.

Anthropologists studying ancient pre-civilized peoples and surviving hunter-gatherers encounter the numerical range 100-150 time and again; it is the common group size among people who lived or still live in a way matching the conditions in which humans evolved. (20-50, as a subdivision of a larger 150 group, also recurs, but for the moment I’m interested in the bigger number.) It is the group size within which people can interact socially and meaningfully without undue strain – in a larger population it’s hard for members to keep track of each other’s personal details. There is a correlation in primate species between cortex development and the size of the close social groups they can maintain. In her book I See Rude People, self-described “Advice Goddess” Amy Alkon noted that evolutionary psychologist Robin Dunbar did the math on this for humans and came up with 148.3. He then looked at 21 surviving hunter-gatherer groups in various environments and found the average village population to be 148.4. In the crowded conditions of more technically advanced societies, people find ways to re-create those group sizes. The average Christmas card list is 154. Armies commonly have units about this size, e.g. the Roman centuriae or US Army companies. Dunbar notes the sociological principle that groups larger than 150-200 need some sort of central authority to maintain order and cohesion, while smaller groups often can get by with informal arrangements and controls (i.e. peer pressure). This is one reason small towns typically are so safe. There is a practical problem to being a rogue in a small town. How, for example, can you hold up the liquor store when the owner and all the patrons know your mother?

I experienced a culture shock first hand when I graduated high school and went to college. The student population at my prep school never exceeded 120, all grades combined; add in faculty and staff, and the total on campus hovered around 150. By the end of each school year, I knew every single one of those 150 by name, and knew at least something about the personal quirks of each. At GWU, on the other hand, where the student population was 25,000, I was unlikely ever to see (or at least recognize) again the majority of fellow students in any particular class I took. I certainly didn’t know the names of any but 1 or 2 in a class (not in every class though) with whom I made a point of striking an acquaintance. As for all the others, we would pass each other on the sidewalk with the indifference and lack of recognition shown by typical pedestrians in any large town or city. So, the different responses of dog-walkers should come as no surprise. Even though the folks in my current neighborhood don’t, in fact, socialize much, something about the size of the group created by the geography resonates with us on a primal level, so we smile and wave this side of the stop sign but not on the other: us vs. them.

This, I think, has much to do with rising rudeness and discourtesy much decried in the press lately – a rudeness evident in public places and starkly evident online. Few of the people we encounter nowadays belong to an “us” group of 150. They are outsiders, which in Paleolithic terms means they are no one we need care about – unless perhaps as fit targets for raiding parties. Amy Alkon’s solution to this is confrontational. When annoyed by another Starbuck’s patron shouting into his cell phone, for example, she took down his cell phone number, which he helpfully shouted, and called him the next morning. She told him if he didn’t want to be called by strangers, maybe he shouldn’t shout out his number in a coffee shop. The story of how she recovered her stolen pink Nash Rambler is best read first hand. She says that people are not naturally hard-wired to exert peer pressure on strangers (they’re outside our 150, and therefore “them” not “us”) so public rudeness normally goes unchecked. She makes a point of delivering checks. Keep in mind that this may work better (i.e. is less likely to lead to violence) for pretty redheads like Amy than for the rest of us.


Russell Brand’s Courtesy Lesson


Friday, July 5, 2013

Gimme Jimi

Some 3,000,000 people watched the fireworks over the East River in NYC last night in person. That’s roughly equal to the entire population of Lithuania or of Panama. I wasn’t one of them; I’m not as fond of the crush of crowds and the boredom of traffic jams as I once was. It was a huge and colorful display, though, for those who like that sort of thing. Several smaller displays were in nearby parks and towns, but I passed on them, too. Fireworks have been part of the 4th of July on these shores since the one-year anniversary in 1777, when the Americans used precious gunpowder for celebrations in Philadelphia and Boston; New York was held by the British. Those fireworks didn’t hold a candle to the five-hour display staged by Peter the Great when his son was born, but they were enough to start a tradition. (I suspect my dad just lit a cigarette the day I was born; he didn’t quit smoking until 15 years later.)

I remember seeing professional displays of 4th of July fireworks at least twice as a toddler when my family lived in Whippany, NJ, which means my earliest attendance at such an event was no later than 1958. I liked them well enough, I suppose, but those two times pretty well satisfied my appetite for them. Nowadays I feel about them much the way most dogs do. I prefer the other tradition: barbecue. (Come to think of it, dogs agree with that, too – I’m sensing a pattern.)

Nonetheless, I did watch the NBC special, partly to see the crowd I was pleased didn’t include me and partly for the musical guests. Then the fireworks began, coordinated with musical selections and with lights on the Empire State Building. That brings us to the reason for this post. I have a suggestion about the fireworks music next time. The use of Jimi Hendrix’ ironic version of The Star Spangled Banner at first seemed to be a welcome bit of brashness on the part of the show organizers. But only at first, because the central part of the tune was excised and replaced by traditional music. Dudes and Dudesses, when you do that, you merely call special attention to what you cut out. So either have the courage to play the whole song, or don’t even start. I recommend the whole thing.

Just an opinion.


Woodstock ‘69