I don’t consume a lot of alcohol of any
kind. My alcohol consumption was immodest for a spate in my 20s, after which I
became a near teetotaler for more than a decade, but both behavior patterns proved
to be (extended) phases. For the past score of years it has been fairly modest,
at least by the standards of the CDC. The CDC guidance for an adult male of
average size is no more than 14 drinks per week
and no more than 4 in any one day. (A standard “drink” in the US is defined as
14 grams of alcohol, which is the amount in 1.5 ounces [44.4 mL] of 80-proof [40%]
spirits.) I haven’t met, much less exceeded, either CDC limit in the current
century, and rarely came close. But when I do pour myself a little ethanol, it
is usually a high proof bourbon or rye. Neat: no ice, no mixers. I don’t care
much for sweet or mellow alcoholic beverages: the burn is part of the
point, much as the heat of a chili pepper is the point. This wasn’t always the
case. In my college years (legal drinking age was 18 then as it still is in
most of the world today) I didn’t like the harshness of unmixed spirits, so I either
opted for wine or would disguise spirits in cocktails aimed at those with a
sweet tooth: white Russians (long before The
Big Lebowski), grasshoppers, screwdrivers, Southern Comfort & Coke, and
the like. (Aside: sweeter than bourbon, Southern Comfort is pretty awful by
itself IMHO, but it is quite good as a mixer in lieu of bourbon.) Eventually
for simplicity’s sake I came to favor rum and Coke (aka Cuba libre) as my go-to
cocktail, rum being a spirit actually made from sugar and from sugar byproducts
such as molasses.
This hasn’t been my
tipple of choice since Star Wars: Episode
VI - Return of the Jedi was in theaters for the first time, but nostalgia for
that era likely influenced my decision last week to pick up a copy of And a Bottle of Rum: a History of the New
World in Ten Cocktails by Wayne Curtis. I like thematic histories. On my
shelves I have dedicated histories of salt, pork, cod, plagues, textiles,
shipping, and even rust, among many others. The elements of history – even
seemingly minor ones – are so intertwined that each can have profound impacts
in unexpected ways. Salt, for instance, may seem to have little to do with the
American Civil War, yet it was a strategic material necessary (in those
pre-refrigeration days) to preserve food to feed the troops, and its shortage
in the South was a real hardship; coastal Southern salt production facilities
were among the first targets of the Union Navy. This reveals the strength and
the weakness of histories that place one element – even a big one be it class, race,
ideology, or what-have-you – front and center. It really is illuminating to
view history from each of those perspectives, but we mislead ourselves if we
think that only one lens is correct or that it gives us a full view. A few
hundred years ago it would have seemed normal to regard religion as the central
mover of history. By the late 20th century, however, the historical
world view of statesmen and academics on both sides of the Cold War had become
so thoroughly secularized that the Islamic Revolution in Iran (which had
nothing to do with class, liberalism, Marxism, capitalism, or democracy)
completely blindsided them. Rum is not remotely as important as anything
mentioned so far, not least because lots of substitutes for it always were and
are available. Yet neither is it negligible.
Sugar is not an
indigenous New World crop. It is native to Papua New Guinea from which it
spread slowly westward in ancient times. Alexander the Great in 325 BCE
encountered it in India, and the plant was carried further west in Hellenistic
times. It was not abundant in the West, however, until introduced to the Caribbean
islands, which proved to have the perfect climate and soil for it. It was the
crop that put the islands on a paying basis for French and English colonists in
the 17th century. It wasn’t a bonanza comparable to the gold and
silver treasures flowing from the Spanish territories, but (unlike the French
and English colonies on the mainland, which were a serious drain on homeland
resources) it was something. It was something with dire consequences: slave
labor on the sugar plantations. Sugar production creates a lot of waste in
stalks and molasses. We don’t know who first decided to ferment and distill
alcohol commercially from the waste, but the where is probably Barbados. The
oldest mention in print of the stuff (called Rumbullion by the author) is by a
visitor to the island in 1652. The earliest surviving example of the short form
“rum” is also from Barbados in a 1658 deed that mentions “cisterns for liquor
for rum.”
Rum caught on quickly
in England and the English colonies. It didn’t suit the French and Spanish, who
continued to prefer wine and brandy at home and in their colonies, but they
were happy to export their excess molasses to the East Coast of North America
where it was distilled into rum in copious quantities. Rum helped spark the
American Revolution. In 1763 England began to enforce the previously flouted
Molasses Act and followed it with the Sugar Act of 1764, which placed tariffs
on sugar and molasses thereby threatening American rum distilleries. The uproar
was so great that the tariff on molasses was reduced in 1766 to a paltry 1
penny per gallon, but by then the Americans were stirred up about other things
as well. When rebellion broke out, the molasses trade was disrupted anyway, of
course, so the colonists became whiskey distillers. Whiskey still dominates the
output of American distillers, though in 2021 Americans drink much more
imported vodka than their own whiskey. Despite that Yo Ho Ho
song, which we owe to Robert Louis Stevenson and Treasure Island, rum wasn’t big among Caribbean buccaneers until
the later years of the classic age of piracy. It wasn’t available in the early
years. Blackbeard, however, was an inveterate rummy (he mixed it with gunpowder
of all things), so at least there is one notable conformer to stereotype. Grog
(3-to-1 mix of water and rum plus a splash of lime) wasn’t a pirate thing at
all but a Royal Navy thing. Admiral Edward Vernon (nicknamed Old Grogram)
decreed the watered and limed rum ration, which was specified in the naval code
in 1756. Rum for its first two
centuries was a rough, raw, and ragged drink. Facundo Bacardi y Maso changed
that when Bacardi’s distillery opened in Santiago Cuba in 1862. The rough edges
were smoothed out with aging and a filtration process that is still technically
secret (but probably uses sand and charcoal, much like Jack Daniels whiskey). The
result is a rum that is pleasant without a mixer. Other distillers soon
followed. Santiago is also the
birthplace of the Cuba libre (rum and Coke). The time and place are pretty
certain though the details vary with the teller. Since its introduction in
1886, Coca-Cola had been tried as a mixer for various spirits. In the most
common version of the story, during the Spanish-American War in 1898 American
troops in bars mixed rum and Coca-Cola and toasted their rebel allies, “Por Cuba libre!” In the US it did not
become a common cocktail before World War 2. During the war, however, it was the
drink of choice of sailors on Trinidad where there was a major US naval base
and no shortage of Caribbean rum. An island entertainer with the charming name
Lord Invader took note and modified a calypso tune originally written by
Trinidadian Lionel Belasco in 1906 with new lyrics and a new title. The locals
and the sailors liked it. Comedian Morey Amsterdam (readers of a certain age
will recognize him from The Dick Van Dyke
Show) heard the song when on a USO tour, got professional help back in the
States to polish the score, and introduced it to various singers. The Andrews
Sisters recorded Rum and Coca-Cola in
1944. It was the flip side of the single One
Meat Ball, which they expected to be a hit; instead Rum and Coca-Cola made a splash and One Meat Ball just a ripple. One couldn’t ask for better
advertising. Whether called a rum and Coke or a Cuba libre, it’s been a very
common bar order ever since. Side note: after the war the aging Lionel Belasco
sued and won for copyright infringement. Rum sales in the US
have about 10% of the market for hard spirits. The percentage varies a little
from year to year as rum based drinks (Mai Tai, Piña Colada, Daiquiri, Mojito,
etc., each of which has its own connection to social history) go in and out of fashion,
but only the very expensive high end rums show any long term trend upward in
sales. I’m too cheap for those. I don’t see myself ever ordering any of the
more complicated rum cocktails either. Just for old times’ sake though, perhaps
one night I’ll once again toast “Por Cuba
libre!”
I had a curious encounter outside a
barber shop toward which I was headed because my hair had somehow succeeded at
the neat trick of being both thin and shaggy. On the sidewalk before I entered
a complete stranger, quite a bit younger than I, stuck a finger in my face and
said with what appeared to be genuine anger, “You’re going to get what’s coming
to you, buddy.” From the “buddy” rather than a name, I
gathered this was not someone whom I didn’t happen to recognize but who knew me
from somewhere. I suspected (and still do) that it was just completely mistaken
identity. Perhaps someone on the street who looked vaguely like me had made
some remark to him or bumped into him or stolen his parking spot or something and
he assumed I was the fellow. Or perhaps he meant my type of person, whatever
type that might be, rather than me personally. Whatever the case, I was too taken
aback to respond before he sneered and walked off. Evidently he was prepared to
let karma wreak its own vengeance. Another approaching pedestrian who had
overheard this eyed me with a raised eyebrow. “It really would be pretty awful if
we all got what we deserved, wouldn’t it?” I remarked to him as he passed.
“Yeah, that is a scary thought,” he said. “Can you imagine?” Neither of us commented
further. I shrugged and entered the shop. The anonymous passerby and I were being
wry but serious, too. In some ways this evinced an old-fashioned world-view.
This is a narcissistic age, and more than a few us seem to think that what we
deserve is admiration and mountainous swag rather than anything alarming. For
them, the words of the finger-pointer would be heartening. I don’t think there
is much risk of either outcome – not from karma anyway. I’m not a believer in cosmic
karma. We get away with some transgressions and are falsely accused of others. We
may be over- or under-rewarded and over- or under-punished for what we do – or don’t
do. Good and bad things happen to us, sometimes earned by our own actions and
sometimes randomly. Nor is there any guarantee of balance to those outcomes:
one or the other can predominate for no particular reason. There are few
observations triter than “life is not fair.” Fairness itself is a notoriously tricky ethical
concept anyway, especially for secularists. (If one has faith that morality is
inherent in the universe, then that is that.) Some thinkers such as John Locke tried
to derive ethics from nature. The atheistic Ayn Rand went further and devised a
severely rational system of ethics (Objectivism) that is self-consistent from
the fundamentals up. However, as in any rational system, her conclusions follow
only if you buy her premises; one first has to hold some “truths to be
self-evident.” Not everyone does, at least not the same ones. Marx certainly
didn’t. A nihilist has little patience with either. Nietzsche regarded competing
ethical systems to be simply tools to achieve or maintain power: those in power
devise moralities that will justify keeping them there (as a matter of fairness)
while those out of power devise moral definitions that will justify deposing
the powers-that-be in favor of themselves (as a matter of fairness). Then there
is the age-old simple proposition that might=right. It is extraordinarily
difficult to dispute this formula on a purely rational basis. I am not a Platonist
(for many reasons that are off-topic here) but Plato was pretty good at putting
into the mouth of Socrates strong refutations of other philosophers – typically
by getting them to refute themselves. Yet one of his least satisfactory counters
(despite Plato having written both sides of the argument) is to the “might is
right” assertion of Thrasymachus in Book I of The Republic. Thrasymachus argued that justice is whatever the
stronger party says it is, whether the nobles in an aristocracy or the demos in
a democracy or, for that matter, a shepherd and his sheep: “and by the same
token you seem to suppose that the rulers in our cities – the real rulers – differ
at all in their thoughts of the governed from a man's attitude towards his
sheep.” Socrates counters that a shepherd must look after the interests of his
sheep (in essence, act justly) in order to do well for himself; an abusive (unjust) shepherd soon won’t have a
flock. Yet, this really doesn’t answer Thrasymachus’ point that the shepherd eats
the sheep, not the other way around. Existentialists also dismiss codes of ethics
as anything other than human-made and insist that none of us can escape the
freedom to choose his or her own, whether pre-packaged or original, but that we act in
bad faith if we refuse to accept the consequences of our choices.
Well, I do have some ethics to which I
try to adhere with varying degrees of success, though I freely admit that choosing
the premises for them had more to do with taste than anything more solid. They
are rather old-fashioned on the whole, which is why I chose the ominous interpretation
of the sidewalk prophecy rather than the auspicious one. However, since the
prophet was someone with whom I had no history of interaction, what’s coming to
me (with regard to him) would be nothing. Even were this otherwise though, the world
is too chaotic a place to be sure of outcomes most of the time. The Joker’s comment in The Dark Knight nonetheless comes to
mind: “The thing about chaos? It’s fair.”
While running errands yesterday on a roundtrip drive of no
more than 20 miles (32km) I bypassed three road repair crews: two simply
patching potholes and a third stripping the road surface in preparation for
repaving. At all three sites the familiar aroma of asphalt was in the air. Getting
the work done now is a sensible precaution. There is not much more than a month
of more or less reliably favorable weather in this corner of the world. You
never know about November; it could be anything from a heat wave to a deep
freeze with any variety and quantity of precipitation. Emergency repairs aside,
roadwork not done by then had best wait until spring.
Last week while working outside my house I was approached by
a fellow about the asphalt on my own driveway. You know the pitch. Anyone with
a blacktop driveway has heard it: “I see your driveway is in bad shape. We’re
redoing your neighbor’s around the corner. Hey, we’re here, we have the
equipment. So, we’ll make you a special offer.” I passed on the multi-thousand
dollar special offer. I’m a cheap old bachelor. I don’t replace things like
windows, countertops, or appliances unless the current ones are actually
broken. (Unfashionable doesn’t count as broken.) Also, when repairs are within
my skill set I do them myself. The same goes for my driveway. I patch it when
it needs patching. It is not in bad shape overall despite the remarks of the
pitchman: there were no loose chunks or potholes except for one spot where the
driveway meets the road. Snow plows in the winter sometimes catch there and
cause damage. It has happened before and will happen again. It is an easy fix.
The special offer did at least prompt me (belatedly) to make it.
Some professionals distinguish among bituminous concrete,
blacktop, asphalt, and several other terms, but even the experts are
inconsistent in their usage. The words are used interchangeably in everyday
speech. It’s fair enough to call pretty much any thick hydrocarbon sludge “asphalt”
(with or without aggregate and whether mostly dry or mostly liquid) though
there are different mixes for different purposes. A more significant
distinction is the source. Natural asphalt can be can be found at or near the
surface in areas where it has gurgled up from deeper petroleum reservoirs. These
natural deposits range in viscosity from hard and crumbly to wet and sticky. So-called
“tar pits” such as those at La Brea are asphalt, not tar. Non-natural asphalt is
a byproduct of petroleum refining. When you crack crude oil by successively
separating out the various fuels and lubricants (kerosene, gasoline, diesel
fuel, etc.) you are left at the end with a residue of asphalt. You can’t help
it. Fortunately there is a market for that, too. Further, the formulations of
asphalt from refineries can be adjusted to suit specific needs. The kind you
dig out of the ground is catch-as-catch-can; each deposit has a unique admixture
of sand and other substances. Natural asphalt deposits were exploited by the earliest
civilizations for waterproofing. From Sumerian times onward asphalt was used to
secure cisterns, sewers, and boats against leaks. The Greek word “asphaltos” means “secure.” It supposedly
waterproofed the reed basket in which the future King Sargon as a baby was set
adrift in the Euphrates in 3800 BCE. Its first recorded use as pavement was 625
BCE in Babylon for a road fromKing
Naboppolassar’s palace to the north gate of the city.His son Nebuchadnezzar paved more roads from the
palace. The idea didn’t catch on more broadly in ancient times however. The
Romans, inveterate roadbuilders though they were, ignored the stuff as a paving
material. They used it to line baths, aqueducts, and drains. They used it to
caulk hulls. They didn’t surface roads with it. The reason was that Romans were
aware of asphalt’s weaknesses. They intended their roads to last, and last they
did. As late as the 18th century most of the best roads in Europe
were still the old Roman ones with their multilayer bases, proper drainage, and
fitted paving stones. Asphalt pavement is relatively inexpensive and provides a
great surface, but it does suffer from weather and traffic. It requires
maintenance. It doesn’t last. Today our calculations are different. The upfront cost of
building a four-lane interstate highway to Roman standards would be
prohibitive, and it still wouldn’t hold up to pounding by modern heavy
vehicles. We expect constantly to maintain and repair our roads, so relatively
cheap asphalt makes economic sense, as it has for well over a century. There is some competition from concrete,
which, though more expensive than asphalt initially, lasts longer, but
eventually concrete must be repaved, too. It is repaved with asphalt. Asphalt
is not only affordable straight from the refinery, it also is endlessly
recyclable. It can be torn up, heated, and laid right back down again. It is,
in fact, the most recycled material – more so than aluminum cans. In the US, the
EPA’s position since 2002 is that asphalt by itself is not a significant pollution
hazard. As that may be, I patched that pothole in the driveway yesterday.
I had a couple of bags in the barn. One was enough. If snow plows damage the
driveway again this winter, I’ll patch it again next spring. If only we could repair
the potholes in our lives so easily.
An unabashed carnivore (well, omnivore actually), I had lunch
last week at a local smokehouse. The Rolling Stones’ Gimme Shelter was playing on the oldies station. I commented to the
18-y.o. waitress while thumbing at the radio, “I missed the past 20 annual Farewell
Tours by these guys.” She shrugged and answered, “I don’t even know the band.”
Sigh. Fortunately she had delivered porcine goodness to the table to distract
me from the generation gap: candied bacon and a pulled pork sandwich with a
fiery bbq sauce. Distract me it did.
As Katherine Rogers explains in her book Pork: A Global History, pigs (Sus
scrufa) were among the earliest animals to be domesticated. (Dogs were
first by far, but as hunting companions.) It was once thought that farming and
husbandry (the Neolithic revolution) preceded the appearance of permanent
settlements and villages – that agriculture provided the prerequisite food
surpluses. The archaeological record reveals this to be backward. Today the
world’s remaining hunter-gatherers exist in extreme and marginal environments,
but this was not the case 10,000 years ago. They throve in temperate regions
rich in game and edible plants. In eastern Anatolia as early as the waning days
of the last ice age they could and did settle down in permanent villages while exploiting
only the local wildlife for food. They were well-fed enough to build impressive
stone monuments such those at Göbekli Tepe. Yet, the growth of these settled
populations did put a strain on wild resources, so the deliberate planting of grains
and the deliberate taming of farm animals eventually began. Two of the most
significant animal domestications were cattle from aurochs and pigs from wild
boar: both events occurred about 10,000 years ago. (DNA studies show all modern
cattle to be descended from an original stock of only 80.) At very nearly the same
time as this was happening in the Near East (and for very much the same reasons)
pigs also were domesticated independently in China. Pigs were easier than cattle (which also had other uses such
as dairy) for settled people to raise on small plots and farms for food, so
pork became the more prevalent meat in relatively densely settled regions until
very modern times. Pork products have another upside: properly smoked or cured,
they can be preserved up to a year without refrigeration. (The smoking/curing
process is rushed and incomplete for ham, bacon, and sausages sold in
supermarkets today, so they do not last long even with refrigeration;
traditionally smoked meats are still available from craft producers but they
are pricey.) The very fact that pork was so ubiquitous in prehistory, the
ancient world, and the medieval world made the prohibitions against it by some
religions (notably Judaism and Islam) a greater mark of distinction than
otherwise would be the case. Pigs were once a far more common sight than they are today. In
the ultra-urbanized 21st century it is easy to forget that for the
bulk of history the overwhelming majority of the world’s population was not
only poor but rural. In 1900 (according to Our
World in Data) only 16% lived in cities. In 1800 it was 7%. Rural folk raised
pigs and chickens even on very small plots and many of them were allowed to wander
freely. In wooded areas of Europe and North America pigs could feed themselves.
The free-ranging omnivores would run “hog wild” eating acorns, mushrooms (they
have an excellent nose for truffles), small animals, and just about anything
else. Even penned, however, pigs happily live on leftovers that otherwise might
be thrown away: table scraps, whey, brewers mash, etc. Samuel Sydney in The Pig (1860) writes, “There is no
savings bank for the labourer like the pig.” He explains that a piglet can be
bought for a trifle in spring or summer and grown on household scraps. The
owner then can sell the hams the ensuing winter for more than enough “to buy
another pig, and the rest will remain for his own consumption, without seeming
to have cost anything.” I know this advice was followed at least into the 1930s
since my paternal grandparents in the Depression did just that. Nowadays (since the mid-19th century actually) pigs
on commercial farms are mostly corn-fed. This leads to the peculiar hog/corn
price cycle, of which elementary Economics textbook authors are so fond as an
illustrative example of price interactions. When corn (maize) prices are high,
hog farmers sell their pigs rather than pay for the pricey feed. The rise in
pork supply on the market drives down pork prices. The drop in demand for feed from
pig farmers in turn drives down corn prices, which soon prompts farmers to withhold
their pigs from market in order to fatten them up with cheap corn. The consequent
reduction in pork supply in markets drives up pork prices which prompts farmers
to raise more pigs. This increases demand for corn feed, which pushes up corn
prices. So, hog and corn prices constantly cycle in opposite directions.
Barring some other major disruption (e.g. bumper crops or crop failures) when
one price is high the other is low. By the late 19th century in Europe and the
Americas, beef shouldered aside pork not only as the preferred meat dish (as it
already long was among those who could afford it) but as the more common one. This
shift to beef hasn’t happened everywhere. In China the pig has held onto its #1
position in the 21st century. As yet, the (steadily rising) annual per capita consumption of all meats in
China remains below US levels, but for pork consumption in particular China has
the edge: 38kg in China vs 28kg in the USA. Either number is a lot of pork. For
myself, I’m as happy with a braised pork chop as with a prime rib – sometimes
happier. There are those of a certain age who may chalk up to
nostalgia the seeming memory that commercially sourced hams and chops were
tastier in their youths. They really were. North American supermarket cuts are leaner
today – typically by 19% compared to half a century ago – in order to address
health concerns that a large portion of the public started taking seriously beginning
in the 1960s. (Remember “the other white meat” industry ads?) This does reduce
the calories in pork products, so there is that, but since fat enhances flavor
it comes at a cost. Ironically, there have been some second thoughts about dietary
fat among medical researchers in the past decade (see analysis published
in The American Journal of Clinical
Nutrition),
but these haven’t yet changed what is on the shelves. I’m aware of the concerns of vegetarians and vegans. (Those
who know me personally know just how close to home those views are.) I don’t
intend to engage in that debate here. Whatever it says about my
moral choices, however, while menus like that of the above referenced smokehouse
still exist, I’ll be ordering from them. Not many large animals (i.e. excluding mosquitos and worms and such) reciprocally
regard us as lunch at present. Though our proto-human ancestors were prey as
often as predator, in the modern world fewer than 2000 humans are killed and
eaten by large animals annually. As a point of interest, however, it’s long
been noted that humans taste like pork. Anthony Burgess, for one, confirmed
this. He wrote about his attendance shortly after WW2 at a ceremonial feast in
New Guinea; there, he partook of an offering “very much like a fine, delicately
sweet pork, which is what I thought it was.” He was shaken to learn it was a
warrior killed in a skirmish. He didn’t ask for seconds. Perhaps, however, this
flavor profile explains the aliens’ enthusiasm in that famous Twilight Zone episode.