Nothing in this world is permanent. The world itself is not
permanent: civilization far less so. The odds favor civilization getting
through the next 100 years, but there is a small risk it won’t. The risk of
disasters – natural and anthropogenic – occurring that don’t threaten
civilization but nonetheless kill millions is so high as to approach certainty,
e.g. the Spanish flu of 1918-19 and the Asian flu of 1957-58, the latter of
which was as deadly as COVID-19 though the public response at the time was pretty
casual. Back in 2008 the Future of Humanity
Institute at Oxford took a stab at calculating odds of human-caused
catastrophic global events prior to 2100. The researches divided risks into
three categories: events causing at least 1 million deaths, events causing at
least 1 billion deaths, and extinction events. The chance of extinction from a
genetically engineered virus was given at 2% by 2100. The existential risk from
Artificial Intelligence was estimated at 5%. The extinction risk from war came
in at 4%. The total risk of extinction from any and all anthropogenic causes
combined by 2100 came in at 19%. FHI focused on human-caused disasters, but natural
phenomena have in the past caused and will again cause global mass extinctions.
An outburst of vulcanism on a scale that formed the Deccan and Siberian traps
could do it. So could gamma ray bursts from a close enough supernova: one such
may have caused the Devonian extinction. Asteroid impacts are not just the stuff of
science fiction. Current estimates are that some 2000
asteroids large enough to threaten civilization have orbits that cross earth’s
orbit. Most are unidentified despite ongoing cataloging efforts by NASA and
other space agencies. Many times that number are too small to be globally
catastrophic but still large enough to obliterate a city. This was demonstrated
on many a dashcam in Chelyabinsk Russia back in
2013 when a 20-meter diameter meteor vaporized with an explosive force
estimated at 500 kilotons. (This is about equivalent to the W88 thermonuclear
warhead carried by some US and British Trident missiles; the Hiroshima bomb was
15 kilotons.) Fortunately the rock exploded at an altitude of over 25
kilometers, which limited damage on the ground though 1000 people were still
injured by flying glass from the shock wave. A rock about this size enters
earth’s atmosphere every few decades, mostly over the ocean. Near misses by far
bigger ones occur with similar frequency.
Chelyabinsk meteor
In 2029 the large asteroid Apophis will
pass inside the orbit of geosynchronous communications satellites. It will be
visible to the naked eye from the ground. The odds of it striking earth are
vanishingly small on that pass, but when it returns in 2068 the odds, while
still small, are not vanishingly so. It is impossible to calculate all the
possible perturbations to an asteroid’s orbit from gravitational interactions
and small impacts, so there is always some degree of uncertainty to these
predictions. The most dangerous rocks of all, of course, are the ones that no
one sees until they arrive. Sooner or later a big one will. The good news is that (probably) nothing so earth-shattering will happen
within the lifetime of anyone alive today. Individually, we are much more likely to be done
in by mundane events than by anything celestial or geologic. Leaving aside
death by natural causes, which all of us who live long enough will face,
ordinary accidents pose a non-negligible risk. According to the Insurance
Information Institute the lifetime risk of accidental fatal poisoning is 1 in
63 (1 in 4990 in any one year). The lifetime risk of dying by automobile
accident is 1 in 107. Falling down stairs is 1 in 1652. The chance of death by
firearm is 1 in 289. The total lifetime risk of dying by an accident of any
type is 1 in 17 (1 in 1306 in any one year). Those are still pretty good odds
that our own klutziness won’t kill us. Most of us will get to live out our 4000
weeks.
4000 Weeks is the title of a book by Oliver Burkeman, long time
columnist on psychology for The Guardian.
He believes that calculating in weeks rather than years improves our sense of
just how fleeting life is. He does this not to be depressing but to nudge us
into focusing on what is important to make that limited span better. He reminds
us that every activity will have a “last time,” and we do well to treat each
one with the reverence we would show if we knew for a fact it was the last: “there
will be a last time we visit our childhood home, or swim in the ocean, or make
love, or have a deep conversation with a certain close friend.” And that is
without a rock falling on us from space. By and large, Burkeman makes good
points, but in a world filled with office hours, quarterly taxes, and monthly
bills that impose from the outside an order on our time, his advice is easier
to give than to follow.
Like many people, I’m single at heart. This is something that
took me an inordinate amount of time to learn, but which all my romantic
companions discovered much sooner. (The maximum length of any of my
relationships, including my ill-fated marriage, was 3 ½ years.) There was, of
course, dating along the way to this self-discovery, done the old-fashioned way
via in-person encounters. “Old-fashioned” may be the wrong qualifier, however. Many
people made use of personal ads during my halcyon dating years, but not a
majority. In the 21st century the personals have largely (yet not
entirely) been superseded by dating sites and apps. A majority of Millennials
and GenZs do use online dating sites. (A substantial minority of older folks currently
use them, too.) These sites are really just a more complex incarnation of personals
ads, which have been a feature of newspapers for more than two centuries. In the American colonies personal ads first appeared in 1759
– a century later than in England. Francesca Beauman writes about the history
of advertising for romance in her book Matrimony,
Inc.: from Personal Ads to Swiping Right, a Story of America Looking for Love.
But for phrasing (the old ones are better written) little has changed in the
content of such ads since then despite radical social changes over the
centuries. As evolutionary psychologists would predict, men posting such ads mostly seek someone young
and pretty (many 19th century ads specify 18-24) while women mostly ask
for someone financially secure and reliable (many 19th century ads also
specify no drunkards). The 21st century is strangely not much
different than the 18th. Notes an article in the Telegraph,
“Research gathered in a scientific speed-dating study [http://www.pnas.org/content/104/38/15011.full]
reveals that when it comes to the rules of attraction people behave like
stereotypical Neanderthals.” The same pattern is reflected in ads and in dating
site behavior as in in-person speed dating.
From a modest start in the 18th century personal
ads in America saw an explosion in the 19th century. A common type
was ads from men who had homesteaded a farm or built a business out West but were
unhappily single. Wrote, for example, A.B. Collins in the Leavenworth Times (1870): “I wish to make the acquaintance of a
lady of good character, between the ages of 22 and 30; good looking, good
disposition, understands and likes housekeeping, would like to live on a farm,
and if understands music please mention it.” As for female posters, this from
the Public Ledger, November 5, 1845,
was fairly typical. “WANTED – A HUSBAND. The qualifications requisite are
industry, sobriety and honesty; one that is capable of making a wife happy and
a home comfortable, not over 40 years of age and of gentlemanly appearance.” The ads are weirdly fascinating, and were at the time even to
those who had no intention of answering them. Wrote Mark Twain (1867), “You may
sit in a New York Restaurant for a few hours, and you will observe that the
very first thing each man does, before ordering breakfast, is to call for the Herald – and the next thing he does is
to look at the top of the first column and read the personals… There is such a
toothsome flavor of mystery about them!” Unsurprisingly, professionals used the personals to promote
business, though because of newspaper policies their intentions were thinly disguised,
e.g from an 1872 Cincinnati Enquirer,
“A young lady of 20 would like the acquaintance of a nice middle-age gentleman
of means: object, pleasure during the summer months.” If anything, late 20th
century ads of this type in respectable publications tended to be even more
discreet, offering such services as “relaxation therapy for the tired executive.”
(Discretion isn’t a big feature of comparable 21st century internet
ads.) At least these ads were for voluntary trade, legal or not. But there was
always a legitimate fear among interested readers, as there is today, that an
advertiser was a more dangerous sort of criminal. In general, women had to be
more cautious, but men, too, could encounter the likes of Belle Gunness. In the
late 19th and early 20th centuries the enterprising Belle
placed newspaper ads for men seeking “companionship with wealthy widow.” The
ads required that the men have at least $1000 (about $35,000 today) to prove
they wouldn’t just be leaches. The men who answered the ads, mostly
Scandinavian and German immigrants, then disappeared. The number is still
uncertain but a couple dozen bodies were discovered buried at her farm after a
1908 fire brought things to light. More graves are probably still undiscovered.
Belle presumably died in the fire but her remains were never identified. The bulk of the ads, however, were from people legitimately
seeking romance. The personal ads faded for a while in the mid-20th
century but never went away entirely. In the late 20th they made a
comeback and then were largely replaced by dating sites and apps in the 21st.The self-advertising on these sites is more
thorough and indeed more personal (though probably no more reliable) than the
old personals. Once again, a majority of Millennials and GenZs use them.
Despite the ease of using them, oddly Millennials and GenZs in general date
less than their parents and grandparents did at their ages. Whether dating apps
somehow contribute to that decline is an open question. Other social factors
may be more important. The biggest complaint from social critics about the modern apps
is that they promote class tribalization. The filters allow one to consider
only those of similar class, education, politics, and values, and most users do
precisely that. My own suspicion, though, is that the use of filters in this
manner simply reflects the tribalization that already exists. I doubt the apps
make a big difference As that may be, I’m just glad to be out of it. For many of us,
(Pace Tennyson) tis better to have
loved and lost than to have loved and found. It’s relaxing to make peace with
that.
This morning’s non-vegan breakfast – three eggs over easy on
prime rib hash – is one I never would have ordered as a kid. I had a sweet
tooth (more of a sweet tooth) and so
was likely to order pancakes. I did like eggs but only scrambled. Back then, yolks
when runny seemed to me unpleasantly raw and yolks when firm looked and tasted
pasty. Blended with the whites, though, they were good. Tastes change. By my
20s I liked lots of things I didn’t as a kid, including runny yolks. Nowadays I
like eggs every which way. I rarely have pancakes.
over easy
People have been eating eggs before, strictly speaking, we
were people. Eggs – whether bird or reptile – are such high nutrient sources
that few animals can resist raiding nests for them and our ancestors were no
exception. Someone eventually got the idea of raising birds just for the
purpose. Red junglefowl (which still
exist in the wild) were bred into domestic chickens in India no later than 3200
BCE – possibly as early as 7500 BCE. Domestic chickens are documented in
ancient Egypt and ancient China by 1400 BCE, and eggs were part of the cuisine
in both places. Since ancient times, chickens overwhelmingly have been the
source of eggs consumed by humans, though ducks, geese, ostriches, quail, and a
few other birds are notable contributors. We don’t know how much eggs contributed to ancient diets –
they didn’t really record data on that. Present day annual per capita egg
consumption in the USA however is 286. This is way down from its 1945 peak of
404, but is still a substantial number. (This figure does not take account of
egg ingredients in baked goods and other products.) My own consumption is
higher: probably in the 500s by rough-and-ready calculation. China currently
leads the world with per capita consumption well over 300. 40% of all the
chicken eggs in the world are produced in China. Though chefs around the world have created a wide array of exotic
egg dishes, most of us aren’t looking for anything so fancy in the morning. We want
our eggs simple and in a reasonable hurry. There are only so many ways to cook
eggs simply and in short order. So, it is not surprising that ancient recipes
differ little from modern day ones around the globe in every way but the
seasonings. The seven standard ways are in water: poached soft boiled hard boiled
fried: over easy sunny side up scrambled and omelet (which basically
is scrambled eggs with one or more extra ingredients) These preparations go by different names in different places
with minor local variations in preferred runniness (and a lot of variation in preferred
side dishes) but, by whatever names, they are found everywhere. There are a few
regional specialties of course. Chinese century eggs (which are aged for weeks
or months in clay, salt, wood ash, and quicklime) come to mind. So does balut,
a Filipino treat that is a boiled fertilized duck egg. Perhaps the strangest is American, though it is not as popular as it once was. The “prairie
oyster” is a raw egg in Worcester sauce (vinegar and tomato juice optional).
Supposedly it is good for a hangover, and I imagine it would (if nothing else)
get your mind off one. I’ve never felt motivated to try it. Eggs have always been a common breakfast food, but in the
early 20th century in the US and some other Western countries there
was a trend to lighter, supposedly healthier, breakfasts promoted by cereal
makers such as Post and Kellogg. This trend reversed in the 1920s thanks in
large part to marketing guru Edward Bernays, Sigmund Freud’s nephew and author
of the how-to book Propaganda (1928).
Faced with a surplus of bacon, the Beech-Nut Packing Company hired Bernays, who
found 5000 doctors to say that the old high-protein farmer’s diet was healthy after
all (“doctors say…”). Bernay’s advertising campaign was a success: bacon sales
took off along with sales of eggs that bacon strips so tastily accompany. There
things stood until the 1960s when concerns about cholesterol and heart disease
suddenly gave eggs a reputation as unhealthy, which largely accounts for the
decline of per capita consumption in the US that persisted until the current
century. As so frequently happens with dietary advice, however, there
has been a change of heart among medical professionals in recent years. According
to Harvard Health Publishing,
“For most people, an egg a day does not increase your risk of a heart attack, a
stroke, or any other type of cardiovascular disease.” The same article notes
that most cholesterol is produced by the liver in response to fats and
transfats, rather than being absorbed directly from dietary sources.A recent study of Diabetes II patients in particular, for whom eggs are a commonly recommended
food, reached similar conclusions: “A healthy diet based on population guidelines
and including more eggs than currently recommended by some countries may be
safely consumed.” As that may be (and I haven’t the expertise to contribute to
the debate), my breakfast choice, truth be told, has less to do with health
than satisfaction. If that turns out to be an epitaph, I can think of worse.
Back in the ‘80s a 2 or 3 mile walk was part of my daily
regimen. That isn’t an impressive distance, but it was something. Like other
Boomers, I felt the dreaded 30-something decade of life weighing on me in the ‘80s,
so I responded by fighting the aging process aggressively with diet, exercise,
and vitamins. No doubt all of that was healthy in general way, but aging
continued anyway, of course, so by the ‘90s my enthusiasm flagged and I slacked
off. These days I’m not sedentary exactly (I do most of my own yard and repair
work, which keep me on my feet), but I no longer walk just for the sake of exercise.
On the other hand neither do I circle the parking lot at the supermarket, as so
many drivers do, looking for the closest parking space to the door. My legs
aren’t broken, and I’m happy to walk an extra 200 feet. I do have a winding
trail through my woods (four of my five acres [two hectares] are wooded) that I
commonly walk, but that is more for peace of mind than for exercise. In truth
though, the best thing I could do for my health would be to resume my old ‘80s
regimen. Age is no excuse as was demonstrated as long ago as 1909 by
professional walker Edward Payson Weston. On March 15 of that year at the age
of 70 he began his walk from New York City to San Francisco. His goal was to
get there in no more than 100 days. The Last Great Walk by Wayne Curtis, which details the event, is worth a read.
Professional walking was a thing in the 19th and early 20th
centuries. (There still are walking races, of course, but they don’t draw press
and crowds as they once did.) Often an event was not a race but a solo long
distance challenge walk, as in “I’ll walk from Boston to Chicago in 35 days.” People
would take bets on whether the walker could do it, newspapers covered the walk,
commercial sponsors paid the walker’s expenses, and walkers would give paid
lectures at towns along the way. (Before the movies became big by 1913, people would go to lectures like that for entertainment – besides, a famous
walker was as close to a celebrity as someone in small town Missouri or
wherever was likely to see.) Weston’s first sponsored walk was in 1861 at age
22. He had bet that Douglas would win the 1860 election. If he lost the bet he would
try to walk from Boston to Washington DC in 10 days or fewer in time for
Lincoln’s inaugural address. He already was savvy enough to get sponsors for
the effort. He missed the speech by 4 hours, but though he technically failed he
had found a sport he loved. During the Civil War, though a civilian, he
delivered messages on foot for the Union army. After the war he took on
numerous challenge walks of 100s of miles – in a few cases over 1000 miles –
and was cheered on by fans in the towns through which he walked. He succeeded
at making a living at it, which was fortunate since he wasn’t ever successful
at any other business undertaking. The New York to San Francisco walk was meant to be a career
topper. Things went wrong with the weather from the start. Snows, strong
headwinds, and torrential rains dogged him along the way. In 1909 the USA did
not have much of a road system. What there was between towns was poorly
maintained, rutted, subject to washouts, and in places barely passable.
Accordingly, Weston for much of the trip walked on or next to railroad tracks,
especially in the western states. These also offered the best routes through
mountain passes and over rivers, though he admitted to feeling uncomfortably
trapped when crossing trestles or walking through tunnels. Only once did he
feel personally threatened: he attracted the attention of two hobos at one otherwise
empty railway stop and they followed him for a while, but he successfully
outpaced them until they gave up. Apparently this slow chase never did break
into a run. Weston averaged 38 miles (61km) a day not counting Sundays, which
he took off. The New York Times and
other papers reported on his progress, and there were always cheers from the
sidewalks when he passed through a town – and often free meals and other
generosities, though his sponsors normally paid for hotels and restaurants
anyway. By the time he reached Nevada, Weston knew he was behind schedule and
was kicking himself for cockily having added 323 miles early in the trip by
arcing through Albany and Buffalo instead of striding directly across
Pennsylvania. On July 14, 1909 Weston reached the 16th Street ferry
in Oakland, California. To avoid any hint of cheating he walked around town for
6 more miles to compensate for the length of the ferry trip to San Francisco.
He disembarked in San Francisco at 10:50 p.m. and at 11:10 arrived at the St.
Francis Hotel where a room was waiting for him. He had walked 3925 miles
(6317km) in 105 days, 5 hours, and 41 minutes – not counting 17 Sundays. Despite receiving copious congratulations for his feat (and getting
paid for endorsements of clothing and footwear), Weston was upset he had missed
his 100 day deadline. “I do not feel inclined to close my career with a
failure,” he said after taking the train back to New York. He decided to repeat
a coast-to-coast walk the next year, but opted for a west-to-east direction
with prevailing winds at his back; he also chose a more southerly route. So, in
January 1910 he took the train to California. On February 1 he left Los Angeles
on foot for New York, this time allotting himself 90 days. He turned 71 on the
walk. Despite bending his route in order to see the Grand Canyon and the
Petrified Forest, on this occasion he beat his deadline. He reached New York in
78 days. He then announced his retirement though he did in fact take on much
shorter challenges (in the range of 100s of miles) from time to time. Long daily
walks just for his own fitness remained part of his life until age 88 when he
was hit by a taxi in NYC. The accident confined him to a wheelchair. Being
unable to walk took the life out of him and he died two years later. Walking – specifically bipedal walking – made humans human.
Not opposable thumbs, which animals from opossums to pandas have. Not big
brains, which came much much later. It was bipedal walking. It’s the most
natural thing we can do. It was a killer app that saved energy and allowed our
ancestors to travel phenomenal distances in search of new land and resources.
It freed up our hands to carry infants, weapons, and tools, which in turn gave
an advantage to individuals smart enough to make weapons and tools. The famous
walk track at Laetoli that is more than 3.6 million years old shows our
ancestors, though chimp-like aside from their upright posture, already walked
like modern humans. They were on the path to a very different future. Our
ancestors’ penchant for walking increased as millennia passed and brains grew. Homo erectus walked all the way from
Africa to Indonesia. Modern humans, since exiting Africa some 70,000 years ago,
in a surprisingly short time walked not only all over Eurasia (with a hop
across the sea to Australia) but all the way across the Bering land bridge down
the Americas to Tierra del Fuego. Homo
ambulans (walking man) might be a better name for the species than Homo sapiens (wise man)… at least until
recently. Nowadays it might be Homo
sedens (sitting man), which isn’t working out well for us in terms of health.
Sitting was once a luxury in days filled mostly with walking. Like so many
former luxuries, we now get more of it than is good for us.
Laetoli footprints
It’s easier to state the problem than fix it, even though
“fixing it” on an individual level simply means getting up and walking
somewhere. Inertia (which sounds so much better than laziness) certainly
afflicts me. I, for one, have no plans to walk to New York from my far flung
suburb much less hoof my way to San Francisco. For that matter I have no plans
to walk to the grocery store, which is about 4 miles (6km) away. I probably
won’t resume my old ‘80s daily walking regimen. But I’ll keep strolling on my
own wood trails and maybe will add an occasional walk in one of the area’s
parks. It’s not much, but it’s something. Those with better motivation for
healthy living than I would be well served by treading more ambitiously. But
watch out for taxis.