Sunday, June 27, 2021

Roads Not Taken

Almost every day I drive on hilly winding narrow Roxiticus Road. Though the road is an old one that appears on mid-18th century maps, it never developed into anything that can be called a thoroughfare. It retains all its original blind curves, both vertical and horizontal, and so demands a modicum of caution when driving on it. Yesterday I found myself behind a bicyclist just before three successive blind curves. Much to the annoyance of the driver of the Mercedes on my tail I didn’t pass even when the cyclist slowed to a crawl going up a hill. It was the right decision since, sure enough, a car whizzed around the curve the other direction at a high speed. A pass would have been fatal. A simple choice such as whether or not to pass a bike can make all the difference in life. Giving in to impatience in such a circumstance is obviously wrong, but many simple choices are not so obvious, and they make all the difference, too.

Roxiticus Road (pre-summer).
Blind curve just past bridge.

The butterfly effect is a cliché but no less true for being one. The originator of the term was MIT meteorology professor Edward Lorenz. In 1961 he was testing early computer simulations of weather patterns. He repeated one simulation but rounded off a variable from 0.506127 to 0.506. The miniscule change radically altered the outcome. He came up with the butterfly example to illustrate the limits of modeling: small uncounted (and uncountable) variables such as the flapping of butterfly wings can have consequences all out of proportion to their size. I suspect, but don’t know, that Lorenz was a Bradbury fan. In 1952 Ray Bradbury published the story A Sound of Thunder in which a time traveler drastically alters the future by stepping on a butterfly in prehistory. This may have influenced Lorenz’ choice of examples, which just as easily could have been called the starling effect or some such thing.
 
Most of the major events in my life (and probably yours) ultimately derived from small random occurrences and tiny offhand decisions that compounded over time. I attended George Washington University, for example, and my life surely would be profoundly different had I spent those four years not in DC but in some other college in some other place – if indeed I even would have finished four years in some other place. Yet, GWU wasn’t on my radar at all when I first began sending out applications my senior year of high school. I just happened to overhear a fellow senior, who had researched schools more diligently than I, praising the university to another student. I already had picked four possibles and wanted a fifth for backup, so on the basis of a recommendation that wasn’t even aimed at me I added it to my list as well. Had I been a couple steps further away or a little less attentive to the conversations around me life would have been... well... otherwise.
 
Many other small moments proved as or more consequential. I very nearly didn’t speak to the young lady who became my first serious love interest: it was very much a coin toss and I certainly didn’t expect much to come of it either way. So, too, with my last love interest. (I’m pretty sure I’m done with that particular brand of insanity – and, whether of a good or bad kind, insanity it is.) Then there are the times I didn’t speak but almost did. My jobs, investments, car choices, home choices, and a myriad other things all give an outward appearance of having been planned but, while planning gave them some kind of after-the-fact order and coherence, all of them in some sense blew my way from the flapping of butterfly wings. (Dwight Eisenhower had something like this in mind when he said, “Plans are useless but planning is indispensable.”)
 
In the moments before sleep I often think how dissimilar things would be had I made this or that close call decision the other way. In truth there is no way to tell since countless other butterflies would have influenced the alternate outcome. I think about it anyway. If the multiverse interpretation is correct, somewhere out there are worlds where I (and you) did make other choices. If the determinists are correct on the other hand, everything that ever was and ever will be is already built into the structure of the universe; past, present, and future coexist, despite our limited perception, and choice is an illusion. I have no insight into which (if either) is correct. I do have a preference. There was a time when I found determinism (and with it the notion that the past with all its occupants physically continues to exist elsewhen) more comforting. Nowadays, though, I’m happy to live in an uncertain and ephemeral world of maybes. Reality is not subject to my preference or yours, of course, but until proven wrong that’s the interpretation I’ll go with. I’m curious to learn what minor and seemingly mindless decision today (posting this blog perhaps) leads to some unexpected big consequence down the road. Maybe nothing that happens today will affect the outcome of anything much. But maybe it will.

Janis Joplin – Maybe



Wednesday, June 23, 2021

Sibs

I caught up with an old friend today in an assisted care facility for the first time in more than a year. Due to COVID non-emergency visitations for non-family members were forbidden until recently and were difficult to arrange until the past several weeks. Even now visits must be scheduled in advance on limited days in limited time windows. Uncertainties in my own schedule ruled out visits until today. The fellow has Parkinson’s at a severe enough stage that he needs this level of assistance, but is otherwise healthy. Because of the other residents and staff he has not been entirely isolated, but he has been separated from friends and family for long stretches. It hasn’t been a fun year. It was good to catch up though it is always somehow surprising on a non-conscious level to notice that someone one knew as a youth is a few courses past his salad days – if I didn’t know better, I’d suspect I was aging a bit myself. We’ve been friends for more than 20 years but I’ve known him for 55 because he was part of my sister’s regular posse in high school. They never dated, but they hung out. It was only when I left at the end of the visit (limited to 30 minutes under current rules) that reflecting on this reminded me of the date. Today is my sister’s birthday.
 
Sharon (1950-1995) was 2 years and 5 months older than I, which is a huge difference in the childhood years and significant up through one’s teens – not so much afterward. Her arrival was fortuitously timed for my dad. North Korea invaded South Korea on June 25 (24 this side of the date line) 1950. He was in the Naval Reserve and under the selective service rules of the day he would have been recalled to active duty were he not a father. This was fortuitous for me, too, since… well… had he been recalled I wouldn’t be here at all.
 
Siblings play a major part in one’s own identity formation. (For an only child the lack of a sibling is a major part of identity.) A part of how we think of ourselves is always so-and-so’s older/younger brother or sister. Even if we hate a sib (some do), that, too, helps define us. There is an entire field of psychology dedicated to the subject, which is too complex to summarize in a few hundred words here, but I’d recommend Sibling Identity and Relationships (pic below) for those interested in an in-depth elucidation. Suffice to say for the purposes of this blog that though she’s been gone 26 years (and I’m on the dessert course with only the after dinner drink to follow), a part of me is still Sharon’s little brother.

 
Sharon and I got along fine most of the time. As a tween and teen she definitely helped my awareness of 60s culture in part just by being older and in part because she confidently swam in it when by myself I might have just dipped in a toe. Sharon was the poet of the family and I posted 100 of her poems online at Echoes of the Boom. They are worth a read. In my intro to them I wrote, “Sharon always was in sync with the times. She was a fine hippie in the Summer of Love, she discoed in the 70s, and she could out-yuppie Michael J. Fox in the Reagan years. There is much to be said for being in step, or at least so it seems to those of us whose footfalls are never quite right.” She did things in the right order: a wild child when young (to the point that it is still inappropriate for me to recount my best stories of her in a public forum like this) trending to a conservative lifestyle when older. (I did it the other way around, which had unfortunate consequences.)

1971

Older-sister/younger-brother is probably the least likely combination and birth order to promote sibling rivalry (at least in the absence of others sibs) but of course there is always some. I only wish there were more decades of it than there were.
 
Pup – Sibling Rivalry


Sunday, June 20, 2021

Walter Mitty Flies Again

Life slowly gets more social as the COVID restrictions continue to ease – emphasis on slowly. It is (or was before 2020) my wont to have solstice and equinox get-togethers at my house. The timing appealed (and still does) to the atavistic side of my nature – such celebrations date back to prehistory – and it doesn’t conflict with parties held by others on Memorial Day or Labor Day or whenever. For the first time since the autumnal equinox of 2019, I hosted one this weekend. It was nice to see some of the usual suspects again, though the turnout was less than half the usual number. This mirrors the experience of a friend who hosted a party last weekend. Though some of the missing truly had other obligations, it also is hard for many of us yet to break social distancing habits even with a vaccination card in one’s wallet.
 
Perhaps we've also lost some taste for sociality. One hopes that after a year of imaginary life via surfeits of Netflix, video games, YouTube videos, and daydreams, the real thing has not grown disappointing by comparison. But maybe it has for some of us. (The original Star Trek pilot – initially rejected by execs but later re-edited and aired as “The Cage” – had a premise something like this.) In truth, the past year probably just amped up pre-existing trends. Well before COVID, people were spending ever larger quantities of time in virtual worlds. Modern tech makes it very easy to immerse oneself in them…not that any of us needs tech to inhabit imaginary worlds. A Harvard study back in 2010 found that the participants (who numbered in thousands) spent an average of 46.9% of their waking hours daydreaming. Said Dr. Matthew Killingsworth, one of the researchers: "This study shows that our mental lives are pervaded, to a remarkable degree, by the non-present."
 
This is not necessarily a bad thing. Dr. David B. Feldman noted in Psychology Today that while there is a risk (for some) that the paler realities of the real world compared to fantasy can lead to depression, daydreaming also can be beneficial. By and large, dreamers are more productive, refreshed, and creative. Albert Einstein was inspired to pursue relativity by daydreaming about walking on light waves. Writers and artists of all kinds rely heavily on their fantasies. Classic adventure/scifi writer Edgar Rice Burroughs (Tarzan, A Princess of Mars, Pellucidar, et al.) said, "Most of the stories I wrote were the stories I told myself just before I went to sleep." Gore Vidal in a 1963 article reassessing the 23 Tarzan novels picked up on this remark and asked, “How many consciously daydream, turning on a story in which the dreamer ceases to be an employee of I.B.M. and becomes a handsome demigod moving through splendid palaces, saving maidens from monsters (or monsters from maidens: this is a jaded time)?” Quite a lot, I’d venture.
 
The advantage and, at one and the same time, the disadvantage of accessing imaginary worlds through tech is that they are not products of our own imaginations – at least not primarily, though it is fair to suppose that each player puts a personal mental spin on the experience. The negative assessments of virtual lives in some corners of the press nonetheless may be excessive. Video games in particular are commonly blamed for everything from dulling minds to desensitizing gamers to violence. The evidence on this is mixed at best. Said researcher Tom A. Hummer, PhD, "Asking what are the effects of video games is like asking what are the effects of eating food... They can have benefits or detriments depending what you're looking at." They can hone coordination skills and they can be mentally refreshing. (So I’m told: I’m not a gamer.) They also can be addicting, however, and addictions seldom are a good thing. Whatever the case, solitary gaming – or even gaming with anonymous online players – isn’t much practice for re-entering social life in meatspace, as it is charmingly known.
 
There is a phenomenon on YouTube and other video platforms that strikes me as particularly telling: reaction videos. Most of us find pleasure in watching movies or shows with a companion, thereby making it a social experience. On a reaction video we anonymously watch people we don’t know watching a movie or listening to music. Some of these “content creators” have tens of thousands of followers. Via the comment sections they even offer a modicum of interaction, but this is still a far cry from real company. That such sites are so popular says something, though I’m not entirely sure what it is. The experience is fundamentally parasocial. Parasocial relationships (as between a fan and a celebrity) are imaginary – or at the very least one-sided. Leaving aside the fringe people (including stalkers) who delude themselves such relationships are real, parasociality isn’t necessarily harmful unless it crowds out real world relationships, and even then harmful only to oneself. Nor does it harm anyone else to live a virtual life to the point of, for example, eschewing actual dating in favor of LovePlus software and realdoll.com hardware, but it’s hard to count the result as a life achievement. (No wonder the US birth rate is at an all-time low.)
 
I suppose the trick is to emulate ER Burroughs, something I’ve not done as successfully as I ought: be a Prince or Princess of Mars in one’s head to one’s heart’s content, but use the imaginary experience to inform one’s real life goals and values. Reimagine a past experience but write a novel based on it that ends better than the real one. Walk on a light wave. See you on Barsoom.
 
 
Devil Doll – It's Only Make Believe


Sunday, June 13, 2021

Klaatu Barada Nikto

Nowadays UFOs are often called UAPs (Unidentified Aerial Phenomena). Why? Did ETs complain that UFO was an offensive term? To whom did they send the memo? I think it should be released to the public. Until then I’ll stick with UFO. By whatever alphabetical designation, the investigation of UFOs known as Project Blue Book was closed as long ago as 1969, officially because the US Air Force determined there was no evidence of a threat to national security even in the case of unexplained sightings. The Air Force also noted (as UFO skeptics always have) that the failure to determine a mundane explanation in some particular case doesn’t mean there isn’t one. Times have changed, but we’ll return to that in a bit.
 
As a kid and throughout my teens I loved books, magazines, and movies about UFOs. Even though I was in fact skeptical of them being extraterrestrial vehicles, I enjoyed imagining what it would be like if they were just that. So did many of my friends. I knew at least a few of them were true believers – or purported to be – but just how many surprised me. I found out my senior year of high school when I was sure a teacher had overreached. Students in the class were pooh-poohing some old superstitions (I no longer remember which ones or in what context) and disdaining the previous generations who believed them. The teacher Mr. Drew countered by saying that every generation has its own mythology and superstitions, and that he easily could provoke an emotional reaction from us by questioning one of our own. He took a deep breath and said simply, “UFOs.” My initial assumption was that he had made a bad gamble and that (barring one or two outliers) the class would respond with a collective shrug. I was wrong. Mr. Drew was right. A cacophony of challenging voices immediately arose citing evidence of aliens-among-us. Apparently, after high school folks grow only marginally more skeptical. Today, according to an Ipsos poll, just under half of American adults believe that at least some UFOs are extraterrestrial spacecraft.
 
The notion of ETs has long fascinated people. The Roman author (writing in Greek) Lucian in the second century CE wrote about a battle among extraterrestrials in his tall tale A True Story. Voltaire wrote of them in the 18th century in Micromegas. HG Wells’ War of the Worlds continues to be reimagined. ER Burroughs fantasized about Martian princesses. TV shows keep returning to the premise, such as People of Earth, which deserved a third season. Documentaries and pseudo-documentaries abound.
 
UFOs are back in the news this year with the release of footage from naval aircraft of objects that seem not merely to fly but to flit. In a replay of the “flying disc” moment (an ill-considered press release) in the 1947 Roswell incident, the Pentagon said it couldn’t rule out aliens. (BTW, about a decade ago I blogged about the iconic Roswell incident: Sip from the Saucer.) The Pentagon is being (what a surprise) disingenuous. “Can’t rule out” is deliberately near-meaningless. The phrase doesn’t mean the top brass is concerned that the objects are extraterrestrial. I regret that. I really do. I want them to be aliens. But the very fact that they are so obsessively cozy with US aircraft carriers in particular makes it far more likely they are drones: spy drones for some other military, false flag drones for our own, or both. Astrophysicist Adam Frank, whose job is to search for signs of extraterrestrial intelligence for NASA, agrees, saying the “UFOs don’t impress me.” He adds, “if the mission of these aliens calls for stealth, they seem surprisingly incompetent.” Civilian military analyst Tyler Rogoway reached the same conclusion. He notes that drones can do things piloted aircraft cannot (pilots prefer to survive their flights) and that the footage in any event is less impressive than much of the popular press suggests. Apparently bizarre maneuvers become nothing of the kind when taking into account motions of the chasing aircraft and the camera equipment. The crafts’ persistent interest in naval assets and disinterest in pretty much anything else is a major clue. If any members of my old senior class are reading this, some are probably shouting at me through the computer – maybe even reciting the Drake Equation (a way of guesstimating the probability of alien civilizations). Nonetheless, the mundane explanation just by being possible can’t help but also be more probable. It’s a shame though.


Just in case I’m wrong, however, I’m glad that Nabisco has taken a friendly approach to the matter. The company is offering Oreo cookies to the aliens. I don’t think that has been tried before. It would work for me. Were I an interstellar traveler, I'd be much more comfortable with it than with the kinky invitation etched on the Pioneer probes.





 

Sunday, June 6, 2021

Chow

Since the late 1970s I’ve generally maintained an eating pattern that some find peculiar: I eat every other day. Nowadays there is a certain cachet to fasting strategies for health reasons, but that wasn’t so much a thing back then. I just experienced that normal but unwelcome post-college metabolism shift and found that the same calorie intake that kept me trim as a 20-year-old added 5 pounds (2.7 kg) a week at 25. There is no secret to losing (or to just stop gaining) weight: eat less. We all know this. But since “less” is almost certainly less than we like…well…we all know that struggle too. Wanting to halt weight gain but faced with the prospect of 1) feeling hungry (by eating less) every single day for the rest of my life or 2) feeling hungry (by eating nothing) every other day yet feeling sated the alternate days, I found the latter easier. I still do. It’s not something I’d recommend to anyone else, but it works for me – when I get into the groove of it.
 
I say “generally maintained” because I’ve maintained it more often than not over the past 45 years. There have been long stretches (lasting years) when I’ve abandoned the strategy for various reasons. The results always were predictably bad. Some of those years have been recent and the results got particularly bad (not helped by other aspects of 2020) by the time last year’s holidays arrived. So, I resumed every-other-day in December and slowly have drifted down about 30 pounds (13.6 kg) since then. The downward drift may continue a little until a new balance is struck, but just a little; further non-trivial losses would require even stricter measures. Mostly it’s now just about maintenance. Having been back in practice for 6 months, the regimen is again second nature. However, I do admit to thinking about food a lot on fast days: not just about the thing itself but about things about the thing itself, such as Richard Wrangham’s book Catching Fire: How Cooking Made Us Human.


Cooking food makes calories (technically kilocalories, but I’ll stick with common usage) much more accessible. Wrangham argues this had the secondary effect in the distant past of freeing up time and energy for intellectual and cultural development. How far back did this start? The oldest reliably dated barbeque pit is 400,000 years old but dental evolution (a trend toward weak jaws and small molars) evidenced in hominin fossils hints that cooking may have started as much as a million years earlier. Humans don’t need the robust jaws and dentition of our great ape relatives. We get by on a much shorter and simpler digestive tract as well because cooked food is softer and easier to process. So, while chimpanzees and other apes spend a third of each day eating and chewing just in order to extract enough calories, modern human hunter-gatherers spend only 5% of their time at it. While it is the rare hunter-gatherer who gets fat (the lifestyle is very active and involves few pastries), the same biology poses a serious risk for those of us who are more sedentary and have overstocked pantries.
 
Body Mass Index (BMI) is the most common tool for categorizing people into different weight classes, but it is a crude one. It has some value when applied to large populations in which the errors tend to cancel out, but on an individual level the tool has serious limitations. It is calculated by dividing a person’s weight in kilograms by the square of the height in meters. A reading over 25 is regarded as overweight while a reading over 30 is regarded as obese. Take, for example, a 215 pound (97.5 kg) man who is 5’11” (1.8 m). He has a BMI of 30.09 so the fellow is obese. Yet, that height and weight could describe three very different people: a body builder with a chiseled physique, a couch potato high in blubber but low in muscle, and someone in between. Calling at least one (maybe two) of the three obese is plainly silly in common parlance, yet they all have the same BMI. Still, BMI is a quick rough-and-ready guide for comparing a small number of average folks or large groups of general mixtures of people.
 
By the numbers, 42% of adult Americans are obese. According to the CDC, which collects the data, there is “no significant difference in the prevalence of obesity between men and women overall or by age group.” In 1970, the year I graduated high school, 15% of adults were obese. I remember 1970 very well and, trust me, we didn’t starve ourselves. So what happened in the next half century? For one thing we followed the advice of the food gurus of the day (advice that lately has been called into question) by cutting back on what we were told was bad for us. Annual consumption of red meat dropped 16 pounds (7.27 kg) per person since 1970. The percentage of fat in our diets has declined from 44% to 33% (though in absolute terms we eat more fat because we eat more calories). Egg consumption fell substantially after 1970 and dairy plummeted. Yet, our substitutions (e.g. chicken for beef or sugary sodas for milk) are either little better or very much worse, and we eat far more of everything else. We eat, on average, 25% more calories more per day than we did in 1970.
 
Average adult caloric intake in in the USA in 2021 is 3600 calories. The USDA recommends 2,600 calories per day for moderately active men and 2,000 for moderately active women. For couch potatoes, cut those numbers back to 2,000 and 1,600 respectively. Yes, those numbers are “on average”: any one person could need substantially more or less based on individual physical characteristics and activities. There is vastly more to nutrition, of course, than calories, which are just a measure of available energy. They come packaged in proteins, fats, and carbohydrates. The right balance of these along with vitamins and minerals is crucial to health. Still, some things are best kept simple, and counting calories is a simple way to avoid surprises on the bathroom scale in the morning. There still might be unpleasantness, but not much surprise.
 
I have no advice to offer anyone on how to control weight. There are plenty of other people eager to do that – and to charge money for the privilege. Besides, I’ve often been bad at it. When I’m good at it, my methods are idiosyncratic to put it mildly. Most people find them crazy. But there is probably some kind of special crazy that does work for you. Best of luck finding it. For me, however, tomorrow is an on-day and I’m contemplating breakfast. (I’ll post this before going out the door in the morning.) I’m thinking maybe a chili jalapeno omelet with black coffee. No toast, but a side of bacon.
 
The Turtles – Food