Thursday, August 31, 2023

Strange Days

My mom and dad were born in 1928 and 1926 respectively, which by standard reckoning makes them both Silent Generation (b. 1926-1945) though my dad probably had more in common with the GI Generation, not least because he served in WW2. The time into which we are born does not compel us to have a particular mindset, but it does impel us. The milieu in which we grow up just seems normal to us. Indeed, we can reject the prevailing mores and world views of our generation and tribe, but to do so requires giving them more thought and reevaluation than most of us devote to the matter amid the distractions of day-to-day life. So it should be no surprise that both of my parents had views and values that today are called traditional – or by some folks reactionary. This mindset served them well in business and home life. It served my sister and me (both Boomers) even better by providing us a solid upbringing and enough prosperity to give us options in our own lives. We responded by criticizing their mindset none too politely. Youth rebellion is a constant in all eras of course.
 
I remember my mom saying to me around 1969 or 1970 that people and the prevailing social mores seemed normal and understandable to her (even if they needed improvement) until the early 1960s. “After that, everything and everyone just went completely crazy,” she said. It was a remarkable statement for someone who remembered the Great Depression and the Second World War, but I knew what she meant. There was something different about Charles Manson compared to criminals of the past. The Broadway musical Hair was very far from Oklahoma! Hippies were fundamentally different from 1950s greasers. It was hard to keep up with all the self-styled revolutions – social, personal, and political – in the 1960s. The sexual revolution, for one, was different than the plain old-fashioned cheating of earlier times. Over a very few years counterculture values had gone mainstream: they became less “counter” than just “culture.” To her and many like her it all seemed weird.
 
In 2023 I can relate better than I could then. In his book The Baby Boom: How It Got That Way...And It Wasn't My Fault...And I'll Never Do It Again, P.J. O’Rourke apologized to future generations of youth for Boomers “having used up all the weird.” He needn’t have worried. As it turns out we hadn’t come close to doing that. The 2020s are weirder than the 1960s. It is obvious in ways small and large. Unruly passenger incidents, once blamed on the pandemic, grow more frequent year by year. Americans of differing philosophies are as hostile to each other as they ever have been. The murder rate after two decades of decline (peak year was 1991) have rebounded to numbers not seen in 30 years. So too carjackings and daylight robberies. Tent cities on sidewalks – not seen even in the Great Depression – are regarded as normal. Reported mental health issues are higher in Millennials and Zoomers than any previous generation. The gender debates that occupy far too much attention would have puzzled the most tuned in, turned on, and dropped out hippie. Per capita alcohol use is up as is drug consumption – not trippy psychedelics but narcotics. Yet, despite all this, young people are delaying joining in on vices compared to Boomers and Xers. It all seems weird.


 
Perhaps none of this seems strange to those who have grown up in the last 20 or 30 years – a matter of the fish not noticing the water in which it swims. But they are strange days indeed for many of us with more miles on the odometer. If P.J. were still with us he could stop feeling guilty.
 
The Doors - Strange Days


Thursday, August 24, 2023

With Bells On

I’m pretty thorough about the donating or disposing (depending on condition) of clothes I no longer use. The “if you haven’t worn it in two years” rule is a good one and most of the time I follow it. The few exceptions are items, some of them not even mine, that I’ve held onto for nostalgia reasons: my dad’s military uniforms for example. I also kept a handful of samples of some of the wilder fashions of the 60s and 70s just in case I wanted to do a theme party or some such thing. Yet, over the years they have decayed – colors fading, holes appearing – despite being unworn in a dark closet. I don’t know why, since those uniforms from 25 years earlier are still intact, but so it is. As a practical matter they would be unwearable even if they still fit (which they don’t), so I set nostalgia aside the other day and bade my sartorial past a permanent goodbye, including the green striped bell bottoms. The only bell bottoms left now in my closet are on my dad’s old sailor whites.
 
Bell bottoms used to be everywhere in my teens and twenties. This was so much the case that when they made a bit of a comeback I didn’t even notice until it was pointed out to me. They just look normal to me. This time they are primarily in women’s fashion but in the 60s/70s they were both. The pics below of myself and my sister in bells are both from 1970.

My mom and I 1970

My sister Sharon

 
The origin of bell bottoms is usually traced to the US Navy around 1800. Though not official uniforms at the time, they became common as everyday shipboard work clothes. They were favored by sailors for being easy to roll up when swabbing the deck and performing other chores. They caught the eye of someone in the Royal Navy where they did become part of the uniform. This in turn back-influenced the US Navy, which then also officially adopted bells. By the end of the 19th century, classic blue denim bell bottom trousers were the standard US Navy shipboard work uniform. They remained so until 1999.
 
Civilian versions turned up from time to time. In the 1920s they were a minor women’s fashion trend. They often were called “yacht pants” because of the seafaring origin. They really took off with the general public, though, in the 1960s. Some publications credit the Army/Navy surplus stores, which sold cheap denim bells to impecunious hippies. Maybe. But maybe not. I remember those years, and most self-styled hippies were from solidly middle class families. They shopped more for fashion than price. Army/Navy surplus was indeed more of a thing than it is today, but I doubt the civilian fashion started there; more likely the Army/Navy stores just benefited from it. In any case, textile companies (jeans producers in particular) jumped into the field immediately. Bells, ranging from modest to bulging, had spread from the counterculture to the mainstream by the late 60s, even in semi-formal wear. It actually was harder to find straight legs – they were there but you had to look for them.
 
Precisely because bells were mainstream in the 1970s, ditching them became a statement of fashion rebellion, notably among punk rockers and their fans. By the 1980s, bells were passé though you still saw them on people who hadn’t refreshed their closets in a few years.
 
Enough years have passed for bells to re-emerge, this time without any significant social statement or context. It is just a style. Most of the US population, after all, was born after 1985 so there is no direct memory of the 60s/70s among those setting fashion trends.
 
My own bells, however, are gone. I do have “boot cut” jeans because… well… I wear boots, but the flairs on those are so trifling as to be unnoticeable with a casual glance. They will have to do.
 
 
Derek & The Dominos (Eric Clapton lead guitar) – Bell Bottom Blues

 

Thursday, August 17, 2023

Festina Lente

 “Anything worth doing is worth doing slowly,” said underrated philosopher Mae West.  I’ve slowed down a lot in recent years, not entirely by choice. The surprising thing is how little this has affected outcomes. It has affected the process substantially, removing much stress as a bonus. Oh, it means prioritizing: delaying or canceling some side projects rather than making frenetic attempts to multitask. The important stuff still gets done however, if anything more reliably on time.
 
There are times when hurrying is necessary. We might need to get to the airport to catch a flight or we might need to meet a deadline for a tax filing. Of course we wouldn’t be so rushed had we not so crammed our days with other busyness that we left ourselves insufficient time for those tasks. Nonetheless, we all make that mistake sometimes, so the events happen. Generally, though, rushing accomplishes little. It leads to errors (having to do something over is not a time saver) and unhappiness.
 
There is a vast literature both ancient and modern on the value of slowing down. They range from fables (you know the classic) to Zen Buddhist tracts (e.g. Haemin Sunim’s The Things You Can See Only When You Slow Down) to medical studies. Corporate efficiency experts are on board too, even though some managers haven’t gotten the message. A recent article in Inc. Magazine is typical: Why Rushing Through Tasks Is Killing Your Productivity. “It's not about how much you do,” the article states. “It's about taking the time to give your best and refusing to sacrifice well-being.” In other words, doing something slowly but right is better than doing a lot of things quickly but badly – not just for you but for the company’s bottom line. Dave Mastovich, CEO and founder of MASSolutions, which he describes as a No Bullshit Marketing Consultancy, agrees. He says in a podcast titled Slow Down to Speed Up, “The myth about rushing is that if we move faster, and we move and rush, that is going to help us and save us time, when in reality, it doesn’t save us much time at all if we step back, and it definitely adds stress and impacts our health.” He further says that rushing rarely saves more than 3 or 4 percent in time while having negative effects, not just on you but on others, that more than negate any benefits.


 
In Psychology Today Susan Biali Haas M.D. describes working with clients who face burnout from their constant rushing not just at work but with their families She writes, “If you slow down and work and live more intentionally, and if you take better care of yourself, you’ll probably be way more effective. At everything in your life.” It helps ease that burnout too. She adds, “Forget the multi-tasking and allow yourself the luxury of being present with whatever you’re doing. We know from studies that multitasking doesn’t actually work.” She acknowledges that sometimes it is inescapable, but says not to allow it to become a default practice.
 
Rushing through meals is linked to obesity. There are numerous studies on this, e.g. Slow Down: Behavioural and Physiological Effects of Reducing Eating Rate published in the NIH: National Library of Medicine, which compared groups of fast and slow eaters. The abstract states, “Two hours post-meal, the slow rate group reported greater fullness (effect size = 0.7) and more accurate portion size memory (effect sizes = 0.4), with a linear relationship between time taken to make portion size decisions and the BOLD [blood-oxygen-level-dependent] response in satiety and reward brain regions.” Fast eaters ate more. That is but one health effect of rushing. Healthline lists among other physical effects of “hurry sickness” (habitual rushing) trouble sleeping, changes in appetite, fatigue, headaches, stomach issues, and decreased immune health.
 
None of this is really a surprise. All of us (well, most of us anyway) know intuitively that rushing around habitually is bad for us physically and mentally. So why do we do it? I don’t really know, but I sometimes suspect it is for the same reason some people drink or do drugs. The world is a scary and hard place in many ways – and then you die. Over-occupying oneself is as effective as getting one’s mind off all that as a double shot of Jack Daniel’s – even though, strangely enough, probably no better in health effects. Yet that is not all to the world there is (the Peggy Lee song notwithstanding). There are roses, too, and the age-old advice to stop and smell them is still good. (I won’t use the “smell the coffee” metaphor since caffeine seems to run counter to the point.)
 
As I mentioned in the beginning, my own slowdown has more to do with circumstances than any wise choices, but I notice the benefits all the same. I’m also planning a very leisurely lunch.
 
The Animals – Take it Easy Baby


Thursday, August 10, 2023

Pangloss Lenses

Murphy’s Law is most frequently referenced by engineers, but one hears it applied to other aspects of life as well. It is usually stated as “If anything can go wrong, it will,” often with the codicil “and at the worst possible time, in the worst possible way.” Sometimes it is given the further expansion, “No matter how idiot-proof something is, there always will be an idiot who rises to the challenge.” The sentiment is thousands of years old, but the term Murphy’s Law dates to 1949 at Murdoc (later renamed Edwards) Air Force Base. Edward R. Murphy was an aerospace engineer who uttered a version of this when sensors failed to work on a rocket sled test.
 
Strictly speaking, it is untrue, at least with regard to one-off events. The universe is indifferent, not malicious. But probabilistically it is true. Given enough time, what can happen will. If a machine part can fail, with repeated use it sooner or later will – maybe the first time, maybe the thousandth, but it will happen. If a human can err in a particular way (e.g. by leaving shut the valves of the backup cooling system at Three Mile Island in violation of safety regulations), eventually someone will. Another example of human error involves the cargo door issue (later corrected) on the old DC10 aircraft. The cargo doors were double doors that opened outward for greater convenience and more interior space. The latches were extra strong. There was not a single incident with the aircraft doors when they were properly closed. The engineering mistake was the assumption that ground loading crews always would close them properly. They didn’t. Improperly latched doors opened in flight on more than one occasion with tragic consequences.
 
Murphy’s Law may seem a pessimistic way of looking at things, but actually it is not. Engineering schools dwell on it so that their students will strive to eliminate what “can go wrong.” If anything, it is optimistic to believe they can do that successfully.
 
Though there are individual exceptions, people in general are hardwired to be pessimistic about the world at large but optimistic about themselves: “Everything is going to hell in a handbasket, but I’ll make it.” This makes evolutionary sense. In the wild, you are more likely to survive if you pessimistically assume that a rustle in the brush is a predator whether it really is or not; you are also more likely to survive if you optimistically assume you can escape a predator by running rather than pessimistically assume you cannot and so just wait for the end.
 
At the personal level, optimists often are wrong. Studies show that pessimists judge their own prospects more accurately, a condition known as depressive realism. Nonetheless, optimists do better in life than pessimists, presumably because pessimists don’t bother to try as much. Optimists are healthier than pessimists. They live longer, make more money, and are happier. However, as with so many other things, optimism is best in moderation. A comparison of moderate optimists with extreme optimists published in the Journal of Financial Economics showed that moderates worked longer, saved more, invested better, and more often paid the full balances on credit cards. Extreme optimists (presumably betting that all would work out for the best regardless) worked less, saved less, invested more riskily, day traded more, carried more debt, and were far more likely to smoke. Unsurprisingly, the overconfidence displayed by extreme optimists often ends badly.
 

As I grow older I find my previous inclinations to outward pessimism and inward optimism reversing. For all its problems (many of them severe), the world at large is still a better place than it was 100 years ago or 50 years ago: richer, safer, cleaner. Today’s doomsayers are no more right than were the ones of the 1960s (e.g. as in The Population Bomb) who predicted worldwide starvation and catastrophic resource depletion well before 2000. That is not being Pollyanna. There are a lot of real global challenges to face, but those folks predicting the extinction of civilization within this century are likely to be disappointed. On the other hand, on a personal level it is hard to ignore the detrimental effects of aging or the realities of actuarial tables. There is only so much optimism I can maintain about that. There is an unexpected benefit to the lowering of future private expectations that comes with each passing year however: being happy with what one has becomes easier.
 
Frank Sinatra – The Best is Yet to Come
[a temerariously optimistic song for Frank at age 64]


Thursday, August 3, 2023

Chances Are

Last night I watched The Cincinnati Kid, which I first saw in the theater as long ago as 1965. The cast is stellar, including Steve McQueen, Ann-Margret, Edward G. Robinson, and Tuesday Weld. The core of the film is a high stakes game of 5-card stud poker. It is a pretty good movie even though poker isn’t really my game.

from "The Cincinnati Kid"

I’ve played poker casually over the years – always with other casual players, always 5-card draw, and never for high stakes. In that kind of group I can hold my own, for I’m fairly disciplined at playing by the odds and not by intuition. But I know my limits. I’m not the best bluffer or recognizer of bluffs. My grasp of the odds is basic, not intimate. A seasoned amateur player, never mind a professional, would clean out my stake in short order. There was a poker game in my dorm at college every Saturday night, but I was sensible enough not to participate. I heard much wailing on Sundays from residents who had lost hundreds of dollars (in 1971 dollars). I didn’t want to be one of them. I’ve continued to avoid serious games ever since.
 
Playing cards date back to 1000 CE in China. They entered Europe via Egypt sometime around 1360. The suits of the cards varied from place to place and some of those variants still exist, but the current standard simplified suits of clubs, diamonds, hearts, and spades date to 1480 when French entrepreneurs began producing them with stencils. Numerous games were invented for them, but poker is surprisingly recent, dating back to 1829 in New Orleans. It (including the name) apparently derives from the French poque, which was played in Louisiana during its French days. Other than betting and bluffing, the two games are not very close, however, so someone (we don’t know who) at some point must have sat down and deliberately invented the distinct rules for poker. By 1834 the rules were tweaked into their current form.
 
There are numerous variants of the game, the most popular being 5-card draw, 7-card draw, 5-card stud, 7-card stud, and Texas Hold’em. Texas Hold’em is a version of 7-card stud that made it into the World Series of Poker in 1971 and is currently the basis of the WSOP Main Event, last won by Daniel Weinman who took home 12.1 million dollars.
 
I don’t encourage gambling in a general way, since some people find it addictive. But for non-addictive types who know and keep to their limits, there are some valuable lessons in the game of poker. Primarily, it attunes one’s sense of probabilities in a more realistic direction. Playing odds is a long game for one thing, not a short term strategy. We tend to regard intuitively 70% odds in our favor as pretty safe bets for example. If you correctly calculate your odds of winning with a particular hand at 70%, however, you will still lose a not inconsequential 30% of the time. You must judge and hedge your bets accordingly in order to stay in the game. Thinking probabilistically rather than with unwarranted certainty is a skill worth cultivating off the poker table every bit as much as on it.
 
Samantha Fish – Lay It Down