Sunday, May 29, 2022

Lex Luthor as a Role Model

There are (sometimes overlapping) fads in movies and popular entertainment: the noir of the 40s/50s, the Westerns of the 50s/60s, the scifi following the success of Star Wars, and so on. Examples of each genre turn up after its fad generally has passed, to be sure, but those examples stand out just for that reason. Since 2000 comic book-based superhero movies have consumed ever larger studio budgets and generated (despite the occasional bad miss) ever larger box offices. A few are solid works of art by any standards, but most are at best forgettable even if entertaining in the moment. It is remarkable just how long-lived this fad has been. A slew of films with budgets in the hundreds of millions of dollars are in production and post-production at this moment, while comic book publishers explore new characters in hopes of creating the next massive franchise. Superheroes must speak to something in 21st century public psychology. Fantasy and wish fulfillment are at the bottom of it of course, but that is the case in every genre of entertainment. Plenty of psychologists and social observers have tried to identify what the current appeal of superheroes in particular might be. I don’t have any special additional insights, so I’ll leave the head-scratching about it to them.
 
I do insist, however, that it is the villain who makes a superhero movie watchable at all. (Not an original view, I know, but true.) The villain provides the necessary challenge and is almost always the better role. The best written ones are not mindless thugs, sadists, or psychopaths. The real world has a full share of those, and they cost us dearly without any upside. The best fictional villains are understandable on some level. Some are even sympathetic. Dr. Horrible is a case in point. Four years before he was handed 200,000,000 studio dollars to write and direct The Avengers, Joss Whedon whimsically made the silly but enjoyable Dr. Horrible's Singalong Blog for virtually zero dollars and then gave it away for free on the internet. The hero, Captain Hammer, is a pompous posturing ass, while Dr. Horrible just seeks love and respect. Dr. Horrible is still on the wrong side, but we the viewers can’t help liking him more. So, too, with villains with better studio funding. It’s not enough for one to say “I want to rule the world” to be interesting. Rather, it’s more understandable to hear, “I want to rule the world to make it a better place, and it will be a better place when everyone just shuts up and does what I say.” This is every bit as villainous as the simple egotism of the former quote, and in many ways scarier, but we get it. After all, reader, wouldn’t the world be a better place if everyone would do what we say?


Becoming a superhero is a little out of reach. There is never a radioactive spider when you need one, and getting belted by gamma rays won’t make you the Hulk. It will just kill you. Fortunately, award-winning comics writer Ryan North tells us that it is entirely possible to become a supervillain in the real world in his book How to Take Over the World: Practical Schemes and Scientific Solutions for the Aspiring Supervillain. No superpowers are required. Money is. However, all the schemes in the book together add up to a cost of some 56 billion dollars. This is a lot, but not more than Congress commonly tacks on to some existing spending bill just as an afterthought. (One of the schemes in the books involves hacking elections, so that is one indirect way to raise the cash.) He makes a point of sticking to the scientifically possible, albeit often difficult: e.g. cooling the climate with high altitude sulfates (as volcanoes do naturally) with all the blackmail possibilities that suggests. He distinguishes between practical and possible. A secret lair, for example, is most practically and economically hidden in plain sight as a sublevel under a commercial building or even a private home, but it is possible to float a high altitude sphere for your lair. This idea actually dates to 1958. Buckminster Fuller noted that the cube volume law (volume increases according to the cube of linear dimensions, so a person twice as tall as another will have eight times the volume if relative body proportions remain the same) means that if you made a geodesic sphere large enough, the mass of the structural elements would become a mere rounding error when calculating the total mass including the air inside. Heating the air inside (transparent greenhouse panels would do the trick) would make the whole object lighter than the surrounding air at normal altitudes and up it would go. North calculates a sphere 1.6 kilometers in diameter would work great. However, one can see how a 1.6 kilometer floating sphere would make a pretty easy and fat target.
 
North says his book “identifies hitherto unexploited weaknesses in our global civilization… In other words, this is a book of non-fiction about becoming a literal supervillain and taking over the world.” Since North himself hasn’t tried any of his schemes (to my knowledge), I suspect none is easy. But it is nice to know it can be done. Anyone have $56 billion to spare?
 
Theory of a Deadman – Villain


Sunday, May 22, 2022

Inner Worlds

I visited an old friend with cognitive issues (and no internet presence) recently. He is no longer good at distinguishing between events he imagines and events that actually happened. Under the circumstances, in his case this is mostly a harmless quirk. He told me he had purchased a house, for example. He didn’t, but he seemed to enjoy thinking he had, so I saw no reason to contradict him.
 
We are all Walter Mittys with elaborate fantasies spinning in our heads in which we are the stars. We generally know they are fantasies. When we believe them they are delusions. (We all harbor some delusions, too, but that is a subject for another time.) What is surprising is just how much time an average person spends daydreaming: “watching your own mental videos” as Yale professor emeritus Jerome Singer put it. According to numerous studies it is between 30% and 47% of our waking hours. Sometimes it’s intentional and sometimes we just zone out when reading something boring or performing a dull task. We usually snap back to the real world fairly quickly.

1947 adaptation of James
Thurber's short story

Recent research suggests daydreaming ought not be considered a bad thing, assuming it doesn’t crowd out too much of real life. On the contrary it helps us form our values and sense of self by testing them in extraordinary situations without actual risk. It exercises our brains to recall the past, imagine the future, and make novel connections among disparate pieces of information. It thereby enhances creativity. It is positively correlated with productivity at work. Smithsonian Magazine notes that “researchers from the University of Wisconsin and the Max Planck Institute for Human Cognitive and Brain Science, suggests that a wandering mind correlates with higher degrees of what is referred to as working memory.” “Working memory” is what it sounds like: an ability to retain and call up memories even when faced with distractions. The upshot is that all the mental work we put into making coherent daydreams enhances our capacities in ways that are transferable to the real world.
 
Daydreams most obviously inspire authors and artists. Woody Allen, Tim Burton, and J.K. Rowling all credit their best work to daydreams. This is not a new observation. Sigmund Freud noted, “A piece of creative writing, like a daydream, is a continuation of, and a substitute for, what was once the play of childhood.” Yet they inspire scientists and engineers too. Robert Goddard, father of modern rocketry, credited his childhood daydreams of flights to Mars for his work in the field.
 
Are there downsides? Sure, for some people it is like a drug. For them real life can’t compare. Addicts of any kind have a hard time paying bills and keeping friends, and chronic daydreamers are no different. Others get depressed that real life never measures up to their imaginations. But, of course, it’s not supposed to measure up. I do not know, but I’d be willing to bet Elon Musk daydreams about colonies around Alpha Centauri. That is not happening in his lifetime or the lifetime of anyone alive today, but that doesn’t make what SpaceX has accomplished so far any less remarkable. Still others set their hearts on aspects of their fantasies, which of course is a recipe for heartbreak.
 
In general, though, we benefit from our mental excursions provided we remember the difference between the real and make-believe. If the day ever comes, however, when I say (and clearly believe) I bought a house when you know I didn’t, be kind and don’t argue.
 
Devil Doll – It’s Only Make Believe 


Sunday, May 15, 2022

The Charm of Nuclear Weapons

The representatives of every nation say its weapons and military exist to deter aggression. No one ever believes it. This is nothing new. The Romans famously claimed all their conquests were in self-defense. The same goes for individual weapons systems; they are supposed to be scary enough to deter aggressors. The modern industrial arms race really picked up steam (literally as well as figuratively) in the late 19th century, especially (but not limited to) with regard to naval weaponry. Better armor called forth better guns to defeat it. New classes of vessels such as torpedo boats begat counter-vessels such as torpedo boat destroyers. On land (and, later, air) the firepower race accelerated. For a few decades, many people really believed these developments were making war “unthinkable.” Hiram Maxim, inventor of the Maxim machine gun, said, “Only a general who was a barbarian would send his men to certain death against the concentrated power of my new gun." In World War I, generals from every combatant army did exactly that.
 
Yet there was a hint by the end of WW1 that some weapons really might be too scary to use. Poison gas was not used on the battlefield in World War 2. Experience in WW1 had indicated that there was no military advantage to be gained when both sides employed it; gas just pointlessly increased suffering without changing the balance of forces on the ground. So in WW2 even in desperate circumstances no party resorted to it on the battlefield. However, later in the 20th century not all militaries were so circumspect, notably in the Iran-Iraq War (1980-88).
 
So far, however, nuclear weapons truly have been different. Since 1949 when the US monopoly of nuclear weapons was broken, they really have deterred. The risk of retaliation has made them too scary to use. Nine nations currently possess nuclear weapons, eight of them openly. (Israel doesn’t acknowledge possessing them but everyone assumes it does; South Africa dismantled its nukes after 1989.) Only China has announced a “no first use” policy; all the others reserve the right to use them not just in retaliation but in circumstances where the existence of the state is at risk. Possession of these weapons has not deterred proxy wars, but it has deterred the nuclear armed powers from tangling with each other directly. It must have occurred to both Moammar Gadhafi and Saddam Hussein that things might have gone differently had they been so armed. It clearly occurs to the leadership of Iran. The risk, of course, even assuming sane leadership all around, is miscalculation during diplomatic brinkmanship games. If one side or another feels existentially threatened, a limited tactical demonstration of nuclear force could escalate out of control. This always has been and remains the most likely scenario for a nuclear exchange if one ever happens.
 
M65 "Atomic Annie" tactical artillery


The West, at present, has been deterred from direct intervention in Ukraine as one obvious example, and for good reason. If anyone has doubts about that reason, I suggest visiting the website NUKEMAP which allows the user to choose a geographical location, choose an explosive power in kilotons, and see the results. (Strategic ICBMs and SLBMS typically have warheads in the hundreds of kilotons; gravity bombs from bombers can be in the thousands of kilotons [aka megatons].)
 
According to the Bulletin of the Atomic Scientists, “As of early 2022, we estimate that Russia has a stockpile of approximately 4,477 nuclear warheads assigned for use by long-range strategic launchers and shorter-range tactical nuclear forces, which is a slight decrease from last year.” Of these, 1,588 strategic warheads are deployed: 812 on land-based missiles, 576 on submarine-launched ballistic missiles (SLBMs), and 200 at heavy bomber bases. 977 strategic warheads are in storage, though they could be activated in short order. 1,912 tactical warheads are operational: some are seaborne and others are intended for land battlefield use. (What is tactical and what is strategic has more to do with how a weapon is used than explosive power per se, but in general tactical nukes are in the 10s of kilotons while strategic nukes are in the 100s of kilotons or more.) Another 1,500 warheads are retired but upgradable. This is a total of 5,977 warheads.
 
US forces are comparable. So perhaps some caution – in both directions to be sure – is justified. If unconvinced, use the NUKEMAP calculator 5,977 times.


Sammy Salvo - A Mushroom Cloud (1961)


 

Sunday, May 8, 2022

Gray is the New Blond

Two years ago at the beginning of the pandemic the local supermarket reserved the hours 7 a.m. to 9 a.m. for shoppers over 65, those being in the age group most at risk. Much to my own surprise (my life feels shorter than that) I solidly qualified, so I arrived at 7:30, figuring that there would be few shoppers. My mistake. The store was jammed with old people. “Are there really all these seniors in town?” I wondered. Yes, there are. Thereafter I shopped after 9 with the general population; the store always was much emptier then.


 
I should point out that Mendham is no retirement community. The home prices are high while the taxes and living costs are simply ridiculous. I live here only because I reside in my family home that my father built; my mother grew up (and is buried, next to my dad and sister) four miles from my door. I’m as local as one can get, and so will stay for sentimental reasons as long as I can somehow afford it. But it is, in a general way, a terrible place to retire. Instead, the town long has been known as a bedroom community with a reputation for good schools that draws young (or young middle-age) families with children. Yet somehow all these old folks are walking around nowadays. Thinking about it, I realize I see more of them than kids.
 
This reflects a trend that is not just local or national but global. The world is getting older, a consequence of declining birthrates and increasing longevity. In 1980 worldwide, children aged 0-9 years outnumbered persons aged over 60 years 1.1 billion to 0.4 billion. According to the UN by 2030 those over 60 will outnumber those under 10 by 2.1 billion to 2.0 billion. Those over 80 are increasing the fastest, from 137 million in 2017 to a projected 425 million in 2050. In advanced countries the trend is more pronounced. In Japan the median age is 48. It is 46 in Germany, 47 in Italy, and 45 in Lithuania. In the US the median age is 38, which sounds young by comparison until one remembers that in 1960 it was 29.5. A higher fertility rate (though still well below replacement level) than most 1st World nations along with immigration have kept the US median age rising more slowly than elsewhere, but the trend is steadily upward all the same.
 
There are economic consequences to an aging population. Fewer younger workers means a shortage of skilled labor which restrains production. On the upside, this should mean rising real wages. (I know they are falling at the moment due to inflation, but on a longer scale the pressure is upward.) This in turn promotes automation and thereby increases productivity per worker. On the downside for those workers, any higher wages will probably be taxed away in order to pay for the health care needs of the elderly as the dependency ratio rises. Accordingly, economic growth in aging countries tends to be anemic in all aspects other than health care and retirement services. Advanced countries with low immigration (e.g. South Korea and Japan) are facing absolute declines in population, with the biggest drop among the young. The growing proportion of elders in the population is well positioned to vote itself more benefits as well.
 
At the current time my taxes (federal, state, and [especially] local) well outweigh any benefits, and I don’t see that changing anytime soon – maybe never. So, I am not at the receiving end of any dependency ratio – rather the reverse. Still, I can understand the grumpiness so many of the young express toward gray hairs. All I can say is that you’ll join us (actually, replace us) sooner than you think. And you’ll then outnumber the kids by even more than we do you. Maybe you should start practicing now the art of shaking a rake and yelling “Get off my lawn!” But if stores offer senior shopping hours, I recommend skipping them.

Bobby Cole - I'm Growing Old (1966)


Sunday, May 1, 2022

The Case for Consuming One's Fruits Fermented

A week ago Sunday I indulged in a “healthy” snack of freshly cut strawberries and blueberries. The quotation marks in the last sentence would not be necessary had I been a little less lackadaisical about thoroughly washing them first. Some hours later began that special agony, which most (all?) of us experience more than once in life. (I don’t know for a fact it was the berries – I didn’t take them to a lab for testing – but the only other possibilities are unlikely, each for its own reason.) I’m normally known for an iron stomach. I have omnivorous habits, no known food allergies, and a taste for the spiciest spices. But I met my match in berries.
 
In the past I’ve shrugged off the occasional food poisoning episode quickly. Not this time. Either this was a sterner bug, or I’m less resilient these days on account of age, or both. In the past, digestive system turmoil (the details and consequences of which I’ll politely decline to describe) was the whole of it, and it was over in 24 hours or so. This time it was almost the least of it. Instead, Chills and Fever opened the show. Intestinal Distress (accompanied by backup singers Torpor and Sleepiness) soon joined the act and occupied Center Stage for the next five days. Even now, though I feel OK finally, I still feel the need to be uncharacteristically gentle and cautious with my food choices lest I risk an encore.
 
Prior to the 19th century, food poisoning was understood only in the chemical sense, e.g. a dash of arsenic mixed in the mashed potatoes from an impatient heir. An understanding of the biological type had to await the confirmation of the germ theory of disease. The first microbe to be identified as a specific culprit behind a specific illness of this type was Salmonella. In 1885 some very good early medical investigative work not only identified the bug responsible for an outbreak of food poisoning in Britain but traced it to a specific slaughtered sick cow. The list of known harmful food-borne biological contaminants (bacterial and viral) since then has grown very long indeed: Campylobacter, Listeria monocytogenes, Escherichia coli, Hepatitis A, etc. They all wreaked havoc long before people knew they existed, of course. According to two medical historians at the University of Maryland, Alexander the Great died in 323 BCE most probably from typhoid fever, which is caused by Salmonella typhi: it was something he ate.
Salmonella

No matter how proactive a country’s health system may be, individual cases and wider outbreaks will happen. Germ contamination, picked up from the soil, are just a feature of fresh produce, whether home grown or from the market. Any fresh raw poultry probably does carry Salmonella or Campylobacter, though proper cooking kills it. Hence the advice always to clean and cook food well. The CDC publishes a list of Foods That Can Cause Food Poisoning. Not many foods are absent from it.
 
Some 48 million Americans get food poisoning every year. This is an estimate, of course, since most of those afflicted don’t seek medical help. 128,000 are hospitalized, however, and 3000 die. So, if last week was an average one, 923,000 Americans were sharing my unpleasant experience. 2519 had a far worse time of it. I’ll be taking precautions to avoid rejoining those statistics again any time soon. I know it’s entirely the wrong lesson, but the canned fruit and vegetable aisle at the market is looking more attractive to me at the moment than the fresh produce aisle… at least until the next time one of those canned labels gets recalled. Maybe the wine aisle is best of all.
 
Samantha Fish – Chills and Fever