Saturday, February 15, 2020

Tides and Swarms


Last night after picking up a few groceries I drove through my smallish home town of Mendham and was surprised to find traffic backed up. At any time other than morning rush hour, it is always a surprise to find traffic backed up. The reason was cars coming in and out of the The Black Horse Inn, a restaurant at the central crossroads that has been in business since 1749. I had forgotten it was Valentine’s Day. Apparently the day is alive and well locally at least. Spending was up substantially this year on gifts and entertainment for the holiday according to MarketWatch, yet fewer people accounted for it. They are enough to have crowded upscale restaurants, it seems, but the number of over-18s who did anything to celebrate the day nonetheless was down 20% from a decade ago, with the biggest drop-off in the 18-35 range. Some 5% in that age group planned (with characteristic irony) anti-valentine activities.

This follows the general trend in the population toward lifetime singlehood and away from forming couples – and also from having children. The extent to which this is true is masked by the 80/20 split which characterizes some much of culture and life. (See Dream Hoarders: How the American Upper Middle Class Is Leaving Everyone Else in the Dust, Why That Is a Problem, and What to Do About It by Richard Reeves.) The culture in traditional media is dominated by the upper 20%, and this class still marries at much the same rate as a half century ago. For everyone else the rate has fallen off a cliff – and not just for marriage but for romantic relationships of any kind. Singles are a majority of the US adult population and many young people express no interest in ever being anything else. Unsurprisingly, the fertility rate keeps dropping. One doesn’t need to be married to have children, of course (40% of US births are out of wedlock), but single people tend to have fewer. In 2019 the US fertility rate declined to 1.7, its lowest level ever. While this is actually relatively high by first world standards, it is well below the 2.1 necessary to maintain a stable population without net immigration.

For those who might think the fertility decline stems from insufficient social supports for parents in the US, countries that have them (e.g. the Nordic bloc) have even lower fertility rates. Finland, for example, has 105-day maternity leaves (fathers get 54 days) during which the Finnish Social Security agency pays a maternity allowance. Parental leave (albeit unpaid) with job security lasts another 158 days. There are daycare subsidies and a child home care allowance. Finalnd ranks fourth globally (after three Scandinavian countries) in gender equality. Yet, Finland’s fertility rate is 1.3. Apparently, something else is influencing these personal decisions.

The global fertility rate, by the way, is 2.4. This is half what the rate was 60 years ago, but it is still enough to keep global population rising from 7.8 billion today to 8.5 billion by 2030 – an increase about equal to the entire world’s population in 1800. If all this discussion of population, seems a digression from Valentine’s Day, it is. But sort of not. The thoughts were stirred up by recently reading two books with similar titles but different emphases. Both are worth a look.

The Human Tide: How Populations Shaped the Modern World by Paul Morland describes the ways birth rates, death rates, migrations of people and peoples, and sometimes unpyramid-like population age pyramids shape history and politics. Some information is on a grand scale, some on a small (e.g. “Life expectancy for men in Glasgow is lower than for men in Gaza”), and much is in between. Demographics may not be destiny, but all else equal they very nearly are. The rise and fall of civilizations over the millennia are intimately tied to the size and distribution of their people.

A profound change in human affairs began a little over two centuries ago. Whereas populations once expanded and contracted in accord with war, disease, political (in)stability, and natural conditions, the industrial revolution broke humans out of the Malthusian trap, first in Europe and then sequentially in other parts of the world. Always there is a population explosion followed by a drop in fertility. There are occasional anomalies (e.g. the Baby Boom, which interrupted an early 20th century fertility decline) but these are short-term responses to unusual circumstances (e.g. Depression and World War). A few countries (Russia and Japan among them) already are contracting in absolute terms. Others are still rising (but aging) due to ongoing reductions in the death rate but will contract in the near future. Replacement of contracting traditional populations by migration will prevent declines is some places, but this is not without social stress. Demographics are reshaping cultures and global power accordingly.

The Human Swarm: How Our Societies Arise, Thrive, and Fall by Mark W. Moffett covers some of the same ground but with another perspective, He is particularly interested in the origins of division and unity, peace and conflict. Moffett ties human psychology (especially the us-versus-them dichotomy) to animal behavior, including (but not limited to) that of chimpanzees and bonobos. Although conflict tends to grab our attention, Moffett reminds us that humans have a remarkably peaceful tolerance of strangers. You cannot put 500 strange chimps together in a theater without a riot, but humans do this without a thought. Yet, we engage in grand scale warfare (much of it civil) beyond the imagination of our anthropoid cousins. Moffett tells us that ethnicity and other forms of tribalism matter, not because of any biological basis they might have but simply because people themselves believe in them with consequent identity politics that are sometimes benign and occasionally murderous. Moffett, like Morland, notes modern fertility decline but, also like Morland, can identify the conditions under which it occurs but not offer an explanation as such.

Perhaps there is no use overthinking it. It’s enough to say that above a certain level of economic and personal independence, more people choose maintaining that freedom and independence over having a large family – or even a romantic partner. As for those still seeking the latter, sometimes it works out. Sometimes not.



An Anti-Valentine Tune:
Puddle of Mudd – She Hates Me


Saturday, February 8, 2020

Go to the Matt

You never know with award-winning actor Matthew McConaughey. When not hawking Lincolns or Wild Turkey bourbon, he might turn up in anything from a lame romcom (Ghosts of Girlfriends Past), to an edgy neo-noir (Killer Joe), to a genuinely effective drama (Dallas Buyers Club). Quality is a roll of the dice. Based on Amazon recommendations, this past week I took a chance and a gander at two from 2019.

**** ****

The Beach Bum 
Written and directed by Harmony Korine, The Beach Bum features a protagonist (McConaughey) named Moondog who doesn’t rise to the level of anything so dignified as the title. We meet him in Key West where he is drunk, stoned, and (very successfully) lecherous – in fact, we rarely see him in any other condition. He is able to live a life of carefree dissipation because he is rich. More accurately, his wife Minnie is rich. Minnie is as unfaithful as Moondog, including with Moondog’s friend and pot dealer Lingerie (Snoop Dog), but everybody is cool about each other’s sexcapades. Minnie calls Moondog back to Miami to attend their daughter’s wedding. Moondog shows up (still drunk, high, and lecherous), but is disappointed that his daughter Heather chose such a boring normal partner. We learn that Moondog (as Minnie, Heather, and Moondog himself all agree) is a great man because he once wrote lewd raunchy poetry (adolescent fare to judge by the samples given) though he stopped when he could afford just to be a bum.


Minnie and Moondog drive drunk the night of the wedding and Minnie is killed. (It is typical of the film that we see nothing of any effects on the driver [and passengers?] of the car with which they had a head-on.) Moondog discovers that Minnie’s Will prevents him from inheriting her huge fortune unless he publishes another book of poetry. Until then he is broke, and his daughter Heather refuses to advance him money. Naturally, he has to drink, blow dope, and letch even more to get his creative juices flowing. Meantime he is recklessly destructive and even participates in the mugging an old man for cash. But that is OK because he is a great artist, you see.

I have no objection to shameless hedonism as a conscious lifestyle choice. I at least understand nihilism, which posits no value (negative or positive) in either creation or destruction. Yet personal hedonism need not entail a brutal disregard for others, and Moondog isn’t a nihilist. On the contrary, he talks a lot about the positive value of fun. He is careless and destructive for the fun of it. He is a jackass. 96 minutes is too long to spend in his company. The only thing worse would be sitting through one of his poetry readings.

Thumbs Down.

**** ****

Serenity
Low expectations after the previous film thankfully were exceeded in this one, but regrettably not by a lot. Serenity starts out promisingly enough as an apparent homage to classic noir. Promises, promises.

Baker Dill (McConaughey) is an Iraq War vet with PTSD. Taking their son Patrick with her, his wife Karen (Anne Hathaway) left him during the war for a rich man. Dill changed his name and retreated to a subtropical island where he operates a deep sea fishing boat for tourists: think Humphrey Bogart in To Have and Have Not but without the Vichy French. Chronically short of cash, Dill makes some extra as a gigolo for Constance (Diane Lane). He is obsessed by a big tuna named Justice who keeps eluding him, but his mind is taken off the fish when Karen shows up. She says that her husband Frank is an abuser. She says she has arranged this vacation so that Dill can take Frank fishing and kill him. Frank doesn’t know Dill is her ex. She offers Dill $10 million to do the job. Yet there is something screwy about the whole business that goes well beyond a murder scheme, and the more he tries to make sense of it the less sense it makes.

Though prettily filmed, Serenity misses on so many levels (including an absence of spark between McConaughey and Hathaway) that it is rescued to a degree by a “so bad it’s good” quality. But only to a degree.

This is not the shipwreck that The Beach Bum is, but ultimately (despite elements I can’t mention without spoilers) it gets a Thumbs Down.

**** ****

The good news is that 2020 should be a better year for Matthew. I haven’t seen The Gentlemen, currently in theaters, but it is getting generally positive reviews. As for Lincoln cars, they are too expensive for my taste. Wild Turkey isn’t bad for its moderate price, though the “101” variant is to be preferred over the standard bottle.


Serenity Trailer

Sunday, February 2, 2020

Malice Afterthought

As mentioned a couple of blogs ago, 2020 contains quite a lot of notable 50th anniversaries for me personally. In the broader world, 1970 was no more or less notable than 1971 and probably much less so than 1969, but in a purely solipsistic sense 1970 was a particularly memorable year. Among other things, it contained my final semester of high school, the first semester of college, and (at the tail end of November) registration for the draft on my 18th birthday. Beneath those surface events, 1970 was also the first full year my sense of identity had a firm footing.

I think most readers will know what I mean. Whatever one thinks of the old Freudian notions of personality formation in early childhood years, it is true that certain personality traits visible in a person’s childhood commonly remain visible in adulthood. Yet, there is less to this than meets the eye. Take a childhood trait of shyness as an example; even if the trait continues into adulthood, it can be expressed in a variety of different ways; what way a 10-year-old will express it at age 20 or 30 is unpredictable. Hence, there are shy criminals, professors, and army Rangers with quite different general personalities. Most of us firm up our final identities in our teens – not our life paths but our identities. Some take longer. I certainly feel I could have become a very different person today had influences during ages 13-16 been different, even if some early quirks would have carried over regardless. However, though much older (and I hope at least a fraction wiser), I’m very much the same person now as at 17. I don’t think anyone who knew me in 1970 would be surprised meeting me today.

For most of us the mid-teens – generally coinciding with high school – are pivotal and are burned into our memories like no 4-year period before or since. The “reminiscence bump” is a well-known phenomenon: even as senior citizens we remember our teens and 20s better than more recent decades. High school is particularly intense since it generally coincides with a lot of “firsts,” though some of us bloom earlier or later than others. Accordingly, there is a whole genre of high school movies, TV shows, and young adult novels that – though ostensibly aimed at teens – finds its largest audience in adults. YA (young adult) novels might seem the least likely medium to win an adult audience, given the teen protagonists and the coming-of-age themes. Yet a majority of readers of them are adults according to Publisher’s Weekly, with age 30-44 as the single largest demographic. For at least some movies, the median age of home viewers is as high or higher – especially for now classic offerings such as 10 Things I Hate about You, the 80s John Hughes movies, Mean Girls, Buffy the Vampire Slayer (the series, not the movie), and so on. I’m not immune to them, even if they’re not my primary fare. Why does their popularity persist into dodderhood? Because we’ve all been there (metaphorically if we’re referring to the ones with vampires), and thanks to that reminiscence bump are still more likely to identify with the teen protagonists than with the adults, be they villains or supporting cast. I must admit to at least one exception, which must mean I truly am getting old. In Buffy I tend to see things from the perspective of the middle-age high school librarian Giles. (My skills in magicks and the black arts are not as refined as his, alas.)

All this brings us to a YA novel that I did recently pick up: Go Ask Malice by Robert Joseph Levy. Despite having first aired 23 years ago, the TV show Buffy the Vampire Slayer continues to inspire analyses, spin-offs, and sequels in multiple media. This epistolary novel (the same format as Bram Stoker’s Dracula) is a prequel about the harsh life of the character Faith before her arrival in Sunnydale. The character, as viewers know, after a promising start goes over to the dark side in Season 3. (The famously existentialist Joss Whedon, the series creator, was no doubt punning with the philosophy’s notion of “bad faith.”) The novel is fully consistent with the show, but that can be noticed only by readers who watched the TV series – but then, who else would want to read the book? This is not really a Young Adult novel unless the definition of YA has gotten edgier than it used to be. Elements such as Faith’s alcoholic prostitute mother, her mom’s abusive boyfriends, her own loser boyfriends, her prison inmate dad (from whom she nonetheless picks up the phrase “five by five”), her evil foster parents, and her stint in a mental hospital make this book rougher than most YA fare. There are school violence, encounters with Bacchae (yes, really), a false rescue by her Watcher, and the brutal event that sends Faith on the road to Sunnydale. Upshot: the novel is not bad, but once again is only for Buffy buffs.

And you thought your high school years were rough.


The Runaways - School Days (1977)

Sunday, January 26, 2020

No BF to Existentialists


As mentioned at various times in these blogs, in order to keep my home library of old-fashioned paper-and-ink books from exceeding my shelf capacity (“add more bookshelves” is no longer a desirable option), my rule-of-thumb is to keep a book only if in principle I might re-read it. Newly finished books that I never willingly would read again even if I had boundless time are not shelved at all. As I acquire new “keepers,” marginal titles on the shelves are culled out so the total shelved number (some 2500) remains about the same. In truth, most of the remaining books will not be re-read either simply because of limited time. However, if I’ve culled properly, any one of them ought to be re-readable if plucked out at random. I test the matter with some frequency by making just such random plucks, usually about one per week.

One recent out-pluck had been sitting on my shelf un-reread since 1972 when I was in college: Beyond Freedom and Dignity by BF Skinner. Back then it inspired a 20-page paper (see pic of cover page) that I wrote for an English class. The class was specifically for honing writing skills (i.e. it wasn’t a literature or general grammar class), and one of the assignments was a +-20 page research paper; the topic didn’t matter, since it was to be judged on form and presentation rather than content per se. The paper (The Conversion of a Reluctant Behaviorist) was an overview of Behaviorism that mostly cited Skinner’s formal studies but also referred to Beyond Freedom and Dignity. I did not know until afterward when she handed me back the graded paper and discussed it with me that the professor had been a student of BF Skinner. (What were the odds on that?) Perhaps that helped on the grade despite the supposed “content doesn’t matter” standard. It probably would not have helped had I mentioned I was lying… sort of. It simplified my task (hey, I had work to do in other classes) to explain straightforwardly how I found the tenets of Behaviorism to be convincing despite my initial misgivings. Adding a “yes, but” detailing my remaining misgivings required more nuance, more research, and more plain old work than I really wanted to put into this paper. Yet I had reservations then and still do.


My 2020 re-read hasn’t changed my opinion much. I was baffled by the book then, and I’m still at something of a loss today. I’ll say up front that I have a lot of respect for Skinner, the research scientist. He has more than proved his case that the Behaviorist school of psychology has a lot of merit. Though best known for his animal studies (e.g. the classic Superstition in the Pigeon), he argued the results are readily applicable to humans. In many ways they prove to be so. (Apparent failures in the technique on any one human are attributed to a lack of full information on that person’s reinforcement schedules outside the lab.) He turns the usual approach to psychology on its head (pun intended) by not tending first to the mind. Change the behavior via the proper reinforcement schedule, he says, and let the psyche take care of itself. If we like a behavioral change, our general mental state is likely to improve too. The approach is not without successes.

However, Beyond Freedom and Dignity is not about treating individuals. It is about treating society, and so it is political philosophy, not “science” despite the frequency with which he uses the word to dismiss anyone who disagrees with him as unscientific. Skinner is a strict determinist who doesn’t believe in free will.

I think I need to sketch out one personal view, which informs my response to this book: in my opinion the whole discussion of determinism and free will is academic. It’s a bit like discussions of whether time is real or if it is just an illusion created by the perception of entropy: as a practical matter, tomorrow will arrive for us whether the passage of time is “real” or not, so we’d better be ready for it and we’d better pay our bills before the end of the month. As for free will, as a practical matter we have it, whatever the ultimate underlying cosmic reality might be. We have to hold people accountable for their choices, which means we have to assume people make them. We can’t ignore criminal behavior, for example, on the grounds that the criminal had no choice or culpability because the crime was already built into the structure of the universe – were that true, Jeffrey Epstein, for one, ought never have been arrested. I am not about to surrender my freedom of choice because a determinist says I don’t have any. As a practical day-to-day matter I do. (Notice that “to surrender” also would be a choice.) It is notoriously hard to define consciousness – the meta-state of not only knowing but knowing that one knows – but it is safe to say that human minds are more complex than those of pigeons. We can consciously choose to alter our behavior even if the reinforcements remain the same. It’s not always easy, as any addict will tell you, but we can do it.

Skinner argues that since there is no such thing as freedom or autonomous beings, we should chuck the whole idea of personal liberty out the window and organize society on scientific principles (aka his principles) with a structure of reinforcements that would maximize human happiness. I’m not quite sure how we could choose to do that, since by his own argument we don’t really choose anything – what we do or don’t do is already predetermined. And whose definition of happiness?

Once again, I respect the work by Skinner that actually is scientific, but I can’t help thinking that in this book he has gone seriously wrong somehow. Skinner personally might have been a kind-hearted soul who genuinely wished for human happiness, but it’s not hard to see how easily his philosophy can be coopted by less kindly authoritarians. Besides, kindly authoritarians are often the most dangerous of all.

Will Beyond Freedom and Dignity go back on the shelf? Probably. I keep a lot of books with which I disagree. I’ll let it sit for a while longer on my desk, though, while I consider it.


BF on pigeons and people 


Thursday, January 23, 2020

On Being One’s Own Chauffeur


If you’re over 50, every day is the 50th anniversary of something personal, but of course some of those days are more memorable than others. For no obvious reason, it occurred to me while on the road this morning that I got my driver’s license 50 Januaries ago. US states vary in the minimum age for a license, being as low as 14 in South Dakota to as high as 17 in my own state of New Jersey. (There currently is a “probationary license” for 17-year-olds in NJ with various time and passenger restrictions, but in 1970 a license was a license was a license.) My 17th birthday was at the end of November in 1969, but because of the holidays I wasn’t able to schedule the Driver’s Test before January 1970. I passed. So, one morning during this month 50 years ago instead of getting on the school bus or cadging a ride from mom, I slipped behind the wheel of my dad’s car (with his permission) and drove off alone for the first time. Except for the destination (school), it was a liberating experience. It was another five years before the open road tempted me to follow it to the Pacific and back, but that morning’s seven miles to school were a first taste of mobile freedom. There are other personally notable 50th anniversaries coming up this year– high school graduation, for instance – but as a rite of passage getting a driver’s license in many ways mattered to me more.

1970 Jeepster 
My mom had sold her 1967 GTO (400 cu.in. 360 hp) a couple of months earlier. I’m sure there was a connection between that decision and my upcoming license. In its place she bought a 1970 Pontiac Grand Prix, an enormous coupe with a hood large enough to subdivide into plots for single family houses. It wasn’t a vehicle I ever would ask to drive. Sometimes my dad would let me borrow his 1968 Mercedes Benz 230, which he had bought two years earlier; with only 8000 miles on it, it was a bargain sale by a musician who had just lost an argument with the IRS. Most commonly, though, for the next three years I drove the 1970 Jeepster on which (for the most part) I had learned to drive. The Jeepster was an excellent vehicle for a newbie precisely because it was difficult: a 4-speed stick shift V6 with no power steering, no power brakes, and a clutch with scarcely any slippage so the slightest error in foot pressure would stall the engine. The ignition was twitchy, too, so I habitually parked it on an incline so I could let it roll forward, turn the key, and engage second gear: it started every time that way. Every vehicle I’ve driven since has seemed easy.

The peculiar geography of New Jersey also makes it ideal (which is to say challenging) for new drivers. I know people who grew up in cities and who consequently find dark winding country roads scary. Conversely, I know drivers from the farmlands who are overwhelmed by urban traffic. NJ is a weird mix of urban, suburban, and rural: often in the same short trip. Perhaps something about that multiple experience contributes to a characteristic driving style responsible for NJ drivers being widely regarded as second only to Massachusetts drivers as people you don’t want on the same road as you. Oddly, despite their reputations for rude aggressive driving, your chances (according to the Centers for Disease Control and Prevention) of dying in an automobile (1.3% in the USA overall over a lifetime, 10.7 per 100,000 in any one year) are lowest in Massachusetts (5.6 per 100,000) and second lowest in New Jersey (6.3 per 100,000). I guess we’re more likely to cause accidents than have them. Montana, of all places, has the highest fatality risk at a whopping 23.3 per 100,000, though Montanan drivers do score high (3rd place) on the politeness scale.

My dad had been in the passenger seat in the previous months in 1969 when I was learning to drive – other than during formal Drivers Ed classes at school. I don’t recall anyone else ever being there. I’m sure that took steely nerves on his part as I lurched, stalled, over-braked, and oversteered in a vehicle that could be easily rolled due to a short wheelbase. I had my own experience attempting to impart lessons from a passenger seat several years later (see The Driving Lesson) and the results were not good, so I appreciate his daring. My dad managed to contain all but a few expressions of alarm during all that time.

Millennials and iGens (aka GenZ) are a puzzle to many of my generation and of GenX in regard to licenses, as in regard to so many other things. In large numbers they, of their own accord, are delaying or forgoing licenses. Only 77% of 20-24 year-olds currently have driver’s licenses. In 1983 92% did. Among teens the drop-off is steeper yet. Most states issue licenses at age 16, but only 25% of US 16-year-olds have them. 46% did in 1983. I don’t pretend to understand this. Apparently they don’t mind being chauffeured by mom and dad, an idea that was anathema (even when unavoidable) to my generation. Parental chauffeurs were a deal-killer when dating, for one thing, but then dating is also as old-fashioned as Blockbuster. Teens today more commonly hang out in groups rather than pair off for a burger (non-vegan back then) and a movie. Parents don’t seem to mind the extended chauffeur duty either.

Driver’s licenses in the US are almost as old as automobiles, New York in 1901 being the first state to issue them. There was no exam of any kind: just a fee. Not until the 1950s did the majority of states require driving tests. (All have since 1959.) The license was something that could be revoked, however, so it did serve a law-enforcement purpose. For the first 15 years of the 20th century, two types of passenger car drivers dominated the roads, such as they were. There were the early-adopter auto enthusiasts, who were likely to have a mechanical bent. Then there were chauffeurs, who doubled as mechanics. (A great period depiction of chauffeurs is in GB Shaw’s 1905 Man and Superman.) The reason was that the vehicles were unreliable. If you didn’t have the skills to repair your car yourself when it stopped for some reason (as it very likely would), it was best to have someone along who did. Chauffeurs were expensive (as they still are when they’re not your parents), but wealthy people overwhelmingly were the customers for cars anyway. ‘Chauffeur’ originally meant ‘stoker,’ so the word contains the notion of someone who keeps the engine running; this meaning was lost as reliability improved and chauffeurs became simply drivers.

Automobile reliability more than affordability (though both were important) was key to letting ordinary people be their own drivers; they could drive to town and back alone without a serious risk of being stranded on the way. Historically, this level of reliability and affordability had been achieved by 1920 and it transformed American life. For me personally, the year I became my own chauffeur was 1970, and, strange as it may seem to iGens happy to be passengers, it was a joyful moment.


Maria Muldaur - Me and My Chauffeur Blues

Sunday, January 19, 2020

The Check Is in the Mail


My checkbooks are presently on my desk next to the computer (it’s an L desk) and the latest bills next to them are waiting to be opened. I usually pay them on weekends: most commonly on Sunday. (It’s not an unbending rule: sometimes I go wild and pay them on Tuesday.) Yes, I pay all but a few of them the old-fashioned way, not online.

Checks preceded money, as such. Standard systems of weight for silver and gold (e.g. the 17 gram Babylonian shekel) were devised very early, which facilitated their use as standards of value, but the precious metals themselves came in any shape and size – hence the need to weigh them. Gold rings were commonplace for easy carriage. So, they were money only in a broad sense. The Lydians are credited with minting the first standard precious metal coins in the 8th century BCE and the Chinese with printing the first fiat folding currency (originally leather, later paper) in the 2nd century BCE; those are “money” by even the strictest definition. In ancient Sumer thousands of years before any of that, however, individual traders exchanged clay tokens notched to indicate quantities of sheep, grain, chairs, and other valuables. Those are personal checks. In ancient Rome shipping companies doubled as banks and issued checks cashable at ports-of-call so shippers and passengers didn’t have to carry coins. Modern customer checking accounts operated by banks issuing printed checks with serial numbers (to “check” on them) appeared in 18th century England, as did central clearinghouses for banks. By the 19th century they were a commonplace means for ordinary folks to settle debts large and small. The largest single check ever written by a private entity, by the way, was $9,000,000,000 from the Bank of Tokyo-Mitsubishi UFJ Ltd to Morgan Stanley in 2008. The largest personal check was $974,790,317.77, a divorce settlement from oilman Harald Hamm to his ex in 2015.
Abe Lincoln cashed an $800 check the
day before he was shot.

Once again, as indicated above, I still use checks (paper not clay, though in principle a clay check arguably still should be valid) for most financial transactions, including paying recurring monthly bills. Though electric power companies, credit card companies, phone companies, and…well… pretty much every commercial enterprise offers (nags, actually) to take its recurring payments electronically directly from my account “for your convenience,” I resist allowing that whenever possible. Sometimes it isn’t, but all my important bills are paid by paper check. I realize that is showing my age. I have millennial acquaintances with no checkbooks at all; even when they receive a check, they snap a photo of it with an iPhone and deposit it electronically. I still prefer a hands-on approach, which has the added benefit of simplifying accounting. My deductible expenses are all right there in ink in the checkbook register. There is also the secondary benefit of focusing my attention on my expenditures.

The first checking account in my own name was in 1969 at 16. It was useful for depositing checks from a summer job, and I knew I would need an account when leaving for college the following year anyway. (I no longer recall if the bank rep, a local fellow whom I knew personally, asked for any parental signatures; perhaps the regulations required an adult signature to open an account, but while I remember sitting at his desk I don’t remember being accompanied.) Prior to then I operated entirely by cash, which was normal in the day. Teens didn’t carry credit cards back then; a substantial minority of adults didn’t either. A few banks already were experimenting with debit cards and ATMs, but most Americans had never heard of them, much less seen one. No teen had access to one. The bank where my account opened was gobbled up four or five times after 1969, but I still have it in its successor – something about which I hadn’t really thought until this moment.

I did appreciate the handiness of credit cards right away, though I didn’t qualify for a major one until 1975. By then they were all but essential when traveling. Try renting a hotel room without one. I pay them off by check, however, much as that seems to annoy the issuers, judging by their constant wheedles to switch to electronic payments.

Governments – tax authorities in particular – are fond of the shift to electronic payments (cryptocurrencies excepted): not just to them but in general. It is so much easier for them to track income flows that way. Their computers can monitor all our transactions and red flag any anomalies. Given their druthers, most governments would stop printing money altogether in favor of going all-electronic; they don’t lest truly untraceable private currencies pick up the slack. Gold retains its appeal for many for just this under-the-radar characteristic. Cryptocurrencies appeal for the same reason. I actually seriously considered mining or buying some Bitcoins a decade ago, but hesitated because I didn’t really understand the blockchain record-keeping that is the basis of their value. By the time I read enough about it to grasp it, the profits had been made. The first known purchase by Bitcoin was 10,000 Bitcoins for a $25 pizza in 2010, which established a value of US $0.0025 per coin. Today a single Bitcoin trades at $9,102. So, a pizza’s worth of coins acquired in 2010 would be worth $91,020,000 today. I missed out on that one, but at least I didn’t buy gold. Gold’s price has barely budged since 2010; even the measly interest offered by banks over that decade provided a better return.

Well, those bills on the desk won’t be paid in gold, Bitcoin, or electrons. The paper checks will be in the mail. Hey, at least they’re not clay.


John Lee Hooker - I Need Some Money

Thursday, January 16, 2020

Gaining Traction


As I’ve mentioned in the past, Peter Jackson is a filmmaker whose work I admire more than like. A big exception is the World War One documentary They Shall Not Grow Old, a stunning film that I both like and admire, though the theater in which I saw it was almost empty. But even though films such as Lord of the Rings and The Hobbit are not for me, I recognize them as remarkable moviemaking. For that reason, I skipped Mortal Engines in the theater and wasn’t particularly eager to see it on DVD, but didn’t fear hating it either. This instinct proved sound. Jackson’s own involvement in this film was peripheral, but his usual team of fx engineers were at the core of its production, so calling it a Peter Jackson movie is not entirely unfair. The movie is based on the Young Adult novel series by Philip Reeve.

In a dystopian world long after the “sixty minute war” destroyed the bulk of civilization, cities on giant caterpillar tractor treads wander around the wastelands consuming surviving smaller towns for fuel and resources: a system known as Municipal Darwinism. London is a big player in this system. Traditional static villages of the Anti-Traction League exist beyond a great wall in Asia, however, and they support subversive Anti-Tractionists in the Western mobile cities: a well-worn rapacious-West vs spiritual-East trope. London bigwig Valentine (Hugo Weaving) has a secret project in St Paul’s Cathedral that may allow the city to take on the wall. Opposing him are the Anti-Tractionist Hester (who has a personal as well as political grudge), the hapless Tom (Robert Sheehan) who follows Hester like a puppy-dog, and Anna Fang (Jihae) whose warrior chops are oddly impersonal. There is also a reanimated killer cyborg named Shrike, who is the only one in the movie to show any depth or character development.

Even a fantasy film benefits from exposing the human heart and human values. Little of either beyond the shallowest is on display here. Nonetheless, the movie has glitzy fx, nicely done action sequences, protagonists that are (though not engaging) not dislikable, and a more or less coherent plot – and it just looks good, which counts for something.

The Upshot: entertaining enough to watch once – twice, not so much. Thumbs ever so slightly tilted above the horizontal.

**** ****

It seemed appropriate to follow up the movie with a book about the sort of weapon likely to be used in a “sixty-minute war.”

Revisiting South Africa's Nuclear Weapons Program: Its History, Dismantlement, and Lessons for Today by David H Albright and Andrea Stricker is a heavily documented look at the nuclear weapons program of the only country ever to develop nuclear weapons and then give them up. South Africa built 8 active devices between 1979 and 1989, the year the decision was made to dismantle them. South Africa signed the NPT (Non-Proliferation Treaty) in 1991, which was followed by an inspection regime by the IAEA that was highly intrusive and not specifically required by the NPT; amid the contemporaneous transition from apartheid to a broadly representative government, it was one to which Pretoria (grudgingly) nonetheless acceded in order to help rebuild the country’s international standing. Accordingly, there are detailed records of nuclear-related facilities (right down to storage sheds), of the production and disposition of enriched uranium, and of the dismantling procedures.

As the book explains, the biggest obstacle to building a fission nuclear weapon is not engineering the device but obtaining the fissile material to put in it. There are two practical options, both of which require a sophisticated industrial capacity: U235 or Pu239. Weapons grade for either is usually regarded as 90% pure, though as low as 80% U235 can work at a reduced yield. The advantage of Pu239 is that plutonium can be extracted from spent fuel rods of nuclear reactors, which is why the NPT requires strict accounting of spent fuel by signatory states. Uranium deposits occur naturally, but natural uranium is more than 99.274% U238, which is not bomb material; separating out the 0.72% U235 (other isotopes make up the difference) from natural uranium is a complex and laborious process. The advantage of uranium, however, is that the weapon itself can be much simpler, e.g. a gun device that shoots one chunk of U235 into another to create a critical mass; plutonium requires a more complicated implosion mechanism to create a critical density. South Africa took the uranium route.

The book reveals how a relatively minor power with stiff sanctions against it was still able to home-grow its own nuclear program. It reveals why: in this case because the South African regime felt its existence was threatened by Soviet-backed communist forces in Angola and Mozambique. It also reveals how it is possible to undo a decision to go nuclear. Once again, there is a reason why: in this case the end of the Cold War and of the Marxist threat. All four points are definitely relevant with regard to current and aspirational nuclear states.
  
As a history, the book is more informational than engrossing, but it is enough of the former for a Thumbs Up.


Trailer Mortal Engines