Sunday, July 26, 2020

Bzzzz


During NJ’s all-too-short summer I spend quite a lot of time outside in the early hours of evening. This is something I did less frequently as a kid. I liked the warm night summer air back then, to be sure, but didn’t much care for the insect life. That childhood home was in a valley near lots of water; the area is called Brookside for a reason. I presently live only 5 miles (8 km) from there horizontally while the vertical distance is a couple hundred feet (60 meters) higher. The change in elevation makes all the difference when it comes to mosquitos, which prefer the lowlands. Mosquitos might seem a minor complaint, but going inside begins to seem like a very good idea when constantly swatting them while telltale high-pitched buzzes around the ears promise more bites to come. While mosquitos are not entirely absent in my current location, they are rare. Bears are more likely to prompt a decision to go inside than they are, and bears don’t show up very often.

There was a time, however, when even the fairly low bite rate I currently experience would have posed a notable risk. We tend to forget just how much of the US as far north as New England was once infested by malaria, dengue, yellow fever, and other mosquito-borne diseases. During the 1920s, when the US population was a third of its current size, 1,200,000 Americans caught malaria. This was an improvement over the 19th century rate. (My dad contracted malaria in the 1940s – also tinnitus from the quinine treatment – but that was in India during WW2.) Some progress was made against the disease in the early 20th century, but efforts were stepped up after WW2 with the National Malaria Eradication Program. According to a CDC website “it consisted primarily of DDT application to the interior surfaces of rural homes or entire premises in counties where malaria was reported to have been prevalent in recent years… It also included drainage, removal of mosquito breeding sites, and spraying (occasionally from aircrafts) of insecticides. Total elimination of transmission was slowly achieved. In 1949, the country was declared free of malaria as a significant public health problem.” Nature rarely lets a handy virus vector such as the mosquito go unutilized however. Zika and West Nile viruses, among others, have secured a limited but stubborn foothold in the US in recent decades.

In his microhistory (a look at history in terms of a single component) Mosquito, author Timothy C. Winegard reminds us that globally the mosquito remains humanity’s deadliest predator, killing an average of 2,000,000 people per year since 2000. More accurately, the diseases they transmit kill people; in addition to those mentioned, these diseases include St. Louis, Japanese, and Equine encephalitides along with chikungunya and Mayaro. By comparison, the average annual toll from homicide (including war) since 2000 is 475,000. Snakes kill 50,000, dogs 25,000, crocodiles 1000, lions 100, and sharks 10. It is common for microhistorians to overstate their cases (the “to a man with a hammer everything looks like a nail” phenomenon), and Winegard is no exception, yet there is no doubt that mosquito borne diseases have had a profound effect on demographics, empires, war, and colonization from ancient times to the present. Winegard’s book is as good a summary as any for the general reader. (The armchair entomologists out there will be harrumphing that there are different species of mosquito that vary in their effects on human health; this is true, and the author covers this point, too.)

Mosquitos that look identical to modern ones existed 150,000,000 years ago – one premise of Jurassic Park that wasn’t fanciful. Humans have been swatting them since the genus Homo split off from the other great apes. People learned early on that swamps where mosquitos were thickest were not just unpleasant but unhealthy. Sumerian tablets 5200 years old describe malarial symptoms and blame the god Nergal, depicted as a mosquito-like being. Egyptian records also describe malaria, calling it swamp fever. Herodotus in the 5th century BCE describes Egyptian mosquito nets in sleeping quarters. The Indian physician Sushruta in the 6th century BCE also describes malaria; he squarely blames mosquitos and even distinguishes five species of them: “Their bite is as painful as that of a serpent and causes diseases.” The Romans wrongly thought disease was spread by “bad air” (which is what “malaria” means) but correctly identified swamps and stagnant water as risky. It wasn’t until the late 19th century that the germ theory of disease finally explained how mosquitos vector diseases, even though the suggestion that they somehow do had been made many times before over millennia.

The mosquito was never in danger of extinction even at the height of the war against it, but nowadays more circumspect use of pesticides and the protection of wetlands are boosting their numbers again. In the 21st century malaria vaccines have been developed, in part with funding from the Gates Foundation, which committed 2 billion dollars to malaria research. So far they seem to be no more than 39% effective, declining to less than 5% after seven years, but that is at least something. A novel approach to combatting mosquito-borne illnesses is to genetically modify mosquitos to not carry them. If this can be done and enough of them are bred, they will spread the traits through the general mosquito population. Even this can be no more than stop-gap however, since diseases will adapt in turn. What can one say but that if the mosquitos outlived the dinosaurs they are pretty certain to outlive all of us, too.

But not tonight. Tonight will be another warm July night in these parts and I expect to enjoy it sans any high-pitched buzzing. If I hear a growl though, I’m going inside.


Nina Simone – Funkier Than a Mosquito's Tweeter


Sunday, July 19, 2020

One Man’s Trash


Friday is garbage day on my street: the general garbage, not recycling which is Thursday. More accurately, it is garbage day for those on my street who have contracted with the same refuse collection company as I have. As in most places, the collection trucks beat rush hour traffic (not that there really is any during the Covid restrictions) by making their rounds at or before first light. Since I don’t much like stumbling out of bed at 5 a.m. (the days are long gone when I was still up at that hour) in order to wheel the bin down the dark tree-lined 200 feet (61 meters) of my driveway to the street, my preference is to wheel it down there the previous evening. There are two problems with that plan: raccoons and bears. Neither show up regularly, but they do often enough to take them into account.


snacking in the driveway
Raccoons are pesky rather than a serious problem. They rarely are able to tip the bin. They’ll open the lid, climb inside, and toss out some trash that I’ll have to pick up later when retrieving the empty container, but nothing worse than that. A bear, however, will do whatever it damn well pleases with the bin and its contents; it is apt to knock over the container, remove the bags, spread them open in a clear spot, and nose through them for something edible. The clean-up is neither quick nor easy; it’s bad enough in my own driveway, but far worse on the street. Yet, bears show up on my property no more than once or twice a month (as far as I notice), so odds are I will get away with it if I put out the trash on a Thursday night. To quote Dirty Harry, “You've got to ask yourself a question: 'Do I feel lucky?'”

My parents commented to me back in the 90s that they never put garbage curbside when they were kids in the 1930s. (They also never mowed the lawn: sheep and goats took care of that.) Urban areas had well developed garbage removal, of course, but it was a different matter outside of town. My mom lived on a dairy farm. My dad (though his father was primarily a builder) also lived on a small farm. Trash such as old rotting lumber would be piled in an out-of-the-way corner of the property where it could decay naturally. For paper, cans, rinds, and such they, like all their neighbors, dug holes; there wasn’t as much of that kind of garbage anyway. A large item, such a couch that was beyond all hope of repair, could be self-carted to the local dump, which might be municipal (typically free to residents) or private (very cheap). The local dump was still a thing when I was a kid. Not everything dumped there proved kind to the groundwater beneath it however, which is why they were shut down starting some 50 years ago in favor of (recycling centers aside) better engineered clay-lined landfills and “resource recovery facilities,” as the State of NJ euphemistically calls its incinerators. The incinerators do generate electric power, so to that extent they are recovering something. Unsurprisingly, trash pick-up has become expensive, and price increases continue to far outstrip general inflation. My bill has quadrupled in the past 20 years even though the volume has gone down as the list of items allowed in the general trash stream gets ever more restrictive.

I’ve ordered a few dumpsters over the years for things not suitable for either curbside trash or recycling such as construction debris, e.g. the old shingles from when I reroofed my barn. Dumpsters also get pricier each year, but they are still the most cost-effective way to dispose of large weights and volumes. I always find plenty to put in them, but there is not always a clear line between what is garbage and what is not. The answer is largely circumstantial. I frequently wrestle with the question. I’m not a pack rat. I haven’t added to net quantity of stuff in my possession in decades. The general trend of incoming versus outgoing long has been toward the latter, yet I hesitate more than I should to thin out what left behind by my parents whose home it was previously. My dad was a builder, so the barn and the attic over the garage are full of random items such as screens that don’t fit any windows, non-standard size doors, nails, lag bolts, joist hangers, siding, PVC drain pipes, shingles, cases of 50-year-old passage locks, and so forth… literally including a kitchen sink. With the pine trim stored over the garage I could completely trim a 4000 square foot (372 square meter) house with the caveat that no two rooms would match; very likely not all the trim in a single room would match, and most of it would be pieced. Some of this stuff could be sold on Craigslist, it is true, but for so few dollars as scarcely to be worth it. Besides, I rather like having on hand, for example, 40 different kinds of nails up to and including 60p spikes. I do use them. I’ll just never use all of them, or even a substantial percentage. The truth is, when the time comes to sell this place (whether by me or…well…someone after me) most of the stuff will end up in a dumpster – and perhaps much of it from there to a resource recovery facility.
kitchen sink

People have been generating garbage since they became people. In a millennium or two, archeologists (if any exist among the robots who will have replaced us) likely will find our landfills to be valuable sources of information about 21st century biological beings. Present day archaeologists spend much of their time digging through middens, as they prefer to call old garbage dumps. Prehistoric shell middens can be found on six continents along ocean fronts and lake shores. An artificial hill (Monte Testaccio) covering 20,000 square meters in Rome is an ancient dump and is made of no fewer than 25,000,000 broken amphorae, the standard Greco-Roman ceramic shipping containers. Even today the amphora shards retain residues of what they once contained, thereby giving insight into Roman commerce. A huge Roman-era 70-meter-deep layered landfill with much more varied trash (including food cast-offs such as fruit pits and fish bones) was recently discovered outside the walls of Jerusalem. Yuval Gadot, who led the 2013-14 dig, commented, “It looks like there was a mechanism in place that cleared the streets, cleared the houses, using donkeys to collect and throw away the garbage.” Sounds familiar. I wonder if the donkeys came by at dawn to beat rush hour.


Garbage Truck: Written by Beck. Performed by Michael Cera, Alison Pill, Mark Webber in Scott Pilgrim vs. the World

Sunday, July 12, 2020

The Ultimate Bad Day on Wall Street


Most Americans regard acts of political violence within US borders as aberrations – occasional deeds by the fringe few who exist in all places and times. How frequent must something be before the term “aberration” is simply wrong? People will differ in their answers to that, but ideologues willing to use violence as a tool either to obtain or retain power are not always fringe or few. Among them in the late 19th and early 20th centuries were numerous anarchists (mostly anarcho-communists) who openly promoted or defended violence – not civil disobedience ala Gandhi (“the means are the ends”) or labor strikes but lethal attacks. In concert with their similarly minded colleagues around the world they took the position that the established authorities enforce their control with violence so it was legitimate to use violence in turn against them and their supporters.

Like European counterparts who blew up Czar Alexander II and assassinated French President Carnot, anarchists in the US sometimes targeted individuals. There were, as examples, the attempted assassination of Henry Frick by Alexander Berkman (Emma Goldman’s confidante), the attempt on JP Morgan Jr. (he was shot twice) at his home by Erich Muenter (a former instructor at Harvard), and the successful assassination of President McKinley by self-described anarchist Leon Czolgolz. Czolgolz explained, “I shot the President because I thought it would help the working people, and for the sake of the common people.”

Others targeted industrial and government sites with dynamite. Bombings in the 1890s and 1900s numbered in the hundreds. These acts were encouraged by people such as Johann Most who published Science of Revolutionary Warfare: A Handbook of Instruction Regarding the Use and Manufacture of Nitroglycerine, Dynamite, Gun-Cotton, Fulminating Mercury, Bombs, Arsons, Poisons, etc. In 1908 The New York Times noted a rate of about one bomb per month in New York City. A 1910 bomb at The Los Angeles Times killed 21. A bomb at the Preparedness parade in San Francisco in 1916 killed 10. Police tried to connect the San Francisco bombing to Berkman who was in town with Emma Goldman editing his “revolutionary labor weekly” titled Blast, but were unable to do so. On May Day 1919, 30 letter bombs were sent to business and political leaders; the one casualty was Senator Hardwick’s maid who had her hands blown off. On the 2nd of June 1919 bombs went off simultaneously in 7 Eastern cities: one of them at the home of Attorney General A. Mitchell Palmer. This was the domestic backdrop of the Palmer raids during the 1919 Red Scare that ran roughshod over Constitutional protections. The deadliest bombing of the era was yet to come.

On the morning of September 16, 1920, a horse drawn red wagon parked in front of the headquarters of J.P. Morgan and Company on the corner of Wall Street and Broad in lower Manhattan. The wagon was packed with dynamite wrapped in window counterweights for shrapnel. It exploded at 12:01 PM, ideally timed for the lunch crowd. The detonation killed 38 people and wounded 143 more – many horrifically – while shattering windows for blocks. To this day damage to the exterior wall of the Morgan building is visible; inside the Morgan bank there was one death (a messenger boy) and several injuries from flying glass. Nearly all the casualties on the street were ordinary people: secretaries, chauffeurs, tradesmen, salespeople, pushcart vendors, etc. A poorly coordinated investigation by the Bureau of Investigation (precursor to the FBI) and the NYPD followed numerous leads and entailed several arrests but none of those arrested were charged in the end.

The public was so horrified that many in the socialist and anarchist press (including Eugene Debs) insisted the explosion was just an accident: the unplanned explosion of a wagon destined for a construction site. They noted that eyewitnesses previously had seen a red DuPont truck or wagon in the area. Construction dynamite is not transported packed in shrapnel, however, so this was not a plausible hypothesis on the face of it; the DuPont truck in question was located and was found to have been carrying paint. As in the case of most bombings of the era, no one claimed credit though some defended it. The case is formally unsolved to this day.

An excellent account of the era in general and the Wall Street bombing in particular is The Day Wall Street Exploded by Beverly Gage, who teaches history at Yale. She provides background on the prominent anarchists and on their varying philosophies about violence. She also introduces us to the industrialists and bankers whom they targeted and to the major figures in local, state, and national governments who targeted them. She provides a solid well-balanced account free of modern polemics – though it recounts historical polemics from all sides.

So, who did plant the Wall Street bomb? Authorities might have come close to a solution before getting distracted by wrongheaded ideas. The 1920/21 investigation mixed fumbling forensics and misguided conspiracy theories (including a supposed Soviet connection) with some surprisingly solid police grunt work. The horse (which was shredded) was newly shod with an unusual amount of caulking. The police found the farrier who did the work; the job was an anonymous cash transaction with someone the farrier described simply as (like himself) Italian. Among the many groups and people investigated by the authorities were the Galleanisti in Paterson NJ, followers of Luigi Galleani, publisher of the Italian language Cronica Sovversiva (Subversive Chronicle). Galleani was deported in 1919 due to suspicion of involvement in earlier bombings (including one that killed 10 at a police station), but his followers continued to meet. They issued a flyer threatening a dynamite campaign: “And deport us! We will dynamite you!” Among the Galleanisti were Sacco and Vanzetti who were arrested on murder charges in connection with a robbery in Braintree Massachusetts on September 11, 1920. (The case was circumstantial, but not as flimsy as is sometimes represented – at least as co-conspirators if not as perpetrators; a conviction might be hard to obtain today on the same evidence but an indictment would be easy.) In 1991 historian Paul Avrich concluded that Mario Buda (Galleanist, close friend of Sacco and Vanzetti, and implicated in the June 2 bombings) was the probable bomber and that his motive was retaliation for the arrest of Sacco and Vanzetti five days earlier. A century after the event this can’t be proven but Avrich makes a compelling case.

After 1920 anarchist violence faded, replaced to some degree by enthusiasm for the new Soviet state, which discouraged individual terrorism. (Emma Goldman, however, after two years in Russia became an anti-communist though she remained an anarchist.) Did the shootings and bombings achieve their political goals? Did they accomplish anything? The answer is pretty clearly no. Terror can work in an out-and-out civil war (e.g. Bolshevists in Russia) or by unrestrained militias (e.g. Mussolini’s Black Shirts) of a revolutionary political party during extreme national turmoil, but in generally stable polities, democratic or otherwise, it only brings public sympathy for the targets and hostility toward the perpetrators. (Nietzsche: “It is only because they [monarchs] have been shot at that they once again sit securely on their thrones.”) We saw much the same response after the 1995 Oklahoma City bombing. One may hope that no one soon, despite these past failures, tests the tactic yet again.



Sunday, July 5, 2020

Does a Body Good


The title, as those old enough will know, is from a 1980s promotional campaign by the dairy industry. Whether milk does in fact do more good than harm has been a matter of debate before and since.

On left, great grandparents Wilhelm and
Theresa c. 1900
My mom grew up on a dairy farm. Her father grew up on a dairy farm. His parents were dairy farmers. Unsurprisingly, there was quite a lot of dairy in my diet when I was a kid. By contemporary standards there is quite a lot today. There is one half-gallon (1.9 liters) in the fridge at this moment. Cow’s milk, full of calcium and potassium, was regarded as health food in the 1950s. (For babies, bottle feeding of various milk formulae was regarded as “scientific” and superior.) A family of four, we had two quarts (again, 1.9 liters) delivered to the house every day for the first dozen years of my life, which was pretty normal. Milkmen arrived near dawn, dropping off full bottles and taking back the empties for steaming and reuse. Homogenization was not as effective back then – especially from the smaller local dairies – so the cream tended to separate out. Shaking the bottle to evenly blend the cream back was a minor thrill – except one time in our kitchen when the cap came off while I gave a vigorous shake and the ceiling got soaked.  Deliveries continued to our house until the mid-1960s when we switched over to milk in cartons from the supermarket, though the old milk box is still by my backdoor just for nostalgia reasons.
1952

There always has been debate about nutritional value versus health risks of milk. In the late 1960s, however, anti-dairy hypotheses began to get the upper hand among health professionals. They were accompanied by diatribes against other sources of animal fats and cholesterol including eggs and red meat. Americans listened. Over the next half century per capita milk consumption in the US dropped by 37%. Low fat milk made up a rising proportion of sales. Annual red meat consumption per capita in the same period (Source: USDA) dropped by 16 pounds and annual egg consumption dropped from its high of 374 to 250. Sales of vegetables, chicken, fish, and fruits rose. This shift to what was purportedly a healthier diet was associated with a doubling of the obesity rate and absolute increase in average adult weight for both men and women of about 22 pounds (10 kilos). Correlation is not causation, but the numbers do give one pause.

Milking animals for food dates to prehistoric times when humans first began domesticating animals. This might seem odd since the majority of adult humans today are lactose intolerant and nearly all humans were 10,000 years ago. Lactose is the sugar in milk, which is broken down by lactase, an enzyme plentiful in infants but sparse (though not absent) in most adults. Yet everything is not what it seems. Drinking milk may cause lactose intolerant people digestive distress (gassiness, loose bowels) but it isn’t actually dangerous. They will still benefit from milk’s other nutritional value. In Neolithic circumstances when simply obtaining enough calories was a challenge, the trade-off was worth it. Lactose tolerance via persistently high lactase production in adults then became a biologically favored trait in pastoral milking cultures. Lactose tolerance evolved completely independently in northern Europe and among the cattle-herding Masai of East Africa. It is a common (though still a minority) trait in Central and South Asia, and not actually rare in East Asia. Accordingly, India, not the United States, is the largest milk producer in 2020.

Mark Kurlansky is known for his microhistories: viewing long spans of history in terms of a single component. In the past I’ve enjoyed his histories on cod, salt, and paper, so when Amazon recommended Milk: A 10,000-Year History I needed little convincing. The book largely met expectations. He tells of the rise of prehistoric pastoralism, the various sources of milk exploited by the ancients (and us) including from mares, camels, donkeys, goats, and sheep. He describes the development of more easily digestible, storable, and transportable milk products (butter, yogurt, cheeses) from Sumerian times to the present. The book is full of recipes both ancient and modern. Cato the Elder, for example, took time out from demanding the destruction of Carthage to write down his cheesecake recipe in his book De Agricultura. Kurlansky tells of the history of ice cream fountains. He discusses historical and current health debates on dairy products, the role of milk in the growth of government regulation, pasteurization versus raw milk, and the goals of animal rights activists. He ends with the economics of dairy, which favor large scale farming; hence since 1970 some 600,000 dairy farms have closed in the US without a drop in production. There is room for artisanal dairy, however, since some people will pay much more for a specialty product, e.g. organic milk from a particular breed of cow or goat or yak or whatever.

Perhaps I was acclimated early by that initial formula, but I for one still like the stuff even in its basic supermarket-shelf form. It just might be time to pour myself a glass.


Ella Mae Morse – Milkman, Keep Those Bottles Quiet (1943)