Author Archives: Will Thomas

War Diary of John Trump, April 1945

Sunday April 1, 1945

After mess at St. Augustin, saw Bennett Archambault. Discussed with him Sam Goudsmit’s request for counter-intelligence personnel and agreed that I would work through him in supplying men for this sort of activity. He described the organization of CIOS, of which ALSOS forms a special part, and indicated his belief that this work need not be conducted as a sort of Arkansas land rush.1 Talked with John Chase regarding termination procedures. Charlie West reported that 42nd Bomb Wing has now had over 30 operational Shoran missions, that 16 planes are equipped, and that 55 per cent of their bombs are landing within a 400 ft. radius circle, with results regarded as 5 to 10 per cent better than their Norden bombing in Italy.2 Later walked to Notre Dame, where we listened to part of the Easter service, and then continued along the Seine.

The war news continues to be favorable. The British are now 90 miles east of the Rhine. The 7th Army has effected a Rhine crossing on a 10-mile front and has taken Heidelberg without a struggle. General Eisenhower has issued instructions to the German troops, giving them procedures for surrender.

Illustration of SHORAN bombing
Illustration of the principles of SHORAN precision bombing techniques. Source: “Graphic Survey of Radio and Radar Equipment Used by the Army Air Forces,” July 1, 1945.
Continue reading

Breaking Technology’s Rules in Times of Crisis, feat. John Trump

John Trump (1907-1985)

John Trump (1907-1985)

Technology is inseparable from policy, surrounded constantly by rules governing its design and use. Sometimes these rules are codified regulations and standards, but where official rules are absent there are always procedures and customs that pattern how technology is deployed and operates. A historian of technology would describe this point in shorthand by saying “technology is always political.”

Generally, such rules are in place for good reason: they make it possible for people to coordinate their use of the technology, they make it easier for more people to use, they amplify its benefits and limit its harms. Sometimes the rules are inadequate or harmful: they make use of the technology inequitable, they fail to limit its harms, they stifle its deployment and adaptation.

Such rules have played an important role so far in the story of the critical technologies of the COVID-19 crisis. The U.S.’s lag in testing is not simply a matter of getting a late start, but of overcoming restrictions that have prevented the use of tests developed in other countries and hampered the development and production of tests by private companies. Now regulatory hurdles are being modified in the race to develop not only tests but vaccines, treatments, and new production sources for protective equipment and ventilators. The failure to alter our rules has already surely cost us lives in a time of crisis, and the argument for fast-tracking is strong.

But we cannot change our rules haphazardly. There are, for instance, calls to remove barriers and expedite data-gathering in an apparent effort to move ahead with certain drugs that President Trump has favored, reportedly due to private lobbying.

To a certain extent, relaxing rules is a matter of accepting additional risk, but it is also a matter of making governance more intensive than we can usually afford it to be. Under ordinary circumstances, we often put in place rules that are more rigid than they absolutely need to be. Some will complain that such rules don’t make sense, but the fact is we cannot practically supervise the application of a more intricate and flexible rules that would achieve the same benefits. There are simply too many things that need to be governed and not enough qualified people to do the governing. But, in times of crisis, we can afford to focus expert attention on the flexible application of rules because time is of the essence and our priorities have become much narrower.

The trick is to set up an organization that is capable of such technological governance. Which brings me to Donald Trump’s uncle, John Trump, who understood this point very well.

Continue reading

Kenneth Arrow and Formal Modeling as a Form of Criticism, Part 1

Introduction

arrowKenneth Arrow died on Feb. 21 at the age of 95. I am not a scholar of Arrow’s work, per se, but inasmuch as I’ve studied him in the context of my broader work, I’ve always found him to be a thoughtful and intriguing person. My book, Rational Action, even gives him the last word.

My point, channeled through Arrow, is that the people who developed fields like operations research and decision theory and who formalized economics were not advocating an exotic, revolutionary, or naive concept of rationality and governance. Rather, they worked to understand and explicitly describe rationality as it exists in the world and to use and improve on that rationality so as to improve decision making and policy. In 1957, Arrow described building formal (i.e., mathematics and logic-based) models of decision making as striving toward a final destination that could never be reached. But, drawing on Goethe’s Faust, he regarded the very act of striving as offering a chance at intellectual salvation. (“He who ever strives, him can we save” / Wer immer strebend sich bemüht, Den können wir erlösen.)

But to what end was Arrow actually striving? I would argue that, certainly early in his career, it was not primarily toward more faithful descriptions of reality—his craft remained far distant from that destination. Rather, his paramount interest was to use models to build an improved critical understanding of cutting-edge concepts and ideas—their presuppositions and logical consequences, their possibilities, and their limits. In this, Arrow was not so different from the humanistic (literary, historical, or philosophical) critic. Yet, his methods were, of course, very different.

Continue reading

Martin Shubik on the Flavors of Game Theory

Shubik, left, circa 1967

Shubik, left, circa 1967

Beatrice Cherrier has asked me to put together a post on Martin Shubik’s informal tripartite classification of work in game theory:

  • High church
  • Low church
  • Conversational

Shubik discussed this classification in two retrospective, reflective articles:

Neither article is exactly what you would call philosophical. They belong to the ill-organized, often repetitious genre of commentary addressing 1) the epistemological status of formal modeling, and 2) the always-dicey relationship between theory development and practical application. Beatrice, I should note, is something of a connoisseuse of this important, under-respected literature, at least as it pertains to economics. Follow her on Twitter for occasional dispatches from its labyrinths.

Continue reading

Warren Weaver on the Epistemology of Crude Formal Analysis: Relativistic Cosmology and the “General Theory of Air Warfare”

Willem de Sitter and Albert Einstein discuss the equations governing the dynamics of the universe.

Willem de Sitter and Albert Einstein discuss the equations governing the dynamics of the universe

In a pair of earlier posts I discussed mathematician Warren Weaver’s opening address at the 1947 RAND conference of social scientists, in which he suggested that all the attendees shared a devotion to the “rational life.” Weaver made it clear that what he meant by the “rational life” was not a strict rationalism, but a kind of searching, open-ended approach to analyzing questions that decision makers were compelled to answer whether they analyzed them or not.

Weaver’s interest in such problems appears to have been primarily prompted by his experience in World War II, dealing with conundrums in the design and selection of military equipment. Weaver confronted these problems, first as an overseer of research on “fire control” (gun-aiming) devices, and then as chief of an organization called the Applied Mathematics Panel. He was particularly impressed by a body of analytical techniques first developed in Britain by a statistician named L. B. C. Cunningham, and referred to as the “mathematical theory of combat” or “air warfare analysis.” In brief, Cunningham’s theory combined expressions describing the specifications of alternative weapons systems and equipment configurations, the tactics of attackers and defenders, and the vulnerability of targets, and used them to derive expectation values for victory in combat.

Various pursuit curves a fighter might follow in making an attack on a bomber.

Various pursuit curves a fighter might follow in making an attack on a bomber. The image links to a post with further context.

It is important to note that, although these expectation values might be checked against data from actual combat, they were not imagined to provide accurate predictions. Rather, they provided a means of comparing different choices of design by making explicit and interrogating previously tacit assumptions that engineers made about the virtues of their various designs. When Weaver spoke, RAND was beginning to elaborate on these methods and to apply them to the design of more complex and prospective military technologies under the new label “systems analysis” (a label that would shift significantly in meaning in subsequent years).

To clarify the intellectual value of this analytical activity, Weaver compared its epistemology to the then-nascent field of relativistic cosmology.

Continue reading

The Task of the B-29 Flight Engineer


flight engineerThe drawing to the right is a detail from a figure from my book, showing the harried flight engineer of a B-29 bomber. Click for the full image, which is the cover of an instruction book for using the “flight engineer’s computer.”1

The “computer” is the slide rule that the flight engineer is holding. It was designed by physicist Alex Green (1919–2014) of the Twentieth Air Force’s Operations Analysis Section to aid in the management of fuel consumption over the course of long flights (see a picture of the slide rule here).2

In the drawing, the cause of the flight engineer’s stress is the complexity of the slide rule. Some years ago, Green replied to an inquiry I emailed to him concerning the illustration: “I instructed the artist of our topographical unit to show the flight engineer as the most hardworking member of the flight crew and to recognize that what we wanted him to do was impossible unless we gave him some extra arms. The flight engineers computer might have been the most complex slide rule in history with some six independent variables needed for a calculation of the fuel consumptions rate.”

One should also note that using such a slide rule was a task that came atop the flight engineer’s other, decidedly non-trivial duties. YouTube has increasingly become a treasure-trove of materials that would not previously have been readily available. Among materials that have recently appeared is the following training film for B-29 flight engineers, which nicely illustrates their typical responsibilities. An additional historical curiosity: one of the narrators is Ronald Reagan!

Optimization and the Gulag: A brief tour of certain 20th-century intellectual anxieties

Optimized labor?

Optimized contruction? Prisoner laborers constructing the Baltic Sea-White Sea canal

In her recent New York Times Magazine essay, “A Sucker is Optimized Every Minute,” Virginia Heffernan posits that an increasing infatuation with “optimization” in our society is leading to cultural, economic, and political harms. Her themes and some of the topics she examines are very much in this blog’s wheelhouse, so I thought it would be useful to take a look at some of the ideas in her piece. First, I’d like to point out that, if we stand back and think about the various associations Heffernan draws, they should seem bizarre. A good example is her concluding line, “Right there in my Apple Watch: a mini Gulag, optimized just for me.” Suppose she chose a slightly different metaphor, say comparing Spotify music-selecting algorithms to Auschwitz. The obvious distastefulness of the comparison would make it immediately apparent that the former and the latter simply exist in totally different moral, intellectual, and institutional universes. Let’s leave aside the question of why it seems to be OK to rope Soviet forced-labor camps into clever cultural critiques. The fact is it is actually perfectly possible to follow Heffernan’s argument without undue bafflement. The reason has to do with our various inheritances from intellectual history. Continue reading

The Traveling Salesman Problem: How much computational efficiency do you really need?

This post is inspired by a blog post by analytics and software engineer Nathan Brixius concerning recent media interest in the Traveling Salesman Problem (TSP). The TSP, for the uninitiated, is to find a minimum-distance route between a set number of points; as the number of points increases, the problem of being certain one has found a solution becomes computationally formidable. Thus, the problem is really to find an efficient algorithm for finding solutions.

Randal Olson's minimum-distance road trip

Randal Olson’s minimum-distance road trip

Come out with guns blazing, or lay out the welcome mat?

Michigan State computer science grad student Randal Olson developed, and blogged about, an algorithm to solve the Traveling Salesman Problem for a 48-stop tour of the United States. This is almost the exact same version of the problem featured in 1954 in the first publication to use linear programming methods to address the TSP. Olson’s approach was picked up by blogs at the Washington Post and New York TImes websites as an interest story. Unfortunately, Olson also suggested that guaranteeing a solution is computationally impossible—for 48 stops it is actually very simple to prove optimality.

TSP expert Bill Cook, Professor of Combinatorics and Optimization at the University of Waterloo, quickly pointed out that the true shortest route—35,940 meters shorter than Olson’s—could be easily computed on an iPhone using his Concorde TSP app. Brixius writes that, good as it is to point out OR’s extensive work on the TSP, it was important to go gently on Olson’s misstatements so that the OR profession would not come out of the episode looking bad.

And it’s here where, as a historian of science who happens to study the history of OR, I find I recognize the issue from two complementary perspectives.

Continue reading

What did Warren Weaver mean when he spoke of “the rational life”?

In 1947 Air Force Project RAND—then a branch of Douglas Aircraft, but soon to become the independent RAND Corporation—decided that it needed to recruit social scientists to aid it in its studies of prospective military technologies. As a step forward it held a conference of social scientists that September. The director of natural sciences at the Rockefeller Foundation, mathematician Warren Waver, delivered the conference’s opening remarks.

The beginning of Warren Weaver's speech to open the RAND conference on social science

The characteristically jokey opening to Warren Weaver’s opening remarks to the RAND Corporation’s 1947 conference on social science. People from technical fields moonlighting in the social sciences are prominently mentioned. The president of the New Jersey Telephone Company was Chester Barnard, who would soon become president of the Rockefeller Foundation. Document source: Papers of Edward L. Bowles, Box 44, Folder 4, Library of Congress Manuscript Division.

Asking the rhetorical question of why they had assembled there, Weaver began by explaining: “I take it that every person in this room is fundamentally interested and devoted to what you can just broadly call the rational life.”

As I note in a parallel post at Ether Wave Propaganda, the remark was first quoted in journalist Fred Kaplan’s 1983 book The Wizards of Armageddon, where it is truncated and explained in such a way that it appears to augur an attempt to marshall social scientists into an attempt “impose” a rational order on military strategy and national policy. The “rational life” quote has been used by a number of other authors since Kaplan’s book appeared, and the meaning of the term has always been taken for granted. This post explores what Weaver had in mind.

Continue reading

Max Weber on Rationality in Social Action, in Sociological Analysis, and in Modern Life

Max Weber (1864–1920)

Max Weber (1864–1920)

For my first post on Rational Action, I’d like to offer a summary of Max Weber’s classic analysis of rationality and social action in his posthumously published Economy and Society (E&S, 1922).1 This subject has not exactly wanted for attention. Weber’s discussion is unquestionably an important reference in twentieth-century thinking about rationality, and we will no doubt have ample opportunity to link back to this post in the future.

A central feature of Weber’s sociology was his belief that sociological inquiry should be grounded in the analysis of how individuals attach “meanings” to their “social actions.” For Weber, an action was social and subjectively meaningful to the actor insofar as it embodied some consideration concerning how others had acted and would act. Individuals’ social actions collectively gave rise to observed forms of social organization.

Rationality and Social Action

In Weber’s view, social actions could be classified into four types: “instrumentally rational (zweckrational),” “value-rational (wertrational),” “affectual,” and “traditional,” though he noted that this list was not necessarily “exhaustive” (E&S 1.1.2).

Continue reading