Kenneth Arrow died on Feb. 21 at the age of 95. I am not a scholar of Arrow’s work, per se, but inasmuch as I’ve studied him in the context of my broader work, I’ve always found him to be a thoughtful and intriguing person. My book, Rational Action, even gives him the last word.
My point, channeled through Arrow, is that the people who developed fields like operations research and decision theory and who formalized economics were not advocating an exotic, revolutionary, or naive concept of rationality and governance. Rather, they worked to understand and explicitly describe rationality as it exists in the world and to use and improve on that rationality so as to improve decision making and policy. In 1957, Arrow described building formal (i.e., mathematics and logic-based) models of decision making as striving toward a final destination that could never be reached. But, drawing on Goethe’s Faust, he regarded the very act of striving as offering a chance at intellectual salvation. (“He who ever strives, him can we save” / Wer immer strebend sich bemüht, Den können wir erlösen.)
But to what end was Arrow actually striving? I would argue that, certainly early in his career, it was not primarily toward more faithful descriptions of reality—his craft remained far distant from that destination. Rather, his paramount interest was to use models to build an improved critical understanding of cutting-edge concepts and ideas—their presuppositions and logical consequences, their possibilities, and their limits. In this, Arrow was not so different from the humanistic (literary, historical, or philosophical) critic. Yet, his methods were, of course, very different.
Neither article is exactly what you would call philosophical. They belong to the ill-organized, often repetitious genre of commentary addressing 1) the epistemological status of formal modeling, and 2) the always-dicey relationship between theory development and practical application. Beatrice, I should note, is something of a connoisseuse of this important, under-respected literature, at least as it pertains to economics. Follow her on Twitter for occasional dispatches from its labyrinths.
Willem de Sitter and Albert Einstein discuss the equations governing the dynamics of the universe
In apair of earlier posts I discussed mathematician Warren Weaver’s opening address at the 1947 RAND conference of social scientists, in which he suggested that all the attendees shared a devotion to the “rational life.” Weaver made it clear that what he meant by the “rational life” was not a strict rationalism, but a kind of searching, open-ended approach to analyzing questions that decision makers were compelled to answer whether they analyzed them or not.
Weaver’s interest in such problems appears to have been primarily prompted by his experience in World War II, dealing with conundrums in the design and selection of military equipment. Weaver confronted these problems, first as an overseer of research on “fire control” (gun-aiming) devices, and then as chief of an organization called the Applied Mathematics Panel. He was particularly impressed by a body of analytical techniques first developed in Britain by a statistician named L. B. C. Cunningham, and referred to as the “mathematical theory of combat” or “air warfare analysis.” In brief, Cunningham’s theory combined expressions describing the specifications of alternative weapons systems and equipment configurations, the tactics of attackers and defenders, and the vulnerability of targets, and used them to derive expectation values for victory in combat.
Various pursuit curves a fighter might follow in making an attack on a bomber. The image links to a post with further context.
It is important to note that, although these expectation values might be checked against data from actual combat, they were not imagined to provide accurate predictions. Rather, they provided a means of comparing different choices of design by making explicit and interrogating previously tacit assumptions that engineers made about the virtues of their various designs. When Weaver spoke, RAND was beginning to elaborate on these methods and to apply them to the design of more complex and prospective military technologies under the new label “systems analysis” (a label that would shift significantly in meaning in subsequent years).
To clarify the intellectual value of this analytical activity, Weaver compared its epistemology to the then-nascent field of relativistic cosmology.
The drawing to the right is a detail from a figure from my book, showing the harried flight engineer of a B-29 bomber. Click for the full image, which is the cover of an instruction book for using the “flight engineer’s computer.”1
The “computer” is the slide rule that the flight engineer is holding. It was designed by physicist Alex Green (1919–2014) of the Twentieth Air Force’s Operations Analysis Section to aid in the management of fuel consumption over the course of long flights (see a picture of the slide rule here).2
In the drawing, the cause of the flight engineer’s stress is the complexity of the slide rule. Some years ago, Green replied to an inquiry I emailed to him concerning the illustration: “I instructed the artist of our topographical unit to show the flight engineer as the most hardworking member of the flight crew and to recognize that what we wanted him to do was impossible unless we gave him some extra arms. The flight engineers computer might have been the most complex slide rule in history with some six independent variables needed for a calculation of the fuel consumptions rate.”
One should also note that using such a slide rule was a task that came atop the flight engineer’s other, decidedly non-trivial duties. YouTube has increasingly become a treasure-trove of materials that would not previously have been readily available. Among materials that have recently appeared is the following training film for B-29 flight engineers, which nicely illustrates their typical responsibilities. An additional historical curiosity: one of the narrators is Ronald Reagan!
Optimized contruction? Prisoner laborers constructing the Baltic Sea-White Sea canal
In her recent New York Times Magazine essay, “A Sucker is Optimized Every Minute,” Virginia Heffernan posits that an increasing infatuation with “optimization” in our society is leading to cultural, economic, and political harms. Her themes and some of the topics she examines are very much in this blog’s wheelhouse, so I thought it would be useful to take a look at some of the ideas in her piece. First, I’d like to point out that, if we stand back and think about the various associations Heffernan draws, they should seem bizarre. A good example is her concluding line, “Right there in my Apple Watch: a mini Gulag, optimized just for me.” Suppose she chose a slightly different metaphor, say comparing Spotify music-selecting algorithms to Auschwitz. The obvious distastefulness of the comparison would make it immediately apparent that the former and the latter simply exist in totally different moral, intellectual, and institutional universes. Let’s leave aside the question of why it seems to be OK to rope Soviet forced-labor camps into clever cultural critiques. The fact is it is actually perfectly possible to follow Heffernan’s argument without undue bafflement. The reason has to do with our various inheritances from intellectual history. Continue reading →
This post is inspired by a blog post by analytics and software engineer Nathan Brixius concerning recent media interest in the Traveling Salesman Problem (TSP). The TSP, for the uninitiated, is to find a minimum-distance route between a set number of points; as the number of points increases, the problem of being certain one has found a solution becomes computationally formidable. Thus, the problem is really to find an efficient algorithm for finding solutions.
Randal Olson’s minimum-distance road trip
Come out with guns blazing, or lay out the welcome mat?
Michigan State computer science grad student Randal Olson developed, and blogged about, an algorithm to solve the Traveling Salesman Problem for a 48-stop tour of the United States. This is almost the exact same version of the problem featured in 1954 in the first publication to use linear programming methods to address the TSP. Olson’s approach was picked up by blogs at the Washington Post and New York TImes websites as an interest story. Unfortunately, Olson also suggested that guaranteeing a solution is computationally impossible—for 48 stops it is actually very simple to prove optimality.
TSP expert Bill Cook, Professor of Combinatorics and Optimization at the University of Waterloo, quickly pointed out that the true shortest route—35,940 meters shorter than Olson’s—could be easily computed on an iPhone using his Concorde TSP app. Brixius writes that, good as it is to point out OR’s extensive work on the TSP, it was important to go gently on Olson’s misstatements so that the OR profession would not come out of the episode looking bad.
And it’s here where, as a historian of science who happens to study the history of OR, I find I recognize the issue from two complementary perspectives.
In 1947 Air Force Project RAND—then a branch of Douglas Aircraft, but soon to become the independent RAND Corporation—decided that it needed to recruit social scientists to aid it in its studies of prospective military technologies. As a step forward it held a conference of social scientists that September. The director of natural sciences at the Rockefeller Foundation, mathematician Warren Waver, delivered the conference’s opening remarks.
The characteristically jokey opening to Warren Weaver’s opening remarks to the RAND Corporation’s 1947 conference on social science. People from technical fields moonlighting in the social sciences are prominently mentioned. The president of the New Jersey Telephone Company was Chester Barnard, who would soon become president of the Rockefeller Foundation. Document source: Papers of Edward L. Bowles, Box 44, Folder 4, Library of Congress Manuscript Division.
Asking the rhetorical question of why they had assembled there, Weaver began by explaining: “I take it that every person in this room is fundamentally interested and devoted to what you can just broadly call the rational life.”
As I note in a parallel post at Ether Wave Propaganda, the remark was first quoted in journalist Fred Kaplan’s 1983 book The Wizards of Armageddon, where it is truncated and explained in such a way that it appears to augur an attempt to marshall social scientists into an attempt “impose” a rational order on military strategy and national policy. The “rational life” quote has been used by a number of other authors since Kaplan’s book appeared, and the meaning of the term has always been taken for granted. This post explores what Weaver had in mind.
For my first post on Rational Action, I’d like to offer a summary of Max Weber’s classic analysis of rationality and social action in his posthumously published Economy and Society (E&S, 1922).1 This subject has not exactly wanted for attention. Weber’s discussion is unquestionably an important reference in twentieth-century thinking about rationality, and we will no doubt have ample opportunity to link back to this post in the future.
A central feature of Weber’s sociology was his belief that sociological inquiry should be grounded in the analysis of how individuals attach “meanings” to their “social actions.” For Weber, an action was social and subjectively meaningful to the actor insofar as it embodied some consideration concerning how others had acted and would act. Individuals’ social actions collectively gave rise to observed forms of social organization.
Rationality and Social Action
In Weber’s view, social actions could be classified into four types: “instrumentally rational (zweckrational),” “value-rational (wertrational),” “affectual,” and “traditional,” though he noted that this list was not necessarily “exhaustive” (E&S 1.1.2).