Jack Balkin's decline and Henry Farrell's... I don't know. It's all sad.
Henry Farrell and Marion Fourcade, "The Moral Economy of High-Tech Modernism"
Algorithms—especially machine learning algorithms—have become major social institutions. To paraphrase anthropologist Mary Douglas, algorithms “do the classifying.”1 They assemble and they sort—people, events, things. They distribute material opportunities and social prestige. But do they, like all artifacts, have a particular politics?2 Technologists defend themselves against the very notion, but a lively literature in philosophy, computer science, and law belies this naive view. Arcane technical debates rage around the translation of concepts such as fairness and democracy into code. For some, it is a matter of legal exposure. For others, it is about designing regulatory rules and verifying compliance. For a third group, it is about crafting hopeful political futures.3
The questions from the social sciences are often different: How do algorithms concretely govern? How do they compare to other modes of governance, like bureaucracy or the market? How does their mediation shape moral intuitions, cultural representations, and political action? In other words, the social sciences worry not only about specific algorithmic outcomes, but also about the broad, society-wide consequences of the deployment of algorithmic regimes—systems of decision-making that rely heavily on computational processes running on large databases. These consequences are not easy to study or apprehend. This is not just because, like bureaucracies, algorithms are simultaneously rule-bound and secretive. Nor is it because, like markets, they are simultaneously empowering and manipulative. It is because they are a bit of both. Algorithms extend both the logic of hierarchy and the logic of competition. They are machines for making categories and applying them, much like traditional bureaucracy. AndUnderstanding this helps highlight both similarities and differences between the historical regime that political scientist James Scott calls “high modernism” and what we dub high-tech modernism.4 We show that bureaucracy, the typical high modernist institution, and machine learning algorithms, the quintessential high tech modernist one, share common roots as technologies of hierarchical classification and intervention. But whereas bureaucracy reinforces human sameness and tends toward large, monopolistic (and often state-based) organizations, algorithms encourage human competition, in a process spearheaded by large, near monopolistic (and often market-based) organizations. High-tech modernism and high modernism are born from the same impulse to exert control, but are articulated in fundamentally different ways, with quite different consequences for the construction of the social and economic order. The contradictions between these two moral economies, and their supporting institutions, generate many of the key struggles of our times.
Both bureaucracy and computation enable an important form of social pow er: the power to classify.5 Bureaucracy deploys filing cabinets and memorandums to organize the world and make it “legible,” in Scott’s terminology. Legibility is, in the first instance, a matter of classification. Scott explains how “high modernist” bureaucracies crafted categories and standardized processes, turning rich but ambiguous social relationships into thin but tractable information. The bureaucratic capacity to categorize, organize, and exploit this informa tion revolutionized the state’s ability to get things done. It also led the state to reorder society in ways that reflected its categorizations and acted them out. Social, political, and even physical geographies were simplified to make them legible to public officials. Surnames were imposed to tax individuals; the streets of Paris were redesigned to facilitate control.Yet high modernism was not just about the state. Markets, too, were standardized, as concrete goods like grain, lumber, and meat were converted into abstract qualities to be traded at scale.6 The power to categorize made and shaped markets, allowing grain buyers, for example, to create categories that advantaged them at the expense of the farmers they bought from. Businesses created their own bureaucracies to order the world, deciding who could participate in markets and how goods ought to be categorized.We use the term high-tech modernism to refer to the body of classifying technologies based on quantitative techniques and digitized information that partly displaces, and partly is layered over, the analog processes used by high modernist organizations. Computational algorithms–especially machine learning algorithms– perform similar functions to the bureaucratic technologies that Scott describes. Both supervised machine learning (which classifies data using a labeled training set) and unsupervised machine learning (which organizes data into self-discovered clusters) make it easier to categorize unstructured data at scale. But unlike their paper-pushing predecessors in bureaucratic institutions, the humans of high-tech modernism disappear behind an algorithmic curtain. The workings of algorithms are much less visible, even though they penetrate deeper into the social fabric than the workings of bureaucracies. The development of smart environments and the Internet of Things has made the collection and processing of information about people too comprehensive, minutely geared, inescapable, and fast-growing for considered consent and resistance.In a basic sense, machine learning does not strip away nearly as much information as traditional high modernism. It potentially fits people into categories (“classifiers”) that are narrower–even bespoke. The movie streaming platform Netflix will slot you into one of its two thousand–plus “microcommunities” and match you to a subset of its thousands of subgenres. Your movie choices alter your position in this scheme and might in principle even alter the classificatory grid itself, creating a new category of viewer reflecting your idiosyncratic viewing practices.Many of the crude, broad categories of nineteenth-century bureaucracies have been replaced by new, multidimensional classifications, powered by machine learning, that are often hard for human minds to grasp.7 People can find them selves grouped around particular behaviors or experiences, sometimes ephemer al, such as followers of a particular YouTuber, subprime borrowers, or fans of ac tion movies with strong female characters. Unlike clunky high modernist categories, high-tech modernist ones can be emergent and technically dynamic, adapting to new behaviors and information as they come in. They incorporate tacit information in ways that are sometimes spookily right, and sometimes disturbing and misguided: music-producing algorithms that imitate a particular artist’s style, language models that mimic social context, or empathic AI that supposedly grasps one’s state of mind.8 Generative AI technologies can take a prompt and generate an original picture, video, poem, or essay that seems to casual observers as though it were produced by a human being.
Taken together, these changes foster a new politics. Traditional high modernism did not just rely on standard issue bureaucrats. It empowered a wide variety of experts to make decisions in the area of their particular specialist knowledge and authority. Now, many of these experts are embattled, as their authority is nibbled away by algorithms whose advocates claim are more accurate, more reliable, and less partial than their human predecessors....At the end of the day, the relationship between high modernism and high-tech modernism is a struggle between two elites: a new elite of coders, who claim to mediate the wisdom of crowds, and an older elite who based their claims to legitimacy on specialized professional, scientific, or bureaucratic knowledge.32 Both elites draw on rhetorical resources to justify their positions; neither is disinterested.they are self-adjusting allocative machines, much like canonical markets.
---1 Mary Douglas, How Institutions Think (Syracuse, N.Y.: Syracuse University Press, 1986), 91.
2 Langdon Winner, “Do Artifacts Have Politics?” Dædalus 109 (1) (Winter 1980): 121–136.
3 Virginia Eubanks, “The Mythography of the ‘New’ Frontier,” MIT Communications Forum, 1999.
4 James Scott, Seeing Like a State: How Certain Schemes to Improve the Human Condition Have Failed, (New Haven, Conn.: Yale University Press, 1998).
5 Robyn Caplan and danah boyd, “Isomorphism through Algorithms: Institutional Dependencies in the Case of Facebook,” Big Data & Society 5 (1) (2018): 1–12.
6 William Cronon, Nature’s Metropolis: Chicago and the Great West (New York: W. W. Norton, 1991).
7 Marion Fourcade and Kieran Healy, “Seeing Like a Market,” Socio-Economic Review 15 (1) (2017): 9–29.
32 William Davies, “Elite Power Under Advanced Neoliberalism,” Theory, Culture and Society 34 (5–6) (2017): 227–250; and Jenna Burrell and Marion Fourcade, “The Society of Algorithms,” Annual Review of Sociology 47 (2021): 213–237.
Farrell has a tag; Healy does too, but his link here is specific. Same with Farrell and boyd,
Norman Geras: What is your favourite song? Henry Farrell: My Bloody Valentine, 'Soon'.
Atomization: modernist technocracy—bureaucracy not as necessary as but as utopia—dumbs down both managers and managed. Connoisseurs are replaced by experts—Healy is the author of "Fuck Nuance"—it passed review.
Everything is overlap: scroll down for James C. Scott; connoisseurship, like anarchism, is a hobby for rich tenured technocrats.
What Farrell's still too stupid to see—still too much the academic tempted by libertarianism—is that government bureaucracy is authoritarianism out of the academy, of priests and shepherds, and the bureaucracy of the market is the bureaucracy of wolves. Academic libertarianism like academic radicalism in every form, is an oxymoron: a strict hierarchic elite engaged in collaborative, soi-disant disinterested reason, calling for... something else.
In a basic sense, machine learning does not strip away nearly as much information as traditional high modernism. It potentially fits people into categories (“classifiers”) that are narrower–even bespoke.
(The irony is that while high-tech modernist firms are happy to turn the market screw on everyone else, they strive to establish monopoly for themselves).22
The footnote is to Peter Thiel in the WSJ "Competition is for Losers".
The irony is there, but Farrell doesn't get it.
No comments:
Post a Comment
Comment moderation is enabled.