AI Scams Are the Point

The military’s role is thoroughly neglected in this book. AI Snake Oil does not mention the RAND Corporation, an indispensable appendage of the Cold War military-industrial complex that played a central role in hosting, funding, and connecting key pioneers of the AI research field. Early forays into simulation, modeling of human psychology, and problem solving in the field were conducted at RAND by RAND employees, alumni, and outfits. The Atlantic once called RAND “the paramilitary academy of United States strategic thinking,” due to its penchant for fusing military and civilian personnel, assets, and ideas—in line with Dwight D. Eisenhower’s call to subordinate civilian scientific research to grand strategy. Nor does AI Snake Oil mention the Defense Advanced Research Projects Agency, or DARPA—save for one line about the role it played in funding the early internet. We get no mention of how DARPA was used to fund the creation of major AI research labs at U.S. universities—again, as envisioned by America’s postwar plan to merge civilian and military research. One could go on like this, but the problem is clear: Though they set out to deconstruct contemporary myths about artificial intelligence, Narayanan and Kapoor present their own mythology, which shows the industry in a generous light and ignores institutional features born out of its origins and historical developments.

If you ignore the role of the military, the rise and fall and rise of neural networks is a story about hype cycles and stodgy peer reviewers and noble researchers and recently greedy companies. But if you think for a second about the role of our postwar military at home and abroad, it makes sense that our artificial intelligence research keeps looping back to surveillance, data extraction, prediction, pattern matching, and social control. Coming out of the Cold War’s military-industrial complex, is it any surprise that funders (namely venture capitalists and large technology firms) are designing and pursuing automated systems that prioritize such odious ends?

This depoliticized history leaves us with relatively meager proposals from Narayanan and Kapoor. Take their proposal to focus on curtailing demand for, rather than the supply of, AI snake oil. Yes, the dealers include firms eager to sell predictive AI, researchers who want flashy results, and sensationalist journalists (or public figures). Demand for AI snake oil, however, is about “misguided incentives in the failing institutions that adopt” it.