Further Reading: Field Autopsy — Technology

Tier 1: Verified Sources

Nilsson, Nils J. The Quest for Artificial Intelligence: A History of Ideas and Achievements. Cambridge University Press, 2010. The definitive history of AI research from its origins to the early twenty-first century. Nilsson, a pioneer in the field, provides a balanced account of the debates between symbolic and connectionist approaches, including the AI winter and its aftermath.

Minsky, Marvin, and Seymour Papert. Perceptrons: An Introduction to Computational Geometry. MIT Press, 1969 (expanded edition 1988). The book that launched the AI winter. Worth reading for what it actually proves (limitations of single-layer networks) versus what it was interpreted as proving (neural networks in general are a dead end). The expanded 1988 edition includes a new chapter acknowledging multi-layer networks, but by then the damage had been done.

Perez, Carlota. Technological Revolutions and Financial Capital: The Dynamics of Bubbles and Golden Ages. Edward Elgar, 2002. A theory of how capital creates and sustains technology bubbles. Perez argues that the bubble-crash-deployment cycle is a recurring structural feature of technological revolutions, not an anomaly. Directly supports this chapter's analysis of capital-sustained error.

Cassidy, John. Dot.con: How America Lost Its Mind and Money in the Internet Age. Harper Perennial, 2003. The most comprehensive account of the dot-com bubble. Cassidy documents the incentive structures, narrative dynamics, and institutional failures that sustained the bubble, providing extensive evidence for the capital-sustained error framework.

Rumelhart, David E., Geoffrey E. Hinton, and Ronald J. Williams. "Learning Representations by Back-Propagating Errors." Nature 323 (1986): 533–536. The landmark paper that demonstrated backpropagation for training multi-layer neural networks, directly addressing the Perceptrons limitation. One of the most cited papers in the history of AI.

Krizhevsky, Alex, Ilya Sutskever, and Geoffrey E. Hinton. "ImageNet Classification with Deep Convolutional Neural Networks." Advances in Neural Information Processing Systems 25 (2012). The AlexNet paper — the demonstration that ended the AI winter. Won the ImageNet competition by a staggering margin and triggered the deep learning revolution.

Tier 2: Attributed Claims

The "three godfathers of deep learning" designation for Hinton, LeCun, and Bengio, and their joint receipt of the 2018 Turing Award, is well documented by the ACM.

The dot-com bubble's destruction of approximately $5 trillion in market value is widely cited in financial histories. Specific company details (Pets.com, Webvan) are documented in SEC filings, financial media, and multiple published accounts.

Musk's autonomous vehicle timeline predictions have been documented through public statements, earnings calls, and media interviews from 2015 onward. The gap between predicted and actual timelines is a matter of public record.

The crypto market's peak valuation (approximately $3 trillion in late 2021) and subsequent decline (to approximately $1 trillion by mid-2022) is documented in market data from CoinMarketCap and other tracking services.

LeCun's deployment of convolutional neural networks for check-reading at AT&T/NCR in the 1990s is documented in multiple published accounts and in LeCun's own academic publications.

  1. Start with Nilsson (The Quest for Artificial Intelligence) — for the full history of the AI debates
  2. Then Minsky and Papert (Perceptrons, expanded edition) — for the primary source of the AI winter
  3. Then the Rumelhart, Hinton, Williams paper (1986) — for the theoretical breakthrough
  4. Then the AlexNet paper (2012) — for the demonstration that ended the winter
  5. Then Perez (Technological Revolutions and Financial Capital) — for the theory of capital and technology bubbles
  6. Then Cassidy (Dot.con) — for the detailed anatomy of a capital-sustained error