
isbn-13: 9780300273779
Hardcover
AI Generated Content
Generated by gemini-3-pro-preview

Introduction
“Making Sense of Chaos: A Better Economics for a Better World” is both a scientific manifesto and a memoir by J. Doyne Farmer, a pioneer in the fields of chaos theory and complex systems. The book presents a compelling argument for a paradigm shift in how we understand and manage the global economy. Farmer argues that standard economic theory—specifically neoclassical economics—has failed to predict major events or solve systemic problems because it relies on outdated assumptions like “equilibrium” and “perfect rationality” [1].
Instead, Farmer proposes a new framework known as Complexity Economics. He suggests viewing the economy not as a static machine seeking balance, but as a Complex Adaptive System, similar to a biological ecosystem or the weather. By leveraging big data and high-power computing, Farmer posits that we can simulate the economy to make more accurate predictions and address existential challenges like financial instability and climate change [2].
Overview of the Key Points
Farmer details specific tools and models that differentiate Complexity Economics from the status quo.
Agent-Based Models (ABMs)
The core technological proposal of the book is the use of Agent-Based Models. Unlike standard models that often rely on a “representative agent” to stand in for all consumers, ABMs simulate millions of individual diverse agents—people, banks, and firms—interacting according to simple rules. These simulations allow complex, emergent behaviors, such as market crashes or booms, to arise naturally from bottom-up interactions, providing a more realistic picture of the economy [3].
The Ecology of Strategies
Drawing from his experience in financial trading, Farmer describes financial markets as an evolutionary “ecology” rather than an efficient system. In this view, different trading strategies compete, mutate, and die out, similar to species in nature. This evolutionary perspective helps explain why markets are inherently unstable and prone to bubbles, challenging the traditional “Efficient Market Hypothesis” [4].
The Weather Forecasting Analogy
A recurring analogy in the book is the comparison between economics and meteorology. Farmer notes that fifty years ago, weather forecasts were largely unreliable. However, through the application of physics-based models and massive data ingestion, meteorology has become a highly accurate science. He argues that economics must undergo a similar “scientific revolution,” abandoning abstract philosophy in favor of data-driven simulation [1].
Overview of the Key Themes
The book addresses several broad themes that extend beyond technical economics into policy and society.
The Failure of Standard Economics
Farmer critiques the current standard, particularly Dynamic Stochastic General Equilibrium (DSGE) models used by central banks. He argues these models failed to predict the 2008 financial crisis because they treat the financial sector as a neutral “veil” and assume the system always tends toward balance. He suggests that adhering to these flawed models blinds policymakers to systemic risks [2].
Climate Change and the Green Transition
A significant portion of the book applies complexity economics to the climate crisis. Farmer highlights Wright’s Law, which describes how costs drop predictably as production volume increases (learning curves). His models suggest that the transition to renewable energy will be significantly cheaper and faster than standard economists predict. He argues that an aggressive green transition could save the global economy trillions, rather than being a costly burden [3].
Inequality and Networks
Farmer explores how the structure of economic networks naturally leads to wealth concentration. While standard models often ignore the specific web of who owes money to whom, Farmer’s network approach reveals how debt linkages can amplify inequality and systemic risk. He illustrates how network effects can trap wealth in specific clusters, exacerbating social divides [4].
Conclusion
Farmer concludes with a vision for a “Conscious Civilization.” He advocates for the creation of “Economic Observatories”—large-scale public institutions dedicated to gathering real-time economic data and running massive simulations. By simulating policy decisions—such as lockdowns or carbon taxes—before they are implemented, society can avoid disastrous unintended consequences. Ultimately, Farmer is optimistic: he believes that if economics transforms into a true science based on data and simulation, it can guide humanity through the crises of the 21st century [1].
Further Reading
- The Origin of Wealth by Eric D. Beinhocker
- Complexity and the Economy by W. Brian Arthur
- Forecast: What Physics, Meteorology, and the Natural Sciences Can Teach Us About Economics by Mark Buchanan
- The End of Theory: Financial Crises, the Failure of Economics, and the Sweep of Human Interaction by Richard Bookstaber
Sources
- [1] Publisher’s description: https://www.penguin.co.uk/books/284357/making-sense-of-chaos-by-farmer-j-doyne/9780141981208
- [2] Review: Making Sense Of Chaos: https://www.ictineducation.org/home-page/making-sense-of-chaos-a-better-economics-for-a-better-world
- [3] J. Doyne Farmer on Complexity Economics: https://jimruttshow.blubrry.net/j-doyne-farmer/
- [4] Making Sense Of Chaos Chapter Summary: https://www.bookey.app/book/making-sense-of-chaos
Quotes
Introduction
The core assumptions of mainstream economics don’t match reality, and the methods based on them don’t scale well from small problems to big problems and are unable to take full advantage of the huge advances in data and technology. For important global issues, such as climate change, standard economic theory’s wrong answers have provided a rationale for inaction, leading us to the brink of global catastrophe. Nobel Prizes have been given to economists whose theories were based on idealized arguments with no empirical support, providing fodder for neoliberal policies that led to extreme inequality and fueled sociopolitical polarization.
To be clear, this book is mostly about macroeconomics, the branch of economics that focuses on large-scale phenomena like inflation and unemployment and finance, which is about money and investments. We will focus exclusively on economic theory, rather than economic data analysis using statistical methods. This is called econometrics in economics and machine learning in the field of artificial intelligence. (In both cases, models are formed simply by fitting functions to data - economists tend to favor linear functions, while computer scientists prefer neural nets.)
Economics is fundamentally harder than physics because, unlike planets, people can think, and their behavior can change as a result, Economic theory has to explain the behavior of agents, like households or firms, who make decisions, such as what to consume, where to work, how much to work, where to invest, what to produce, how much to produce, what price to charge and how to innovate. There is a standard template for building economic theories that incorporate agents’ ability to reason, which evolved over the last 150 years and gelled into its present form in the 1970s. This template, which I will call standard economic theory, assigns each agent a utility function describing her preferences. For example, in standard macroeconomic models, households get utility from consumption and firms get utility from making profits while minimizing risk. Under standard economic theory, each agent makes decisions that make her own utility as large as possible. To find these decisions and understand their economic consequences, economists write down and solve equations that express this in mathematical terms. This is the basis of all economic theories that are taught in textbooks, and all the theoretical models that economists use to evaluate new policies and provide guidance to the economy.
Complexity economics is completely different from standard economic theory. To begin, standard economic theory and complexity economics use very different models for how agents reason and make decisions. Standard economics usually assumes forward-looking agents who are intelligent enough to reason their way through any problem. … Complexity economics, conversely, assumes from the outset that agents are boundedly rational, meaning they make imperfect decisions and have limited ability to reason. Our models for boundedly rational agents are less prescribed – they can be based on psychology experiments, census data, expert opinions or a variety of other sources. Agents can learn to achieve goals, but they typically only partially achieve them.
Complexity economics, conversely, assumes from the outset that agents are boundedly rational, meaning they make imperfect decisions and have limited ability to reason. Our models for boundedly rational agents are less prescribed - they can be based on psychology experiments, census data, expert opinions or a variety of other sources. Agents can learn to achieve goals, but they typically only partially achieve them.
The theoretical frameworks of standard economics and complexity economics are also very different. Standard economics draws heavily on mathematical methods from the theory of optimization. This is like finding the highest peak in a landscape (except that here it is an abstract landscape representing utility). Standard economists like to prove theorems about their models whenever they can. We complexity economists also use mathematics, but usually do so to understand something observed in simulations. The mathematical tools are taken from a much broader palette, using methods and concepts from fields as diverse as dynamical systems, statistical physics, ecology and evolutionary biology.
Last but not least, standard economic theory assumes from the outset that transactions only take place when supply equals demand, a state that is called equilibrium. Complexity-economics models, in contrast, do not necessarily assume equilibrium, but regard it as an emergent property when it happens.
I believe we should follow what I call the principle of verisimilitude: Models should fit the facts and their assumptions should be plausible. Assumptions that seem wrong from the outset are more likely to lead to false conclusions than plausible assumptions. We need to replace ‘as-if’ reasoning with ‘as-is’ reasoning.
The principle of verisimilitude recognizes that models need to contain the key features of the phenomena they attempt to explain, but they needn’t be literal representations of the world; models are abstractions, and we don’t need to capture every detail. When we talk about building simulations, we aren’t trying to create The Matrix. Verisimilitude just means capturing the essential components as realistically as we can. Good models should be as simple as possible, but no simpler. Agent-based models can be complicated or simple; computers easily keep track of details that are complicated to state mathematically, so we can include as many features as we need to. Most important, we can easily add new features without changing existing features, incrementally increasing the verisimilitude of our models of the world.
What is a Complex System?
The Project taught me the scientific method, drilling it deep into my soul in a way that graduate school never could. We developed the theory, designed the experiment, executed some difficult engineering and tested our hypotheses both in the lab and in the casino. It also taught me that randomness is a subjective concept that depends on one’s state of understanding. Better science and better technology can make something that seems random become predictable.
Complex systems is the study of emergent phenomena. These occur when the behavior of a system as a whole is qualitatively different from that of its individual parts. A good example is the brain: An individual neuron is a relatively simple device that takes in stimuli from other neurons and produces new stimuli. An individual neuron is not conscious, but somehow the 85 billion neurons in the human brain work together to produce consciousness and thought.
Digital technology enables us to create analogs of the real world inside the computer, sometimes called digital twins. Essential events in the real world have their counterparts in a simulated world. Scientists now use computer simulations to study just about everything – galaxy formation, protein-folding, the brain, epidemics, battle tactics and traffic jams.
The simulations used in complexity economics are called agent-based models. As the name emphasizes, the individual building blocks of these models – such as households, firms or governments – make decisions; they have agency.
There are many advantages to using simulations rather than mathematical equations, but one of the most important is the ability to model the diversity of the actors in the world – what economists call heterogeneity.
Perhaps the biggest advantage of complexity economics is its ability to solve hard problems. A model is tractable if it is easy to build and use to answer questions. Standard economic theory yields tractable models in simple settings, but this breaks down as things get complicated – it becomes too hard to solve the equations, and it becomes increasingly difficult to add new features to a pre-existing model. As a result, when a problem gets complicated, mainstream economists are forced to oversimplify by leaving things out.
Making realistic models requires good data. Agent-based models are naturally suited to make use of the vast quantities of data currently available. Worldwide, tens of millions of firms and billions of households make trillions of transactions annually. It would be extraordinarily useful to build a detailed map of the global economy, but, remarkably, no such map exists. If we built one, it would show us the structure of the economy in detail and enable us to track its changes over time. We are now developing agent-based economic models at the level of individual firms and households that can use such data. These models study the economy from the bottom up: Macroeconomics emerges from microeconomics.
Economists generally simply assume that prices are at equilibrium, defined as the point where supply equals demand. The idea of equilibrium originated in the latter half of the nineteenth century with the work of the French economist Leon Walras. He justified how the price could get to the intersection of supply and demand via a procedure called tatonnement (French for ‘trial and error’) that was used in the Paris stock market at the time.3 Because this process is cumbersome and time consuming it is no longer used in modern stock markets. In fact, in most modern markets transactions are made at prices where supply does not equal demand, and in some markets – like housing markets – they can be very different. As we will see, one of the contributions of complexity economics (my own work included) is in understanding how prices move when markets operate out of equilibrium, and demonstrating that this can be important.
What is a complex system? By definition, a system is complex if it has emergent properties. I’ve already mentioned the example of the brain. We humans don’t just think – we are conscious, with a sophisticated model of ourselves, the world around us and how we fit into it. Consciousness isn’t present in the individual building blocks – it emerges from the interaction of billions of neurons. In human brains and other complex systems, emergence happens when building blocks are connected together to give rise to behavior qualitatively different from what any of the building blocks can do alone.
Complex systems like the brain or an ant colony are called adaptive complex systems. They are distinguished from ordinary complex systems with simpler emergent behaviors by the fact that their properties have evolved over time, through a process of selection.
My goal in this book is to set out a vision for how we can build models that make better economic predictions. Surprisingly, many economists do not agree on the importance of this goal. To quote two prominent Harvard economists, David Laibson and Xavier Gabaix, Predictive precision is infrequently emphasized in economics research [. . .] In this sense, economic research differs from research in the natural sciences, particularly physics. We hope that economists will close this gap. Models that make weak predictions (or no predictions) are limited in their ability to advance economic understanding of the world.9 When I emphasize this in my talks, economists often respond that the central goal of economics is to provide a conceptual framework for thinking about the world and evaluating policy choices. I wholeheartedly agree that having a conceptual framework is central, but, in my view, this only makes the quest for better predictions even more important. If we can’t make reliable predictions, then how do we know if the conceptual framework is correct?
I should be clear about what I mean by the word prediction. Niels Bohr is said to have declared that ‘prediction is very difficult, especially if it’s about the future’. This remark seems ironic; aren’t all predictions about the future? Actually, no, not at all. Consider Boyle’s Law: In 1662, Robert Boyle invented a device that allowed him to control the volume of air inside a container and showed that the air pressure is inversely proportional to the volume. In other words, if you know the volume, you can predict the pressure, and vice versa. This is scientific prediction, true at any point in time – present, past or future.
Although complex systems can be complicated, they don’t need to be complicated. The words complicated and complex are not synonyms! Complicated means that something has many moving parts, whereas complex means it exhibits emergent behavior – that is, it does things that are not easily predicted a priori.
Chaos can be simple or complicated. Simple chaos and complicated chaos lie at two ends of the same spectrum, but their behavior is quite different. In simple chaos (like the Lorenz equations), the motion can be described using only a few variables. For complicated chaos, like the weather, many factors act independently, and many variables are required to describe the motion. For instance, models for predicting the weather depend on the air temperature, pressure, humidity, cloud cover, wind speed and direction, not to mention the local topography, the height and type of the clouds, the current amount of rain, the temperature of the land, and so on. Because the list of numbers needed to describe the current state of the weather is long, we say it has many degrees of freedom.
The economy is organized as an ecosystem of specialists. In biology, the word ecosystem refers to a collection of species who interact with and affect each other. Each is a specialist, with its own unique strategy for extracting energy from the environment in order to survive and reproduce.
Understanding this ecosystem means thinking about the economy in terms of networks, which provide a universal language describing the operations of complex systems.
Representing a system as a network begins by identifying its building blocks and their interactions: asking, ‘What are the most important nodes?’ and ‘Are there communities? If so, what are they?’ (Mathematically speaking, a community is a set of nodes that interact with each other much more strongly than they do with other nodes.) Websites provide a good example. The most important websites are not necessarily those with the most traffic or those that are linked to the largest number of other websites. Rather, they are the websites that are linked to the largest number of other important websites. Although this sounds circular, network theory provides a way to unravel this conundrum and identify the websites that are the most important.
…it’s helpful to describe the way the economy produces things in network terms. Economic activity in the modern economy is highly specialized. The production network reflects the division of labor in producing goods and services. The nodes are companies that provide those goods and services, and the links are the transactions among companies or between companies and households; households can be thought of as a special kind of industry that consumes the products of the other industries and provides them with labor.
A simulation is often preceded by building a network model. You can think of the network as the skeleton, and the simulation as the living tissue filling out the skeleton and enabling the body to function. A simulation goes beyond a network model by capturing the dynamics: How do the nodes and links, and their properties, change over time? Simulation is a bottom-up approach, acting at the level of the individual building blocks, letting the collective behavior of the system emerge. By definition, emergent behavior is bottom-up, and simulation is often the only method for understanding it.
Simulation in the social sciences is more challenging than it is in the physical sciences. Atoms don’t think or make choices, while stars obey the laws of gravity, which are precise, inviolable and well understood. But there are no laws that can reliably tell us how people will behave in any possible situation. People have agency; they make decisions, which are much more difficult to predict than physical processes. There is now a huge body of work in psychology and sociology that helps us understand how people behave; nonetheless, our models for human behavior are still imprecise and controversial, and, most important, they don’t cover all aspects of economic decision-making.
Agent-based models have been very successful for problems where decision-making is reasonably simple. A good example is traffic modeling, in which drivers are the agents.2 In a traffic simulation, each simulated driver interacts with simulated drivers in nearby cars, speeding up and slowing down as needed. Traffic jams are a simple example of an emergent phenomenon: As soon as one driver slows down on a crowded road, those behind must slow down too. Cars begin to pile up, which causes even more slowing down, until the traffic comes to a near-halt. Agent-based models do a good job of predicting the conditions that can lead to traffic jams, and they help city planners design transportation systems.
Agent-based models have also become essential tools in epidemiology. Epidemic simulations track the movements and interactions of individual people, who can be healthy, infected or immune. Such models can incorporate transportation routes and flight schedules connecting cities and the locations of hospitals and schools. Simulations can investigate ‘what-if’ scenarios and, in some cases, they have accurately forecast the likely effect of vaccination protocols or other public-health measures, helping to achieve the best possible outcomes with the least effort.