Case Study 2: Information in Economics and Communication

"The most important single central fact about a free market is that no single person or combination of persons directs it." -- Milton Friedman

What directs it, instead, is information -- flowing through prices, constrained by bandwidth, corrupted by noise, and distorted by asymmetry.


Two Systems, One Architecture

This case study examines information operating in two domains that, at first glance, seem to have little in common: market economies and communication networks. One moves goods and services; the other moves signals and messages. One is studied by economists; the other by engineers. But when viewed through the lens of information theory, the two domains reveal a shared architecture: both are networks for transmitting information under constraints, and both are limited by the same fundamental laws -- channel capacity, noise, bandwidth, and the consequences of information loss.


Part I: The Economy as a Communication Network

Hayek's Information Problem

In 1945, Friedrich Hayek posed what remains the deepest question in economics: how can an economy work when the knowledge needed to make it work is scattered across millions of minds, none of which possesses more than a tiny fraction of the whole?

The question is more radical than it sounds. Consider what it takes to produce a simple consumer product -- say, a cotton shirt. Someone must know how to grow cotton in a specific climate and soil. Someone must know how to harvest and process the raw fiber. Someone must know how to spin it into thread. Someone must know how to weave thread into fabric. Someone must know how to cut and sew fabric into a shirt. Someone must know how to ship the shirt from the factory to the store. Someone must know what styles and sizes consumers want. Someone must know what price consumers will pay.

No single person knows all of this. The knowledge is distributed -- scattered across farmers in Mississippi, machine operators in Bangladesh, designers in Milan, logistics specialists in Singapore, and retail analysts in New York. A central planner who tried to coordinate all of these activities would need to collect all of this knowledge in one place, process it, and issue instructions. The informational demands would be staggering.

Hayek's insight was that the price system accomplishes this coordination without any central collection or processing of knowledge. When the price of cotton rises, every participant in the cotton supply chain receives a signal: cotton is scarcer or more in demand. The farmer plants more. The mill looks for substitutes. The shirt manufacturer considers different materials. The retailer adjusts prices. Each participant responds to the price signal using only their local knowledge -- the farmer does not need to know why cotton prices rose, only that they rose. The price compresses the relevant global information into a local signal.

The Bandwidth of Prices

Hayek's argument can be stated more precisely in Shannon's terms. The price system is a communication channel with a specific bandwidth -- a maximum rate at which it can transmit information.

A single price is a number. In a market with many goods, the price system transmits a vector of numbers -- one price for each good -- that updates continuously. The bandwidth of this system is the rate at which these numbers change and the precision with which they are communicated. A stock market, with thousands of prices updating hundreds of times per second, has high bandwidth. A village market, where a few dozen goods trade at prices that change daily or weekly, has low bandwidth.

But bandwidth is only part of the story. The information content of prices depends not just on how fast they change but on how much uncertainty they resolve. A price that stays the same for months contains little information -- nothing has changed, nothing is surprising. A price that moves sharply in response to new developments -- a crop failure, a technological breakthrough, a change in government policy -- contains a lot of information. The Shannon entropy of the price stream determines how much genuine information the market is transmitting, as distinct from how much data it is generating.

This distinction matters. Modern financial markets generate enormous volumes of price data -- millions of transactions per second. But how much of this data is information? High-frequency trading generates vast quantities of price changes, but many of these changes are noise -- the artifact of algorithmic strategies interacting with each other rather than of genuine new information entering the market. The bandwidth of the market has increased enormously. Whether the information throughput has increased proportionally is a different and much harder question.

The Soviet Information Catastrophe

The most dramatic test of Hayek's information argument came from the experience of centrally planned economies, particularly the Soviet Union.

Soviet central planning attempted to replace the price system with a bureaucratic hierarchy. The State Planning Committee (Gosplan) set production targets for every enterprise in the economy. To do this rationally, Gosplan needed to know the costs, capabilities, and constraints of every factory, farm, and mine in the country. It needed to know consumer preferences, resource availability, and technological possibilities. It needed to process all of this information and produce a coherent plan -- millions of interlocking decisions, updated regularly.

The information-theoretic impossibility of this task was identified by economists Ludwig von Mises (in 1920) and Hayek (in 1945), decades before Shannon's theory provided the formal framework. The central planner faces a channel capacity problem: the amount of information that must be collected, transmitted to the center, processed, and transmitted back as instructions vastly exceeds the bandwidth of any bureaucratic communication channel. Forms must be filled out, reports must be written, instructions must be issued, compliance must be monitored. Each step introduces delay, noise, and the possibility of distortion.

The result, in practice, was exactly what information theory would predict: chronic misallocation. Factories produced goods that nobody wanted. Shortages of essential items coexisted with surpluses of useless ones. Agricultural output was planned without adequate information about local soil conditions, weather patterns, or crop diseases. The central plan was always out of date, because by the time information traveled from the periphery to the center and back, conditions had changed.

The Soviet economy did not fail because central planners were stupid or malicious. It failed because the information-processing task they had taken on exceeded the channel capacity of their institutional apparatus. The price system distributes this processing across millions of participants, each handling only local information. The central plan concentrates it in a single institution, which is overwhelmed.

Information Asymmetry in Markets -- Beyond Lemons

The main chapter introduced Akerlof's Market for Lemons. Here we explore information asymmetry in more depth, examining how it shapes three critical markets.

The Labor Market. Employers face a version of the lemons problem when hiring. Job candidates know their own abilities, work ethic, and intentions far better than any employer can determine through interviews and references. Employers respond to this information asymmetry in several ways, all of which can be understood as information-processing strategies.

Signaling. Michael Spence (who shared the Nobel Prize with Akerlof) showed that education serves partly as a signal of ability rather than a source of ability. A college degree does not necessarily make you a better worker. But obtaining a degree is costly (in time, money, and effort), and high-ability individuals can obtain it more cheaply (it requires less effort for them). This cost differential makes the degree a credible signal: the willingness to incur the cost reveals information about the candidate's type. The degree reduces information asymmetry by transmitting information about the candidate's unobservable qualities.

Screening. Employers design hiring processes to extract information: aptitude tests, work samples, trial periods, probationary employment. Each mechanism is a channel for transmitting information about the candidate's quality from the candidate (who knows) to the employer (who does not). The effectiveness of each mechanism depends on its signal-to-noise ratio -- how much genuine information about quality it extracts relative to how much irrelevant noise it produces.

Reputation. Over time, workers build reputations. Past employers provide references. Industry networks share information. Online platforms aggregate reviews. Reputation systems are, in information-theoretic terms, distributed memory -- a way of storing and transmitting information about quality across transactions and over time, reducing the information asymmetry that makes each individual transaction risky.

The Insurance Market. Insurance is the domain where information asymmetry's consequences are most precisely understood and most extensively studied.

The fundamental challenge is this: the people who buy insurance know more about their risk than the insurance company does. Sicker people are more likely to buy health insurance. Worse drivers are more likely to buy car insurance. Property owners in flood zones are more likely to buy flood insurance. This is adverse selection -- the pool of insured people is biased toward higher risk, because higher-risk individuals have more to gain from insurance.

Insurance companies respond with their own information-processing strategies: detailed application forms (extracting information from the applicant), actuarial tables (processing statistical information about risk categories), risk-adjusted pricing (using observable characteristics as proxies for unobservable risk), and deductibles and copayments (aligning the insured person's incentives with risk reduction). Each of these mechanisms is an attempt to reduce information asymmetry -- to bring the insurer's information closer to the insured person's information.

When these mechanisms fail -- when the information gap is too large to bridge -- insurance markets can collapse entirely. This is why some risks are uninsurable: the information asymmetry is so severe that no pricing scheme can attract a balanced pool of customers. The market for lemons becomes a market for no one.

Financial Markets. The 2008 financial crisis can be understood, in part, as an information catastrophe. Mortgage-backed securities were constructed by bundling thousands of individual mortgages into complex financial instruments. The complexity of these instruments made their true risk nearly impossible to assess. The chain from the original mortgage (where the borrower and the local lender had detailed information about risk) to the final investor (who held a slice of a pool of thousands of mortgages in a structure designed by financial engineers using models with hidden assumptions) involved multiple steps, each of which degraded information.

At each step, noise increased. The local lender's detailed knowledge of the borrower's creditworthiness was reduced to a credit score. The credit score was aggregated with thousands of others into a tranche rating. The rating was assigned by agencies (Moody's, Standard & Poor's) using models that assumed housing prices would never decline nationwide simultaneously. Each compression step -- from detailed knowledge to credit score to tranche rating to bond rating -- lost information. By the time the securities reached final investors, the signal (the true risk of the underlying mortgages) had been drowned in noise (the complexity of the securitization structure, the unreliability of the rating models, the opacity of the instruments).

The crisis was, in Shannon's terms, a channel capacity failure: the financial system was attempting to transmit risk information through a channel (the securitization chain) that lacked the bandwidth to carry it accurately. The result was precisely what Shannon's theory predicts when you transmit above the channel capacity: errors, and lots of them.


Part II: Communication Networks -- Shannon's World Made Physical

From Theory to Infrastructure

Shannon's theorems were not merely theoretical insights. They were engineering blueprints. The entire infrastructure of modern communication -- cell phones, the internet, satellite communication, digital television, Wi-Fi, Bluetooth -- is designed and built using Shannon's framework.

Consider the problem that Shannon solved. In the 1940s, telephone engineers faced a practical challenge: how to transmit the maximum number of telephone conversations over a single cable. The cable had a fixed bandwidth (determined by its physical properties) and a fixed noise level (determined by interference, thermal noise, and other sources). Shannon proved that the maximum number of conversations -- the channel capacity -- was determined by a simple formula involving only the bandwidth and the signal-to-noise ratio.

This formula, known as the Shannon-Hartley theorem, is: C = B log2(1 + S/N), where C is the channel capacity in bits per second, B is the bandwidth in hertz, and S/N is the signal-to-noise ratio. The formula says that channel capacity increases with bandwidth and with signal-to-noise ratio, but logarithmically -- doubling the signal-to-noise ratio does not double the capacity. The formula sets an absolute ceiling. No communication system, no matter how cleverly designed, can exceed it.

Modern communication systems operate remarkably close to this ceiling. A 4G LTE cellular network, for example, achieves data rates within a few percentage points of the Shannon limit for its bandwidth and noise conditions. The entire history of communication engineering since 1948 has been, in essence, the story of approaching Shannon's limit more and more closely -- not by transcending the limit, but by developing increasingly sophisticated coding schemes that extract more and more of the available capacity.

Error-Correcting Codes: Shannon's Second Revolution

Shannon's channel coding theorem proved that reliable communication over a noisy channel is possible -- as long as you transmit below the channel capacity. But Shannon's proof was existential, not constructive: he proved that good codes must exist, but he did not show how to build them.

The search for practical codes that approach Shannon's limit drove decades of research in coding theory. The progression is instructive:

Hamming codes (1950): Richard Hamming, Shannon's colleague at Bell Labs, developed the first systematic error-correcting codes. Hamming codes can detect and correct single-bit errors by adding a small number of check bits to each message. They were a breakthrough, but they operated far below the Shannon limit.

Reed-Solomon codes (1960): These codes, widely used in CDs, DVDs, QR codes, and deep-space communication, can correct multiple errors by operating on blocks of symbols rather than individual bits. Reed-Solomon codes brought communication systems closer to the Shannon limit and proved remarkably practical -- they are the reason a scratched CD can still play music.

Turbo codes (1993): Claude Berrou and colleagues developed turbo codes, which came within a fraction of a decibel of the Shannon limit. Turbo codes use an iterative decoding process -- a feedback loop that progressively refines the estimate of the transmitted message. They were adopted for 3G and 4G cellular standards and for deep-space communication.

LDPC codes (1960/1996): Low-density parity-check codes, originally invented by Robert Gallager in 1960 and rediscovered in 1996, achieve performance essentially at the Shannon limit. They are used in Wi-Fi (802.11n and later), 5G cellular networks, and satellite communication.

The trajectory is remarkable: from Shannon's proof in 1948 to practical codes that approach his limit in the 1990s, the engineering community spent half a century closing the gap between what Shannon proved was possible and what could actually be built. The gap is now essentially closed. Modern communication systems extract nearly all the information capacity that physics allows.

The Internet as an Information Network

The internet is, in Shannon's terms, a network of channels -- billions of channels connecting billions of devices, each channel with its own bandwidth, noise characteristics, and capacity. The engineering of the internet is, at every level, an exercise in information theory.

At the physical layer, fiber optic cables transmit light pulses through glass fibers with extraordinarily high bandwidth and low noise. The Shannon limit of a single optical fiber is in the range of hundreds of terabits per second -- far more than any current system uses.

At the network layer, protocols like TCP/IP manage the transmission of data packets across multiple channels, routing around failures, retransmitting lost packets, and managing congestion. These protocols are, in information-theoretic terms, error-correction and flow-control mechanisms -- they ensure reliable transmission over a network of individually unreliable channels.

At the application layer, compression algorithms (JPEG for images, MP3 for audio, H.264 for video) exploit Shannon's source coding theorem to reduce the number of bits needed to represent a message. A raw, uncompressed photograph might require 50 megabytes. JPEG compression reduces this to 2-5 megabytes by exploiting statistical redundancy in the image -- patterns that are predictable and therefore carry little information. The compression algorithm identifies and removes the low-information content, preserving only the bits that carry genuine surprise.

The entire architecture of the internet -- from the physics of the fiber to the protocols of the network to the algorithms of the applications -- is built on Shannon's foundation. The bit is not merely the unit of measurement. It is the unit of design. Every engineering decision in the communication stack is, at bottom, a decision about how to store, transmit, or process bits more efficiently.


The Synthesis: Markets and Networks

The parallel between economic networks and communication networks is not a vague analogy. It is a structural identity.

Both systems transmit information through channels with limited bandwidth. The price system transmits information about supply and demand through the channel of market transactions. The internet transmits information through the channel of electromagnetic signals in cables and airwaves.

Both systems are degraded by noise. Market prices are distorted by rumors, manipulation, irrational exuberance, and panic. Communication signals are distorted by interference, thermal noise, and crosstalk.

Both systems face channel capacity limits. The market cannot process more information than its participants can generate and interpret. The communication network cannot transmit more information than its physical channels can carry.

Both systems use error correction. Financial markets use auditing, regulation, disclosure requirements, and reputation systems to correct for informational errors. Communication systems use Hamming codes, Reed-Solomon codes, turbo codes, and LDPC codes to correct for transmission errors.

Both systems suffer from information asymmetry. In markets, sellers know more than buyers. In communication, senders may encode messages in ways that receivers cannot fully decode (encryption is the deliberate creation of information asymmetry; fraud is the malicious creation of it).

And both systems are constrained by the same fundamental law: Shannon's theorem. The capacity of any information-processing system -- whether it moves prices or packets -- is determined by its bandwidth and its signal-to-noise ratio. No amount of institutional design can make a market process more information than its structure allows. No amount of engineering can make a cable carry more bits than physics permits.

The bit is the currency of both domains. When an economist talks about "price discovery" and an engineer talks about "channel coding," they are talking about the same thing -- the transmission of information under constraints -- in different vocabularies. Shannon's framework unifies them. And this unification is not merely intellectually satisfying. It is practically powerful: insights from communication theory can illuminate market design, and insights from market economics can illuminate network architecture.

The universal currency is the bit. And it spends the same everywhere.