Content

/

Research Papers

/

Net Neutrality and Information Inequality

research papers

Net Neutrality and Information Inequality

October 1, 2024

The featured image for a post titled "Net Neutrality and Information Inequality"

This piece originally appeared in National Affairs.

Long before Andrew Carnegie became a titan of industry, he worked as a messenger boy for the Eastern Telegraph Company. In his autobiography, Carnegie recalled how he made $2.50 a week carrying messages back and forth between paying customers and telegraph operators. He was fond of that time — and of the perks that came with the job.

"There were wholesale fruit stores, where a pocketful of apples was sometimes to be had for the prompt delivery of a message," Carnegie wrote. "One great excitement of this life was the extra charge of ten cents [that] we were permitted to collect for messages delivered beyond a certain limit. These 'dime messages,' as might be expected, were anxiously watched, and quarrels arose among us as to the right of delivery."

The phenomenon Carnegie was observing is not unique to the telegraph: When it comes to communications, people have always been willing to pay a premium for priority services. Even today, people pay extra for Priority Mail Express — the fastest, most expensive service the U.S. Postal Service offers. They do so based on the tacit premise that not all information is equal.

Over the last 20 years, however, this conventional assumption has come under scrutiny. Increasingly, scholars have applied venerable ideas about the equality of man to information concerning men — and, by extension, the rules by which that information is transmitted. If all information is equal, as these experts contend, then all information must be treated equally.

As this theory has filtered out of the ivory tower, legal and political advocates have seized on it to build a framework for the equitable transmission of information. Dubbed "network neutrality" (or simply "net neutrality") by its proponents, this blend of utility-style common-carriage regulations and novel rules for the internet era has sparked a fierce debate among policymakers.

In most of these discussions, all information is presumed to be of equal value. But this assumption is false. As the history of telecommunications demonstrates, information is inherently unequal, and attempting to treat all information equally is likely to result in less innovation, less competition, and lower-quality services across the board. To maximize the economic, technological, and societal benefits of modern telecommunications networks, policymakers must recognize this fact and allow information to be treated unequally.

Dumb Pipes, Smart Operators

Efficiency often requires inequality. Some communications professionals today discount this principle, but those who designed earlier communications systems did not. Caesar Augustus, for example, built the most successful communications system in the ancient world in part by acknowledging the trade-off between equality and efficiency.

Known as the cursus publicus, the system was primarily intended for official use, providing rapid and reliable communication services for the government, military, and nobility across the empire. After it became apparent that not all messages were equally urgent, the Romans divided the cursus publicus into a fast lane and a slow lane: the cursus uelox and the cursus clauularius. The cursus uelox was further divided by the swiftness of the vehicle: horses, which were the quickest, followed by slower mules and still slower oxen. This split met the empire's needs for both speed and capacity in communications, contributing significantly to the administrative efficiency that characterized imperial Roman governance.

More modern forms of communication demonstrate this same principle. Take, for instance, the telegraph. Telegraph wires are a classic example of "dumb pipes" — components of a network that transmit all information the same way. There is no way to prioritize message delivery across the wires themselves, no cursus uelox for sending urgent official business; each message must "wait in line," as it were, before being transmitted. The primary bottleneck for telegraph systems thus manifests not over the wires, but among telegraph operators — the firms and individuals who decide which messages take precedence over others. This is where prioritization protocols come into play.

In the telegraph's heyday, people sitting in telegraph offices did most of the prioritizing. Telegraph operators adopted extensive protocols, or rules, regarding how to process messaging traffic. An 1866 employee handbook for the Western Union Telegraph Company, for example, notes that railroad and intracompany communications should always be given priority; that free messages of an "unofficial character" were to be limited and approved by a manager; that when an error was made, a message could only be delayed for correction if the delay did not affect the message's value; and that theaters, shows, and concerts always had to pay full freight. These protocols provided the operator's employees with a guide for deciding whether to prioritize a given message. The operators thus acted as checks on the system, managing traffic and preventing customers from misusing the telegraph services by claiming priority for non-urgent messages.

Like the Romans before them, telegraph operators succeeded because they understood that not all information is equal, and they adopted prioritization mechanisms reflecting this understanding. As a result, they were able to send great volumes of messages quickly and efficiently.

A second prioritization mechanism was also present in telegraph systems: the market. Carnegie's anecdote at the outset of this essay is an excellent example of market-based prioritization, both formal and informal: Informally, a messenger might receive apples as a reward for prompt delivery; more formally, customers could pay an extra dime to have their messages delivered beyond a certain limit.

Indeed, the history of the telegraph is replete with payments and dealmaking in exchange for priority services. In the 1840s, when inventor William Cooke sought to expand England's first telegraph beyond its original 13-mile test line, he cut a deal with the Great Western Railway to fund the project in exchange for free, priority use of the line. Similarly, the Atlantic Telegraph Company promised the British and American governments that it would carry official messages between the two for free if the countries provided money and ships to lay the first transatlantic cable. And in the late 1860s, Wall Street speculators were more than happy to pay for the privilege of priority services when the Gold Indicator Company began to provide a constant stream of information on stock and commodity prices using the newly invented stock ticker.

As telegraph operators recognized, paid prioritization is not inherently evil or undemocratic: It is simply a way of using market checks to efficiently manage network traffic.

Military communications in the mid-20th century offer an example of what happens without such checks. During the Vietnam War, when receiving up-to-the-second information about operations in the field was paramount, the U.S. Army Signal Corps developed a system for prioritizing messages sent over a combination of radio and telephone systems whereby the soldiers issuing communications from the field would label each message with a code identifying its priority level. The highest level, "Flash," indicated that the message was of the utmost importance and had to be transmitted immediately.

In a 1972 retrospective report, the U.S. Army outlined the flaws of the Flash system:

One of the most significant problems encountered in message switching in Vietnam was the abnormal amount of traffic using high-precedence indicators. The rapid growth in traffic resulted in completely distorted precedence distribution, in which up to 50 percent of all traffic was classified as "Immediate" or "Flash." The Joint Chiefs of Staff found it necessary to adopt a "Superflash" category in order to make sure that all the real "Flash" action was properly disseminated. General Van Harlingen, 1st Signal Brigade commander from mid-1967 until February 1969, noted in his final debriefing report that, as a result of the burgeoning of message traffic, there was a slowdown in the delivery of messages; and a situation had developed in which there was constant danger of losing messages...one of the most grievous sins in the communications-electronics community.

The Army Signal Corps had devised what appeared to be a reasonable method of prioritizing communications. But without centralized protocols or a market mechanism to check against misuse, everyone insisted that his message was more urgent than the others, generating confusion and backlogs. In the end, the overall quality of the network deteriorated. Had the United States not pulled out of Vietnam in 1973, the Joint Chiefs may have needed to create another designation — perhaps "Super-duper Flash" — once soldiers on the ground learned that Superflash was the new top designation.

When the number of messages is low and bandwidth is plentiful, it's easy to run an egalitarian network. But as use of a network increases and diversifies, network operators are forced to prioritize certain messages over others in order to carry out their services efficiently. And as with other limited resources, one of the best ways to efficiently and effectively distribute bandwidth is by granting the laws of supply and demand the freedom to work unabated.

Of course, the Army couldn't make troops in the field pay for prioritized treatment. But the point remains: Without some sort of restriction on traffic, all senders will insist that their messages take precedence. Thus, effective networks require checks against congestion and exploitation.

The Inequality of Packets

As telecommunications networks have expanded and evolved, the ways in which traffic is prioritized have changed in kind. In the days of the telegraph and the earliest telephones, people sitting at switchboards did most of the prioritizing. These individuals were soon replaced with electromechanical switches and, ultimately, digital switching. The latter is a prerequisite for a uniquely powerful communication tool of the modern age: the internet.

When it comes to prioritizing traffic over the internet, the processes and protocols have become increasingly complex. To understand the importance of treating information unequally in modern telecommunications networks, it is important to have at least a rudimentary understanding of how the internet works.

At its core, the internet is a network of networks that communicate with each other using standardized protocols. Together, these components are often referred to as the "technology stack." At the "top" or "edge" of the stack sit user-facing platforms and interfaces that most people interact with on a daily basis. At the networking layer, routers, switches, and load balancers apply various protocols to facilitate data transmission within the stack as well as to the broader internet. The middle layers of the stack can be thought of as the internal plumbing for the various platforms and their interactions with one another: They are composed of computing resources and services (data centers, cloud infrastructure, web servers, etc.) that process data. Finally, at the bottom of the stack sits the physical infrastructure — the fiber-optic cables and cellular-network towers — that carry the bits and bytes that make up the higher layers through the technology stack.

To illustrate this process, imagine a user wishes to stream a movie on a smartphone. This interaction begins at the user-facing edge layer, where the streaming app on the phone serves as the interface. The app sends a request for data through the networking layer, where protocols and routers facilitate communication. The request then reaches the middle layers, where web servers hosted in a cloud data center process it. These servers access the necessary movie files (likely stored on backend servers or databases) and send the data back through the network. The physical infrastructure — in this instance a combination of cellular towers and fiber-optic cables — carries all of these data through the middle layers of the stack and back to the edge layer, allowing the user to enjoy the movie.

During this process, data are separated into pieces known as packets, which are transmitted independently across the networks to their destination and reassembled. So long as the network has enough bandwidth to handle the traffic, the suite of protocols that dominates online data transmission (collectively known as the Transmission Control Protocol and Internet Protocol, or TCP/IP) routes traffic on a first-come, first-served basis. In other words, the logic that underpins the functioning of the internet is non-discriminatory.

Of course, not all packets are equally important or urgent, and some internet traffic is more sensitive. Latency — the delay between an action and a response — is one example of packet inequality. In many applications, latency is not particularly consequential: When sending an email, users don't typically expect instantaneous delivery — a slight delay is usually acceptable. When conversing on a video call, however, latency can radically degrade the quality of the service — as anyone who has ever accidentally talked over someone on such a call can attest.

There are also circumstances in which network latency can mean the difference between life and death. In 2001, doctors in New York City successfully removed the gallbladder of a patient in Strasbourg, France. This surgery — dubbed the Lindbergh Operation for being the first transatlantic surgical procedure — was only possible because France Télécom dedicated an entire transatlantic fiber-optic link to the operation to ensure that the system's latency was less than two-tenths of a second. Without a dedicated, low-latency fiber link, the delay between the doctors' actions and the response of the robot in Strasbourg would have been too great to safely perform the procedure.

The Lindbergh Operation was exorbitantly expensive and undertaken only to prove the feasibility of telesurgery. While most robotically assisted surgeries that occur today use dedicated internal networks to minimize latency and interference, experts continue to experiment with and develop telesurgery technology with the hopes of one day providing affordable world-class medical care to anyone, anywhere. For that dream to become reality, hospitals will require dedicated low-latency connections with a low risk of interference.

As the Lindbergh Operation demonstrates, this is doable. However, it requires broadband networks to dedicate infrastructure to a sole purpose: transmitting information to and from the doctor and the patient. Allowing hospitals to pay a premium for dedicated end-to-end services would be the simplest way to make this arrangement work.

Another instance in which recognizing information inequality is paramount comes from the cutting edge of wireless broadband-network technology. Using a technique known as "network slicing," a single physical network can be segmented into distinct virtual networks, or "slices," each tailored to meet specific requirements — higher bandwidth, lower latency, enhanced security, etc. These slices can then be further customized and optimized for different kinds of services, including emergency communications, autonomous vehicles, "internet of things" devices, and even (in theory) telesurgery.

There are numerous benefits to such a technique. By using the same physical infrastructure to create multiple virtual networks, network slicing reduces the need for additional physical resources, thereby cutting capital and operational expenses. Additionally, each network slice can be isolated and secured independently such that a security breach of one slice does not compromise other slices, thereby increasing the overall security of the network. Slices can also be dynamically created, modified, and decommissioned relatively quickly based on changing demands, thereby granting the network greater flexibility. Finally, each slice can have its own quality-of-service settings tailored to the needs of the specific service or application it supports, which improves overall user experience.

Autonomous vehicles stand to benefit greatly from network slicing in 5G networks. Dedicated, low-latency, ultra-reliable broadband connections are essential to the safe and efficient operation of such vehicles, which are in constant communication with servers running computations to prevent crashes. Network slicing can ensure that the bandwidth and connectivity needs of autonomous vehicles are met even in congested network environments — a crucial factor in maintaining consistent performance and safety standards.

In this sense, network slicing represents a novel way to distinguish between different kinds of traffic: It divides a network into separate "lanes," each with its own customized features to meet varying requirements, without forcing the rest of the network to meet those requirements. This is only possible when information can be treated unequally.

Whether it be greater speed, reduced latency, more security, or some combination thereof, it makes perfect sense for internet-service providers (ISPs) to offer different services to broadband customers based on their unique needs — and they are more than justified in charging varied rates for these services. Such an arrangement demands that private networks be given the freedom to innovate and experiment with different features and products without the threat of government intervention looming over their heads. But it is precisely these sorts of arrangements that have spurred a theory of information flows — exchanges of information among people, processes, and systems — that threatens to throttle progress.

A Theory of Information Equality

Before the late 1990s, when the internet was mostly a curious oddity for hobbyists, all internet connections were equally terrible. Dial-up connections that relied on old telephone wires were more of an add-on feature than a service unto themselves — an arrangement that naturally created bottlenecks. That changed when digital-subscriber-line (DSL) services emerged, and changed again with the introduction of dedicated cable and wireless networks. As a result, internet speeds grew exponentially over time, and consumers were generally content.

Yet some academics grew concerned. They contended that in this new broadband environment, ISPs' private interests would conflict with the public's interests and, in turn, erode the equality of the internet. This concern was voiced most forcefully by Columbia Law School professor Tim Wu — the same man who would later become one of the driving forces behind the Biden administration's anti-trust crusade against Big Tech.

In a seminal article published in 2003, Wu observed that ISPs' positions as gatekeepers to the internet would enable them to prioritize or deprioritize traffic for certain websites, services, and applications, thereby harming consumers and stifling innovation. He argued that ISPs should be forced to treat all data on the internet equally, without discriminating based on the user, content, site, platform, application, or mode of communication. He called his translation of a broad theory of information equality into public policy "network neutrality."

Two years later, the Federal Communications Commission (FCC) issued guidance outlining the principles intended to foster an open and egalitarian internet. In 2010, the FCC attempted to turn that guidance into binding regulations with the first Open Internet Order, which was later struck down as an overreach of the FCC's statutory authority. Then in 2015, the commission circumvented this decision by reclassifying broadband as a telecommunications service under Title II of the Communications Act, thereby giving it the authority to regulate broadband in line with Wu's vision.

Like the 2010 order, the 2015 order converted the theory of information equality into binding law. Among its numerous restrictions were prohibitions on broadband providers' blocking, degrading, or giving favorable access to any lawful content or application. As former FCC deputy general counsel Jonathan Nuechterlein and Colorado attorney general Philip Weiser explain in their book Digital Crossroads, net-neutrality advocates transformed the design principle of packet equality "into a normative policy judgment that the Internet's constituent [internet protocol] networks should remain 'dumb' in the sense that they should not 'know' what content...packets contain," making it impossible for them to discriminate among them.

Since 2015, net-neutrality rules have flip-flopped whenever the White House has changed hands. Under former chairman Ajit Pai, the FCC repealed net neutrality with the Restoring Internet Freedom Order in 2017. Now, under President Joe Biden, the commission has reimposed net-neutrality rules, which are being challenged in federal court. Though the current FCC has mustered every argument possible to justify net neutrality, the underlying theory remains the same: To protect consumers and promote competition, all information must be treated equally.

The Rise and Fall of Western Union

Wu's insistence on government-imposed information equality is grounded in a particular breed of progressivism that views nearly everything through the lenses of economic concentration and market power. Its theories received their most thorough articulation in the writings of the trustbusters of the late 19th and early 20th centuries. One of the most prominent of these individuals was Supreme Court justice Louis Brandeis, who believed that "bigness" — the economic and political power of large corporations or trusts — is an inherent threat to democratic values and economic fairness.

In his more recent work, Wu has gone so far as to compare broadband services to the abusive Standard Oil monopoly of the trustbuster era. In effect, he is asserting that ISPs' gatekeeper status over the internet is no different from John Rockefeller's colluding with the railroad cartels to lock out competition. And, according to Wu, the solution should be the same: imposing common-carriage-style regulations — a special set of rules that apply to services holding themselves out as willing to provide transportation to the public — on the offending parties.

In drawing this comparison, Wu is arguing that net-neutrality regulations will eliminate the threat of private control over the means of communication — just as railroad regulation freed the transportation industry from monopolization. But that comparison is crude and inaccurate.

A more telling example of how common-carriage regulations affect markets is the rise and fall of Western Union. Founded in 1851, a series of market acquisitions, ingenuity, and good fortune drove the company to monopolize the telegraph industry in under two decades. Western Union then proceeded to use its monopoly position to rack up an impressive assortment of abuses: The firm was accused of political corruption, insider trading, censorship, anti-competitive practices, and price gouging. In response to these and similar charges levied against other industries (most notably the railroads), Congress imposed utility-style common-carriage regulations on Western Union through amendments to the Interstate Commerce Act of 1887 in 1906 and 1910.

Though regulating Western Union may have prevented the company from further abusing its monopoly status, it also hampered its ability to compete as the marketplace evolved. As a common carrier, Western Union was subject to pricing and service requirements that limited its flexibility in responding to emerging competitive threats — most notably the telephone. Similarly, its legally enforced obligation to provide services in a non-discriminatory manner limited Western Union's ability to attract and keep customers by offering varied services to match diverse needs and preferences.

Over the years, the telephone industry — which was not subject to common-carrier regulations — began to challenge the telegraph's monopoly on telecommunications. Western Union attempted to compete, but it proved unable to keep up with the evolving marketplace and emerging technological advances. After losing an important patent suit in 1879, the company quickly deteriorated and was eventually acquired by AT&T in 1909.

Western Union's failure to foresee the disruptive potential of the telephone cannot be discounted in understanding its fall. However, common-carriage regulations also played a major role in preventing the company that would have been AT&T's natural rival from competing at full force. With its primary competitor's hands tied, AT&T itself became one of the most enduring monopolies in the history of the United States.

As the Western Union example illustrates, mandating information equality might be intended to level the playing field, but it often impedes competition by imposing compliance costs on existing firms and limiting their capacity to differentiate their services. It may also reduce the economic incentive for disruptive innovation. This was true of antiquated telecommunications services, but it is especially true of the internet.

A Free Market for Information

In the 1990s, many digital pioneers believed that the internet was somehow immune from the traditional laws of economics. John Perry Barlow and other founders of the Electronic Frontier Foundation believed that the digital era would usher in a new age of civilization that superseded centuries-old legal concepts of property, citizenship, and movement. Others, like economist George Gilder, believed that innovation would make bandwidth so plentiful as to be virtually free. Wu and other information-equality advocates drew on these views in developing their own theories and policy proposals.

Fortunately, the Clinton administration favored pragmatism over utopian egalitarianism. Recognizing the internet's limitless potential — and the fact that utility-style regulation could squash that potential — President Bill Clinton laid the groundwork for a light-touch regulatory framework. Congress refrained from imposing common-carriage regulations on the burgeoning market for ISPs in the landmark Telecommunications Act of 1996. Likewise, the Framework for Global Electronic Commerce ensured the government would treat other layers of the technology stack with similar deference.

In 2001, as President Clinton was leaving office, computer programmer and author Charles Platt published a eulogy for the egalitarian vision of a flat-rate internet. His conclusion was prescient: "Millions of broadband users have already voted with their wallets for high-speed access. Fast information, not free information, will drive and shape the future of the Net."

The Clinton administration's deference to markets and private innovation allowed the internet to mature where egalitarian virtues and heavy-handed government regulation may have strangled it in the crib. The question, then, should not be how to fit internet infrastructure within the confines of Barlow's, Gilder's, and Wu's egalitarian vision, but rather how to maintain and improve the environment that has allowed the internet to flourish. The answer lies in accepting the inequality of information.

Take, once again, the example of telesurgery. Under a net-neutrality regime, an arrangement whereby a hospital could purchase a dedicated low-latency link to ensure patient safety during operations would be forbidden — or, at the very least, it would require the hospital to request and be granted an exemption from the overarching regulations. That a rule would require explicit exemptions for such obviously beneficial innovations as telesurgery is itself an indictment of the principle.

Information equality is an extraordinarily rigid principle: It does not allow for experimentation or innovation unless the new service is guaranteed to be perfectly fair. In the case of cutting-edge technologies like network slicing, principles of information equality could hamper or prohibit innovation — and, by extension, competition. The virtual nature of network slicing means that most of its current uses would likely not be subject to net-neutrality rules, but it is far from clear that future uses will fall outside the regulations' scope. In fact, in its most recent order reimposing net-neutrality rules, the FCC explicitly declined to clarify whether network-slicing techniques would be permissible under the new rules. As a consequence, wireless providers will hesitate to invest in or experiment with network slicing and other innovative techniques for fear of running afoul of regulators.

A free market for information grounded in the presumption that information is inherently unequal, by contrast, would allow ISPs greater freedom to experiment and innovate. Unlike the standardized approach of net neutrality, market-driven models allow ISPs to offer a variety of service tiers and packages, thereby enhancing consumer choice and, in turn, competition among firms. It also allows ISPs the freedom to customize services for specific uses, such as telesurgery, without having to jump through regulatory hoops for permission.

Created Unequal

The history of communication, from the telegraph to the internet, underscores that not all information is created equal. This inherent inequality has practical implications for how networks should be regulated. The theory of information equality — and, by extension, net neutrality — overlook the nuanced reality of diverse needs and the benefits of market-driven prioritization.

To successfully navigate the digital age, we must recognize information's fundamental inequality, and leverage that understanding to foster a more efficient, innovative, and adaptable telecommunications landscape. Embracing a free market for information flows will provide the flexibility and dynamism we'll need to meet the ever-evolving demands of the future.

Explore More Policy Areas

InnovationGovernanceNational SecurityEducation
Show All

Stay in the loop

Get occasional updates about our upcoming events, announcements, and publications.