. GLOCOM Platform
. . debates Media Reviews Tech Reviews Special Topics Books & Journals
.
.
.
.
.
. Newsletters
(Japanese)
. Summary Page
(Japanese)
.
.
.
.
.
.
Search with Google
.
.
.
Home > Special Topics > Colloquium Last Updated: 15:15 03/09/2007
Colloquium #3: June 5, 2001

A New Approach to Communications Regulation


Introduction to the Werbach Paper

Shumpei KUMON (Executive Director, GLOCOM)

There are more than a hundred "fellows" at GLOCOM, selected members from academic, business, and public sectors in Japan and abroad who serve as important human resources to help us engage in collaborative activities globally. One of these activities is GLOCOM Platform, for which we have begun inviting Fellows to contribute by submitting essays.

The first such essay is by Kevin Werbach, formerly of the FCC and currently an editor and regular contributor to Esther Dyson's newsletter "Release 1.0." GLOCOM very much values the insights found in Release 1.0, along with George Gilder's "Gilder Technology Report," David Isenberg's "Smart Letter," and CANARIE-NEWS.

Kevin's thesis that the information revolution has changed the structure of the info-telecommunications industry from vertical (services tied to equipment) to horizontal (layers of networks, applications and contents), and that regulation models must change accordingly, is well known.

Werbach's essay below is written from this perspective, and provides an important and critical counterpoint to the hands-off policy being adopted by the FCC under the Bush Administration.


Third-Generation Communications Regulation

Kevin Werbach (Editor, Release 1.0)

Communications regulators in the United States, Europe and Asia face similar challenges. In an era of rapid technological change, they must meet several goals: promoting competition, furthering deregulation, fostering innovation, incenting investment and protecting consumers. The trouble is that these goals often conflict. For example, eliminating price regulation or interconnection requirements on dominant incumbents may preclude competition and result in poorer service quality, but aggressive market-opening requirements may cause operators to scrap investments in beneficial new services.

New cross-cutting services such as broadband and wireless data pose the starkest problems. The flashpoints, however, vary from country to country. In the United States, battles have been fought over residential broadband services delivered by cable television operators and local telephone companies. Also, the merger of AOL and Time-Warner led to a debate over the openness of next-generation services such as enhanced instant messaging and interactive television. In Japan, wireless Internet services will likely be the battleground, thanks to the tremendous success of NTT DoCoMo.

Traditional regulatory models fail in the current environment. This has happened before. The first generation of communications regulation was built around the concept of natural monopoly. Governments licensed or owned the sole providers of services such as public voice telephony. This approach gradually came under fire due to changing views about government and the impact of new technologies such as computers. Today's second-generation communications regulation focuses on competition and market forces rather than command-and-control rulemaking as the primary means to achieve governmental goals. Most industrialized nations have adopted this general philosophy.

Third-generation communications policy begins where these previous efforts leave off. It concentrates less on what government seeks to achieve than on how to get there. It is built on a bottom-up understanding of real-world conditions, rather than top-down models that fail to reflect the true situation on the ground. It is animated by the recognition that although the timing and paths forward are un-knowable,@the endpoints of the current technological revolution in communications are inevitable. The networks of the future will be data networks that carry services such as voice and video, rather than the reverse. And they will span many endpoints: wired and wireless, mobile and stationary, handsets and computers.

Vertical vs. Horizontal Regulation

The basic flaw of current regulatory models is that they emphasize horizontal service categories rather than vertical layers. The task of the regulator is to classify a service - voice or data, cable or telephony - and then mechanically apply requirements that flow from those rules. New services that don't fit existing categories neatly, such as wireless Internet access, pose problems for this approach. Thus it is here that third-generation regulation is starting to coalesce.

In an increasingly digital world, regulation should track the architecture of the Internet. The Internet has two essential architectural elements: end-to-end design and layering. End-to-end structure means that intelligence resides at the edges. A new service can be deployed by connecting any two client devices, without requiring approval or configuration inside the network.

Layering means that higher-level functions, such as application logic, are separated from lower-level ones such as congestion buffering and traffic routing. Layering in an end-to-end environment means that services can be moved up or down the stack. Internet protocol (IP) telephony, for example, takes a service - voice - previously delivered at one level and recreates it at a higher level on top of an Internet data stream.

For regulatory purposes, it makes sense to think of the Internet as comprised of four layers:

  • content
  • applications or services
  • logical infrastructure (addressing and signaling)
  • physical

Communications policy should be developed around these four vertical layers, rather than the horizontal categories employed today. This frees regulatory to evaluate not just the services being offered, but also the interfaces between layers. Open interfaces on the Internet allow competitors to circumvent a bottleneck at one layer by deploying services over another layer, and prevent companies that have control of lower-level services to prejudice or preclude certain services at higher layers. For example, Internet service providers do not determine the content users can access across their networks, nor can application providers force users to choose a particular delivery network.

Regulators should focus on these interface questions in deciding how to treat a new service. NTT DoCoMo, for example, uses its control over physical platforms (the wireless network) and logical infrastructure (its billing system) to influence the content and services available to users. The appropriate question is whether there are truly effective alternative mechanisms for users to gain access to those services.

The Seductive Idea of Symmetry

Another illustration of the problems with the existing conceptual model for communications regulation is the debate over symmetric versus asymmetric regulation. Symmetry exerts a powerful influence on our thinking in many areas. Researchers have shown that symmetry of facial features is a major element in subjective assessments of beauty across cultures. Particle physicists seek to explain the deep structure of the universe based on symmetries among families of subatomic particles.

Symmetry is important in law and regulation - two people or companies in the same situation should be subject to the same rules, if those rules are not to be considered arbitrary. Sometimes, though, symmetry can be deceiving. In arguing for symmetric treatment of two things, you must show that they are fundamentally the same. A company with substantial market power is generally not the same as a new entrant, even if they offer the same service. That is the basis of antitrust law (or competition policy as it is generally labeled outside the US). Monopolies may be treated differently than their competitors, regardless of how they attained their monopoly, because an unregulated monopoly can quash the competitive, efficient markets that best promote growth and innovation.

Incumbents with market power always argue for symmetric regulation, because they win regardless of how the specific rules come out. Either everyone is subject to significant regulatory obstacles, which make it difficult for new entrants to gain critical mass, especially since incumbents have the best experience and scope to manipulate or comply with regulatory requirements. Or the rules are light for everyone, which allows incumbents to consolidate their existing market power.

LOOKING TO THE FUTURE

These points don't mean that regulators, in Japan or elsewhere, must always intervene to force service providers to open their platforms to competitors. Before the emergence of the Web, online services such as AOL were closed and proprietary. An AOL user could not exchange email with a Compuserve user or with someone connected through a university network or independent Internet service provider. Today all major online services offer interoperable Internet email, and AOL users have full access to the Web. AOL had no choice but to open up in response to market forces. Even though AOL was the largest online service, the open Internet was larger.

Regulation always has a cost, and it should be used only when that cost seems justified by the associated benefits. Whenever possible, regulators should look for incentive-based routes to achieve their policy goals. For wireless services, the best situation is one in which many competitors have the spectrum and resources to offer services, so that no one service provider can afford to keep its network closed. Technical standards are also important. One reason for DoCoMo's success is that the HTML-based standard it chose for content authoring made it easy for third parties to create services. By insisting on choice and open interfaces, regulators will create the conditions for market forces to work in their favor.

We are early in the development of third-generation regulatory models. Now is the time, however, to set the conceptual framework for emerging services such as broadband, interactive television, and high-speed mobile data. With technology and markets changing, regulatory approaches must change as well.

 Top
TOP BACK HOME
Copyright © Japanese Institute of Global Communications