ArticlePDF Available

Coordination Costs and Standard Setting: Lessons from 56K Modems

Authors:

Abstract

The authors offer a detailed analysis of the coordination costs behind the standardization of 56K modems. They focus primarily on market events and standard setting activities during early deployment. They argue that the canonical model for a standards war is misleading in the case of 56K. They present alternative questions than the model's and examine different views on how market events during deployment influenced negotiations within the International Telecommunications Union and vice versa. Introduction We offer a detailed analysis of the coordination costs behind the standardization of 56K modems. Although the canonical model of a standards war could be applied to the case of 56K modems, we argue here that the model is misleading and instead offer up alternative questions for understanding how market events during deployment influenced negotiations and vice versa. There are three phases to a canonical model of a standards war: First, an economic opportunity arises from a technical upgrade. Second, competition develops between different implementations of that upgrade. Third, resolution of the conflict occurs when one of the implementations wins in a competitive market or a publicly spirited standard setting organization (SSO) becomes involved in resolving the conflict (for a review, see e.g., Farrell 1996 or Stango 2004). There are extensive case studies describing a variety of ways for winning a competitive war between fixed specifications. © Cambridge University Press 2007 and Cambridge University Press, 2009.
Visit the CSIO website at: www.csio.econ.northwestern.edu.
E-mail us at: csio@northwestern.edu.
THE CENTER FOR THE STUDY
OF INDUSTRIAL ORGANIZATION
AT NORTHWESTERN UNIVERSITY
Working Paper #0056
Coordination Costs and Standard Setting:
Lessons from 56K Modems*
By
Shane Greenstein
Northwestern University
and
Marc Rysman
Boston University
December, 2004.
* The authors thank Angelique Augereau, Joe Farrell, Tim Feddersen, Alicia Shems, Victor Stango, and
many seminar participants for useful remarks on our studies of the 56K modem market. We thank Bill
McCarthy, Simoa Campus, Ken Kretschmer, Michael Seedman, and Richard Stuart for sharing their
observations about events in this market. Rysman acknowledges the support of NSF grant SES-0112527.
All errors are ours alone.
Abstract
The authors offer a detailed analysis of the coordination costs behind the standardization of 56K
modems. They focus primarily on market events and standard-setting activities during early
deployment. They argue that the canonical model for a standards war is misleading in the case of
56K. They present alternative questions than the model’s and examine different views on how
market events during deployment influenced negotiations within the International
Telecommunications Union and vice versa.
2
INTRODUCTION
We offer a detailed analysis of the coordination costs behind the standardization of 56K
modems. Although the canonical model of a standards war could be applied to the case of 56K
modems, we argue here that the model is misleading and instead offer up alternative questions for
understanding how market events during deployment influenced negotiations and vice versa.
There are three phases to a canonical model of a standards war: First, an economic
opportunity arises from a technical upgrade. Second, competition develops between different
implementations of that upgrade. Third, resolution of the conflict occurs when one of the
implementations wins in a competitive market or a publicly spirited standard-setting organization
(SSO) becomes involved in resolving the conflict (For a review, see e.g., Stango, 2004, or Farrell,
1996). There are extensive cases studies describing a variety of ways for winning a competitive
war between fixed specifications. There are also a variety of reasons why an SSO chooses to
make a specification a focal point1 for further development (see, e.g., Chapter 8 of Shapiro and
Varian, 1999).
On the surface, parallels to the canonical model can be seen in the 56K modem standards
war. It did involve a fight between two seemingly symmetric network technologies, each of
which provided a possible specification for improving modem speeds beyond 33K. Two large
camps of firms formed around each specification, even though service providers and users would
have benefited from a single standard from the outset. Eventually, an SSO, the International
Telecommunications Union (ITU), intervened with a new standard specification that gained
widespread popularity and settled the war. This intervention was useful in that the market
appeared to grow rapidly afterwards.
1 A focal point is a specific option or characteristic that all the major players choose even when there are
several other feasible and plausible options.
3
Despite these parallels, we argue that the canonical model is misleading for 56K
modems. We support this argument with a detailed study of the standardization process. For the
case of the 56K, outsiders have access to adequate, though not excessive, documentation of key
events, such as interviews and industry reports in trade magazines, as well as statistical
information about deployment prior to the standard’s emergence. We exploit this detail by using
an eclectic mix of methodologies, weaving together “case study” evidence, interviews with
industry participants, interpretations offered by second-hand sources, and novel statistics.
Our methodology and data leads us to differentiate the 56K standards war from the
canonical model. We highlight four contrasts. First, in a canonical model, each party thinks it can
win the standards war. With 56K modems, neither camp of firms thought it could win the war on
its own. In actuality, neither one did. Second, in the canonical model the designs sponsored by
alternative camps are fixed. With 56K modems an alternative specification became the standard.
Third, in the canonical model design and competition occur in sequence. In contrast in 56K
modems, design, negotiation and the market process occurred concurrently and over time rather
than in some simple ordering. Finally, the canonical model treats standards as arbitrary focal
points and either does not look at their origins or has a superficial description of their origins.
With the case of 56K modems, creating a focal point at the ITU was extremely costly. These
fundamental differences in perspective and behavior from the canonical model lead us to pose
alternative questions than the model’s and, as a result, examine different views on how market
events during deployment influenced negotiations with the ITU and vice versa.
We begin by focusing on understanding the factors that shape the costs of coordinating
on a new standard—specifically on economic factors that shaped deployment prior to February
1998 when the V.90 standard emerged. The costs of coordination were primarily borne during
the early deployment of the 56K modem, a period when users could ostensibly choose between
two competing specifications, X2 and Flex. An important feature of the modem market was that
consumers only signed with an Internet service provider (ISP) within their local calling area.
4
Hence, competition occurred in distinct local markets, and decision making was fragmented. As a
result, we can do statistical analysis normally not available in other examples of technology
deployment. Borrowing from our companion paper, Augereau, Greenstein, and Rysman (2004,
hereafter AGR), we show that ISPs tended to split across X2 and Flex—not only nationally but
also within local markets. While the network features of the product created incentives to
coordinate on a single standard, local competition created great pressure to differentiate across the
technologies.
We next trace the relationship between early deployment and negotiations within the
ITU, for which there are several competing interpretations. We interpret this process as the the
cost of creating a focal point and pay special attention to the of role intellectual property (IP). To
be sure, we could also focus on why the ITU’s intervention was beneficial, but as there is little
dispute that the benefits were large, that insight is not particularly novel. More interesting, we
highlight two common and sharply contrasting views about the relationship between deployment
and negotiations. One view emphasizes the way in which market events strongly shape
negotiations. The other view argues that decisions were based on engineering choices, not on
business incentives. We argue for a middle ground between these two views.
Events of this case illustrate how some aspects of firm participation inside the SSO varied
with market circumstances and IP holdings, while other aspects did not. The situation compelled
participation and managerial attention of all interested parties, but each came to the SSO with
asymmetric negotiating positions. We argue that had positions been different then behavior would
also have been different, namely, behavior would have been less urgent or more urgent, and more
inclined towards compromise or less inclined.
Our study adds to the comparatively small number of close economic studies of standards
wars. (See Stango (2004) for a review of the literature on such wars.) As with other studies in
this vein, we identify conundrums for the canon by analyzing important aspects of these events
that either fit or do not fit canonical models. Shapiro and Varian (1999, Chapter 8) include a brief
5
summary of announcement by firms in different camps of the 56K modem war as of the end of
1997. We also offer evidence on facets of behavior where previous research is incomplete. We
analyze how deployment activity shaped the incentives of parties in negotiations and how the
negotiations in the ITU shaped behavior and outcomes. Understanding the parties’ asymmetric
positions and their relationship to deployment is crucial, we argue, for understanding the behavior
and outcomes in this particular standards war, as well as in other standards wars.
We now provide a short literature review of related studies. In the following section, we
provide an outline of the industry and setting. We then pose alternative questions and analyses
than those to which the canonical model points. In answering these questions and conundrums,
we examine different views on how market event shaped negotiations and suggest that each view
is incomplete. We then offer an alternative analysis of the case of the 56K standards war.
Our study follows in the spirit of several rich analyses of the role of standards during the
diffusion of new communications technology, such as Besen and Johnson’s (1986) study of FM
radio and color television and Farrell and Shapiro’s (1992) rich study of the standards war leading
to the specification for HDTV in the United States. Our setting differs because standard setting
takes place in an SSO, not under the auspices of a regulator that can mandate standards, such as
the Federal Communication Commission (FCC). Standard setting in an SSO requires a different
framework, one that understands the factors shaping the negotiation between firms.
Our emphasis also bears resemblance Von Burg’s (2001) study of the multiple
implementations of the Ethernet, and Dranove and Gandal’s (2003) study of the DVD/DiVX war.
There are key differences in our study from the previous ones. In both previous studies, market
events determined the choice between alternative specifications, each of which had its
commercial sponsors. In Von burg’s study of the Ethernet, three specifications competed in the
marketplace and an SSO endorsed all three, whereas in our study we examine how an alternative
standard, the V.90, arose at the ITU to replace the two competing specifications, X2 and Flex. In
Dranove and Gandal’s study, there were two technically different formats competing, as
6
compared to the two similar formats competing in our study; and one of those specifications
quickly failed in the marketplace. Also, in the DVD/DiVX war, firms tried to bypass the SSO,
whereas with X2 and Flex, firms believed that working with the SSO was an inevitable
eventuality. Thus, our study of the ways companies worked with SSOs as they competed with
each other and the relationship between SSOs and companies is based in different market
circumstances. As a result, we highlight a different set of relationships between deployment and
negotiations at the SSO. This leads to a very different set of insights about the costs of
coordination.
B. INDUSTRY AND SETTING2
The broad outline of events is not in dispute. Before 1997, the fastest available modem speed was
33K. In early 1997, competing consortiums introduced two types of 56K modems almost
simultaneously, X2 and Flex. Although their technical proficiency was identical in that they had
the same performance characteristics, they were incompatible, because if a consumer chose a
different modem than his or her ISP used, then the consumer was reduced to speeds of 33K or
worse. These products exhibited network effects in the sense that when more consumers picked a
modem, more ISPs would be attracted to it and the ensuing competition would lead to cheaper,
better, and more reliable service for the consumer. Nevertheless, sales in the first year went much
slower than the two sides had hoped.
In February 1998, ongoing negotiations between the industry participants at the ITU led
to the ratification of a new standard, the V.90. It was incompatible with both of the previous
technologies without a proper upgrade of equipment. The V.90 gained almost immediate
widespread acceptance, and sales of modems to both ISPs and consumers grew rapidly.
2 A more in-depth discussion of these issues can be found in Rickard’s (1997a, 1997b, 1998) studies.
7
We now explain the details behind the broad outline of events. A modem allows a
computer to send and receive data over a telephone line. The speed at which a modem can down-
and upload data is measured in bits per second (bps), so a 33.6K modem can send and receive
33.6 kilobits (33,600 bits) of data every second. In the early days of the Internet, modem users
typically dialed a telephone number that connected them directly to the computer with which they
wanted to exchange data. Modem users could only connect to computers that also maintained
modems. Numerous bulletin boards sprang up devoted to a wide variety of issues, where readers
could post questions and comments. Most exchanges were in “character mode,” which used very
little memory, so modem speed was not an important issue.
Two changes occurred in the mid-1990s. The first was the rise of ISPs, which allowed
users to dial a single number and connect to any computer on the Internet. This meant that only
computers associated with ISPs had to maintain modem banks to receive phone calls. Although
ISPs charged a fee, consumers often gained because they could access the entire Internet through
a local telephone call.3 Many bulletin board moderators transformed into ISPs as they already
had the basic technology (banks of modems) to do so. This led to a very unconcentrated industry.
In 1997, about 93% of the U.S. population had access to a commercial ISP by a local phone call
(Downes and Greenstein, 1999). An important feature of concentrating modem usage at ISPs was
that ISPs often found it worthwhile to invest in digital connections to the local telephone
company switch, which meant that ISPs had fast, high-volume connections to the Internet.
A second change in the mid-1990s was the rise of the World Wide Web. The Web
provided a protocol for transferring data over the Internet, which allowed for the widespread use
of graphics and digital photographs. This change greatly enhanced both the demand for Internet
access and the importance of consumer connection speed.
3 The ISPs also offered e-mail accounts and access to the World Wide Web.
8
These two changes made 56K a potentially valuable technology. Up until early 1997,
33.6K modems were the fastest available for use with analog telephone lines. Rockwell
Semiconductor was practically a monopolist (over 80% market share) in the production of
modem chipsets, or the internal hardware of a modem. They licensed their technology to over 100
resellers that produced modems under different names. The most successful of these was U.S.
Robotics, with about a 40% market share in retail modem sales.
The adoption of digital circuitry between ISPs and the telephone companies allowed for
the elimination of one analog-to-digital transformation, which allowed for theoretical modem
speeds of up to 56K. U.S. Robotics recognized this possibility first and began work on their X2
modem.4 Worried that they would be closed from this new market, Rockwell quickly began work
on their own 56K modem. After joining with Motorola and Lucent in this endeavor, their product
was called K56Flex, or Flex. Due to setbacks at U.S. Robotics and a remarkable production run
at Rockwell, both brought their product to market at essentially the same time, February 1997.
Some product reviews suggest there were problems with Flex up until July. It is clear from
contemporary reports that within 6 months the two technologies worked equally well, though
there could be variability between them depending on local connection characteristics.
The cost of the new modems depended on the purchaser. Modems for consumers were
initially priced at around $200, as compared to $100 for 33K modems. For ISPs, the conversion
depended on their technology. Since the 1980s, the entire telephone network was being
gradually upgraded to a digital system. If an ISP was in an area that had been fully upgraded, it
could offer 56K by simply buying a few consumer-grade 56K modems. If an ISP’s connection to
the telephone network had not been upgraded, it would have to invest in T1 lines or ISDN lines,
which represent high-quality digital connections to the Internet.
4 Much of the market was at 28K, which used the same basic technology as 33K, and 56 = 28 × 2: Hence
the name X2.
9
Because racks of consumer modems had high maintenance and administrative costs, they
were an inefficient way to offer 56K to more than a few customers. As a result, ISPs tended to
invest in a Remote Access Server, a large server that came equipped with high-quality modems
and required T1 lines or ISDN lines. For instance, in March 1997, U.S. Robotics sold the Total
Control Network Hub that connected forty-eight ports to two T1 lines for $44,126, or $919.29 per
port.5
The price per port could be driven down to around $500 for larger servers. Digital lines
such as T1 lines had installation costs around $2000. Monthly charges for digital lines were
around $50 per port, as opposed to $20 or $30 for analog lines. Note that many ISPs had already
invested in Remote Access Servers and T1 or ISDN lines, as they were also an efficient way to
handle 33K modems. The ISPs could simply upgrade their server. Doing so cost $50 to $100 per
port and was sometimes offered for free as the standards battle intensified. The ability to upgrade
depended on the server—U.S. Robotics servers could be upgraded only to X2, most other servers
could be upgraded only to Flex. The result was that upgrade costs were much higher for some
ISPs than for others.
The Development of the V.90 Standard
Throughout this time period, there were deliberations over standard setting at the
Telecommunications Industry Association (TIA) and the ITU. The TIA is an organization of
private firms in the United States, and it has representation at the ITU. The ITU is an
organization of the United Nations, which sets standards for telecommunication issues under its
ITU-T branch. Typically, negotiations on a standard start at the TIA and then are moved to the
5 Each connecting consumer requires one port. Because consumers do not all connect at once, ISPs
typically required one port for every three or four consumers. The number of ports that a typical ISP
maintained at a given point-of-presence ranged from fifty to many thousands.
10
ITU. Negotiations may simultaneously continue at the TIA, as they did to some extent in this
example.
The ITU-T has both government and “sector” members. Sector members are typically
private firms. Currently, ITU-T has 189 member states and more than 650 sector members, 128
from the United States. The Department of State, the Department of Commerce and the FCC
represent the U.S. Government. A sector membership costs between $20,000 and $50,000
annually and, for U.S. companies, requires approval by the Department of State. All members
may participate in any working group, such as Study Group 16, which handled the 56K modems.
The negotiation is based on submissions, typically proposals for the potential standard, along with
documentation of technical characteristics and possibly performance data. The ITU requires a
consensus vote to approve a standard.
The ITU was holding meetings with industry participants as early as November 1996 and
claimed that it would announce a standard for 56K modems about two years after the introduction
of the modem. It is important to keep in mind several points when evaluating the progress of the
market during ITU negotiations. First, it is not clear how credible the ITU’s scheduling claims
were. Two years would be very quick relative to previous ITU decisions. Farrell (1996) reports
that similar organizations delivered standards in five years, on average. Second, the ITU had no
enforcement power in this case; it served only to create a focal point.6 In theory, if one
technology could emerge as the market standard, the ITU’s decision might not matter. Therefore,
it was crucial that all the major players chose to support and participate in creating the ITU’s
standard even when other specifications (their own) were available.
Our evidence below suggests that market participants did not believe it was a realistic for
one of the pre-existing specifications to win in the market. Nevertheless, even two years was
6 With some technologies, the ITU can compel member governments to use approved technologies in
government contracts (but even this relies on the United Nations’ enforcement power). But in this industry,
the ITU has no enforcement power.
11
considered a long time in this industry, so that may explain why it appears that the technology
sponsors seemed to compete as if they were trying to win a standards war, as opposed to waiting
for the ITU decision. Certainly, if the ITU decision dragged on for years, as it had with some
other standards, then competing vigorously was the only sensible strategy.
As it turned out, 56K modem sales to ISPs went very slowly relative to what the market
could have supported.7 Barely 50% of ISP’ adopted 56K by October 1997, with almost none of
the large ISPs (AOL, AT&T, UUNET, MSN, GTE, Bell-South, EarthLink) adopting. Although
there is some evidence that X2 sales were greater than Flex sales, most evidence suggests that
sales to consumers were relatively low (We present more evidence of this below). Rockwell and
U.S. Robotics felt that the source of these problems was the standards battle.
With strong industry support, the ITU announced the V.90 standard, an amalgam of X2
and Flex, in February 1998. At the time, this was regarded as the fastest the ITU had ever reached
a decision (ITU Press and Public Information Service 1998). Although the V.90 was incompatible
with either of the previous two standards, sales were strong and there was widespread adoption by
both ISPs and consumers.
In summary, the events of this case appear to have all the elements of a canonical
standards war. There was an economic opportunity arising from a technical upgrade of modems,
and all parties believed this opportunity would be valuable for users and vendors. There was a
conflict between different implementations of that upgrade, but these implementations did not
appear to be technically or functionally different from each other. A publicly spirited SSO
became involved and promulgated a specification for a new specification as standard, apparently
to the benefit of all parties and users. Nevertheless, as we argue in the subsequent section, several
7 A descriptive article on the ITU web site contains quotes from industry experts such as: “The market was
drying up. … people had stopped buying 56K modems;” and “A split was there for a short time” (ITU
Press and Public Information Service 1998).
12
questions and conundrums arise that the canonical model does not address. These issues are
important for understanding how market events affected negotiations and vice versa.
C. QUESTIONS AND CONUNDRUMS
In this section, we use the events surrounding the deployment of the 56K standard to illustrate
broad principles about standardization processes. Our analysis stresses why in this instance a
canonical model of the standards war is misleading or at least underspecified. We explore
different approaches for characterizing standardization processes and stress the role of
deployment. We develop quantitative and qualitative evidence about the interaction between
deployment and the standardization process at the ITU. Much of this is based on interviews with
market participants. Specifically, we discuss the following nine conundrums and questions:
1. Did ISPs have incentives to coordinate?
2. What incentives to coordinate did the modem makers have?
3. Why are focal points with 56K so costly?
4. How do intellectual property conflicts shape the costs of negotiation?
5. How does the voting structure and rules at the SSO shape the costs of coordination?
6. Why do SSOs not encourage the use of side payments?
7. Does standardization lead to technical improvement?
8. How do participants in standard-setting processes use all the available information?
9. Are SSOs substitutes for each other?
Our analysis of deployment shows why the product’s network features created incentives
to coordinate on a single standard, but local competition created great pressure to differentiate
across the technologies. In addition, we stress that it is not possible to understand the behavior of
market participants without understanding their asymmetric market positions and the negotiation
process. The interaction of these asymmetries and negotiations receives the most attention in our
study, especially as we identify and characterize different common viewpoints. We ultimately
13
argue that had positions been different, then behavior would also have been different, that is, less
urgent or more urgent, and more inclined towards compromise or less inclined.
Readers should keep in mind that our analysis is necessarily speculative, and the
methodology must rely on our interpretation of a relatively small number of interviews and
articles in the trade press. Most lessons are not “proven” in the sense of statistical analysis or
mathematical proof. We also identify places where questions are open because we cannot “test”
between differing claims and interpretations for what occurred in the 56K market. With that
caveat in mind, we turn to the results from our case study of the 56K modem market.
1. Did Internet Service Providers have incentives to coordinate?
The ISPs that adopted 56K modems before the V.90 was available made a choice
between one of two existing technologies. Similar to the standard models of network effects,
they had an incentive to coordinate on the same technologies as their rivals, which would raise
the possibility that they were using the technology that ultimately would become the market
standard. However, ISPs had a countervailing incentive. They could adopt the technology that
was less popular to take advantage of larger margins available in the admittedly smaller market.
Our companion paper, AGR, explores this issue in detail.8 Here, however, we provide
some simple statistics suggesting that ISPs preferred differentiation to coordination. In other
words, we answer: No, ISPs did not have incentives to coordinate with local competitors.
Building on directories of ISPs, we construct a data set on adoption decisions in October 1997,
after the products were widely available but before it was clear the ITU would soon reach a
decision. For 2233 ISPs, we observe their adoption decision (X2, Flex, both, or neither) as well
as a list of telephone numbers that could be used to connect. Merging with a database on local
8 In this section, we analyze deployment of 56K modem technology as of October 1997 and summarize the
more extensive statistical work of AGR. Our primary purpose for showing this data is to emphasize the
geographic dispersion of deployment and decision making by ISPs in the United States, which were factors
in raising coordination costs.
14
telephone calling areas allows us to determine which consumers could call which ISPs. As
consumers almost always sign up with an ISP in their local calling area, we take local calling
areas as independent markets.
Several issues arose in construction of the data. First, we observe less than half of the
ISPs in existence, though the ones we miss tend to be small and probably would not have adopted
in any event. Second, we observe only a single decision for each firm, not what their decision
was in each location. Nevertheless, our understanding is that most firms actually did make a
single decision for all of their locations simultaneously. Third, some telephone switches may be
part of multiple local calling areas. In these circumstances, we arbitrarily assign switches to a
single local calling area. Detailed empirical models in AGR suggest that assignments do not
affect the results.9
In Table 1, we show the adoption rates in October. By this time, only about half of the
ISPs had deployed. Moreover, the vast majority of non-deploying ISPs were large, so the
percentage of customers served by 56K was much lower than a half. About 8% of adopters
actually adopted both technologies.
Our method creates 2,298 local calling areas. Local calling areas have relatively few
firms in each one. The average number of ISPs in a calling area is fifteen with a standard
deviation of 20.8. However, there are 738 calling areas with only one ISP and the median number
is only three. In Table 2 we show average adoption rates by local calling area. Again, there are
only a few adopters in each calling area. The average number of adopters in October 1997 is
about six ISPs per calling area. Flex leads X2 when tallied by ISP (as in Table1), while X2 leads
Flex when tallied by locale (as in Table 2).
To discuss local interactions, our approach here is to compare the national adoption rate
with the adoption rate in each local calling area. If the rates are close to the same, it suggests that
9 Data came from the Directory, Boardwatch. See Augereau, Greenstein and Rysman (2004) for a more
detailed discussion of the data and a more detailed statistical analysis.
15
ISPs were differentiating from each other. If local markets are characterized by agglomeration on
one standard or the other, it suggests network effects were important.
We look only at ISPs that adopt X2 or Flex and ignore ISPs that adopt neither or both.
We look only at the 1,595 markets in which there are at least two such ISPs. Among such firms,
57.7% adopted X2. We are interested in calculating the number of markets in which adoption
approximated this rate of 57.7% and term these markets highly differentiated. As a point of
comparison, we compute what we would have expected if ISPs had made their decisions
independently, with a 57.7% chance of adopting X2 and a 42.3% chance of adopting Flex.
In Table 3, we report the percentage of markets for which the adoption rate falls within a
given window, where the window roughly brackets the average national adoption rate. For
instance, we see in the first row that in 17% of 1,595 markets, the portion of firms adopting X2
fell between 55% and 60%. There are 13,613 separate firm-market combinations. If each one of
these had adopted X2 with the probability 57.7%, we would have expected only 11% of markets
to fall within this 55-60% window. The results in rows 2 and 3 tell a similar story for larger
windows.
The results in Table 3 show that the number of differentiated markets is much higher than
would be expected if the firms were choosing independently. In other words, contrary to what
one would expect, there is no geographic clustering at a local level. In AGR, we establish the
statistical significance of this result and account for numerous possible complications, such as
that ISPs make only a single choice across markets, that some switches are in multiple local
calling plans, that there is an impact from firm characteristics and demographic variables, and
that there is possible endogeneity of ISP decision making.
When we brought up our hypothesis that ISPs competitively differentiated from each in
interviews, we received mixed responses, with some subjects finding it believable while others
found it implausible. We were struck that the interview subjects with a closer relationship to the
16
smaller ISPs found it plausible, as we believe our result is mostly driven by the smallest ISPs. No
subjects provided a convincing alternative explanation for these results.
2. What incentives to coordinate did the modem makers have?
There appear to have been ample incentives to coordinate, but for different reasons than
one might have expected from studying canonical models of standard setting. It is crucial to
understand what issues participants considered open and what issues they considered settled.
This case illustrates how participants can be both certain about some aspects of a standard and
uncertain about others.
In this instance, everyone had similar expectations about participation: Market
participants acted with the belief that an ITU standard eventually would emerge. The open
question was when and with what features; and nobody forecast with certainty which specific
proposal would emerge. Similarly, that the ITU announced the V.90 in February 1998 was widely
regarded as fast by historical norms. Given that this situation was confrontational and many
others at the ITU were not, this speed was viewed as sooner than the most optimistic forecast
from when the process started two years earlier. Yet, no one ever doubted that such an
announcement would arrive eventually.
This raises the related question about what participants expected prior to the ITU
standard. Participants acted with the belief that there could be nothing more than a temporary de
facto standard arising from the market success of one specification or the other. In other words,
participants could not forecast how long the market process would continue and how it would
proceed, but nobody acted as if this was the only possible mode for standard setting. Nor did the
market process alone, or in conjunction with the TIA, provide an opportunity for standard setting
equivalent to that in the ITU process.
17
In this light, we can guess why both parties found it in their interest to cooperate with the
ITU process even though—after a short period of competition—the X2 standard seems to have
had an advantage over Flex in sales and deployment. First, we consider the camp formed around
the Flex specification, where the interpretation appears straightforward. The ISPs with server
equipment that aligned them with Flex were suffering in the market, and the Rockwell group
risked losing them as customers. Hence, the Rockwell group had a clear incentive to agree to a
standard that put it on better technical footing.
More surprising, U.S. Robotics never considered ignoring an ITU standard even though it
believed it was winning the standards war. The economic incentive for this stance is not
transparent in retrospect. Why agree to an ITU standard and allow the Rockwell group to begin
marketing substitutable products? Why abandon de facto standardization on X2 through market
processes if that provides a lead and adequate profitability?
We can catalogue several related reasons. The first reason was grounded in the history of
the market. The Flex group had the most established participants in the industry; Lucent, and
particularly Rockwell, were dominant in the previous technology. Despite having the more
dominant X2, U.S. Robotics believed that it could not standardize the worldwide market without
Rockwell’s participation, at least not in a reasonable time. Similarly, Rockwell and Lucent could
not act unilaterally and push through their standard without consulting U.S. Robotics. The second
reason strengthening the first is that U.S. Robotics believed its advantage in the pre-standard
market could be maintained in the post-standard market. For instance, U.S. Robotics established
a shelf-space advantage in consumer modems, which it felt had lasting power. Together, these
points led U.S. Robotics to believe that the “market-growing” features of an ITU standard
outweighed the competitive impact of a public standard.10 A third reason is one of status. Because
10 Another explanation for coordination between the U.S. Robotics and Rockwell groups is that they were
participating in some sort of repeated game. Note that many features of this market are not conducive to
collusion. U.S. Robotics had not produced chipsets in the past and so this was at best the beginning of a
18
Rockwell was historically dominant and U.S. Robotics was regarded as an upstart, agreement
between the two represented a symbolic victory for U.S.R because it attained status as a major
equipment manufacturer. The ITU standard-setting process ratified this status.
These observations motivate an interesting counterfactual question about how the market
process shapes the ITU standard. What would have happened if Flex had been dominant in the
pre-standard market? Would market participants have treated the ITU standard as inevitable in
that case? Given that Rockwell would have been dominant in both the 56K technology and
previous technologies, it would seem that it would be in a position to impose a proprietary
standard. Our conjecture is that the Rockwell group still would have encouraged an ITU standard,
but the open question is whether the standard’s specification would have looked different. For
example, would Rockwell’s negotiators have taken a different stance in the face of IP held by
others?
Such counterfactual questions are hard to resolve, by definition. It is especially difficult
because the eventual standard was a combination of different specifications, a compromise among
many. Would the combinations have differed if the market positions had been different? There
are generally two views on this hypothetical question, both of which we discuss in further detail
below. One conjecture is that the standards process has its own momentum and largely ignores
the market position, because other issues, such as resolving conflicts over IP or the technical
merits of a proposal, are paramount to the speed of the outcome and the type of specification that
results. The other view is that the market position informs the urgency of all parties and
contributes to a firm’s willingness to compromise in specific ways. Had the market position
differed, so too would have behavior at the ITU, which might have affected the eventual speed of
decision making and the chosen design.
repeated game. Also, as new standards appear only once in a few years, there would be a long delay before
the “punishment phase.” Also, the next technology (broadband) was expected to be extremely different
from 56K modems.
19
3. Why are focal points with 56K so costly?
In the canonical analysis of standards wars, focal points have a role in settling the
standards battle. As options that grab the attention of all the major players, they provide a
coordinating device when all parties need one. As the standard emerges, it must combine a bundle
of components that must all interoperate. A standard is a public good, providing non-rivalrous
information about designs to any manufacturer, resulting in a set of goods that collectively work
better together than they would have in the absence of the public good.
In the canonical model of a standards war, it does not matter how focal points arise. They
are typically modeled as the outcome of a “sunspot” or a “public coin-flipping.” The crucial
feature of a focal point is its steadfastness after it emerges. Steadfastness arises from one of
several sources: strong and transparent economic incentives from key sponsors; a mandate from a
government agency; the presence of difficult-to-change investments by many interested parties;
and historical precedent that cannot be erased. In the standards canon, it is not essential whether
arriving at that point was costly or not, only that it is difficult to change once there.
On the surface, the deployment of 56K modems fits the canon, because it illustrates why
a focal point here had such benefits. Use of a focal point avoids the type of geographic
fragmentation that occurs when firms are diffusing competing technological specifications. It also
fits the canon in the sense that the ITU had precedent on its side and a promise to follow a
predictable process in the future. It had provided industry standards for successive generations of
this technology and, as we previously stated, industry participants believed that the ITU would do
so again using much the same decision-making process. In addition, other buyers in other
countries looked to the ITU standards before purchasing equipment, so the ITU standard
potentially had a gate-keeping function as well.
20
But the ITU is a much more costly mechanism for negotiating a focal point than the low-
cost mechanisms typically found in the theoretical models, such as sun-spots or public coin-
flipping. In addition to the membership costs previously detailed, participation requires sending
delegations to meetings that take place throughout the United States and Canada, and (for other
standards) throughout the world. Meetings require submissions with potentially expensive
documentation of technical claims. The actual negotiations themselves have their own cost. All
participants recalled the pain affiliated with the brokering associated with 56K. All sides involved
lawyers, engineers, and marketing executives at many firms. Nobody called this easy.
An obvious reason why this process is so costly is that negotiating an agreement has
nontrivial explicit costs. An additional reason is that designing a new technology requires an
investment of research and development. A third reason may be some inefficiencies in the ITU
system relative to some optimal SSO, but we doubt this is important. Indeed, it can be costly to
choose between alternative approaches to a technical problem even when the disagreement is
entirely within a single firm. Note that a benefit and a potential reason for the costliness is that
once an agreement is reached, market players are less likely to revisit the standard-setting
process, which raises the likelihood of implementation of any given standard.
This last observation takes on more saliency in light of our next few remarks about
negotiations over IP. One of the major negotiating costs in the V.90 involved negotiating through
all the parties’ IP claims.
4. How do intellectual property conflicts shape the cost of negotiation?
The most naïve models of standards wars are portrayed as solely a fight between
producer and user surplus. That is, consumers lose when two proprietary implementations for a
technical opportunity vie to gain the producer surplus and thereby delay deployment. In this
scenario the SSO’s only role is to represent the potential for foregone surplus for users and to
make vendors act less selfishly. For the case of 56K, the SSO’s primary purpose was different.
21
The ITU serves as a forum for negotiations between parties who choose to participate.
This set normally includes the conflicting parties as well as others. If users show up to represent
their interests in the negotiations, then they have a voice too, but there is nothing about the
negotiations process that guarantees user interests will be central, or even present. Nor is there
any compelling law mandating a specific outcome from negotiations. Firm activity is voluntary.
Why do firms use this forum to negotiate? While there were many potential issues to
negotiate, the most worrisome in the case of 56K was that a proposed specification might infringe
upon IP held by several firms. Resolving IP issues was the primary activity performed during the
negotiations. No other factor was as crucial for achieving agreement on the specification of the
V.90. Accordingly, protection of IP appears to be the most prominent feature motivating
participation.
To a professional manager in communication equipment markets or a consultant familiar
with standardization cases, the importance of resolving IP issues is not surprising; however, it is
surprising how little attention this topic receives in the canonical framework. We suggest that
although there are some well-understood legal issues, there are fewer economic frameworks for
analyzing the role for IP at SSOs. More generally, there is no framework for how IP shapes
negotiating costs.
One view of how IP shapes negotiation costs is that patents are simply bargaining chips
useful for achieving a desirable outcome from the SSO (such as delaying the adoption of the
standard). For instance, if one firm holds a patent necessary for solving the technological issue in
question, that firm is in a position to negotiate or delay to their advantage.
Of particular importance to our sources was the use of patents to influence standing in the
post-standard market. Formally, all firms would have to pay licensing fees to use the patents of
other firms covering the standard. But it was widely understood that firms that held patents over
the standard would cross-license their patents to each other, thereby ensuring free use of the
22
standard to patent holders. This feature meant that firms prioritized the inclusion of their patents
in the final standard.11
The second view emphasizes the procedural and cultural momentum that shaped the
negotiations. According to this view, business decisions are based on engineering choices, not on
the economic incentives of participating organizations. The principle goal is to walk out with the
best technical standard according to the evaluators’ engineering norms without regard for the
impact on private interests directly.
Under this view, negotiation within an SSO is very different from simple bilateral
negotiation between parties. For example, the debate over IP was not solely a legal debate, as it
might have been if IP lawyers negotiated a bilateral agreement outside the purview of the SSO.
Instead, because the debate occurred inside an SSO such as the ITU, it became subject to pre-
existing decision-making rules for including or excluding features of a standard. That is,
participants closely scrutinized the claims about the functional contribution of a technology
covered by a patent and vigorously debated over the technical merits of proposals. The resolution
of these disputes was partially tempered by engineering norms of the participants at the ITU
subgroups.
Resolving disputes requires appreciation of the minute level of engineering detail and
legal nuance embedded in a patent. It is not possible to resolve issues by mechanical means or
nondiscretionary decision-making norms. These observations point toward the importance of
formal and informal rules at SSOs for resolving conflicting business interests or conflicting
technical claims. As a practical matter, SSOs cannot resolve such matters without a myriad
combination—or clash—of views from firm participants, administrative staff, technical talent,
and legal expertise.
11 For example, “Companies want a piece of their technology in the standard so that others will have to pay
a licensing fee for the use of the technology,” said an executive at 3-Com (ITU Press and Public
Information Service, 1998).
23
We take a middle ground between these views—especially as design and negotiation
occurred concurrently for the 56K. On the one hand, the role of IP as a negotiating tool was
correct because we saw that participants would have opposed a standard that did not include their
IP. That IP was important in assuring property-holders’ position in the post-standard market was
clearly on the minds of many of the people involved. On the other had, there is also more than
just a grain of truth to the belief that agreement was dictated by the “best technology” rather than
by strategic concerns. It is partly reflected in everyone believing a standard would eventually
emerge from the ITU and in the degree of control held by engineers associated with the ITU.
In the case of 56K it is clear (in retrospect) that all firms approached negotiations over IP
issues with a sense of urgency about reaching an outcome and a sense of cooperation, or, at a
minimum, non-obstructionism to a point. Participants perceived that an ITU standard would help
virtually all parties, particularly if done sooner rather than later. As was previously noted,
multiple factors, including the market positions of the firms, contributed to those perceptions and,
hence, these choices. To be sure, the outcome was ultimately constrained by many of the
important technical details that shaped the precise specification, which inevitably resulted in
costly negotiations. Without a sense of urgency and cooperation, however, the negotiations would
have been even more costly, and would almost certainly have reached resolution at a later date.
5. How do the voting structure and rules at the SSO shape the costs of
coordination?
Voting structure
Assigning authority for dispute resolution is an important facet of negotiation costs; for
example, SSOs in general can resolve disputes via consensus voting or majority voting. We find
that market participants have thoughtful and sophisticated assessments of how particular SSOs
resolve disputes. Such assessments include views about where an SSO vests authority to resolve
disputes and what biases arise as a result of these assignments.
24
The ITU uses a consensus voting structure and requires nondiscriminatory licensing
practices. This structure is important for resolving IP disputes. The open question is whether the
specification of a standard is affected by these negotiation rules or whether the outcome would be
the same under any set of rules for resolving disputes—both in the case of 56K and in general.
There are two contrasting views about consensus voting, consistent with the two camps
we previously identified. One view— consistent with the first camp—stresses the strategic
behavior of participants. Firms want their IP in any given standard, and they try to have the
standard modified to include their patents. A consensus voting process gives them great leverage
to do so. This process might not create the best technology available, but it does create one that
all participants will approve. That is, patents may be included just to help the working group
achieve a consensus in favor of the proposed standard.
The alternative view—consistent with the second camp—is that technical merit plays an
important role in determining inclusion. Through discussion, it is possible to exclude
“unimportant” technologies that degrade the functioning of the standard. It is through such a
process of review and open debate that a superior hybrid technology emerges. In this view,
consensus voting ensures that all participants are heard and ensures that the ITU considers all
known options.
Majority voting can have very different implications. In a process based on majority
voting, there is much less scope for firms to ensure the consideration of their technology, for
better or for worse. We cannot make a blanket statement about the efficiency of majority voting
over consensus voting. One of our contacts works with both the IEEE (i.e., Institute of Electrical
and Electronics Engineers) and the ITU. The IEEE is based on majority voting, and he reports12
that outcomes are easily manipulated. When a vote arises that is important for a particular firm
that firm will send a large number of people (e.g., twenty) to the meeting. Most of these
12 AU: Again, possibly insert information here regarding the interview.
25
attendees do not know what is going on in the meeting, but a group leader signals to them how to
vote. One may wonder at the efficiency implications of such a system.
The canonical approach to analyzing a standards war treats the choices as fixed. There is
no general framework for thinking about a negotiated specification at an SSO. Hence, the canon
does not provide much guidance beyond the conventional wisdom, namely, that consensus
procedures lead to better technologies whereas majority voting leads to quicker agreements.
Verifying this convention requires evidence about a wide cross-section of cases well beyond the
case of 56K and the scope of this article13
Rules
A contrast of views pervades the debate about the comparative relevance of the ITU’s
requirement that participants agree to license their related patents at a “fair and reasonable rate.”
Some believe this rule works as intended, whereas others focus on how this rule raises
coordination costs. In particular, the ITU requires that any participant holding a patent that may
affect a proposed standard must disclose the patent. The participant must also agree to license that
patent at a “fair and reasonable rate” and do so in a manner that is nondiscriminatory. This does
not imply that licensing is cheap, nor does it mean that patented technology will become widely
available at some price. If firms vary from some consensus view of what constitutes a reasonable
price they can be sanctioned in other SSO actions, or, at worse, taken to court for violation of a
participant’s rules.
Lemley (2002) provides an excellent discussion of the various legal issues that arise from
this type of requirement and the interaction of SSOs and IP more generally. Lemley stresses the
importance of handling IP for the success of an SSO, and one of his central policy
recommendations is that SSOs develop clear statements that are similar to the one at the ITU.
13 Indeed, the case of 56K even seems to mildly defy this wisdom, as it was widely regarded as both
beneficial and comparatively fast.
26
The ITU requirement is there to ensure that the ITU is not unknowlingly making
proprietary technologies into international standards and that a standard can be implemented
easily after the ITU has endorsed it. The goal of the ITU requirement is that any firm can make
use of the standard whether or not that firm participated in the standards process.
One may question whether this rule accomplishes its stated goal. If the ITU requirement
operated as intended, licensing patents associated with a standard would be straightforward.
However, if that were the case, firms would not be willing to expend so many resources ensuring
that they can cross-license relevant patents after the standard is promulgated.
Assuming that it is difficult to make a standard without infringing on someone’s IP, at
least in part, we conjecture that there are two possible outcomes for any negotiating session. One
is “unaggressive,” that all firms deliberately avoid staking claims over their own IP, volunteer
their IP without fuss, and compromise specifications emerge quickly, even when they come close
to violating someone’s patents. The other norm is “aggressive,” that all firms attempt to include
their patented technologies in an eventual compromise specification, claim broad importance for
them, and achieve a cross-licensing deal to their benefit.
We conjecture that the first norm, unaggressive behavior, cannot survive in the presence
of at least one firm acting according to the second norm, aggressive behavior. That is, if one firm
tries to include a patented technology and make broad claims about it, then it is in the interest of
all firms to do the same. In anticipation of that outcome, it is in every firm’s incentive to try to get
any advantage they can from making their patented claims earliest.
The ITU rule can be interpreted as an attempt to promote the first norm, where all
volunteer the patents freely and without fuss, thereby lowering negotiating costs for everyone. If
the second behavioral norm holds, however, the negotiation costs are likely to be high whether or
not the rule is present. That is, negotiation costs are high in situations where there are many
conflicting claims over IP. In such a case, it is unclear whether the rule about licensing alters
behavior or even helps.
27
In the 56K standard negotiations, participants placed an emphasis on getting their IP
included in the standard. On the surface, this appears to be aggressive behavior. For reasons we
next explain, we conjecture that this would have arisen under virtually any set of consensus rules
and licensing norms. And it made negotiating costs high.
6. Why do SSOs not encourage use of side payments?
In any basic economic model of negotiations, the objective of negotiations is to identify
the set of common solutions that yield net benefits to all parties. It is a common property of such
models that all parties can use side payments to enlarge the set of possible outcomes that leave all
parties better off.
The negotiations at SSOs, in contrast, typically do not include side payments, and the
negotiations at the ITU for 56K modems followed the SSO convention. Why SSOs follow this
convention is puzzling, since such a habit drives up costs.
First, we examine why costs are raised by the absence of side payments. Consider one
naïve model of the negotiating process—the joint-surplus maximizing model—which, if side
payments were present, would correctly describe negotiating behavior. This model requires side-
payments for an agreement to arise in any setting where participants have very asymmetric assets.
In such a model, participants in a standard-setting process always choose the technology that
maximizes joint participant surplus. The SSO could simply use side payments to compensate
participants who would lose relative to some alternative technology. Firms with inferior
technology could be paid to vote with the best technology.
Second, we examine the puzzle of why SSOs do not use side payments, and find that the
conundrum is more complex when we highlight the relationship between negotiations and
deployment. If side-payments solutions were observed often, then it would not be so important
for firms to place their IP in the standard. In practice, it seems that the major form of payment for
28
a vote is to include the voter’s IP in the standard, which brings the benefit of allowing the firm to
participate in the post-standard market. This is obviously a crude method of payment and its use
is puzzling in comparison to side payments, which are much more efficient.
The following observation highlights the relevance of no-side payments. Agreement can
be difficult when a firm has relevant IP but does not plan on participating in the post-standard
market. In these cases, the IP holder expects licensing payments, which makes the rest of the
participants wary of ratifying a particular standard. Our interview subjects noted that a key to the
quick agreement on the V.90 was that all the participants who had relevant IP also were
producers in the post-standard market. All market players anticipated participating in market
processes after the standard was announced and were willing to cross-license their patents,
allowing for production without licensing fees.
We conjecture that the absence of side-payments here arises for many of the same
reasons contracting breaks down between private parties in the face of uncertainty. When the
economic value of agreement depends on the resolution of some uncertainty in the future—such
as the level of demand—the contract must specify how that future state will be measured and how
payoffs between parties relate to that measure. Such state-contingent contracts between bilateral
settings are particularly hard to forge when there are different views about the likely value of
future events or when discussions about contracts reveal too much about a party’s competitive
position and strategic plans for the future. It is also hard to enforce such contracts if events cannot
be measured in a verifiable manner beyond opportunistic reinterpretation. We conjecture that in a
multilateral setting, such as standard-setting negotiations at an SSO, such factors greatly interfere
with the emergence of written state-contingent contracts.
The absence of written contracts specifying how parties will benefit or lose in the event
of certain outcomes does not eliminate the need for some sort of mechanism for paying off parties
for resolving their differences. In the absence of a written agreement, we conjecture that parties
favor economic payoffs that are contingent on deployment and market success, where each
29
party’s market success is a trade secret, by default, and not subjected to reporting biases or other
legal disputes about enforcement.
Consistent with our remarks above, there are two views about the relevance of these
issues for the case of 56K. One view highlights the technical constraints placed on the outcome
and, accordingly, diminishes the importance of side payment considerations. Another view, and
the one to which we are sympathetic, highlights the sense of urgency and cooperation with which
parties approached the negotiations as they deployed infrastructure into the market place. In that
light, firms only reaped the benefits from agreement by accelerating the deployment of 56K and
making additional sales. In that sense, lack of side payments heightened incentives to achieve
agreement and thus start the selling.
7. Does standardization lead to technical improvement?
A standards war determined in the market typically leads to one proprietary technology
becoming the standard or to no standardization at all. This simple observation underlies a
seeming advantage for SSOs, namely, that they have a greater set of options than a market
process. The 56K modem case illustrates the issue concretely. The ITU could (1) endorse one
party’s specification as standard without change; (2) endorse no specification from any party; (3)
endorse a specification that combines elements of standards presently proposed; and (4) endorse a
specification that combines elements of present proposals, but add additional elements to make
the resulting compromise palatable to all relevant parties. On this basis one might naively
conclude that because, unlike markets, SSOs have options 3 and 4 available, they are superior to
markets. That is, an SSO may take the best of several proprietary technologies and create a
technology superior to any individual firm or consortium would have created on its own.
Such a view is naïve because it ignores the negotiation process for reaching a focal point.
In this case, even if all parties desire a standard, the consensus system at the ITU essentially
30
excludes options 1 and 2. That is, these first two options were extremely unlikely even given both
sides’ interest in achieving a standard. Hence, determining standards in an SSO rarely involves a
pure expansion of options. Instead, it biases the outcome toward a different type of choice. Is it a
better or worse choice? Once again, the canonical framework for a standards war does not
consider the trade-off, so we have little prior literature to guide our understanding.
How should one think about the potential costs and advantages of combining
technologies? There are two key costs: One involves the short-run costs for designing a standard
for the issues under consideration. The second and more subtle cost is of designing a standard in
anticipation of what is likely to occur in the near future, as new technical opportunities arise for
upgrades. Events in 56K illustrate each of these.
First, there are the costs of simply writing a standard. We were initially surprised that
56K modems did not undergo an enormous improvement at the ITU. While there is some limited
evidence that both technologies for 56K modems were improving after their introduction, we
came across no declarations in the public press that the V.90 was a noticeable technical
improvement over X2 and Flex. Our evidence is weak in that we have no evidence that it was not
better, but we are struck by the lack of public discussion of any improvements in the V.90.
Clearly, this lesson applies only to 56K modems and does not extrapolate to other technologies.
The second lesson is more transparently illustrated by events here. The V.90 was not the
last standard for 56K modems to come from the ITU. There were further upgrades with the V.91
and V.92, which clearly were superior to their predecessor by objective engineering norms—and
were widely acknowledged as such. Hence, even if one was unsure about the improvement
embedded in the first, there seems little dispute that the first agreement created a unified base
specification for building further improvement.
This gives rise to counterfactual questions about what would happen in the absence of
agreement or in the presence of a longer delay or a different type of agreement. Would such
upgrades have occurred as quickly if a proprietary technology had been the choice for the V.90?
31
Similarly, in the absence of an ITU standard, would de facto market standards advance more
quickly, less quickly or at comparable rates? If standards are negotiated by consensus among
firms, is it more efficient to have the same partners negotiate with each other? We conjecture that
familiarity lowers negotiation costs because participants are familiar with each other’s business
concerns, IP holdings, and market positions, as well as other factors that shape the costs of
negotiations. As noted, the canon does not provide a framework for considering these open
questions, both in the case of 56K modems and in general.
8. How do participants in a standard-setting process use all available information?
Models of negotiations tend to emphasize that disputes arise from the asymmetric
positions of the parties and the private information strategically kept from each other. While these
behaviors might have been relevant to some parts of the negotiation in the case of 56K, the issues
associated with making decisions in the face of market uncertainty and conjectures about the
future direction of technology were much more pressing. Participants based their decisions and
actions on the best available information, but, despite that, sometimes consensus forecasts about
the future turn out to be wrong. Said another way, it is easy to model negotiations as if no
uncertainty is present, but doing so is naïve and potentially a misleading way to understand the
biases inherent to using SSOs to resolve standards wars. It is easy to look back on events with
perfect hindsight or with information about how market trends worked out; but this runs the risk
of being historically inaccurate.
For 56K modems, part of the impetus for reaching an agreement so quickly stemmed
from the belief that 56K modems would be quickly eclipsed by broadband technologies such as
digital subscriber lines and cable modems. That is, many participants believed the technical value
of upgrading dial-up modems, and the market opportunity for deploying 56K modems as a
business would be short-lived. A lengthy ITU process would risk missing the height of the market
32
opportunity. Of course, within a few years it was obvious that this consensus forecast about the
speed of diffusion for the replacement technology was wildly overoptimistic.14
In light of the dot-com bust and other overoptimistic forecasts about the rise of the
Internet, one may accept a forecasting mistake such as this. This preconception about broadband
seems to have been held by every market observer. Yet, as of mid-2004, the technologies are only
now beginning to displace the 56K modem in personal computer communications. As Gandal,
Gantman, and Genesove (2004) point out, that still understates the staying power of 56K. ITU
standards for 56K modems are still the dominant interface for many technologies such as fax
machines and cellular telephones.
More to the point, this case illustrates how forums, such as the ITU, can allow
misconceptions to shape outcomes in ways that might not occur in market processes. Markets
would arrive at an outcome on the basis of firms’ strategies, whether or not they were
independently determined, or there was a consensus about the future. In contrast, SSO’s magnify
the error that arises from a wrong consensus.
This observation complicates comparisons of SSOs to markets. This case illustrates
precisely why it is difficult to make blanket assertions. All participants thought the window for
the 56K market would be short and, therefore, negotiated with a sense of urgency. This urgency
was important in coming to resolution in the face of so many costly negotiating obstacles. In this
sense, the mistaken forecast about the near future contributed to reaching resolution, something
that might not have occurred if no SSO existed.
We now turn to another example about the role for these forums in settling disagreements
about uncertainty. Specifically, even in 2004, we repeatedly encountered the observation that
market processes in 1997 were difficult to document, that market information was inherently
ambiguous. Even in retrospect some factors are held in dispute. Some participants continue to
14 This was a source of great amusement for some of the participants we interviewed.
33
issue charges about vaporware (mostly about Flex) and express skepticism about publicly stated
commitments. While each firm could track its own sales of modems, only one trade magazine,
Boardwatch, published something resembling a survey of use among service providers. Different
associations provided their members with different viewpoints about actions.15 In summary,
decision making necessarily took place amongst interpretative confusion built upon factual
ambiguity.
In this light, one view of SSOs is as forums offering an opportunity for firms to compare
their views, share information, and reduce ambiguity. This information aggregation can be about
more than just the technical merits of various approaches to a given problem. It can be about
nature of demand and the reconciliation of alternative visions about the path along which the
market place will develop.
Related, and more understandably, there is disagreement and inherent ambiguity about
the consequences of paths not taken. We have found former participants expressing different
opinions about what an interim agreement at the TIA might have looked like in the absence of
compromise at the ITU. Moreover, we previously highlighted how fragmented the market
experience was across the United States, so it is no surprise that, even in retrospect, participants
also provide distinct forecasts about whether sales were strong or weak prior to the agreement at
the ITU. They also provide different views about whether they would have continued to be strong
or weak if the emergence of the standard had been delayed.16 We speculate that some of these
differences are consistent with previously stated positions, and some are simply to save face. To
our ears, they will never be resolved.
On the surface, a lack of resolution for such matters is not, per se, of much interest to
anyone—with exception of a market historian or a participant with a stake in how history gets
15 This can be seen, for example, in the wide array of sources quoted by Shapiro and Varian (1999).
16 For example, contrast the following press release in the ITU Press and Public Information Service’s 1998
Plenipotentiary Conference (see www.itu.int/newsarchive/press/PP98/PressRel-Features/Features3.html).
34
told. But it is interesting for this study because it highlights a trade-off between market processes
and negotiated forums. In de facto market standardization processes, such unresolved
disagreements are not relevant except in so far as they shape firm strategies that affect market
outcomes. In negotiations, however, such disagreements can play a role in shaping consensus
outcomes. Hence, in the face of market uncertainty, we perceive a role for such forums in
aggregating fragmented information among multiple parties, and we also perceive the possibility
that such forums can allow misconceptions to shape outcomes in ways that would not occur in
market process.
9. Are SSOs substitutes for each other?
Do SSOs compete with one another for jurisdiction? One might view SSOs as arbitrators
that compete to have disagreements brought under their purview. In that sense, SSOs choose their
structure to attract the most “disputes.” For instance, Besen and Farrell (1991) reports that the
ITU was losing importance relative to regional private SSOs, such as the IEEE, and it responded
by dropping the requirement that countries vote before a standard can be approved. Since
countries vote only once every four years, bypassing this requirement allowed the ITU to
promulgate standards more quickly.
In that light, we can reinterpret some of the events over the 56K modem war. We
reinterpret the question about why the Rockwell and U.S. Robotics groups chose to bring their
dispute to the ITU. Earlier we asked why they came to the ITU instead of allowing market
processes to carry on. In this section we ask why they chose the ITU instead of the IEEE or some
other SSO.
We believe this question arises in the canonical model because the canon has an
incomplete view of the negotiations process. To illustrate, we have a few potential answers for
why the ITU served as SSO and no other forum did. First, as was previously noted, there was
35
precedent. The ITU was the source of all previous modem standards and so had both expertise
and infrastructure in its favor.17 In addition, there were structural advantages at the ITU for
modems. The ITU has an international jurisdiction; and an ITU standard meant that producers
could immediately begin producing for all areas of the world. Although, because of U.S.
influence on technology, the IEEE has a de facto jurisdiction greater than the United States, our
sources say that the internationalism of the ITU was perceived as an advantage.
The ITU also was better able to negotiate the regulatory requirements. An FCC cap on
the modulation within phone wires limited the new modems to 56K. In fact, speeds greater than
56K could have been achievable in some foreign countries, but those countries were willing to
agree to this standard to achieve an international standard. Presumably, coordinating these issues
was easier when done through the ITU, which has a long-standing relationship with both the FCC
and international telecommunication regulators, than through the IEEE.
Finally, the status of these institutions in business culture played a role in why the ITU
was chosen. Rockwell had a history in defense contracting and was considered an establishment
firm. U.S. Robotics was closely associated with ISPs and was considered an upstart. The ITU is
the most established SSO in the world. One source claimed the ITU’s “establishment
credentials” made it an acceptable venue for Rockwell to negotiate with this new competitor. We
find it difficult to translate these ideas about credentials into modern economic language, but
found them provocative nonetheless. To say the least, this notion is not part of the canonical
model of forum shopping during standards wars.
These reasons for using the ITU versus another SSO slightly alter our earlier
interpretation of the coordinating advantages from using a negotiated forum. That is, not only
will using an SSO help coordinate actors with different interests or who face different
17 This of course raises the question of why modems ever were brought to the ITU in the first place many
years earlier, but we leave that aside.
36
geographically independent competitive situations, but it will also help coordinate geographically
distinct markets around the globe.
A theoretic model in which SSOs choose their voting rules or other such characteristics
and compete for market share in the standards market would be exceptionally interesting (see
Lerner and Tirole, 2004 for a start), but our research points out important constraints on such
models. There exist important asymmetries between existing SSOs in their ability to coordinate
otherwise fragmented market actors. Moreover, since there are multiple sources of fragmentation
in need of potentially different types of coordination devices, no single forum will be superior for
all situations. We conjecture that different forums will possess different comparative advantages,
and these advantages cannot be shaped without constraint, nor are these advantages free of an
SSO’s history and long-standing formal and informal norms for resolving disputes.
D. CONCLUSION
We provide detailed analysis of a standards war and the costs of coordinating a solution
to it. On the surface, this case has the three key elements found in a canonical standards war: an
economic opportunity arising from a technical upgrade, a conflict between different
implementations of that upgrade, and a resolution to the conflict, this time through the
involvement of a publicly spirited SSO. Taking advantage of the detailed information available,
we focus on the earliest period of deployment and analyze the interaction of market participants
with the behavior of the SSO. Our main point is that this canonical model is misleading or
incomplete with regards to the costs of coordination.
Incompleteness arose in several general areas. There is a large difference between a
situation in which a regulatory agency intervenes and one in which firms voluntarily negotiate
with an SSO. Yet, most previous cases of standards wars involve the FCC, a government agency
37
that can mandate standards. Because regulatory concerns are paramount in understanding the
activities of the FCC, the literature on such standards wars provide a set of insights that simply
do not carry over to one in which an SSO is involved. The costs of negotiating in an SSO are
shaped by a very different set of determinants.
In this case the ITU had no power to mandate a standard. The ITU can issue a
specification, which can then act as a focal point. This specification is negotiated and need not
have direct correspondence with any specification already for sale. For understanding this
outcome it was more essential to understand that the ITU has its own idiosyncratic set of rules
and precedents. While different than the concerns of regulators, these rules and sets of procedures
give momentum to events and push them in directions that might overlap with—or be orthogonal
to—the concerns of users or to those with economic interests. Moreover, these activities involve
individuals with long-standing professional relationships with each other and with the SSO,
factors that also shape the negotiations and outcomes.
The canon is also incomplete in its analysis of the subtle ways in which the costs of
coordination vary with firm behavior and market circumstances. Participation is voluntary on
some levels and not others. One can see this nuance in three ways. First, all firms in this
marketplace were members of this organization. It was inevitable that they would confront each
other’s claims over IP and marketing goals. Moreover, it was necessary to have an ITU standard
to meet international markets. So, no matter how the market progressed, it was necessary for each
firm to consider its negotiating position and come to these meetings with a position, whether it
was strategic or not. This is not a mandated standard in the sense of a regulatory body mandating
involvement of all interested parties and compelling adoption through legal means. Yet, there is a
sense in which the situation compelled participation and managerial attention of all interested
parties, and the focal point compelled use. We know of no model in the canon that properly
captures how economic incentives led to this outcome.
38
Second, all firms took for granted that an ITU standard was inevitable, though many were
uncertain about what it would look like and when it would emerge. The market position of firms
then shapes the negotiating position of firms. For many firms, such a standard was valuable for
their marketing purposes, and their marketing opportunity had a short window. Those perceptions
of the marketing opportunity informed participatory behavior, making some parties less
obstructionist than they might have been under different market circumstances. It also made
others take a more urgent stance and pressed them to compromise sooner rather than later. Had
market positions been different then negotiations can also be different, that is, in the sense of less
urgent or more urgent, and more inclined towards compromise or less inclined.
Moreover, market-oriented events help crystallize forecasts. They also show where the
market opportunity will move and thus help all parties be more foresighted about which IP are
relevant for cross-licensing purposes and which are not, and which factors are relevant for the
post-standard market opportunity—a key factor in reaching a compromise. We find it useful to
describe this behavior as asymmetric negotiating positions brought about by asymmetric market
positions. Again, we know of no model in the canon that captures these features.
Third, once the standards process gets started, the inevitability of the focal point becomes
a potential factor in market events. There is a strong possibility that a standards war that ends
with another specification simply adds more uncertainty to the marketplace. The uncertainties
encompass such significant outcomes, such as the speed of announcement, nuances of bargaining
position, inevitability of a final specification. Even without this process, there were concerns
among service providers that their investments would be orphaned. With this process reaching a
likely outcome, these investments became contingent on the outcome. For example, it is striking
that market participants knew the history of this ITU committee and did not forecast that the
process would resolve itself quickly. Yet, once it became more apparent that the ITU committee
might defy its own history, then it was in all the parties’ interest to wait just a few months more.
39
Yet, once again, no model in the canon places emphasis on how the management of the
negotiating process at the SSO feeds back into market events.
From the perspective of economic canon, our close study of the details of events here
suggest that the model of the standards war needs modification in several important respects. We
conclude that the canon needs to address several open questions: What circumstances lead all
firms to be compelled to participate in a voluntary standard-setting process and when do
circumstances not do so? What factors shape negotiating positions, which can range from being
obstructionist to urging compromise? Under what circumstances can the standard-making process
produce a feedback from the process into market events, either slowing it down by sowing
uncertainty or speeding it up by ending concerns about orphaning? Such questions are essential
for analyzing the costs of coordinating in their proper completeness, and for understanding the
extent of public benefits that might arise from a delay in emergence of a standard.
From the perspective of policy towards SSOs, our study details important costs in the
process for 56K modems. There are explicit costs, such as membership and negotiation costs, and
implicit costs, such as a procedure that leads to a suboptimal technology or an inefficient handling
of IP rights. However, this analysis should not be seen as a criticism of the ITU or SSOs in
general, and particularly not in this instance. Indeed, for the ITU to provide a resolution to a
difficult standards problem within eighteen months seems a remarkable accomplishment.
Compared to the alternatives of regulation or pure market processes, SSOs may often be a
superior coordination mechanism. Moving from a situation where knowledge and technology is
dispersed among independent firms to one in which the market is coordinated on a single
standard has inevitable costs. Our paper merely details what might be thought of as the “true
costs” of coordinating through an SSO in what surely was one of the better circumstances. One
can only imagine these costs in circumstances where the outcomes were not as beneficial to so
many parties.
40
We offer these questions with a few caveats in mind. Our conclusions and observations
depended on getting accurate information from participants with the good graces to speak with
researchers. We have focused the study primarily on the period prior to the issuance of the V.90
standard. It is clear that events did not suddenly stop after this. The market grew and lasted longer
than many participants expected. The ITU also upgraded the 56K modem standard several more
times. A full appreciation of these later events might generate different insights about what really
turned out to matter for later outcomes. Also, and not unrelated, we have largely eschewed
welfare analysis in favor of identifying and characterizing the nuances of firm behavior. We
identified trade-offs between different types of rules in an SSO and between different types of
firm strategies in their negotiating position, but we did not fully develop these observations. A
fully specified model would be required to analyze all welfare trade-offs, and we do not attempt
to make such an assessment here.
41
References.
Augereau, A., S. Greenstein and M. Rysman. 2004. Coordination vs. Differentiation in a
Standards War: 56K Modems, Mimeo, Boston University.
Besen, S. and J. Farrell. 1991. The Role of the ITU in Standardization: Pre-eminence,
Impotence or Rubber Stamp? Telecommunications Policy 15(4), pp. 311–21.
Besen, S. and L. Johnson. 1986. Compatibility Standards, Competition and Innovation in the
Broadcasting Industry. Santa Monica, CA: Rand Corporation.
Downes, T. and S. Greenstein. 1999. Do Commercial ISP’s Provide Universal Service? In
Competition, Regulation and Convergence: Current Trends in Telecommunications
Policy Research, eds. S. Gillett and I. Vogelsang, 195–212. Mahwah, N.J.: Lawrence
Erlbaum Associates.
Dranove, D and N. Gandal. 2003. The DVD vs. DIVX Standard War: Empirical Evidence of
Network Effects and Preannouncement Effects, Journal of Economics and
Management Science 12 (3): 363–86.
Ellison, G. and E. Glaeser. 1997. Geographic Concentration in U.S. Manufacturing
Industries: A Dartboard Approach, Journal of Political Economy 105 (5) 889–927.
Farrell, J. 1996. Choosing Rules for Formal Standardization, Mimeo, University of
California, Berkeley, CA.
——— and C. Shapiro. 1992. Standard Setting in High-Definition Television, Brookings
Papers on Economic Activity. MicroEconomics, 1-77.
Gandal, N., N. Gantman, and D. Genesove. 2004. Intellectual Property, Standardization
Committees and Market Competition, Mimeo, Tel Aviv University.
ITU Press and Public Information Service, 1998, From Competition to Cooperation: The
Road to E-Commerce, ITU Plenipotentiary Conference, October 12 – November 6,
www.itu.int/newsarchive/press/PP98/PressRel-Features/Features3.html (accessed
5/5/2004)
Lemley, M. 2002. Intellectual Property Rights and Standard-Setting Organizations,
California Law Review, 90, 1889.
Rickard, J. 1997a. 56K Modems: The Battle Continues, Boardwatch, March.
———. 1997b. U.S. Robotics Launches the New Battle56Kbps Modems,
Boardwatch, January.
———. 1998. The 56K Modem Battle, Boardwatch, March.
42
Rysman, M. and S. Greenstein, Forthcoming. “Testing for Agglomeration and Dispersion,”
Economics Letters.
Shapiro, C. and H. Varian. 1999. Information Rules: A Strategic Guide to the Network
Economy, Cambridge, MA: Harvard Business School Press.
Stango, V., 2004, “The Economics of Standard’s Wars,” Review of Network Economics, 3
(1), pp 1-19.
Von Burg, U. 2001. The Triumph of Ethernet, Technological Communities and the Battle for
the LAN Standard, Stanford University Press.
... Although this coordination can be articulated in many different ways, Standard Setting Organizations (SSOs) are one of the most common arrangements. 1 Farrell and Saloner (1988) studies how standards are decided, and it shows that, compared to a de facto standard setting model, SSOs constitute a superior mechanism because although consensus may take longer to be reached, it tends to lead to fewer errors. 2 1 Under the umbrella term SSO we include informal industry consortia as well as more formal standards development organizations. 2 Greenstein and Rysman (2007) illustrates the positive role that cooperation can play in standard ...
... 1 Farrell and Saloner (1988) studies how standards are decided, and it shows that, compared to a de facto standard setting model, SSOs constitute a superior mechanism because although consensus may take longer to be reached, it tends to lead to fewer errors. 2 1 Under the umbrella term SSO we include informal industry consortia as well as more formal standards development organizations. 2 Greenstein and Rysman (2007) illustrates the positive role that cooperation can play in standard ...
Article
This paper studies the effects of a Standard Setting Organization (SSO) imposing a licensing cap for patents incorporated into a standard. In particular, we evaluate the "Incremental Value" rule as a way to reward firms that contribute technology to a standard. This rule has been proposed as a means of avoiding patent hold-up of licensing firms by granting patent holders compensation equal to the value that their technology contributes to the standard on an ex-ante basis, as compared to the next best alternative. Our analysis shows that even in contexts where this rule is efficient from an ex-post point of view, it induces important distortions in the decisions of firms to innovate and participate in the SSO. Specifically, firms being rewarded according to this rule will inefficiently decide not to join the SSO, under the expectation that their technology becomes ex-post essential at which point they may negotiate larger payments from the SSO.
... While, for instance, the ITU only spent around two years in helping the market settle the v.90 standard, such an approval was widely regarded as 'fast' in historical terms. 286 The timing issue, coupled with uncertainty about the ultimate choice of the ITU, would arguably affect firms' strategies, thereby undercutting the likelihood for coordination and convergence. ...
... While, for instance, the ITU only spent around two years in helping the market settle the v.90 standard, such an approval was widely regarded as 'fast' in historical terms. 286 The timing issue, coupled with uncertainty about the ultimate choice of the ITU, would arguably affect firms' strategies, thereby undercutting the likelihood for coordination and convergence. ...
Full-text available
Article
Voluminous studies have documented the rise of international standards and their ramifications for the World Trade Organization (WTO), though most of these studies have focused on environment, food safety, public health, and financial regulations issues. An equally important, yet less explored, area is the information and communications technology (ICT) industry. This article seeks to contribute to the literature by examining the concept of an international standard in the ICT industry and its implications for the WTO. Drawing upon empirical data, this article makes four claims. First, today, the WTO policymakers are facing a ‘balkanized’ standard-setting paradigm in the ICT sector. Global standard-setting in the ICT industry is no longer the sole domain of the ‘Big Three’: the International Organization for Standardization (ISO), the International Electrotechnical Commission (IEC), and the International Telecommunications Union (ITU). Numerous industry consortia, mostly based in the USA, have emerged on the scene and in some way compete with the Big Three. Second, this paradigm shift engenders intense legal and political interest among major trading partners in the WTO, namely the USA and the EU. Applying the current WTO jurisprudence to this new paradigm, this article suggests that certain consortia may qualify as ‘international standardizing bodies’ for the purpose of the WTO. To the extent that standards developed by these consortia are recognized by the WTO, firms operating outside the US-based standardizing environment would bear higher costs in global trade. Additionally, this article argues that, while the Big Three seeks to respond to evolving market demands, their structural changes undercut the legitimacy as an international standardizing body. Fourth, intellectual property in the ICT standard-setting context is an eminent threat to the WTO. Ambiguities in licensing rules of the standardizing bodies—be they the Big Three or the industry consortia—may provide loopholes for emerging economies moving up the global value chain to use selectively an international standard.
... Building on this, Farrell (1996) emphasizes the tradeoffs between the delays inherent in achieving consensus and the benefits of avoiding a costly standards war. At an empirical level, Greenstein and Rysman (2007), examine the role of the ITU in establishing standards for the 56K modem market and conclude that the alternatives of regulation or the market would not have overcome the social costs of coordination any more easily. More recently, Simcoe (2012) demonstrates that the slow-down in standards production within the IETF (Internet Engineering Task Force) between 1993 and 2003 can be linked to distributional conflicts created by the rapid commercialization of the Internet. ...
Article
Technology standards refer to the specifications that provide users and vendors with a common platform and ensure compatibility between components of a technological system. These technical “rules of the game” are being increasingly set in standards development organizations (SDOs). In this paper, I ask the question: how do actors operating in these venues address the challenges posed by the anticipatory and collective nature of the specifications they are establishing? Through an in-depth analytic narrative of the Ethernet LAN (local area network) standard, I indicate how actors engage in an ongoing process of extension generation, ratification and incorporation. In imagining alterations to a specification, approving timely modifications and crafting an identity for a rule even as it changes, they manifest “pragmatic agency” in these contexts. In exercising such agency, SDO's substantially increase the functionality of an existing standard as well as boost its long-term competitive viability.
... SSOs apportion votes based on revenue categories (Greenstein and Rysman 2007). Without unanimity, a technology owner needs the support of either the necessary threshold proportion of SSO members or at a minimum the most powerful SSO members for its patented invention to be included in the standard. ...
Article
The quote in the title refers to a recurring principle in the Antitrust Guidelines for the Licensing of Intellectual Property, issued jointly by the US Department of Justice and the Federal Trade Commission in 1995. That report states that The Agencies' general approach in analyzing a licensing restraint under the rule of reason is to inquire whether the restraint is likely to have anticompetitive effects and, if so, whether the restraint is reasonably necessary to achieve procompetitive benefits that outweigh those anticompetitive effects. We apply this standard of evaluation to recent proposals for joint licensing negotiations in standard setting contexts, which have been offered as a solution to the problem of opportunistic licensing and patent hold up. We find that, to the contrary, joint negotiations are not reasonably necessary to prevent hold up. Instead, other more moderate policy solutions that take advantage of existing institutional features within standard setting bodies have a greater likelihood of preventing hold up without running the risk of anticompetitive licensee collusion that is present with joint negotiations. In particular, we posit that standard setting bodies should set voting rules to obtain majority support in the selection of technologies for a standard and should consider means of encouraging ex ante bilateral negotiations. In addition, competition authorities could focus on the enforcement of non-discriminatory licensing as a means of preventing anticompetitive opportunistic hold up.
Chapter
Some policymakers, courts, and academics have expressed concerns that when a firm’s patents are incorporated into a standard, the patents gain importance and can bestow on the patent holder market power that can be abused when the standard is commercialized. This paper extends the existing literature on the effect that standards can have on patents. This analysis has two aims: first, to better understand how an SSO might confer importance on included patents and second, to move closer to an empirical understanding of the impact of a standard on included patents. The authors create a dataset of patents named to voluntary standard setting organizations, as well as the patent pools that sometimes develop around such standards. The authors rely on proxies to capture a patent’s importance or value.
Article
The Institute of Electrical and Electronic Engineers, Inc. (IEEE) is a nonprofit organization whose aim is "to advance global prosperity by fostering technical innovation, enabling members' careers and promoting community worldwide". The IEEE was established in 1963, following the merger of the American Institute of Electrical Engineers (AIEE) and the Institute of Radio Engineers. The IEEE also plays a major role in the development of standards for IT, telecommunications, and power generated products and services through its Standards Association. Membership of this organization is open to "individuals who by education or experience give evidence of competence in an IEEE designated field of interest". The IEEE is organized into 319 sections in ten geographical regions. These are further sub-divided into 1676 chapters that bring together local members with similar technical interests, as well as 39 societies and five technical councils. The IEEE raises funds from membership dues, assessments, and fees. Keywords: American Institute of Electrical Engineers (AIEE); exchange membership privileges; Institute of Electrical and Electronic Engineers (IEEE); telecommunications
Article
The Distributed Management Task Force, Inc. (DMTF) is a non-profit industry organization, whose aim is to develop, adopt, and promote interoperable management initiatives and standards for desktop, enterprise, and internet environments. The DMTF was established in 1992 in order to address the management complexities faced by businesses, due to the proliferation of disparate networks, systems, applications, and management software. The DMTF organization has four levels of public membership namely leadership, participation, monitoring, and sponsored. The authorities of the DMTF are the Board of Directors, the Technical Committee, the Marketing Committee, and the Interoperability Committee. The primary source of income for the DMTF is the annual dues paid by its members. Standard setting organizations are gaining an "increasingly important" role in transnational economic governance. The DMTF plays an important role in developing and unifying management standards and initiatives for desktop, enterprise, and internet environments. Keywords: Distributed Management Task Force (DMTF); public membership; Technical Committee; transnational economic governance
Full-text available
Article
How and why did the U.S. commercial Internet access market structure evolve during its first decade? Commercial Internet access market structure arose from a propitious combination of inherited market structures from communications and computing, where a variety of firms already flourished and entrepreneurial norms prevailed. This setting nurtured innovative behavior across such key features as pricing, operational practices, and geographic coverage. Inherited regulatory decisions in communications markets had a nurturing effect on innovative activity. On-going regulatory decisions also shaped the market’s evolution, sometimes nurturing innovation and sometimes not. This narrative and analysis informs conjectures about several unique features of U.S. market structure and innovative behavior. It also informs policy debates today about the role of regulation in nurturing or discouraging innovation behavior.
Full-text available
Article
Some policymakers, courts, and academics have expressed concerns that when a firm’s patents are incorporated into a standard, the inclusion can create market power for the patent holders that can then be abused when the standard is commercialized. This paper offers a critical assessment of that proposition. Our analysis has two aims: first, to better understand exactly how an SSO might confer market power on included patents and second, to move closer to an empirical understanding of the market power proposition. We create a dataset of patents named to voluntary standard setting organizations, as well as the patent pools that sometimes develop around such standards, with the goal of providing some suggestive measurements of the effect standardization might have on market power. As it is extremely difficult to measure market power directly, we rely on proxies capturing a patent’s importance or value. We find that some SSOs do appear to enhance some included patents’ importance, but most do not. Moreover, the effects change over time, across standards, and across patents. Thus, we conclude that, on average, inclusion in an SSO tends to enhance a patent’s value, but for any particular patent named to a particular standard a positive effect is not inevitable. Instead, a broad range of effects is possible, some even negative but most equal to zero.
Full-text available
Article
Formal standardization - explicit agreement on compatibility standards - has important advantages over de facto standardization, but is marred by severe delays. I explore the tradeoffs between speed and the quality of the outcome in a private-information model of the war of attrition and alternative mechanisms, and show that the war of attrition can be excessively slow. I discuss strategies to reduce delay, including changes in intellectual property policy and in voting rules, early beginnings to standardization efforts, and the use of options.
Article
The role of institutions in mediating the use of intellectual property rights has long been neglected in debates over the economics of intellectual property. In a path-breaking work, Rob Merges studied what he calls "collective rights organizations," industry groups that collect intellectual property rights from owners and license them as a package. Merges finds that these organizations ease some of the tensions created by strong intellectual property rights by allowing industries to bargain from a property rule into a liability rule. Collective rights organizations thus play a valuable role in facilitating transactions in intellectual property rights. There is another sort of organization that mediates between intellectual property owners and users, however. Standard-setting organizations (SSOs) regularly encounter situations in which one or more companies claim to own proprietary rights that cover a proposed industry standard. The industry cannot adopt the standard without the permission of the intellectual property owner (or owners). How SSOs respond to those who assert intellectual property rights is critically important. Whether or not private companies retain intellectual property rights in group standards will determine whether a standard is "open" or "closed." It will determine who can sell compliant products, and it may well influence whether the standard adopted in the market is one chosen by a group or one offered by a single company. SSO rules governing intellectual property rights will also affect how standards change as technology improves. Given the importance of SSO rules governing intellectual property rights, there has been surprisingly little treatment of SSO intellectual property rules in the legal literature. My aim in this article is to fill that void. To do so, I have studied the intellectual property policies of dozens of SSOs, primarily but not exclusively in the computer networking and telecommunications industries. This is no accident; interface standards are much more prevalent in those industries than in other fields. In Part I, I provide some background on SSOs themselves, and discuss the value of group standard setting in network markets. In Part II, I discuss my empirical research, which demonstrates a remarkable diversity among SSOs even within a given industry in how they treat intellectual property. In Part III, I analyze a host of unresolved contract and intellectual property law issues relating to the applicability and enforcement of such intellectual property policies. In Part IV, I consider the constraints the antitrust laws place on SSOs in general, and on their adoption of intellectual property policies in particular. Part V offers a theory of SSO intellectual property rules as a sort of messy private ordering, allowing companies to bargain in the shadow of patent law in those industries in which it is most important that they do so. Finally, in Part VI I offer ideas for how the law can improve the efficiency of this private ordering process. In the end, I hope to convince the reader of four things. First, SSO rules governing intellectual property fundamentally change the way in which we must approach the study of intellectual property. It is not enough to consider IP rights in a vacuum; we must consider them as they are actually used in practice. And that means considering how SSO rules affect IP incentives in different industries. Second, there is a remarkable diversity among SSOs in how they treat IP rights. This diversity is largely accidental, and does not reflect conscious competition between different policies. Third, the law is not well designed to take account of the modern role of SSOs. Antitrust rules may unduly restrict SSOs even when those organizations are serving procompetitive ends. And enforcement of SSO IP rules presents a number of important but unresolved problems of contract and intellectual property law, issues that will need to be resolved if SSO IP rules are to fulfill their promise of solving patent holdup problems. My fourth conclusion is an optimistic one. SSOs are a species of private ordering that may help solve one of the fundamental dilemmas of intellectual property law: the fact that intellectual property rights seem to promote innovation in some industries but harm innovation in others. SSOs may serve to ameliorate the problems of overlapping intellectual property rights in those industries in which IP is most problematic for innovation, particularly in the semiconductor, software, and telecommunications fields. The best thing the government can do is to enforce these private ordering agreements and avoid unduly restricting SSOs by overzealous antitrust scrutiny.
Article
56K modems were introduced under two competing incompatible standards. We show the importance of competition between internet service providers in the adoption process. We show that ISPs were less likely to adopt the technology that more competitors adopted. This result is particularly striking given that industry participants expected coordination on one standard or the other. We speculate about the role of ISP differentiation in preventing the market from achieving standardization until a standard setting organization intervened.
Article
We test empirically for network effects and preannouncement effects in the DVD market. We do this by measuring the effect of potential (incompatible) competition on a network undergoing growth. We find that there are network effects. The data are generally consistent with the hypothesis that the preannouncement of DIVX temporarily slowed down the adoption of DVD technology.
Article
This article argues that the historic pre- eminence of the ITU in setting international telecommunications standards is likely to be increasingly threatened by the regional standards organizations (RSOs) and by formal or informal coordination among the RSOs. The interests the RSOs represent are powerful enough that the ITU cannot ignore agreements among them, nor is it likely to be able to set standards if they cannot agree. At the same time their size, structure and procedures are likely to make them more effective at agreeing on standards than the ITU, despite the latter's attempts to improve its procedures.
Article
This paper discusses the prevalence of Silicon Valley-style localizations of individual manufacturing industries in the United States. A model in which localized industry-specific spillovers, natural advantages, and random chance contribute to geographic concentration motivates new indices of geographic concentration and coagglomeration. The indices contain controls that facilitate cross-industry and cross-country comparisons. The authors find almost all industries to be more concentrated than a random dart-throwing model predicts but the degree of localization is often slight. They also discuss which industries are concentrated, the geographic scope of localization, coagglomeration patterns, and other topics. Copyright 1997 by the University of Chicago.
Article
Policymakers face an increasing number of questions regarding whether markets efficiently choose technological standards. In this essay I survey the economic literature regarding standards, focusing on arguments that markets move between standards either too slowly or too swiftly.